Not Dead Head

Side A

Category Archives: devops

We Don’t Need No Stinkin’ Architects! #infoq

Michael Stiefel attempts to clear up some misunderstandings regarding the role played by the software architect.

An intimidating title, but I don’t let it scare you, you will see why the so called generalist is a myth and why, even though you practice DevOps and agile and so on, you still need architects and also test engineers and testes and senior developers and junior developers and system engineers/adminstrators and scrum masters and so on.

This slideshow requires JavaScript.

Watch the talk at InfoQ: Architects? We Don't Need No Stinkin’ Architects!.

Resistance is futile!

Know your Kung-Fu: Bash scripting … dynamic variable names using arrays/hashes.

When you read in the bash manpage about arrays, you will find:

… Arrays are assigned to using compound assignments of the form name=(value1 … valuen), where each value is of the form [subscript]=string. Only string is required. If the optional brackets and subscript are supplied, that index is assigned to; otherwise the index of the element assigned is the last index assigned to by the statement plus one. Indexing starts at zero. …

Any element of an array may be referenced using ${name[subscript]}.  The braces are required to avoid conflicts with pathname expansion. …

so for me that sounds like in bash arrays are actually hashes!, with dynamic keys which default to integers starting at 0. Nice.

I must say, I don’t use hashes or arrays in bash very often (if you need complex data structures, you should use an appropriate programming language)  and actually I used them here to get something else … dynamic variable names. By the way, this is also known as indirect variable reference.

This is just a “Mami mami, look what I can do!” script, the correct way to go, will be to use an array of hashes and thus use only one CLUSTER array or you could store this information in json format somewhere and access it from there.

set -e
cat<< EOF
usage: $0 options

This script creates mysql dumps from the given cluster's mysql server or
RDS instance and dumps it into the production database.

       -d Database to dump
       -h Show this message
       -r the region [cluster_1|cluster_2|cluster_3]
       -t database table
       -v Verbose
exit 0

while getopts “ht:r:p:v” OPTION
  case $OPTION in
         # internal hack because array names in bash can't contain '-'
         set -x

if [[ -z $REGION ]] || [[ -z $DATABASE ]] || [[ -z $DB_TABLE ]]

# define some stuff

# all the target location and credentials
declare -A cluster_1
declare -A cluster_2
declare -A cluster_3

# TODO: get the server names using the ec2-api-tools
cluster_1=(["DB_USER"]="some_user" \
           ["DB_PASSWD"]="p@ssw0rd" \
           ["DB_HOST_RO"]="My_Slave" \
           ["DB_HOST_RW"]="My_Master" )

cluster_2=(["DB_USER"]="some_user" \
           ["DB_PASSWD"]="p@ssw0rd" \
           ["DB_HOST_RO"]="My_Slave" \
           ["DB_HOST_RW"]="My_Master" )

cluster_3=(["DB_USER"]="some_user" \
           ["DB_PASSWD"]="p@ssw0rd" \
           ["DB_HOST_RO"]="My_Slave" \
           ["DB_HOST_RW"]="My_Master" )

# backup production data
# dyanmic variable names are set using bash's indirect variable reference
BCKP_DUMP_FILE="${DUMP_DIR}/bckp_${REGION}_$(DB_TABLE)_$(date +%Y%m%d%H%M).dump"
echo "Dumping \"core_translate\"-table from ${REGION} production database"
time /usr/bin/mysqldump --single-transaction $(MYSQL_OPT) \
                        -u$(eval echo "\${${REGION}[DB_USER]}") \
                        -p$(eval echo "\${${REGION}[DB_PASSWD]}") \
                        -h$(eval echo "\${${REGION}[DB_HOST_RO]}") \
                        $(DATABASE) $(DB_TABLE) > ${BCKP_DUMP_FILE}
%d bloggers like this: