Using Drush to move stuff from dev to prd


Revision 2.0

If you’re hosting Drupal sites you might have searched for scripts to make publishing of development to production easier. And you must have stumbled upon things like aliases, rsync and sqldump to help you get along.

To save you some time, a short little script that we use for propagation to our prod servers.

First you need your aliases.drushrc.php (we have it in the ~/.drush folder)

<?php $aliases['tbs-dev'] = array (
 'root' => '/sites/tbs/www',
 'uri' => 'http://www.travelbysuitcases.local',
 'path-aliases' => array (
 '%drush' => '/usr/bin',
 '%site' => 'sites/default/',
 ),
 'command-specific' => array (
 'rsync'=> array(
 'mode' =>'avz',
 'exclude-paths' =>'files'
 ),
 ),
);
//
$aliases['tbs-prd'] = array (
 'root' => '/sites/tbs/www',
 'uri' => 'http://www.travelbysuitcases.com',
 'remote-host' => '10.1.4.12',
 'path-aliases' => array (
 '%drush' => '/usr/bin',
 '%site' => 'sites/default/',
 ),
 'command-specific' => array (
 'sql-sync' => array (
 'sanitize' => FALSE,
 'no-ordered-dump' => TRUE,
 'structure-tables' => array(
 'common' => array(
 'cache','cache_filter','cache_menu','cache_page','history','sessions','watchdog'
 ),
 ),
 ),
 'rsync'=> array(
 'mode' =>'avz',
 'exclude-paths' =>'files'
 ),
 ),
);

there might be a few things that you notice. We use 3 letter shortnames (CMS, INT, TBS and so on) for our websites, followed by either -prd or -dev, further we do not sanitize data on when we push to the prd database as we develop with the full dataset, you might want to have a different approach there.

Also, as we found it makes no sense to push our entire media library across on every update, we exclude this from our RSYNC. Instead we have the /sites/default/files folder mounted as an NFS share from our Prod and Dev servers

Adjust where necessary.

Now our little bash script that moves stuff around

# PRD 2 DEV Drush by j.kool@integrative.it
# Revision 2.0
#

STARTTIME=$(date +%s)
if [ -z "$1" ] ; then 
 echo "Usage : prd2dev 'shortname'"
 exit 1
fi

cd /sites

# Check if shortname is a valid alias (~/.drush/aliases.drushrc.php)
drush sa > shortnames.txt
if grep -Fxq "@$1-prd" shortnames.txt; then
 # Get the DEV and PRD paths
 dpath=$(drush @$1-dev dd)
 ppath=$(drush @$1-prd dd)
 # Get DEV and PRD URL for later (uuid)
 devurl=$(drush sa --table |grep "@$1-dev" |awk '{print $3}')
 prdurl=$(drush sa --table |grep "@$1-prd" |awk '{print $3}')
 date=`date +%Y-%m-%d.%H-%M-%S`
 # Clean cache / set maintenance and backup the DEV site
 drush @$1-prd vset maintenance_mode 1 &
 drush @$1-dev vset maintenance_mode 1 &
 wait
 echo @$1-prd and @$1-dev in maintenance mode
 drush @$1-prd cc all &
 drush @$1-dev cc all &
 wait
 echo @$1-prd and @$1-dev cache cleared
 rm /sites/backup/$1-prd-*.tar.gz 
 drush @$1-dev ard --destination=/sites/backup/$1-prd-$date.tar.gz --tar-options="--exclude=files"
 
 # Save the dPath and pPath and Build.txt
 echo $dpath > $dpath/dev-path.txt
 echo $ppath > $dpath/prd-path.txt
 echo $date > $dpath/build-$date.txt
 echo Local backup made of @$1-dev to /sites/backup/$1-prd-$date.tar.gz 
 #RSYNC and SQL Sync the PRD to DEV (remove all obsolete files)
 drush --yes -v rsync @$1-prd @$1-dev &
 drush --yes sql-sync @$1-prd @$1-dev &
 wait
 echo SQL and RSYNC completed
 # Set css and js compression to 0
 drush @$1-dev vset preprocess_css 0 &
 drush @$1-dev vset preprocess_js 0 &
 # Disable googleanalythics and mollum - set UUID base
 if [ -d $dpath/sites/all/modules/mollom ]; then
 drush @$1-dev --yes dis mollom &
 fi
 if [ -d $dpath/sites/all/modules/google_analytics ]; then
 drush @$1-dev --yes dis googleanalytics &
 fi
 if [ -d $dpath/sites/all/modules/uuid ]; then
 drush @$1-dev vset uuid_redirect_external_base_url $devurl &
 fi
 wait
 echo Module maintenance completed
 drush @$1-prd vset maintenance_mode 0 &
 drush @$1-dev vset maintenance_mode 0 &
 wait
 echo @$1-prd and @$1-dev Back ON-LINE
else
 echo "$1 not found as Shortname in alias list"
 exit 1
fi
ENDTIME=$(date +%s)
echo "$1-prd deployed to $1-dev in $(($ENDTIME - $STARTTIME)) seconds... COPY Backup in background"
scp -c arcfour128 /sites/backup/$1-prd-$date.tar.gz netbackup@192.168.1.199:~/$1-prd-$date.tar.gz &


It’s no rocket science, we first check if the shortname is available on the server, then push the url’s and paths in some variables as we might need these later, we also take a backup of the prod side, Rsync and SQLsync all over, do some maintenance on disabled modules (thinks like mollom and GA), finally disable the maintenance mode and ce’st ca.

As per revision 2.0 I push some jobs in parralel on dev and prd to speed everything up a bit (the & and wait combinations). Also we move our backups to an NFS share.

The script above is PRD2DEV (DEV2PRD) is basically the reverse of this.

I could go in to detail of each little piece of bash code, but it’s pretty self explanatory i think

Hope it saves someone some time

Cheers