RustProof Labs: blogging for education (logo)
My book Mastering PostGIS and OpenStreetMap is available!

Long Road - My Site Backup Process

By Ryan Lambert -- Published December 03, 2013

Warning: This post is outdated. It is here for reference purposes only.

This post is part of the series Long Road to the Right Solution and gives an overview of how I have setup our backup collection process for all client sites.

System Components

The list below covers the systems that I require for my system to collect the backups from our client sites.

Backup Collection Inner Workings

The main guts to this library is the little program I've built in PHP to run on the server. When executed, the program runs a query against the database to pull the needed data for any backups that are scheduled for that hour. If no backups need to be taken, it exits. If backups are scheduled it will loop through the backup process: rsync, run database backup, transfer db backup, package files & db backup in an archive, and move it to the proper location for short-term backup storage. Sounds simple, right?

Connecting the Pieces

Getting everything to connect properly was the trickiest part for me. Each website is on it's own server with it's own IP and login information. This was the trickiest obstacle to overcome for me! Part of the difficulty was that when I started this project we didn't have a VPS, so the tools at my disposal were much more limited. Also, this was the my first real experience with using public/private keys to allow servers to talk to each other securely, and to securely store passwords. Asymmetric encryption is a big, confusing topic to get into, but it was one of the more interesting things I have researched this year!

A big piece of the puzzle was getting the secure connection between the central VPS to each client site and transferring a large amount of files. I didn't want to have to store these account passwords so I spent a lot of time researching and testing SSH connections using public/private keys. For transferring the files for each website I tried working with SFTP which was very buggy, I assume because of the large file sizes. I tried using SCP which worked fairly well but was slower then expected. Another issue with both of those methods was the need to archive the site files on the client server. If I didn't .zip the files before transferring it was much slower because of the large number of files. Then I tried rsync which it turns out is just what I was looking for all along! I don't have to .zip the file on the client server first, it all comes over blazing fast. Below is the code the copies the target site files to the VPS to prepare to be archived.

$ssh = ' "' . $config->pathToSSH . ' -i ' . $config->keyPrivateSSH . '" ';
$cmd = 'rsync -az -e ' .$ssh . ' ' . $username . '@' . $remoteHost . ':' . $backupPath . ' ' . $tmpPathLocal;
exec($cmd, $output, $status);

Get the Database

Now we have the files for the site, but we need to get the database as well. The code below shows the command I use. Note, the output gets saved to the client site's server and I scp that back to the VPS. Also, the password is stored in an encrypted file that can only be decrypted using the account's super-secret private key.

$command = "ssh -i $config->keyPrivateSSH $sshUser@$sshHost ";
$command .= "'mysqldump --skip-comments --skip-extended-insert ";
$command .= " -u$user -p$pw $db > $filepath'";
$output = array();
$status = null;
exec ($command, $output, $status);

As I mentioned, I scp the file back to the VPS into the tmp directory along with the site files. Last I archive it all in one .zip and move it to the short-term backup storage.

Jenkins

Alright, so when I said that Jenkins was required, that was a lie. Sorry about that. I designed everything to be able to be scheduled with just CRON, but I have really come to love the conveninece of using Jenkins. The best part is in that each job documents all of its steps in its configuration and it never forgets a step or three like a human might (ahem... me).

The beauty of Jenkins is the ease of setting up different development and production environments. I have only one repository of code for this entire project and it has been designed to be ran in various pieces. Most of the work Jenkins does is to trigger various targets in Phing, but it also runs some shell commands as well. Having the steps documented in the jobs makes it easy to replicate and/or modify for different server environments.

Development Jobs

On my development VM I have a few jobs based on this code for different purposes, but the main job builds and tests everything available. The first step of all the jobs is to run a couple chown commands on directories to ensure that Jenkins can do everything it needs, but it also keeps anyone else from messing with the files while Jenkins is doing its thing. The next command that every job runs it to trigger a target in Phing called "loadProperties". This target is passed an option that defines the server that is running it and loads the appropriate server's properties file for Phing. This allows me to easily switch between different servers (such as dev and prod) but have all the appropriate file paths and settings for the current server. After this most steps are optional for various purposed, I'm going over all of them.

Next, the job copies a couple files from source control and applies REGEX using the freshly created properties files. It also triggers a target that uses dbdeploy to ensure the back-end database exists and is up-to-date. Once the database has been checked it deploys the files for the web interface (built using Yii) and copies the code that handles restoring backups to sites on the local server. The last thing Jenkins does before changing file permissions back to normal is to run a suite of tests using PHPUnit. Unit testing is something I've added very recently to my process but has radically improved how I develop. At the time of writing I have the main pieces of this repository at 80% code coverage. (Ok, so not all the classes are included yet so it's way lower overall, but I can only tackle so much at a time!)

In the post-build steps Jenkins publishes the PHPUnit test results that show any errors that might have occurred as well as the code coverage report in HTML format, accessible from a click in the interface! This comes in quite handy, I must say. You can see that I have one failing test in this screenshot. That's because it's my development environment, and I just recently made a change I knew would break that test!

Production Jobs

One thing I didn't mention above is that the jobs on my development server are almost always set to build from whatever branch/commit is most recent. This is good for testing, but Jenkins also has the perfect option for production servers as well. I use git for my version control and Jenkins has a great plug-in to integrate into your existing repositories. Jenkins gives you the option to specify a branch -- This means you can limit the job to ONLY build from a specific branch. So, the logical thing for me to do was create the job on the production server to only build from the */master branch. This allows me to push my development branches to the production server (I don't mind having another backup!) without having to worry about them being accidentally built to production.

Now, the actual job... Production is much simpler then development! On our production server I don't want to run the full bunch of tests every time it runs backups, I just want it to collect backups! So, it uses Phing to ensure the files and database schema are up-to-date and if so, it runs the script that checks for backups (outlined above!).

Final Thoughts

I do know that my system has a lot of room for improvement and I don't doubt that it has holes, but it's infinitely better than managing backups using Filezilla and plain FTP. Because managing backups that way means it isn't getting done at all!

By Ryan Lambert
Published December 03, 2013
Last Updated April 13, 2019