Automatic wordpress backups to Amazon S3

Automatic wordpress backups to Amazon S3

I use wordpress for many sites, and its really important for me to be able to automatically perform nightly backups of not just the database, but all files including uploads, themes, plugins and the basic wordpress files. I need the ability to revert any file back to a previous version.

I found a variety of solutions to backing up files using S3, but none that exactly met my needs, so I decided to post my work here.

Note that these instructions assume your web server is Unix, Linux or Mac OS X based. It also assumes you have a basic understanding of FTP, SSH, firefox extensions and compressed files.

1. Get the s3-bash code library

The s3 BASH library is a collection of scripts that can get, put or delete files from Amazon s3. You can grab them from Google Code.

2. Unpack and upload these scripts to your server. They should probably live outside of your webroot. Depending on how your server is set up you might put them in:

/home/USER/s3

Where USER is your username.

3. Give s3-put the proper permissions We need to be able to execute s3-put so chmod it to 744.

chmod 744 /home/USER/s3/s3-put

4. Sign up for an Amazon S3 Account You'll need an Amazon S3 account, so sign up and then get your access key ID and secret access key on the AWS access identifiers page.

5. Put your secret access key in a text file and upload it to the server You could call it s3.key and put it in the same folder you uploaded the s3-bash scripts to. Chmod it to 640.

chmod 640 /home/USER/s3/s3.key

6. Create a folder for the backups Make a folder on your webserver for backups, perhaps:

/home/USER/backups

Where USER is your username.

7. Create a bucket on Amazon S3 for your backups The easiest way I found to do this is to use the firefox extension S3 Firefox Organizer. Add the extension, restart firefox and then in Firefox go to "Tools" > "S3 Organizer". Click "Manage Accounts" and enter your access key ID and secret access key and then login. Create a bucket (directory) for your backups. I created a one called bnee.

8. Write the backup script This is the tricky part. Here is the script I came up with. The [[Double Bracketed]] items are places where you'll have to enter information specific to your account (see below).

#!/bin/bash USER=[[USER]] mysqldump --add-drop-table -u [[DBUSER]] -p[[DBPASSWORD]] -A >/home/$USER/backups/mysql.sql rm /home/$USER/backups/backup.tgz tar -czvf /home/$USER/backups/backup.tgz [[PATHTOWEBROOT]] /home/$USER/backups/mysql.sql NOW=$(date +_%b_%d_%y) /home/$USER/s3/s3-put -k [[S3ACCESSKEYID]] -s /home/$USER/s3/s3.key -T /home/$USER/backups/backup.tgz -v /[[S3BUCKET]]/backup$NOW.tgz

[[USER]] - Because the username of my web server is used a lot in the paths, this is a variable [[DBUSER]] - The username of the mysql database. You can get this from the wp-config.php file of your wordpress installation. [[DBPASSWORD]] - The password for your mysql database. You can get this from the wp-config.php file of your wordpress installation. [[PATHTOWEBROOT]] - The path to the webroot. Its likely something like /home/USER/public_html [[S3ACCESSKEYID]] - The access key ID for Amazon S3 (see step 3). [[S3BUCKET]] - The name of the bucket (directory) you created in step 6.

You might have to change the various paths depending on the way your web server is set up and the folder names you chose in previous steps.

Paste all of this into a text file and name it something like backup.sh. Upload it to your web server, outside of the webroot.

/home/USER/backup.sh

9. Test the script Verify that the script is working by running it.

sh /home/USER/backup.sh

If everything is working ok, you'll see a lot of output as each file of your website is added to the compressed backup and then uploaded to Amazon S3. It may take a minute or so to run, but when it is done you can verify that the script worked by using the S3 Firefox plugin to check that the file ended up where you expected it to.

10. Automate it Add a cron job to run it every night

0 0 * * * sh /home/USER/backup.sh

If everything is configured correctly, this should perform a full backup nightly at midnight.

Note for subversion users: if you'd like to exclude all of the .svn folders and files from your backup, just use

--exclude=".svn"

before the [[PATHTOWEBROOT]] in step 8 above.