A while ago we posted a simple database dump and FTP backup script we use to backup all the site databases daily. Since jet lag is setting in pretty bad I added an extra option to upload the script to Amazon’s S3 storage system at 4:30 this morning.
You need a command line app to get the database dumps into S3. For our Debian server I installed s3cmd with the usual apt-get install s3cmd.
Next setup s3cmd with s3cmd –configure to create a profile. You’ll need the Access and Secret keys from your Amazon account (under security credentials). We made a new set just for backup. I didn’t bother setting the PGP or HTTPS stuff for now. S3cmd documentation is here.
Do a s3cmd ls to list all buckets and make sure the connection is working. If you haven’t already, create a bucket for your backups using the S3 web interface or the s3cmd utility as described in the documentation.
s3cmd put back_$i-$day-$hostname.zip s3://dpdbbackup/back_$i-$day-$hostname.zip
To upload to S3 just add a single line to the old backup script processing loop. Replace dpdbbackup with your bucket name (the version on the wiki now uses a variable to set this).
Run the script with ./db.sh (or whatever you named it) to test.