HOW-TO: MYSQL database remote backup script

in how-to, scripts by DP | 6 comments

mysql-logo
Thanks to amazing Linux server admin Alex, this site is extremely well backed up. Every hour we make a complete copy of the local harddrive, and then incrementally sync any new data to an offsite storage location.

If all this fails, we want to preserve the most important part of the site – the databases that hold three years of posts, wikis, and forum discussions. During the recent server maintenance we spun a new shell script to dump the databases, zip them up, and FTP a copy to a secondary offsite server.  It’s way more flexible than the dump script we used previously, and maybe someone out there will find it useful.

Script and walk-through follow below the fold. There is also a wiki version of this post if you’d like to contribute, or see the latest version.

This is a simple command line script for a Linux server that:

  1. Dumps a list of databases to .SQL files
  2. Zips the .SQL files
  3. FTPs the .ZIP backups to a remote server
  4. Deletes the .SQL backup files, the .ZIP files accumulate forever

There is no garbage collection or backup rotation. .ZIPed backups will accumulate on the local server and remote FTP server until you manually clear this. This might be updated in the future.

The script goodness

 #!/bin/bash

## list of database table to backup
 declare -a arr=(forum mediawiki track wordpress zencart)

## MYSQL backup user login and password
 ## should have SELECT and LOCK privilages on tables to backup
 user='backup_guy'
 pass=''

## Remote FTP backup location details
 ftpurl='ftp.remoteserver.com'
 ftpfolder='/' #if user's home directory be sure to keep the single /
 ftpuser='ftp_login'
 ftppass=''

## Home directory of the script, used to store ZIPs and delete the daily SQL dumps
 shdir='/home/dp/' #replace with your home directory

##Date and hostname strings
 day=$(date | awk '{ print $2"-"$3"-"$6}')
 hostname=$(hostname -s)

## now loop through the above array
 for i in ${arr[@]}
 do
 echo $i # current DB
 mysqldump -u $user --password=$pass $i > "$i.sql" #dump db to file
 wait
 zip back_$i-$day-$hostname.zip "$i.sql" #zip db to dated file
 wait
 ncftpput -u $ftpuser -p $ftppass $ftpurl $ftpfolder back_$i-$day-$hostname.zip
 wait
 done

rm -rf $shdir*.sql #clear out the SQL files, update with our 

Basic use

You’ll need some basic login and directory info to configure the script.

MYSQL table names

 ## list of database table to backup
 declare -a arr=(forum mediawiki track wordpress zencart) 

These are the databases you’ll back up. Enter with a space between each.

MYSQL backup user login

 ## MYSQL backup user login and password
 ## should have SELECT and LOCK privilages on tables to backup
 user='backup_guy'
 pass='' 

Enter the user name and password of the MYSQL backup user. This user should have SELECT and LOCK permissions on all the tables to backup. We use a dedicated user with only these permissions for security.

Remote FTP backup location info

 ## Remote FTP backup location details
 ftpurl='ftp.remoteserver.com'
 ftpfolder='/' #if user's home directory be sure to keep the single /
 ftpuser='ftp_login'
 ftppass='' 

Address and FTP login details for the server where the backups will be sent via FTP.

Home directory

 ## Home directory of the script, used to store ZIPs and delete the daily SQL dumps
 shdir='/home/dp/' #replace with your home directory 

Set the home directory of the script. This location is used to store ZIPs and delete the daily SQL dumps.

This should be the SAME directory the .sh script is run from!!! This could probably be taken from an environmental variable or something, but this works ok.

Test the script

1. Get the script onto your server, set your database list, MYSQL backup user, remote FTP login, and local shell location (for deleting the daily .SQL dumps)

2. Set execute permissions:

  • chmod u+rx db.sh (for user only) -or-
  • chmod +rx db.sh (all users, less safe)

3. Test from the command line

  • ./db.sh

4. Add to your cron tab and schedule to run daily

This entry was posted in how-to, scripts and tagged , , .

Comments

  1. Mark says:

    I use a very similar approach but push the MySQL dumps to my Dropbox account using some command line tools. No particular reason to do other than it is rather neat. Your databases may well be too large to make this a practical proposition. The command line tools that I use are here. http://www.andreafabrizi.it

    Mark

  2. ducksauz says:

    Nice to see you’re doing security right with the backup_guy mysql account, that’s a very nice touch. However, you’ve then using FTP to transfer the files. Is this because the dest host doesn’t allow you to use scp?

  3. fredb says:

    You should be able to backup and gzip in one step to avoid the need for deleting the sql files.

    mysqldump -u $user –password=$pass $i | gzip > “back_$i-$day-$hostname.sql.gz” #dump db to file

    To restore a database backup saved with gzip, you can use either gunzip or zcat

    gunzip < backupfile.sql.gz | mysql -u username -p password dbname
    or
    zcat backupfile.sql.gz | mysql -u username -p password dbname

    Both of the above commands should restore the backup. I usually use zcat to restore.

  4. Ian says:

    Thanks for the tips. I’ll check out scp and update to dump zip directly.

  5. Have you ever taken a look at http://aws.amazon.com/glacier/ ? Should be the most cost-effective alternative.

  6. Can’t get this to work. Maybe someone can help me.

    I am getting this error:

    fun_db
    updating: fun_db.sql (deflated 82%)
    ./dp: line 35: ncftpput: command not found

    All of my settings have been checked 3 times and are correct

Leave a Comment

Your email address will not be published. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>

Notify me of followup comments via e-mail. You can also subscribe without commenting.