MYSQL database remote backup script easy to use

From DP

Jump to: navigation , search

Contents

Overview

This is a simple command line script for a Linux server that:

  1. Dumps a list of databases to .SQL files
  2. Zips the .SQL files
  3. FTPs the .ZIP backups to a remote server
  4. Uploads to Amazon S3
  5. Deletes the .SQL backup files, the .ZIP files accumulate forever

There is no garbage collection or backup rotation. .ZIPed backups will accumulate on the local server and remote FTP server until you manually clear this. This might be updated in the future.

The script goodness

#!/bin/bash
 
## list of database table to backup
declare -a arr=(forum mediawiki track wordpress zencart)
 
## MYSQL backup user login and password
## should have SELECT and LOCK privilages on tables to backup
user='backup_guy'
pass=''
 
## Remote FTP backup location details
ftpurl='ftp.remoteserver.com'
ftpfolder='/' #if user's home directory be sure to keep the single /
ftpuser='ftp_login'
ftppass=''
 
## Home directory of the script, used to store ZIPs and delete the daily SQL dumps
shdir='/home/dp/' #replace with your home directory
 
## Amazon S3 bucket to save to (or delete the S3 upload line below)
s3bucket='dbbackup'  #replace with your S3 bucket name
 
##Date and hostname strings
day=$(date  | awk '{ print $2"-"$3"-"$6}')
hostname=$(hostname -s)
 
## now loop through the above array
for i in ${arr[@]}
do
   echo $i # current DB
   mysqldump -u $user --password=$pass $i > "$i.sql" #dump db to file
   wait
   zip back_$i-$day-$hostname.zip "$i.sql" #zip db to dated file
   wait
   ncftpput -u $ftpuser -p $ftppass $ftpurl $ftpfolder back_$i-$day-$hostname.zip
   wait
   s3cmd put back_$i-$day-$hostname.zip s3://$s3bucket/back_$i-$day-$hostname.zip #delete if unused
   wait
done
 
rm -rf $shdir*.sql  #clear out the SQL files, update with our

Basic use

You'll need some basic login and directory info to configure the script.

MYSQL table names

## list of database table to backup
declare -a arr=(forum mediawiki track wordpress zencart)

These are the databases you'll back up. Enter with a space between each.

MYSQL backup user login

## MYSQL backup user login and password
## should have SELECT and LOCK privilages on tables to backup
user='backup_guy'
pass=''

Enter the user name and password of the MYSQL backup user. This user should have SELECT and LOCK permissions on all the tables to backup. We use a dedicated user with only these permissions for security.

Remote FTP backup location info

## Remote FTP backup location details
ftpurl='ftp.remoteserver.com'
ftpfolder='/' #if user's home directory be sure to keep the single /
ftpuser='ftp_login'
ftppass=''

Address and FTP login details for the server where the backups will be sent via FTP.

Home directory

## Home directory of the script, used to store ZIPs and delete the daily SQL dumps
shdir='/home/dp/' #replace with your home directory

Set the home directory of the script. This location is used to store ZIPs and delete the daily SQL dumps.

This should be the SAME directory the .sh script is run from!!! This could probably be taken from an environmental variable or something, but this works ok.

Setup for Amazon S3 storage

You can use the Amazon S3 storage to make an additional backup of your database dumps. To remove this feature delete the s3cmd line from the processing loop:

s3cmd put back_$i-$day-$hostname.zip s3://$s3bucket/back_$i-$day-$hostname.zip

Install s3cmd

You need a command line app to get the database dumps into S3. For our Debian server I installed s3cmd with the usual apt-get install s3cmd.

Configure s3cmd

Next setup s3cmd with s3cmd --configure to create a profile. You'll need the Access and Secret keys from your Amazon account (under security credentials). We made a new set just for backup. I didn't bother setting the PGP or HTTPS stuff for now. S3cmd documentation is here.

Test s3cmd

Do a s3cmd ls to list all buckets and make sure the connection is working.

Set the bucket name

## Amazon S3 bucket to save to (or delete the S3 upload line below)
s3bucket='dbbackup'  #replace with your S3 bucket name

If you haven't already, create a bucket for your backups using the S3 web interface or the s3cmd utility as described in the documentation.

Set the s3bucket variable at the top of the script with your bucket name.

Test the script

  1. Get the script onto your server, set your database list, MYSQL backup user, remote FTP login, and local shell location (for deleting the daily .SQL dumps)
  2. Set execute permissions:
    1. chmod u+rx db.sh (for user only) -or-
    2. chmod +rx db.sh (all users, less safe)
  3. Test from the command line
    1. ./db.sh
  4. Add to your cron tab and schedule to run daily