How to backup a MySQL database to Amazon Simple Storage Service (S3)

server rack One of my projects is running on a Digital Ocean VPS, deployed there using Laravel Forge (an amazing tool which automates a lot of painful server-related tasks). Although I have enabled the backup service offered by Digital Ocean, I needed to create an additional off-site backup solution for the project’s database, so I decided to store them in AWS S3 service.

This small tutorial covers the task of creating a shell script that will automatically backup a MySQL database to a set of AWS S3 buckets.

The backup plan

For my project’s needs, I needed three sets of complete database backups (i.e. without excluding any table), running on different frequencies: daily, weekly, and monthly.

At any given point of time, I need to have access to the latest 7 daily, 4 weekly, and 12 monthly backups.

The idea is to create a shell script that will run on regular intervals (i.e. daily, weekly, and monthly) using cron jobs which will create database backups saving them in a set of local folders, which will then be synchronised to the relevant S3 buckets.

Amazon S3 setup

Obviously, you need to have an AWS account. You also need to create an IAM user (remember to store the credentials) that will have access to the S3 service.

Then, you need to create one or more S3 buckets to store the backups. I decided to store each type of backup in a separate S3 bucket, so I created the following three buckets:

  1. projectname-daily-db-backup
  2. projectname-weekly-db-backup
  3. projectname-monthly-db-backup

Server Setup

Moving to your own server now (by the way, my server runs on Ubuntu 14.04), you need a way for our server to communicate with S3, so install the AWS Command Line Interface and then run aws configure to set up the credentials of the user you just created. You can verify that you have access to your S3 account by running aws s3 ls, which should show you a list of your S3 buckets.

After you are done with that, you need to create some folders where the backups will be saved. For my needs, I created the following set of folders:

  1. /usr/database_backups/daily
  2. /usr/database_backups/weekly
  3. /usr/database_backups/monthly

Remember to change the owner of the folders to the user/group who will run the scripts. For my setup, the user/group is called forge, so I run the following command:

sudo chown -R forge:forge /usr/database_backups

Shell script

Create a file, give it a name of your preference (e.g. db_backup.sh), and paste the following commands. Don’t forget to add your database name and credentials in the appropriate variables, as well as to modify the paths based on your server and AWS setup. Also, remember to make it executable: chmod +x db_backup.sh

#!/bin/sh

DB_NAME=""
DB_USERNAME=""
DB_PASSWORD=""
BACKUP_DATE=`date +%d%m%y%H%M`

case $1 in
"daily")
    DAYS_TO_DELETE=7;;
"weekly")
    DAYS_TO_DELETE=31;;
"monthly")
    DAYS_TO_DELETE=365;;
*)
    exit 0;;
esac

# export database
mysqldump -u ${DB_USERNAME} -p${DB_PASSWORD} ${DB_NAME} | gzip > /usr/database_backups/${1}/db_backup_${DB_NAME}_${BACKUP_DATE}.sql.gz

# remove local backups older than $DAYS_TO_DELETE days
find /usr/database_backups/${1}/db* -mtime +${DAYS_TO_DELETE} -exec rm {} \;

# synchronise relevant AWS S3 bucket
/usr/local/bin/aws s3 sync /usr/database_backups/${1} s3://projectname-${1}-db-backup --delete

The script accepts a single argument, which can be one of the following: daily, weekly, or monthly. Example of usage: ./db_backup.sh daily

Photo by Dominik Deobald