How to Automate MySQL Database Backups on EC2 to Amazon S3

Backing up your database is crucial for protecting your data against corruption and loss. In this tutorial, we’ll guide you through the process of automatically backing up a MySQL database from an Amazon EC2 instance to an S3 bucket.

Prerequisites

Before we start, make sure you have:

Step-by-Step Guide

Step 1. Create an IAM User

Firstly, you need to create an IAM user with the necessary permissions to access the S3 bucket.

  1. Sign in to the AWS Management Console and open the IAM console.
  2. In the navigation pane, click on “Users” and then choose “Add user”.
  3. Enter the username and select “Programmatic access” as the access type.
  4. Click on “Next: Permissions”.
  5. Choose “Attach existing policies directly” and search for AmazonS3FullAccess (or create a custom policy if needed).
  6. Review the details and create the user.

Remember to note down the Access Key ID and Secret Access Key; you will need them later.

Step 2. Set S3 Bucket Permissions

Update your S3 bucket policy to allow the new IAM user to interact with the bucket.

  1. Go to your S3 bucket settings in the AWS Console.
  2. Find the “Permissions” tab.
  3. Add a bucket policy that grants the necessary permissions to the IAM user.

Example bucket policy (replace ‘YOUR_IAM_USER_ARN’ and ‘YOUR_BUCKET_NAME’):

{
    "Version": "2012-10-17",
    "Id": "PolicyID",
    "Statement": [
        {
            "Sid": "StmtID",
            "Effect": "Allow",
            "Principal": { "AWS": "YOUR_IAM_USER_ARN" },
            "Action": "s3:*",
            "Resource": [
                "arn:aws:s3:::YOUR_BUCKET_NAME",
                "arn:aws:s3:::YOUR_BUCKET_NAME/*"
            ]
        }
    ]
}

Step 3. Install and Configure AWS CLI

Next, install the AWS CLI on your EC2 instance.

sudo apt update
sudo apt install awscli -y

Now, configure it using the credentials of the IAM user you created earlier.

aws configure

Input the Access Key ID, Secret Access Key, Default region name, and Default output format when prompted.

Step 4. Install S3fs and Mount S3 Bucket

To mount the S3 bucket, install s3fs.

sudo apt-get install s3fs

Configure s3fs by entering your AWS credentials.

echo YOUR_ACCESS_KEY_ID:YOUR_SECRET_ACCESS_KEY > ~/.passwd-s3fs
chmod 600 ~/.passwd-s3fs

Create a directory to mount your S3 bucket.

mkdir /path/to/my-s3-mount

Mount the S3 bucket to the directory.

s3fs YOUR_BUCKET_NAME /path/to/my-s3-mount -o passwd_file=~/.passwd-s3fs

To ensure the bucket mounts on reboot, add it to /etc/fstab.

YOUR_BUCKET_NAME /path/to/my-s3-mount fuse.s3fs _netdev,allow_other,passwd_file=/home/ubuntu/.passwd-s3fs 0 0

For more details, you can reference my full guide on installing s3fs on Ubuntu: https://linuxbeast.com/blog/mount-s3-bucket-on-ubuntu-with-s3fs-fuse/

Step 5. Set Up Cron Job for MySQL Backup

Finally, you’ll set up a cron job to run the backup script.

  1. Create a backup script backup-mysql-to-s3.sh with the following contents:
#!/bin/bash
DATE=$(date +%H-%M-%S)
BACKUP_FILE="/path/to/my-s3-mount/my-db-backup-${DATE}.sql"

# Dump the MySQL database
mysqldump -u your_mysql_user -p'your_mysql_password' your_database_name > $BACKUP_FILE

# To delete old backups after X days, uncomment the line below
# find /path/to/my-s3-mount/* -mtime +X -exec rm {} \;

Make sure to replace your_mysql_useryour_mysql_password, and your_database_name with your actual MySQL credentials and database name.

  1. Give execute permission to the backup script.
chmod +x backup-mysql-to-s3.sh
  1. Open the crontab configuration.
crontab -e
  1. Add a line to schedule your backup, for instance, every day at 2am:
0 2 * * * /path/to/backup-mysql-to-s3.sh > /path/to/my-s3-mount/backup.log 2>&1

Conclusion

You’ve now automated daily backups of your MySQL database to an S3 bucket. Make sure to regularly verify your backups and simulate recovery procedures to ensure data integrity.

Note: It’s important to keep your backup strategy secure and compliant with your data protection policies.

Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.