Automate MySQLdump Backup in EC2 to Amazon S3

May 13, 2020 | By Gerald | Filed in: AWS.
MySQLdump Backup to AWS S3

In this tutorial, you will learn how to automate MySQLdump backup in EC2 to Amazon S3 bucket.

We will archive the backups to S3 Glacier after 30 days using S3 Lifecycle policy and then schedule it to permanent deletion after 90 days of expiration.

What will you do

  1. Create IAM User and S3 Permission.
  2. Install AWS CLI and Configure
  3. AWS S3 List Verification
  4. Install S3fs and Mount S3 Bucket
  5. Create an Automation Backup Script for MySQLdump
  6. Testing MySQLdump to S3 Bucket
  7. Verify the files on S3 Bucket Console

Requirements

To get started, this guide will show you through step process on how to automate mysqldump backup in EC2 to Amazon S3.

Step 1. Create IAM User and S3 Permission

Signed to AWS Management Console and go to the IAM service. Navigate to Users and create a new IAM user and attach the permission using AmazonS3FullAccess policy or use the custom policy if you want a specific access on S3 bucket.

Step 2. Install AWS CLI and Configure

SSH remote into your EC2 instance and then run the apt update command to install available updates of all the packages currently installed on the system.

sudo apt update

Now install the AWS CLI using:

sudo apt install awscli -y

After installing AWS CLI, run aws configure command:

aws configure

On the screen, AWS CLI will prompt you to require to enter your AWS Access ID and Secret Key. Then you need to provide the Region of your EC2 instance location and then leave to default for output format value.

AWS Access Key ID [None]: AAAABBBBBBCCCCCCDDDDD
AWS Secret Access Key [None]: aabbccffeeffggghhiijjkklmn
Default region name [None]: us-west-2
Default output format [None]:

Note: If you do not have yet your own credentials then generate a new credentials within the AWS Identity and Access Management (IAM).

See: How to create new IAM users and generate new credentials

Step 3. Verify AWS S3 List

Verify your S3 bucket list after completed on AWS CLI installation, run command:

aws s3 ls

On the screen, you will see the list of your bucket printed:

2020-05-06 08:02:56 testbucket1
2020-05-13 14:21:09 testbucket2

If the S3 bucket list does not print, then check IAM User and make sure the IAM policy mention in step 1 is attached to your IAM User.

Step 4. Install S3fs and Mount S3 Bucket

You can move your backup to Amazon S3 without consuming your time for configuration. These can be done by installing S3fs agent from your EC2 instance and mount our S3 bucket in the specific folder.

Remote to your EC2 instance, then update your system.

sudo apt update

Once your system is updated, then begin to install the S3fs agent.

sudo apt install s3fs

Create a file for S3fs credential, below enter your IAM User Access ID and Secret key:

echo ACCESS_KEY_ID:SECRET_ACCESS_KEY > /home/ubuntu/.s3fs-creds

Your file will be saved at /home/ubuntu/.s3fs-creds.

chmod 600 /home/ubuntu/.s3fs-creds

And create a new folder from your home directory.

mkdir /home/ubuntu/s3_uploads

Mount the S3 bucket in the folder you have created earlier and make sure to replace the S3 bucket name below:

s3fs S3_BUCKET_NAME /home/ubuntu/s3_uploads -o passwd_file=/home/ubuntu/.s3fs-creds

Next, check your file system if your S3 bucket is properly mounted, run the command:

df -h

On the screen, the something like this:

You can also benefit here the full guide S3fs tutorials.

Step 5. Create an Automation Backup Script for MySQLdump

Before creating an automation backup script for MySQLdump, Open the configuration file for your MySQL credentials, type command:

sudo vim /home/ubuntu/.config.cnf

Then add the following below:

[client]
user = whatever
password = Wh5t3v3r
host = localhost

Save and close the file.

Next, To create an automation backup script for MySQLdump, type command:

sudo vim /home/ubuntu/mysqldump.sh

Add the following script:

Save and close the file.

To make sure the file is executable, run the command:

sudo chmod +x /home/ubuntu/mysqldump.sh

To automate your backup, use to command below to assign the mysql dump script on cron scheduling jobs.

crontab -e

If your first time to open the crontab, you can choose the vim editor on the screen and then add the following command to the bottom line.

Save and close the file.

Step 6. Testing MySQLdump to S3 Bucket

To begin the manual testing on MySQL backup, make sure you are in the home working directory.

cd ~

Now begin to execute the mysql dump script.

./mysqldump.sh

After the backups completes, open the mounted s3_uploads folder.

cd s3_uploads/

Next, type the command:

ls -lah

On the screen, the something like this:

Step 7. Verify the files on S3 Bucket Console

Signed to AWS Manage Console and open the Amazon S3 console at at https://console.aws.amazon.com/s3/home.

Select your S3 bucket and as you can see on the screenshot below the backup database is automatically replicated to your S3 bucket.

AWS S3 MySQLdump backups

Now create an S3 Lifecycle policy to archive your backup transition into Glacier storage after 30 days and choose permanent delete after 90 days expiration on Glacier.

AWS S3 Lifecycle Policy Database Backups

That’s all.

Notes:

The cron scheduling job instruct the shell script to run daily and execute the backups command based on the program you installed. If you want to archive the MySQL backup database before you storing on S3 bucket, please contact us so let me handle the hard work for you.

I hope this tutorial helped you and feel free to comment section below for more suggestions.

SHARE THIS ARTICLE

Tags: , , , , , , ,

7 comments on “Automate MySQLdump Backup in EC2 to Amazon S3

  1. Rochel says:

    Thank you so much! This saved me a lot of headaches!

  2. Jellyfer Echavez says:

    Thank you!! So informative

  3. Samia Rahman says:

    Q1-I am confused about the user ‘whatever’ can you please clarify? is it a user who has permission to create MYSQL backup?

    Q2- I noticed you used ‘dbname-=whatever’. can you please clarify this as well?

  4. SHAFI SHAIK says:

    Thanks for the post, its really useful.

Leave a Reply

Your email address will not be published. Required fields are marked *