Tag ‘whm’ » Archiv


36

Configuring Hostgator VPS for automated CPanel backups to Amazon S3

September
4

Warning! Geekiness ahead

Ok, so just had a wonderful time trying to set up a way to automatically backup my hostgator VPS to Amazon S3.

I did it with the help of some googling, head scratching and tricky bastardness so I thought I’d better document it here to remind me how to do it when I need to do it again.

I got most of the information from the post here and in the comments. Although, I had to piss about a lot to get it finally working. Mainly because of curly quotes in the original page, windoze line breaks and my monkey heritage.

(this is for VPS on Hostgator running Centos 5)

1. Create AmazonS3 bucket

Easy bit here, just create a bucket

2. Install S3 client for Linux

First need to install s3tools repo

cd /etc/yum.repos.d
wget http://s3tools.org/repo/CentOS_5/s3tools.repo

Next need to install

yum install s3cmd

Answer the questions with Y

3. Configure s3cmd

s3cmd --configure

Enter in the access key and secret key from Amazon Security Credentials

4. Enable daily backups from WHM

You can select which accounts by clicking the button marked ‘select’

If it’s already configured, find out the backup directory by typing

grep BACKUPDIR /etc/cpbackup.conf

5. Create the log directories

mkdir /var/log/backuplogs

6. Write a script to automate the backup and save it as /root/dailybackup.sh

You should change the email and bucket name to reflect your own values

#!/bin/bash

##Notification email address
_EMAIL=youremail@yourdomain.com

ERRORLOG=/var/log/backuplogs/backup.err`date +%F`
ACTIVITYLOG=/var/log/backuplogs/activity.log`date +%F`

##Directory which needs to be backed up
SOURCE=/backup/cpbackup/daily/*.gz

##Name of the backup in bucket
DESTINATION=`date +%F`

##Backup degree
DEGREE=3

#Clear the logs if the script is executed second time
:> ${ERRORLOG}
:> ${ACTIVITYLOG}

##Uploading the daily backup to Amazon s3
/usr/bin/s3cmd -r put ${SOURCE} s3://yourbucketname/${DESTINATION}/ 1>>${ACTIVITYLOG} 2>>${ERRORLOG}
ret2=$?

##Sent email alert
msg="BACKUP NOTIFICATION ALERT FROM `hostname`"

if [ $ret2 -eq 0 ];then
msg1="Amazon s3 Backup Uploaded Successfully"
else
msg1="Amazon s3 Backup Failed!!\n Check ${ERRORLOG} for more details"
fi
echo -e "$msg1"|mail -s "$msg" ${_EMAIL}

#######################
##Deleting backup’s older than DEGREE days
## Delete from both server and amazon
#######################
DELETENAME=$(date --date="${DEGREE} days ago" +%F)

/usr/bin/s3cmd -r --force del s3://yourbucketname/${DELETENAME} 1>>${ACTIVITYLOG} 2>>${ERRORLOG}

7. Grant execute privilege to the script

chmod u+x /root/dailybackup.sh

8. Set up a cpanel hook to run the script after the backup has completed

nano /scripts/postcpbackup

enter this as the contents

#!/usr/bin/perl
system("/root/dailybackup.sh");

make it executable

chmod u+x /scripts/postcpbackup

That’s it!

In case of disaster, copy the file from Amazon s3 with

mkdir restore
s3cmd -r get s3://yourbucketname/2011-02-32/filename.gz restore

Notes
I changed the bash script so it only copies *.gz files
I wanted to change the time at which the cpbackup occurs so I went to ‘manage plugins’ in WHM and put ‘install and keep udpated’ tick on ‘cronconfig’ and then went to ‘configure cpanel cron times’ and set the time I wanted cpbackup to run.

Blog Tools, linux