Configuring Hostgator VPS for automated CPanel backups to Amazon S3

September
4

Warning! Geekiness ahead

Ok, so just had a wonderful time trying to set up a way to automatically backup my hostgator VPS to Amazon S3.

I did it with the help of some googling, head scratching and tricky bastardness so I thought I’d better document it here to remind me how to do it when I need to do it again.

I got most of the information from the post here and in the comments. Although, I had to piss about a lot to get it finally working. Mainly because of curly quotes in the original page, windoze line breaks and my monkey heritage.

(this is for VPS on Hostgator running Centos 5)

1. Create AmazonS3 bucket

Easy bit here, just create a bucket

2. Install S3 client for Linux

First need to install s3tools repo

cd /etc/yum.repos.d
wget http://s3tools.org/repo/CentOS_5/s3tools.repo

Next need to install

yum install s3cmd

Answer the questions with Y

3. Configure s3cmd

s3cmd --configure

Enter in the access key and secret key from Amazon Security Credentials

4. Enable daily backups from WHM

You can select which accounts by clicking the button marked ‘select’

If it’s already configured, find out the backup directory by typing

grep BACKUPDIR /etc/cpbackup.conf

5. Create the log directories

mkdir /var/log/backuplogs

6. Write a script to automate the backup and save it as /root/dailybackup.sh

You should change the email and bucket name to reflect your own values

#!/bin/bash

##Notification email address
_EMAIL=youremail@yourdomain.com

ERRORLOG=/var/log/backuplogs/backup.err`date +%F`
ACTIVITYLOG=/var/log/backuplogs/activity.log`date +%F`

##Directory which needs to be backed up
SOURCE=/backup/cpbackup/daily/*.gz

##Name of the backup in bucket
DESTINATION=`date +%F`

##Backup degree
DEGREE=3

#Clear the logs if the script is executed second time
:> ${ERRORLOG}
:> ${ACTIVITYLOG}

##Uploading the daily backup to Amazon s3
/usr/bin/s3cmd -r put ${SOURCE} s3://yourbucketname/${DESTINATION}/ 1>>${ACTIVITYLOG} 2>>${ERRORLOG}
ret2=$?

##Sent email alert
msg="BACKUP NOTIFICATION ALERT FROM `hostname`"

if [ $ret2 -eq 0 ];then
msg1="Amazon s3 Backup Uploaded Successfully"
else
msg1="Amazon s3 Backup Failed!!\n Check ${ERRORLOG} for more details"
fi
echo -e "$msg1"|mail -s "$msg" ${_EMAIL}

#######################
##Deleting backup’s older than DEGREE days
## Delete from both server and amazon
#######################
DELETENAME=$(date --date="${DEGREE} days ago" +%F)

/usr/bin/s3cmd -r --force del s3://yourbucketname/${DELETENAME} 1>>${ACTIVITYLOG} 2>>${ERRORLOG}

7. Grant execute privilege to the script

chmod u+x /root/dailybackup.sh

8. Set up a cpanel hook to run the script after the backup has completed

nano /scripts/postcpbackup

enter this as the contents

#!/usr/bin/perl
system("/root/dailybackup.sh");

make it executable

chmod u+x /scripts/postcpbackup

That’s it!

In case of disaster, copy the file from Amazon s3 with

mkdir restore
s3cmd -r get s3://yourbucketname/2011-02-32/filename.gz restore

Notes
I changed the bash script so it only copies *.gz files
I wanted to change the time at which the cpbackup occurs so I went to ‘manage plugins’ in WHM and put ‘install and keep udpated’ tick on ‘cronconfig’ and then went to ‘configure cpanel cron times’ and set the time I wanted cpbackup to run.

Blog Tools, linux


36 Comments zu “Configuring Hostgator VPS for automated CPanel backups to Amazon S3”

  1. Shivam Garg
    08.09.11 8:46 am

    Hi Andy,
    This is really very good and informative article. Actually I don’t really understand the usability of Hostgator VPS? Well thanks for sharing.
    Shivam Garg recently posted..Bootstrap CouponMy Profile

    #1

  2. Indifest
    14.09.11 7:25 am

    kool. been looking for an option like that. It would save a damn lot of time logging on to cpanel and my main problem has been remembering to take a backup. automated does the trick. thanks brother.
    Indifest recently posted..Famous Places to Visit in Mysore | Popular Attractions in MysoreMy Profile

    #2

  3. Andrew Bailey
    14.09.11 4:38 pm

    I know what you mean, I always forgot but now I can relax because it keeps 7 days worth of backups for every account that I have in WHM . very nice!
    Andrew Bailey recently posted..New premium plugin being testedMy Profile

    #3

  4. Benjamin Kerensa
    18.09.11 8:20 pm

    Why would one use Hostgator over say Slicehost or Linode who are the leaders in VPS performance and pricing?
    Benjamin Kerensa recently posted..Erick, We the TechCrunch Readers Dislike TrollingMy Profile

    #4

  5. Genie
    08.10.11 10:10 am

    I’ve long in search for that an d I finally have it answered. It saves a lot of time and get back up. I completely understand the whole point about it.
    Genie recently posted..skin tagsMy Profile

    #5

  6. Nick Sotos
    27.10.11 11:18 am

    Thanks for this useful and informative post Andy.
    You saved me hours in front of cpanel and live chat of hostgator.
    Keep on the good work!
    Nick Sotos recently posted..North Face Denali Jacket Review and CouponsMy Profile

    #6

  7. Cheolsu
    05.11.11 7:43 pm

    I recently started using a VPS hosting for one of my blogs. I should try setting up a similar automated backups.
    Btw, I am glad that I stumbled upon your personal blog. I am a fan of your commentluv plugin. Good luck with your commentluv premium launch.
    Cheolsu recently posted..Myspace Login with FacebookMy Profile

    #7

  8. Jun
    21.11.11 3:27 pm

    Thanks for sharing this information. In the past, I create back up once a week, with this I won’t have problem backing up and its automatic. Also, it’s step by step making it useful for me who have just basic understanding on these kind of stuff.
    Jun recently posted..how to meet womenMy Profile

    #8

  9. Clifford P
    24.11.11 7:26 am

    Thanks for the great write-up. I don’t have HostGator, but it still worked. Only thing was that I needed to do this for the /root/dailybackup.sh file: http://www.gizmola.com/blog/archives/87-Linux-shell-scripting-bad-interpreter-No-such-file-or-directory.html

    :-)
    Clifford P recently posted..DIY iPhone Videos for Real EstateMy Profile

    #9

  10. Clifford
    04.12.11 1:38 pm

    Thanks for this. I’m not on HostGator, but all’s been working fine. :)

    How can I delete the backup from the server after 1 day but from S3 after 30 days (or possibly even delete from server immediately after upload completed successfully – would that be 0 days)? Instead of deleting from both sources at the same interval.
    The comments in /root/dailybackup.sh say that the DEGREE variable is used for both server and S3, but the command that you use (/usr/bin/s3cmd -r –force del s3://t…..) looks like it would only delete from S3.

    Thanks!
    Clifford recently posted..Annotate websites and share your edits (Freebies)My Profile

    #10

  11. Andy
    06.12.11 12:04 pm

    The script will delete the local backups right after uploading them to S3 (at least it does on mine)

    #11

  12. Cliff
    06.12.11 7:43 pm

    The code includes this comment:
    #######################
    ##Deleting backup’s older than DEGREE days
    ## Delete from both server and amazon
    #######################

    Yet the code’s only delete/rm command is on S3, not on the local server (as the comment mentions).

    Looking in the /backup/cpbackup/ daily, monthly, etc directories, the backup files remain as …tar.gz files.
    And that makes sense since there’s no ‘rm’ command to the local server, only to S3.

    Are you sure you’re using the same script that you’ve posted here?

    tyvm
    Cliff recently posted..iPhone camera shutter button – POPAMy Profile

    #12

  13. Andy
    06.12.11 8:22 pm

    Hi Cliff,
    the backups on the server are overwritten by the next days so there are only 1 set of backups in the folder at any one time.
    Andy recently posted..CommentLuv PremiumMy Profile

    #13

  14. Cliff
    06.12.11 8:37 pm

    Correct. Same as mine. Thanks.

    Would you mind suggesting a command to explicitly remove the server’s backup after uploading to S3?

    Also, is the /daily .tar.gz file the same one as in the /monthly folder?

    I don’t want a /daily or /monthly or /yearly folder – I just want backups on S3… maybe just 1 backup on local server would be ok.

    #14

  15. Andy
    06.12.11 8:49 pm

    I just set whm backup settings to only do daily backups so there’s only ever files in the daily folder
    Andy recently posted..CommentLuv PremiumMy Profile

    #15

  16. Cliff
    06.12.11 10:22 pm

    That’s how mine is setup too. Are you sure you don’t have anything in /monthly or /yearly ?
    Screen shot: http://ScrnSht.com/kbalde

    #16

  17. Andy
    07.12.11 9:20 pm

    yep I’m sure, I have just got the daily folder in there
    Andy recently posted..CommentLuv PremiumMy Profile

    #17

  18. Clifford P
    08.12.11 6:11 am

    How can i get this to happen at the end of the script?

    ##Remove local backup folder after S3 PUT
    rm -rf /backup/cpbackup/monthly
    rm -rf /backup/cpbackup/weekly

    I appreciate the assistance.

    #18

  19. Andrew Bailey
    08.12.11 9:18 am

    sorry Cliff, I am not familiar enough with bash scripting to tell you what to do to do that. see the original article that is referenced in this article
    Andrew Bailey recently posted..CommentLuv PremiumMy Profile

    #19

  20. Salvado
    11.12.11 11:39 pm

    Hey Andy, thanks for this detailed post. It really helped me because I’m a starter blogger. I just started a blog like 2 weeks ago and I need this kind of advices. Keep up the good work ;)
    Salvado recently posted..VLC media player free downloadMy Profile

    #20

  21. Joshua Miller
    20.12.11 11:03 am

    Maybe this is a silly question, but running this backup daily in no way affects your bandwidth quota does it?

    #21

  22. Kevin
    22.12.11 11:19 pm

    Since this is for cPanel/WHM is there any reason these directions would not work on any cPanel?

    #22

  23. Andy
    28.12.11 9:28 pm

    to be honest, I do not know! I always get hosting with unlimited bandwidth :-)

    #23

  24. Jamie
    12.01.12 7:11 pm

    Great code, having a cloud to store backups is making keeping websites secure easier than ever.
    Jamie recently posted..Reborn Babies For Sale CheapMy Profile

    #24

  25. Joe
    24.01.12 3:04 pm

    Thanks, you just solidified my decision to NOT go this route. Going to use backup buddy for my WordPress backups to S3 instead.
    Joe
    Joe recently posted..How To Check All In Anchor Backlinks According to Pot Pie GirlMy Profile

    #25

  26. Paul
    05.04.12 3:25 pm

    Thanks for the post, really looks like a great idea. I got the folloing erro though during setup:
    yum install s3cmd
    Loaded plugins: fastestmirror
    Loading mirror speeds from cached hostfile
    * base: mirror.raystedman.net
    * extras: mirror.raystedman.net
    * update: mirror.raystedman.net
    Segmentation fault (core dumped)

    What is the matter? I’ve tried looking into this but cant see any specific…

    #26

  27. Andy
    06.04.12 8:51 pm

    looks like a server software issue if it’s doing a core dump.

    time to ask your hosting provider!

    #27

  28. Mark
    09.04.12 10:57 am

    Hi! Andy,
    Thank you for ur step by step tutorial. :)

    #28

  29. matt
    15.04.12 10:31 pm

    Thanks for this page and guide. Honestly, I us it at least once a month. :)

    #29

  30. Paul
    16.04.12 3:42 pm

    Thanks for the heads up, got it sorted.

    There was an issue with libxml which needed to be recompiled. To solve it I had recompile libxml and updated YUM.

    lets hope it works :)

    #30

  31. David
    01.06.12 1:35 pm

    Thanks Andy – brilliant! I’m on Hostdime and it worked for me

    #31

  32. Andy
    26.07.12 9:46 am

    good to know thanks Clifford!
    Andy recently posted..CommentLuv PremiumMy Profile

    #32

  33. Andy
    26.07.12 9:48 am

    haha actually I had problems with backbuddy and so I used this because it runs independently of wordpress and doesn’t cause wordpress to bork if another plugin is using too much memory
    Andy recently posted..CommentLuv PremiumMy Profile

    #33

  34. Jesus Perez
    13.08.13 4:38 pm

    I just set this up on my WiredTree VPS using the http://s3tools.org/repo/RHEL_6/s3tools.repo since I’m running CentOS_6. The instructions were smooth as silk. Thanks so much, Andy. Only one minor step is missing. CTRL+X to exit Nano and “Y” to save.

    Question: when I copied the backup.sh code, I see some very indented single quotes. Are these ok? Or should I switch them to the the straighter single quotes in my text editor?
    Jesus Perez recently posted..Why I Love WiredTree HostingMy Profile

    #34

  35. Andy
    13.08.13 4:58 pm

    glad you got it working for your site Jesus. the indents will probably by a copy and paste thing, you should be ok to use the quotes that you are used to using but if it works, keep it as is or keep a backup copy in case the changed quotes don’t work on your system
    Andy recently posted..CommentLuv PremiumMy Profile

    #35

  36. Jesus Perez
    13.08.13 5:02 pm

    Thanks Andy!

    One last tip. To force the Cpanel backups and immediately test the setup, use the following commands in SSH:

    For newer VPS servers using the newer backup system:
    /usr/local/cpanel/bin/backup –force

    For older VPS servers using Legacy backup system:
    /scripts/cpbackup –force

    Just keep in mind it can 1) take a while depending on your size and 2) it can bring your server down while it’s working.
    Jesus Perez recently posted..Why I Love WiredTree HostingMy Profile

    #36

Your Comment

CommentLuv badge