I am working on a small backup automation script for AWS (Amazon Web Services). Previously, I wrote this script shared on Github Gist here to take a snapshot of a specific volume for one user which is already being used in production.
Now I am thinking to add the following to this script:
- Add support in script to be used by multiple AWS accounts in different regions.
- Remove older snapshots and keep only the latest 3 copies for so.
I will share this script as a separate project on Github once complete.
Backup. No one can deny the fact that it is a very core requirement of IT businesses. We had been using a script to backup data to a specified folder on our Linux server and then from Linux server we used to write CD weekly or bi weekly basis. Due to lack of time, we couldn’t improve this for quite sometime.
Recently, when due to some extremely busy schedule I couldn’t take backups on CDs, I thought to improve our backup process. Our requirements were really simple, we needed a backup process which could copy our required folders to a remote server and then later on keep doing some incremental backups there. Upon some research, I found two software being used for this purpose one is rsync, the other one rdiff-backup, I chose to go with rdiff-backup software as it was purely written for this purpose of backup and it also uses the libraries of rsync.
I studied and evaluated it and found it worth trying. After a few days of trying on this on my local machine, I installed this both on our local server and online server to start backup using this. Following are some of the steps which would help anyone setup this on their machines:
- To use rdiff-backup for remote backup of local servers. Both servers need to have rdiff-backup installed. rdiff-backup is available from apt-get on a debian system and for Redhat based systems it could be downloaded from http://rdiff-backup.nongnu.org/
- After installing rdiff-backup on both servers to start taking backup issue the following command:
rdiff-backup /home/www/ firstname.lastname@example.org::/home/meraj/backup/ (whhere /home/www/ is my local source folder to taken backup of and email@example.com::/home/meraj/backup/ is server address and destination folder)
Now my next step is to automate this stuff. I don’t want do this every day manually so now will try to get some time and automate this by writing a shell script and then run it through crontab.
While setting up and running rdiff-backup, I noticed the following issues:
- To make sure that rdiff-backup is working correctly, before trying to start backup run ‘rdiff-backup –version‘ on both servers and make sure that it returns version correctly, which would mean that rdiff-backup is correctly configured and running.
- Make sure that on the destination server we have SE (on redhat machines) not running or if running, then make it permissive mode.
- In case running SE in permissive mode also doesn’t help then try to relabel the _librsync.s using ‘chcon -t texrel_shlib_t /usr/lib/python2.4/site-packages/rdiff_backup/_librsync.s‘