I wrote a post last year on how to implement rotational backups, it worked great until I moved my backups to a VPS.
The problem was that in my previous implementation, I had other users RSYNCing their backups through their own SSH login, allowing me to secure my server through the various accounts; if someone hacked their SSH Key, they wouldn’t be able to access any of my data on the backup server.
In the VPS environment, you only get one SSH login per account. I could have used the same process but it would have meant giving root login access to other people, which I wasn’t happy with.
To get around the problem, I changed the process so that the rotation and the RSYNC delta was done through CRON tasks on the VPS and moving data from the source machine to the VPS server was done using the BitTorrent Sync tool, allowing me to not give keys out.
The same script would also backup folders mastered on the VPS, like my music library or website.
- A “live” folder was created on the VPS to sync with the client through Bittorrent Sync
- The Source folder on the client was set to Read Only, providing a 1 way push to the VPS
- A new backup folder was created for each folder being synced
- A .latest folder was linked from the backup folder, to the folder being synced
- I wrote a script to rotate the backup on a CRON
- Hard copy (cp -al) the last modified folder (last backup) to a folder of the current time
- Rsync the .latest folder onto the new folder (the delta)
- Remove the oldest backup if we’ve exceeded the number of backups to maintain
The script has 2 parameters: -
- path: the root location of the backups (the parent folder of .latest)
- numRotations: the maximum number of back ups to keep before old ones are deleted; defaults to 16
#!/bin/bash # Check for Required Parameters and pre-requisites if [ -z "$1" ]; then echo "Missing parameter 1: usage rotate_backup path [numRotations]" echo " - path: the path to a folder that should be rotated" echo " - numRotations: Optional parameter detailing how may rotations should be maintained - defaults to 10" exit -1 else basefolder=`readlink -mn "$1"` # Check the .latest folder exists as either a directory or a symlink if [[ ! -d $basefolder/.latest && -L $basefolder/.latest ]]; then echo "Missing a .latest directory within \"$basefolder\". You can either: -" echo " - symlink to the data to be backed up: ln -s /path/to/data \"$basefolder/.latest\"" echo " - create the directory and populate it with the data: mkdir \"$basefolder/.latest\"" exit -1 else latestfolder=`readlink -mn "$basefolder/.latest"` fi fi # Check for optional parameters, setting defaults if unset if [ -z "$2" ]; then rotations=16 else rotations=$2 fi # Get the latest snapshot lastSnapshot=`ls "$basefolder" -1tr | tail -1` # Create a folder for today's snapshot newfolder=$basefolder/`date +\%Y-\%m-\%d_\%H.\%M.\%S` newfolder=`readlink -mn "$newfolder"` mkdir -p "$newfolder" # Copy as hard-links from the latest data to today's date if [ ! "$lastSnapshot" = "" ]; then lastSnapshot="$basefolder/$lastSnapshot" cp -al $lastSnapshot/. $newfolder/ >/dev/null 2>&1 else cp -al $latestfolder/. $newfolder/ >/dev/null 2>&1 fi # Make the changes on top of the hard links using rsync rsync -aLk --delete "$latestfolder/" "$newfolder/" >/dev/null 2>&1 # Find the oldest folders in the directory and remove then (over max rotations) while [ ${rotations} -le `ls -l "$1" | grep ^d | wc -l` ] do oldestfolder=$basefolder/`ls "$basefolder" -1t | tail -1` oldestfolder=`readlink -mn "$oldestfolder"` if ! rm -Rf "$oldestfolder" >& /dev/null then echo "Failed to remove old rotation \"$oldestfolder\"" >2 exit "-1" fi done echo "$latestfolder" |
I’ve been running the backup for a few weeks and not encountered any problems.