About     Search     Feed

Nilesh D Kapadia


Music     Twitter     Github

Incremental backups on Linux with rsync

UPDATE: I have been informed that there is a more efficient way of doing this. Instead of using cp to create copies with links, there is the –link-dest parameter in rsync that can be used.

So instead of this:

cp -al $LASTDIR $DST
rsync -av --delete $SRC

You can do this:

rsync -av --delete --link-dest=$LASTDIR $SRC

I have not had a chance to modify the examples and scripts below, and have not tested this myself yet, but it appears correct.

END UPDATE

Using rsync in combination with hard links, its very simple and easy to setup incremental backups in Linux (and UNIX). You end up with a full snapshot of each increment, accessible as if they were the original copy that was backed up. Here are the two links that helped me:

The first link shows how to setup a seperate backup server that connects to your other machines and backs up files from those, but I wrote a backup script that copies to a local drive. The following my first attempt at a script to backup home directories. I wrote a better script with less hard coded values that will be shown later, but this one is a better example of how it works:

#!/bin/bash
/bin/rm -rf /backup/home-backup.5
/bin/mv /backup/home-backup.4 /backup/home-backup.5
/bin/mv /backup/home-backup.3 /backup/home-backup.4
/bin/mv /backup/home-backup.2 /backup/home-backup.3
/bin/mv /backup/home-backup.1 /backup/home-backup.2
/bin/mv /backup/home-backup.0 /backup/home-backup.1
/bin/cp -al /backup/home-backup /backup/home-backup.0

/usr/bin/rsync -aHx --numeric-ids --delete \
    --exclude-from=/backup/home-backup.excludes --delete-excluded \
        /home/ /backup/home-backup/

Note that “cp -al” copies the files using hard links. Hard links are different from soft links in that the file won’t get deleted until all hard links to it are gone and each hard link to the same file acts as if it is the file (whereas soft links are pointers to the real file). So each hard linked copy of my home directory is a complete snapshot of it, but no space is wasted in duplicate files. I have a cronjob that runs as root for this script that runs every morning, so I end up with a complete snapshot for each day of the last 7 days. Restoring from backup is as easy as copying the files back to their original location, and any of the increments can be copied as is.

“home-backup.excludes” is a list of directories that are excluded from the backup (see rsync man page for format of this file). This is a example of some excludes (“someuser” being a user’s home directory):

- someuser/.Trash
- someuser/.firefox/default/abcdefgh.slt/Cache
- someuser/.firefox/default/abcdefgh.slt/Cache.Trash

Also note that only root should have any permissions to the /backup directory, so that in the case that a user account gets compromised, the attacker can’t delete the backups for that user (permissions of files remain identical). If an attacker gains root access, then all bets are off, that’s why a seperate backup server is a good idea.

I have made a more modular version of the script. It could use more improvements, but this is the current version of it:

#!/bin/bash

##########################################################
# do_backup function
#
# parameters:
# $1 = backup target (directory that is getting backed up)
# $2 = location of where backup will be stored
# $3 = name of directory to store backup in (also used for excludes)
#
##########################################################
do_backup() {
local backuptarget=$1
local backupdir=$2
local backupname=$3

/bin/rm -rf ${backupdir}/${backupname}.5
for ((n=4, m=5; n >= 0; n--, m--))
do
        /bin/mv ${backupdir}/${backupname}.${n} ${backupdir}/${backupname}.${m}
done
/bin/cp -al ${backupdir}/${backupname} ${backupdir}/${backupname}.0

/usr/bin/rsync -aHx --numeric-ids --delete \
    --exclude-from=${backupdir}/${backupname}.excludes --delete-excluded \
        ${backuptarget}/ ${backupdir}/${backupname}/
}
# END do_backup function

# Set variables to point to backup destination folder
backuproot=/backup
backupdest=${backuproot}/backup

# Make calls to do_backup function to backup /home and /etc
do_backup "/home" "${backupdest}" "home-backup"
do_backup "/etc" "${backupdest}" "etc"

© 2017 Nilesh D Kapadia