When Kelsey and I were both using OS X, we had automated incremental backups over our wireless network using an AirPort Extreme and a Western Digital MyBook 500 GB HDD through Time Machine. After I moved to Ubuntu, however, I needed to find a different solution for network backups. I wanted to make sure that the network backups for Kelsey’s MacBook continued uninterrupted, but I wanted to be able to back up my system to the same network drive. In order to do that, I had to enable SMB (Windows) file sharing at the AirPort Extreme level, and then mount the drive locally. Even though my configuration might not match others, the following instructions are useful for anyone trying to do network backups to an SMB share.
Most of the backup utilities available for Ubuntu can’t use SMB shares natively. You need to mount the drive locally – i.e., map it to a local folder. However, this creates problems with most backup utilities, since they use rsync, and when rsync is copying files to what it thinks is a local folder, it switches from bit-comparison to copy-and-replace, which is slow over a network (particularly a wireless network). Additionally, rsync needs to read the entirety of the target file to figure out what bits to switch, which also significantly slows down incremental network backups (again, particularly over wireless).
My solution to this was to use tar for backups. The advantage here is that tar has the ability to do incremental backups out of the box without needing to read the backup archive. It keeps a file that contains info about all of the files that you backed up, so it can pull incremental files without even needing to be connected to the backup share.
Although you can set tar to gzip or bzip2 your backup files to make the sizes smaller, for my needs, I determined that the amount of CPU time required to do the compression wasn’t worth the 2-3% average compression over all of my files. Part of that is that many of the files I am compressing are already compressed – e.g., MP3s, PDFs, etc – or where compression is difficult – e.g., full-rez 10MP JPEG photos. If you want super small files, you could tar them and then send them off to 7zip, which will make teeny tiny files but will take forever to run. I have plenty of space on my backup drive, so I decided to save myself the cycles and just stick with a raw .tar file.
Incremental backups using tar can be automated via cron, but I choose to run the backups manually so that I don’t inadvertently close my laptop lid and stop the process mid-run and have to deal with that in the shell script. The content of my shell script for backups is below. It doesn’t include everything on my system (although it could). I’m only concerned with backing up my data directories.
# Creates incremental backups for specified directories using tar, then copies them to network storage
tar -g /mnt/backup/$1.snar -cvf /mnt/backup/$1-$2.tar /home/kevin/$1 >> $3
# Mounts network storage drive in preparation for copying backed up files
umount /mnt/backup >> $logfile
if mount -t cifs //server/backup/whitemac /mnt/backup -o username=guest,password=
backupDir Documents $timestamp $logfile
backupDir iso $timestamp $logfile
backupDir Music $timestamp $logfile
backupDir Pictures $timestamp $logfile
backupDir Sites $timestamp $logfile
backupDir Videos $timestamp $logfile
umount /mnt/backup >> $logfile