August 26, 2010

It's dead, Jim
Photo by Martin Deutsch

I thought it was time I wrote about my obsession with backups. It’s the sort of madness you only get after a hard drive’s gone bad and left you wondering if you’ll ever recover the digital photographs of your babies.

Some years ago I was running a PC that I had built up to run BeOS but I also had Linux and Windows partitions and a shared data area. At that time my backup strategy consisted of occasionally burning a CD or two containing all (it was that long ago) my data in zip files, relying on Windows to burn the CDs. When my hard drive started to fail I was unable to burn my files to CD because Windows would not run. Mercifully though I could use the combination of BeOS and Linux to recover the data I needed.

From that point I was determined that I would have a better backup and came to appreciate three important points:

This last point is only necessary to cover the worst-case scenario that might lead to both the master drive and backup being lost. It’s probably a very small probability but one worth thinking about.

Fast forward to 2010 and I’m now using multiple backups and for the most part they all happen automatically without any user intervention. The downside is there are so many layers to this it really needs a picture to explain it.

Backup Madness

Every night the entire hard drive is backed-up using SuperDuper to an external hard drive configured as a RAID 1 (mirrored) drive. No intervention is needed to kick this off, it runs automatically during the night, and occasionally I test it by booting off the backup drive.

My user data (including the all-important photos) is backed-up offsite to CrashPlan and also occasionally to a bus-powered hard drive using ChronoSync to synchronise the two drives.

Our documents folders are synchronised to an iPod and USB memory stick each time they are connected. Again, this is achieved using the fantastic ChronoSync and does not require any action from the user. ChronoSync detects the connection of the target drive and synchronises at the first connection each day. The advantage of using these kinds of storage devices is they are usually offsite.

Whenever I make a major addition to my Aperture library I synchronise the library with Aperture vaults on whatever drives are connected at the time. This may seem overkill but I consider my photos as the one type of data I could not cope with losing.

My email is provided via fastmail.fm so is available on their servers and, occasionally, I archive mail to MailSteward to keep a local archive (backed-up of course).

As well as being backed-up via the above methods, music and audiobooks are synched with an iPod Classic and some of it with an iPhone.

But the real test of any backup strategy is whether it works…

Recently the logic board on my MacBook Pro died and the cost of a repair was prohibitively expensive. Once the (even more expensive) replacement was available I was able to restore pretty much everything from the SuperDuper backup drive. Because the computer failed overnight I don’t know if that night’s backup completed or not, so I may have lost at most documents that were created during the 24 hours before but I don’t think this is the case. Restoring email was simply a case of re-enabling my IMAP accounts although there is a small risk that one day of my wife’s POP3 email may have been lost.

Even taking the option to transfer data back manually (the logical structure of an OS X drive makes this very easy) and de-cruft the machine I was up and running again on the same data within 24 hours of getting my new computer.

Software


Previous post iPhone Wallpapers A collection of simply amazing iPhone Homescreen wallpapers and lock screens. Next post Adobia I’ve just updated to Acrobat 9.0 at work and immediately hit my first problem with one of the new features, Portfolios. My perception is that Adobe