I recently put a lot of thought into how I perform my computer backups. I’m one of those people that, while I would only be mildly pissed off by the failure of a hard drive, would be quite angry at myself if I lost even the merest hint of data that I wanted to keep. I used to perform my backups manually, using the Windows backup utility to back data up onto an external hard drive. It worked fine most of the time, but it definitely had process defects… the largest of which being, I had to remember to do it. It required my interaction to succeed (because I had to plug the drive in) and this meant there was always a human element involved. And humans are lazy.
So I set about designing myself the ultimate foolproof backup system. There would be multiple storage media, there would be encryption, there would be checks and validations and several custom-written applications. Then I started thinking, “what exactly am I protecting myself against?” It’s a good question. Here’s the list I came up with:
- I need my data to be safe from storage media failure. This may mean a single backup DVD being unreadable, or maybe my primary hard disk drives it’s head into the sand.
- I need my data to be safe from the failure of every drive of a particular type, simultaneously. It happens more than you would think, and the consequences usually aren’t pretty (whole RAID arrays failing, with all their ‘safe’ data, usually makes people a bit upset).
- I need to make sure my data can’t be stolen. If it is stolen (or people I don’t want reading my data try to do so) then it should appear as meaningless gibberish.
- I need my data to be safe from being corrupted while in storage, or while being transferred between storage devices.
- My data needs to be safe from theft or fire, which could mean every storage device in a particular location is unusable.
- My data needs to be safe from natural disasters, which could take out an entire city or state. Unlikely, but it’s the kind of thing most people don’t plan for.
- I need to be able to search for data that I’ve accidentally deleted, and I need back.
- If my data is anywhere not under my direct control, I need to be able to trust the people who do control them.
- I have to assume that if my backup hasn’t been tested (i.e. I haven’t tried to restore from it) then the backup isn’t any good.
- Finally, I shouldn’t have to do anything… computers should be smart enough these days to back themselves up.
That was all I could think of, though I’m sure there are additional points (leave a comment or email me, please!). Then I figured out what I had to do in order to prevent these situations from happening.
- Points one and two are the easiest to solve, and are really what most people think of when they think of “backup” plans. The solution is simple: keep your data on multiple storage media, and those different storage media should be different types.
- Point three is pretty simple to solve: encrypt everything you can possibly encrypt. This also partially side-steps point eight, because if your data is encrypted, you don’t have to trust them to not read it, you only have to trust them to not delete it. And you don’t need to trust them to not delete it if you’ve got the data in multiple locations (i.e. somewhere not under their control).
- Point four can be partially solved by taking checksums of the data (which can be done at the same time it is encrypted). If a checksum doesn’t match, something has gone wrong and should be tried again or looked at by a human. There is the issue of what happens if the original data is corrupted. I put this in the too-hard basket for now, though the use of a RAID array can reduce the likelihood of this.
- Points five and six are closely related, and also solved together. Every good backup plan should make use of off-site backups, where a copy of data is kept away from the original. Point five might mean keeping a copy in another building (or in my case, at my parent’s house a few kilometres away). Point six means I might consider going further. Ideally I’d like to store a copy of my data on another continent, just in case of nuclear war. If I survive, my data should too.
- Point seven means I should be creating archives of data, so that copies of old files are kept so that I can go back in time. I would like to be able to choose copies from every day for a week, then every week for a year. After a year, I’m probably not going to remember that I once had a file.
- Points nine and ten are quite possibly the trickiest. To solve them, I have to write automatic scripts to do all these backup tasks, then write automatic scripts to try recovering from the data and make sure it’s in perfect state. I also need to do this manually, just in case my scripts stop working (it is a computer, after all).
So there was my analysis of the backup problem done. Now for the design stage. My current working computer systems consist of a laptop (running Windows 7), a desktop (dual-booting Windows 7 and Debian GNU/Linux), and my home server (which runs Debian GNU/Linux). So I chose to do the following:
- I decided that, since it was turned on all the time, my home server would be the primary location for all my treasured data. Every other location for my data would feed off that. My laptop and my desktop will be synchronised to my server using software such as rsync running on a very frequent schedule. Ideally I will code a switch into the script on my laptop that does syncs less often when I’m not at home, to avoid wasting bandwidth. This will give me three or four working copies of my data, depending on how implementation goes.
- My server has two hard drives, and I’m going to use this to my advantage. The first hard drive has my primary working copy of data, and the second drive is where the backups go. So I’ll write another script that will take my working copies from my first hard drive, perform archival on them (using tar), encrypt them and checksum them (using encryption that money can’t buy) and copy them to my second hard drive. This gives me the ability to go back in time through my data, if need be. At this stage there are some things I won’t backup, either for legal reasons (I’m fairly sure the MP3 backups of my music collection shouldn’t be stored off-site under Australian law) or for practical reasons (videos are just too large to transfer off-site over the Internet).
- I still haven’t solved the problem of off-site backups. To solve this, I’m planning to make use of Amazon S3, which is a cloud backup solution offered by everybody’s
favouritefriendlyforgettable online book store, Amazon. Because my data has now been encrypted, I don’t have to trust them at all. I can just copy it across, mark it as being invisible to the wider world, and forget about it. I will also take up an offer from my friend Jamie to store my data on his NAS, which gives me another off-site backup location. I’m in Tasmania, Jamie is in Queensland, Amazon is in the U.S.A., and my data is safe. - I’m also planning to fit my server with a DVD burner and write a script that backs up my most crucial data (such as financial information and treasured memories) onto a DVD every week or so. Encrypted, of course. The only problem is that I need to remember to go and change the DVD over every week.
- Finally, I have to write scripts to occasionally check the consistency of my data, so that nothing suffers from bit rot.
I haven’t completed the process of implementation yet (in fact I’ve hardly started). Already though, I feel safer knowing that I’ve thought about the process of storing my data, and that makes me feel a lot safer. Most people don’t think about backups until it’s too late, and perhaps maybe they should.