On the Importance of Backups and Storage Decisions
I admit it’s a little bit of a geek thing but I’m in the habit of wiping and reloading my computer systems at least once a year to keep everything working right. I find that, even as careful as I am about what I install, I still work with some major software and Windows systems start to slow down and get a little flaky after awhile with random hiccups that take time to track down. Sometimes, it’s easier just to take a day, back everything up and re-install Windows from scratch.
The keyword there is, of course, backup. On my systems, I keep as much of my own data under one central data directory as possible in order to make it easier to quickly copy all my data to an external drive. I also go through the collection occasionally and see what items I can store to the external drive permanently and remove from the PC itself. Still, things tend to get distributed over time and various programs like to store their data in various places such as the user profile and documents folders. So these periodic reloads are also an exercise in evaluating how attentive I’ve been to where my data is stored. I generally forget something, usually something minor.
This time, I made a serious effort to go through every program I had installed and verify that I had backed up its data somewhere. The gotcha came when I was restoring my Thunderbird e-mail stores and realized that some e-mail I’d kept in a separate account from my main e-mail had actually been stored deep within my Windows profile instead of the main data directory and, consequently, had not been backed up. Fortunately, I was able to recover the most important of the e-mails. When you’re setting up your POP3 e-mail accounts, it’s a good idea to let the server retain copies of the e-mails you download for a certain amount of time.
There are a few ways to do a system rebuild like I’m describing here and they each have their pros and cons.
This is the method I used the other day and it takes the longest but gives you the most control over what goes on your new software setup. For me, it was a matter of backing everything up, inserting the Windows 7 installation disc in the CD drive and restarting the machine while ensuring it was set to boot from the CD drive. During the installation, I specified that the drive was to be formatted to clear out all existing data and ensure that the installation didn’t try to leave any folders from the previous install there that I wouldn’t be able to access. This also wipes out all of the old data so the backup is, again, important.
After Windows is setup, I decide what software absolutely needs to be installed based on what I’m doing these days and what software can be left for later when I need it. Then there are all the Windows updates – for Windows 7, that’s about 140 or so separate updates that have been released since the package on my installation DVD was released. It’s also important to have any special drivers in place for things like web cams and printers that have been added onto the system along with access codes for the wireless network and registration codes for software to be re-installed.
All of this takes about a day and a good portion of it involves just letting the more long-running installations do their thing while I go off and do something else so it’s handy to have another machine to work with in the meantime.
Imaging software takes a complete image of all the software and files on your hard disk and packages it in a file or series of files that can be restored at a later time. To restore an image, you would, again, backup your current files, start the image software (usually from a CD since it’s overwriting everything on the hard disk) and instruct it to restore the contents of the backup file that you created earlier. Then, after the restoration is finished, you restart the PC and the system is completely restored to whatever it looked like when you created the image.
This is the fastest option since it simply unpacks the image file onto the hard disk and includes all of the software that was installed in the image. It might take about a half hour. It’s also the option used by organizations that have a lot of installations to do such as companies that need to load the same selection of software and utilities on every one of their machines. Multiple images can be created to accommodate different hardware configurations and can be deployed quickly.
Years ago, I used Norton Ghost to maintain and restore images of my machines and it worked well. From my brief search, one of the current favorites seems to be Acronis True Image. You can also find some other selections on BackupReview.com. It’s best to compare a few packages, their features, their reviews and their prices before committing to one.
The only downside to this method is that it takes a little more planning than maybe the average user is accustomed to doing when it comes to the PC. Images are best taken when the PC is new and before any extra software and data is loaded or right after the kind of manual rebuild I described above. The backup images have to be stored somewhere safe so that they can be accessed later if necessary. The user also needs to be aware of what hardware has been added or removed since the image was taken. If this includes external storage, it can complicate backup and restore operations. Still, if you’re willing to do that extra bit of planning and want to rebuild relatively often, this is a great option.
Many PCs and laptops come with a factory reset option where an image like the one described in the last section is maintained in an isolated partition on the hard drive. By using a special key combination at startup, you can activate the image software that came with the computer and it will restore the system to the way it looked when it was new. All existing software and data will be overwritten with the original factory image including the operating system and any additional pre-installed software.
When this option works, it’s an easy alternative to maintaining your own imaging software and image files. I used it a number of times on my old Dell laptop and was happy with it until I found out that sometimes it doesn’t work. On older machines, the disk partition that contains the factory image and restoration software can become corrupted and the restoration will fail without warning – after it’s already wiped your hard disk in preparation. Unfortunately, I learned this during a service call on a customer’s old PC. Fortunately, everything was backed up and the customer was ready to get rid of the machine anyway but it was still not something I wanted as part of the customer experience.
I don’t use this method anymore unless I’m prepared to do a manual rebuild afterward.
Windows System Restore
Of course, there’s Windows System Restore which will restore your system to an earlier time. Windows maintains system checkpoints which record the software and settings on your machine at the time, essentially partial images. It can then roll your system back to one of these checkpoints, if necessary, to reverse any changes that have caused problems.
The upside of using this is that it leaves your documents and files alone and only affects the software and system settings although it’s best to do a backup, just in case. You also won’t have to re-install all of your software afterward or make all those little adjustments to get the machine to look just like you want it to.
The downside is that System Restore only maintains a certain number of checkpoints based on the amount of disk space that it’s been allowed. Windows creates periodic checkpoints of its own and many software packages will create their own checkpoints before installing. The oldest checkpoints are deleted first so, chances are, you will not be able to use it to restore your machine to its factory condition.
Another disadvantage is that sometimes checkpoint restorations fail. Unlike the factory reset, this won’t turn your machine into a brick but it will leave you with the same problem that prompted you to do a restoration. I’ve often noticed these failures when an external hard drive was connected to the machine when a checkpoint was created but was not connected during the restore operation. I don’t know if flash drives will have the same effect. Still, if you’re going to make use of this, it’s best to create your own checkpoints before doing any software installations and disconnect as many USB drives and devices as you can first.
People are storing more and more of their data on computers and other devices at this point, from e-mails to photographs and music collections. While computers make a lot of things easier and enable us to do things we couldn’t years ago, they require just as much attention and maintenance as your car or any other machine that plays a large role in your life. For this reason, it’s important to pay attention to where your data is stored and maintain backups on a regular basis.