The Reluctant Sysadmin: NAS Time Machine


I recently set up a Synology DiskStation thinking it would be wonderful to have a simple wireless backup solution for the handful of laptops in our apartment. Unfortunately, Time Machine has been regularly breaking down and the troubleshooting advice I’ve found most places amounts to holding your breath and wishing that things will magically work. The error looks like this:

time machine completed a verification of your backups. To improve reliability, Time Machine must create a new backup for you.
In other words, it’s all gone. I know, wtf?

Forum posts suggest everything from getting a UPS (it might have been a power spike or dip that ruined it) to using a wired connection (since wifi might not always be stable and apparently no one accounted for that), to making sure your computer doesn’t go to sleep while backing up to Time Machine. While I did go for the UPS and even tried plugging in for a while, none of those solutions worked.

The advice I received from Synology in an (impressively prompt) email was much the same (emphasis theirs):

This can happen when you let the system go into hibernation/sleep during a backup or if you are performing the backups over wireless.

I don’t buy it. Wireless technology has been around for decades, and packet retransmission is no big deal. As a test, I have purposely closed my laptop in the middle of large backups over wireless with no ill effects.

What I do believe could fix this issue and kill it dead is something I found in some older posts about enabling Time Machine on an NAS before it was officially supported by Synology (although that official support doesn’t extend to making sure it works without getting corrupted). Apparently, HFS+ file systems can handle crazy huge numbers of files coexisting in a single folder. Whatever flavor of Linux is running on the NAS cannot. The failed verification results from Apple software trying to dump too many files in the same folder (in this case, the files are sparse bundle stripes). Too many files are created when you start having a large-ish backup. Say 300GB+. For me, that can happen after a week or so of file changes, since my initial backup is in the 290GB range. The solution to enabling more backup space is to increase the band size of the time machine backup so that your NAS’s filesystem doesn’t get overwhelmed by the number of files Time Machine attempts to store in a single folder. Fewer, bigger files.

This is not a how-to post, nor do I make any claims about the usefulness of this information, but the following concepts should point you in the right direction if you’ve experienced the issue (or would like to avoid it in the first place). Create a sparse bundle with a 128MB sparse-band-size and then replace the bundle generated by time machine with it. You will also need to copy the*.plist files from the Time Machine-generated bundle. Those plists will let Time Machine identify the sparsebundle as one it can and should use. Don’t copy the .bckup files, token, Info.plist, or bands/.

Here are some bash scripts for creating your own sparse bundle and copying the important plists from one bundle to another:

I have only just now set this up, and I will definitely follow up if this process doesn’t fix the issues that plagued me over the last month or so. Also, I didn’t come up with this on my own. After hours of tedious forum reading and web searching (and weeks of trying non-solutions), I found the following posts and some lightbulbs came on:

This seems like a sane culprit for the errors I’ve been having, as it addresses a real software limitation. The power flickers and wireless network problems seem like chasing ghosts, and my partner’s smaller (100GB) backup hasn’t become corrupted.