Best practice for for low powered NAS and Off-site backup

I have two questions: one regarding offsite backup to amazon glacier and one regarding doing the initial backup on a different device.

I have an old synology NAS that I use for personal and work storage.
I often have to work with different computers an I therefore used the NAS as a sort of “home” directory, so I have access to the same files independent of the computer (we are talking roughly ~1e6 files. a large fraction are small ascii data files from various scientific instruments).

In addition to my “home” dir, I store duplicati backups from several PCs on the NAS as well. In order to have an on-site and off site backup I want to backup the NAS. Ideally to a glacier storage.

Since I know that duplicati does not play well with glacier, my idea was to make a local duplicati backup of my “home” dir, to a dedicated share on the NAS. The same share that holds duplicati backups of my various PCs. I would then use the glacier software that comes with the Synology NAS to store a copy of the backup share on glacier.
The advantage of this is that my backups are encrypted and compressed, which is not the case when using the Synology software.

However Im not sure that this scheme makes sense?

In addition to this the NAS is fairly low powered 1200mhz arm with 128mb(!) ram. It works fine for serving my files, but its obviously going to be pressed when running the backups.

Is it possible to run the initial backup on a computer (ie have the cumputer use duplicati to backup the “home” share to the backup share), and then have duplicati running on the NAS to do the incremental backups on a daily basis? With that setup I can leverage on the much more capable processing power of the computer to do the initial heavy lifting.

Hi @mesalasnano, welcome to the forum!

Yep! Seeding a backup like this has been done many times, though in your case the thing to be aware of is that unless the folder paths are the same on the “seeding” computer as on the NAS the first time you run the seeded backup on the NAS Duplicati will think all the files have moved from one folder to another (ore more accurately, deleted from one and magically appeared in another).

This won’t cause any problems because of the de-duplication - the file data itself won’t be re-uploaded, but the first “seed” backup might look a bit odd when trying to do a restore.

As far as syncing your local backups to an offsite destination (in your case Amazon Glacier) I believe a few people are using this method and it should work just fine with the following caveats:

  1. Be careful of sync settings - specifically deletes. For example, if something caused your local NAS Duplicati files to be deleted then the Glacier tool would likely sync those deletes to the cloud thus erasing your backups there as well.

    HOWEVER - if Glacier supports versioning such that you could go to a previous version (before the delete) then this isn’t as much of an issue.

  2. IF you need to restore from glacier you will have to manually download ALL the Duplicati files to somewhere that Duplicati can actually see (such as a local drive) and do the restore from there

    This means if your local NAS dies (:crossed_fingers: that it won’t, but if it did) and you wanted to restore just ONE file from the glacier backup, you’d still have to download the ENTIRE backup set to where Duplicati could get to it.

1 Like

Thank you very much.
Good to know that its possible to bootstrap the backup. I did a small test run yesterday, and bootstrapping the full backup will literally save me weeks :exploding_head:

Its also a good point that I will need to download the full backup if I want to restore from glacier. Although my strategy is that I would only need it if the NAS suffers a catastrophic failure( dies in a fire or similar). I does mirroring between two disks, so unless both fail, I will have a local copy.

Instead of using duplicati to backup the home folder, I’m considering using rsync to do local incremental versioning of the home folder. This has the advantage that it can easily be browsed and old versions of files can readily be compared. This library would then be compressed and stored in glacier in regular intervals (once a month or something).

Kind regards

It sounds like you’ve got a nice plan in mind, I hope it works out well for you! :crossed_fingers: