Backup the Duplicati Backup Files

Hi
It’s a problem if I do this?:
Backup my files with Duplicati and then additionally backup this backup data via Duplicati to another location.
Can this lead to problems or is this completely okay?

Hi!
The two jobs are two independent “entities” so they can’t affect with each other. But personally I’m not a “nested backup” fan: what happen if a bug in the backup engine occours? it could corrupt twice backup set!!! I use a program (WinSCP) in order to synchronize my backup (*.dlist and *.dblock.aes files) from my NAS to a local usb spin disk so if the NAS broken I have a redundant backup set .

Hi,
but if the backup engine occours, the copied backup will no longer be usable ?

First of all, I talk about a hypothetical case. IMHO duplicati have a stable code yet: a big bug in the “backup engine” should be rare event. Anyway: in this scenario a duplicati installation (with the hypothetical fix) should be able to purge the versions affected but, yet, everything is in fuction of the bug and nobody can predict the domino effect.

But you understand that, in order to have a consistent backup set the backup of backup couldn’t be “recovered” but it must be “perfect”. So backup the backup with the same tool IMHO have nonsense.

The backup of the backup should not be run while the first-level backup is running, or it will pick up a halfway-done backup. If you do this, make sure the first-level is done before doing the second-level.

If the goal is to get fast local restores most of the time, with remote in case of local disaster, you can achieve that with running basically the same backup to two places using two jobs. There’s a feature request (or at least a request) somewhere to make it nicer, but given limited help, it may take awhile.

The other advantage of independent backups is that if one breaks, then you might still have another. This avoids the concerns of @xblitz of running things through Duplicati twice and needing both OK. The Beta code isn’t terrible, but every now and then things can break. Next Beta should work better.

You’re also somewhat quicker if the local gets damaged because you can restore right from remote, whereas with your proposal you’d need to do second-level backup restore then do first-level restore. This is not just slower, but you’re running things through Duplicati twice more, and both must be OK.

As a side note, your Database will be lost if your drive dies. Some people add an additional backup which backs up the primary remote backup’s local DB remotely as a second job to avoid waiting for recreation of the local database if a drive breaks. This is more important if it’s a very large backup… There’s also a bug in current Beta which can make recreate slow. That bug is fixed in the next Beta.