How do I backup the files duplicati creates to off site backup that im able to restore?

Hello!
I have 4 backups being made locally on my machine that backs up to a /home/username/backups folder, I then have a backup task that backs this entire folder up to network storage.
Since the backups in the /home/username/backups are already duplicati files, when I go to “restore” from the network storage, all I see is a list of the duplicati zips that have been created instead of the folder/filenames that were created by the original 4 folder backups into the /home/username/backups.

I must have the logic wrong in doing this, what is the correct method?

Hi Tom. I’m a bit confused with what you’re trying to do …

Do you mean you have 4 jobs configured in Duplicati to backup to the network folder?

yes sorry I should have added a screenshot

Its 2 minecraft serers, so im backing up the /worlds folder for each as their own jobs and also their /plugin folders as jobs, then I have the “bigbackup” which copies these local backups to a network storage!

You made a backup of a backup right? I think you should make the reverse path.

In my opinion its simplier to make 1 backup of all folder to your remote storage, don’t you think?

[]'s

I would not use Duplicati to back up Duplicati data a second time. This only complicates your restore process. Instead, I recommend using a file synchronization tool to sync your Duplicati backup data to another location. This way it stays in exactly the same format. You can restore from the second backup copy if needed by simply changing your destination path in the backup job.

Edit to add:

For the synchronization you can use several tools. I would configure one-way sync only (main Duplicati backup destination → second copy location). rclone is an excellent tool that supports several cloud storage targets. In my case I use Synology Cloud Sync (my Duplicati installs target the NAS, the NAS syncs to S3).

1 Like

I agree with the other posters that it’s not advisable to backup a backup. You should run the backup of the /worlds and /plugins folders using Duplicati and store that on your CIFS/SMB/NFS share.

The issue I have is limited space on the local disk, but my network drive can store a lot more!
The network share does have 20 snapshots mind, so that in itself is a history of these backups, so it may just work.

So would you tell backblaze to run the rsync command each time of of these backups complete? so I just have the 4 backups and when each one ends it does a 1 way rsync to the network drive?

@samw yeah its a CIFs share! so use rsync like above?

If you are trying to get additional retention using the sync method, well that won’t work. Syncing your data using rclone or another tool just makes an exact copy in an alternate location.

If you would like to have different Duplicati retention rules for local and remote, then the best way to achieve that in Duplicati is with two separate backup jobs: one targets local storage with more limited retention, and one targets remote storage with longer retention.

But let’s back up a bit. Duplicati’s deduplication engine enables it to keep many versions of your backup without it taking too much space. It depends on your data change rate of course, but you could potentially have dozens or hundreds of backup versions without it taking much more space than the source size of your data. In other words, keeping 4 backup versions does not require 4x the space of the source data.

Example: on my main machine my documents backup protects about 50GB of data. I have 284 backup versions that go back to 2017 when I first started using Duplicati. The space required for all these versions is only 113GB. So think about if you really do need different retention on the remote vs the local backups. If you do, then go ahead and use two backup jobs in Duplicati. If you don’t, then maybe only a local backup job is needed (with a sync to remote).

Thats what I was thinking!

Im definitely going to need to test this more, My data changes regularly but its so hard to predict, I wish there was some sort of calculation that looks at existing backups from the database to estimate the size if I were to add on “versions to keep”! Should be possible!

I setup the Rsync last night and I have a script that runs when the backup jobs finish, which seems to work fine, but now im thinking about condensing my backups, those world/plugin backups are both inside folders already, so I could make it all a lot simpler and have just 2 backups, I may have made it more complicated for myself.

I think I need to just try it and see how much space this deduplication uses up.

My Networked storage actually has its own snapshot feature, with 20 days worth. So with duplicati being rsync there, Its like its a snapshot of snapshots…