I want to make a backup to a local drive and to a ftp server (offsite).
Is it possible to do this in one and the same job?
I have no created two jobs, but than I am having a timing issue. I only want to run the backup during the night. How can I be sure that the first job (local) is finished before starting the offisite backup? Or can those jobs run simultaneously?
Hello and welcome!
As you discovered, with a single job you cannot target two locations. What you did (two jobs) is one way to accomplish this, and it’s how I did it when I first started using Duplicati. I didn’t like that Duplicati had to do the work twice, and I didn’t like that the two backups weren’t identical (they run at different times - the web UI engine cannot run jobs simultaneously). It was also annoying to have to manage 2x as many job configurations.
Because of those issues I switched to having Duplicati only back up locally (to my NAS), and then I sync the resulting backup data to a remote destination with a third party tool. rclone is excellent for this, but I actually use the “Cloud Sync” program on my Synology NAS to do it.
You could consider setting up a serialized backup if Duplicati is able to also use your local drive as a source. With this setup you’d actually be backing up the files Duplicati creates as part of your backup. It might be easier to get the timing between the two backup sets right this way and would avoid having to have Duplicati do the work twice.
I’m pretty new to Duplicati, but I guessing that in the event of a loss of your primary backup, you could repoint your first backup job to the ftp server to do file level restores. Someone with more Duplicati experience might have advice on what confusion, if anything, this would cause for Duplicati.
It would also let you rebuild your local backup using the duplicati files from the ftp server. Again, not sure if any of this might cause Duplicati some confusion.
I would strongly recommend NOT doing this. While technically it would work, you are running the backup files through Duplicati’s deduplication engine again, creating a new set of backup files that differ from the originals. To do a restore you’d have to do TWO restores.
It’s much better to do an actual synchronization of the backup files to an alternate location. If you do a sync, then the back end files are identical, and you can actually do what your goal is:
If its any consolation, I did mean to include that you should turn compression and deduplication off. On the other hand, I just looked it up and saw that it is not possible to disable deduplication. In the end I was wrong regardless! So everyone, forget I said anything.
No worries! When I was new to Duplicati I thought of doing the SAME thing you suggested!!
Please note that there is no way to turn off deduplication in Duplicati.
Thank you for your participation in the forum!
Can you share more details about this synchronization?
Can I setup a synchronisation job in Duplicati? If yes, how?
If this is something external like rsync, I don’t see the point of using Duplicati in the first place.
For example, if I want to have a local backup and a remote backup with Duplicati, I need to backup locally, then backup the backup in remote.
If I want to avoid this backup of backup, I need to somehow sync the Duplicati local backup files to remote, but this means I need to configure two backup solutions, right?
To me, it makes more sense for Duplicati to have this functionality.
Either backup to multiple locations in the same job, or have a sync job.
Is my understanding correct?
Duplicati is a backup program, and it gives you features a synchronization program doesn’t typically provide: efficient storage of multiple versions (thanks to deduplication), encryption, compression, etc.
So in my opinion using both Duplicati and a sync tool gets you the best of both worlds: local backup target for Duplicati for fast backups and restores, and then sync off-site to help protect against disaster at the location.
Duplicati doesn’t have a sync feature built-in (as mentioned it was designed to be a backup program, not a sync tool). That being said, you could have Duplicati run a post-backup script that triggered a third party sync tool like rclone if you really wanted them linked more tightly. I personally prefer running Cloud Sync on my NAS. It watches the Duplicati target area and synchronizes changes almost instantly to off-site cloud storage.
You’re not the only one to think so. There has been a feature request to implement multi-destination support for quite a while. (See Allow targeting multiple destinations [$200] · Issue #234 · duplicati/duplicati · GitHub)
Until something like that gets implemented, I’m really happy with using Cloud Sync. (Previously I used rclone and think highly of it.)
The one thing I wouldn’t do is do a second Duplicati backup of your Duplicati backup. This runs all the data through the Duplicati backup engine again. Any restore of this second backup would require two steps and be quite clunky.
Thank you @drwtsn32, I gave CloudSync a try on my Synology NAS and it’s a great solution.
I will follow your setup, backup with Duplicati and sync the backup files to my NAS, then off site with CloudSync to Google Drive, Dropbox …
Great! Only thing I would add is that I would configure Cloud Sync to do a one-way sync only: NAS to cloud storage. I think this might add an extra layer of protection in case your cloud files get deleted somehow.