Questions about destinations, rclone, scripts and jobs

I’m new to Duplicati, and have used it for a couple of days. I have a couple of questions for which I could not find answers:

  1. I want to backup a folder to a local destination and a B2 bucket. Is there a way to have 2 destinations for 1 job? I only need the backup to be created once, not twice.

  2. I’ve read people using rclone to do the backup to the cloud. I don’t mind writing a script that does a rclone sync (although I think this is a task more suited for Duplicati), but I want that script to be started after the last job is done. Can I use the run-script-after option?

  3. This option executes a script after performing an operation. I have 6 jobs scheduled at 3:00 at night. Is this set of jobs considered ‘an operation’? Can I be sure my script does not get triggered 6 times?

  4. Are all these jobs being run 1 by 1 or simultaneously? Can I influence this behavior? Is the order in the web UI the order the jobs are being run?

I’m running Duplicati through the official Docker container on a server running Ubuntu Server.

Hello and welcome!

  1. No, a single job can only target a single destination.

  2. My Duplicati jobs target a local NAS and then I use rclone to sync to cloud storage. I use scheduling that is independent of Duplicati, but I don’t see why you couldn’t use the run-script-after option. By default Duplicati will only wait 60 seconds for the script to finish. If it doesn’t finish, it will let it continue but will not capture the output for you.

  3. Each backup job is an independent operation. If you have the run-script-after option set on each job (or in the global options) then yes, it would run a total of six times. Maybe not a big deal if each run of the script only synchronizes that specific backup job’s files to cloud storage. If you only want the script run once and have it sync all Duplicati backup data to the cloud, then configure only the final job to do the sync. You’ll probably want to adjust your schedule timing so it’s more obvious which job will be last, instead of scheduling them all to run at the same time (which they don’t do anyway as Duplicati only runs one backup at a time).

  4. Duplicati only runs one backup operation at a time. You cannot have the Web UI engine run more than one backup at a time. (You’d have to stop using the web UI and switch to command line if you want to go that route.) I am not sure what order the jobs run if they are all scheduled at the same time. It may be in the same order that they appear in the web UI, or maybe not. I haven’t tested that.

1 Like

Thank you very much for your thorough and informative response. This makes a lot of things much clearer.

Looking at the timestamps of every job I think I know which one is last, but I would indeed need to do some testing then.

I also found an alternative method: instead of using a script and rclone, create a separate job that uploads the backup folder unencrypted (because whatever is in the backup folder is already encrypted) to the cloud. This is where I got the idea from. He however alternates the backup jobs and the remote sync during different nights, which I don’t like.

What I want is to have a fully automated process every night. I don’t want to rely on guessing, so when the last regular backup job is done it needs to trigger a remote sync. Before exploring the possibilities with a script and rclone:

  • Is it possible to manually trigger a remote sync job programmatically? I will not schedule the remote sync job, so it will never automatically run. I found the backup CLI command, which would be good enough for a script I think.
  • Is it possible to trigger this remote sync job as an advanced option on the last scheduled backup job? This would eliminate the need for a script. It could be called chaining I guess. I can’t find a specific advanced option for it.

So long story short: my 6 scheduled jobs run at 3 am. After the last job it will trigger the remote sync job.

This would be acceptable to me because it’s automated, doesn’t depend on guesses or assumptions, it’s maintainable, it has good log access for all tasks and the backup system actually does all the backing-up.

After digging more into duplicati-cli I understand it’s a completely separate process from the web GUI. So I can’t start a job configured through the web GUI through CLI or through any other means?

I could of course make a script and use duplicati-cli backup to make a remote backup, and fire that script after the last backup job, but I wouldn’t be able to check the logs through the GUI which is important to me.

Actually forget that. I ran into this topic, and if you export the command from the web GUI and run that in the CLI it will store the logs in that specific database, and so the logs will also be viewable from the web GUI. Fantastic!

I will soon look further into this and build a script.

I would not do this because it isn’t a “sync.” The files are re-run through the Duplicati engine and reprocessed, so to restore anything you’d have to do TWO levels of restore. It adds unnecessary complexity to your setup which increases time to restore and probably reduces reliability.

I would create a regular script / batch file that uses rclone to do the sync up to cloud. You can then trigger this script manually, via an OS scheduler (cron on Linux, Task Scheduler on Windows, etc), or via the run-script-after option on your final job.

If you’re talking about Duplicati.CommandLine.exe, then you’re correct. it is totally independent from the web UI scheduler and job configs. You won’t be able to see the status of such a job in the web UI.

You can with this excellent tool. It is a way to trigger web UI jobs to run via the command line.

Good luck with whatever you decide. But I definitely recommend rclone for your second copy of the backup data.

I came to the same conlusion after running the backup and noticing it did do a backup of the backup files. I think you’re absolutely right about this not being a good approach.

I later discovered in my last post if you export the command from the web UI, and run that command with Duplicati.CommandLine.exe (duplicati-cli on linux), you can view the results through the web UI. That’s nice, although not useful for me anymore.

Thanks for the tip. I will definitely look into that one.

I think it’s a shame I need to rely on external tools to do something quite basic and from what I thought was a core functionality of Duplicati since they claim to be designed for online backups. It’s not hard for me to create a script for it, and probably to do that after the last backup job every night, but now I have another log file I have to regularly check and a tool I have to keep updated.

When I have a good script running the way I want it I will come back and share it here. Thanks for your help my friend, appreciate it very much.

To be fair, Duplicati certainly has no trouble doing a backup direct to cloud/online storage. It’s people like you and me who want two copies of our backup data that need to do some scripting. Not sure how common it is for Duplicati users to want two copies.

Personally it didn’t bug me to have a separate sync tool. rclone is great at syncing. I actually use Cloud Sync on my Synology NAS which has also been really reliable. It monitors the target storage location used by Duplicati (which is the same NAS) and automatically starts syncing changes within a minute of changes occurring.

Maybe some day Duplicati will have the native ability to do two backup copies though. I believe there is a feature request for this on the Github project page already.

1 Like