Questions about destinations, rclone, scripts and jobs

Hello and welcome!

  1. No, a single job can only target a single destination.

  2. My Duplicati jobs target a local NAS and then I use rclone to sync to cloud storage. I use scheduling that is independent of Duplicati, but I don’t see why you couldn’t use the run-script-after option. By default Duplicati will only wait 60 seconds for the script to finish. If it doesn’t finish, it will let it continue but will not capture the output for you.

  3. Each backup job is an independent operation. If you have the run-script-after option set on each job (or in the global options) then yes, it would run a total of six times. Maybe not a big deal if each run of the script only synchronizes that specific backup job’s files to cloud storage. If you only want the script run once and have it sync all Duplicati backup data to the cloud, then configure only the final job to do the sync. You’ll probably want to adjust your schedule timing so it’s more obvious which job will be last, instead of scheduling them all to run at the same time (which they don’t do anyway as Duplicati only runs one backup at a time).

  4. Duplicati only runs one backup operation at a time. You cannot have the Web UI engine run more than one backup at a time. (You’d have to stop using the web UI and switch to command line if you want to go that route.) I am not sure what order the jobs run if they are all scheduled at the same time. It may be in the same order that they appear in the web UI, or maybe not. I haven’t tested that.

1 Like