How to set up schedule and retention

How to achieve the following?

  • One job per day
  • This job should run multiple times from 6:00 to 21:00 (every hour) per day
    (On the weekend less frequent)
  • When running the next week, only keeping the last version of the day from last week
  • Smart rentention: xx:yy,2M:1W,6M:1M,1Y:3M,U:6M -> xx:yy?


  • Having (most) current backups
  • Getting some redundancy (-> The forum has quite some threads with broken jobs)

I broke this out into it’s own topic and I suspect it will lead to a few posts and I didn’t want to clutter the other topic with them.

Do you mean if you wanted backups Monday through Friday you’d have 5 jobs?

At present Duplicati doesn’t support windowed start / stop times, only a start time. However, using --run-script-before-required you could call a script that checks the the current hour and returns the “don’t run” result if it’s not between 0600 and 2100.

@kenkendk, could the “Run again every” custom option be used to enter a --retention-policy like string to allow for ranged start times? For example, 0600-1000:1h would mean hourly start times at 6 AM, 7 AM, 8 AM, 9 AM, and 10 AM. Similarly, 0600-1000:1h,1400-1600:30m would add 30 min. start times between 2 PM and 4 PM.

Note that I see these as JUST start times, they won’t cancel a job if it’s running past the end of the time range. (Though I suppose there’s no reason it couldn’t do that…)

So again assuming Monday through Friday runs you’d want 5 versions, one from each day, right? Assuming your definition of “last week” is a rolling “the last 7 days” then I think you’d want something like --retention-policy=7d:1d (meaning for the last 7 days, keep only 1 copy per day - I THINK the retention process keeps the newest version in a time period by default).


Hopefully I find some time on the weekend to do some tests with --run-script-before-required.
(Linux newbie: have to learn about scripts, checking week days, times, …)

Even your thread "missing files" error after switching to custom retention policy confirm my thoughts about having multiple jobs (one everyday) for the same data.

=> I’m sure that we need something like I described in the first post.

Another example:
On job on Monday and Tuesday running multiple times a day saving to Cloud1.
On job on Wednesday and Thursday running multiple times a day saving to Cloud2.
On job on Friday and weekend running multiple times a day saving to Cloud3.

PLUS a good retention sheme like just keeping the last backup of the day (done the week after backup)

  • multiple backup versions from Monday and Tuesday in the current week
  • reduced one week later to the latest versions of Monday and Tuesday
  • later on reduced to the latest version of Tuesday

The cause of trouble with the backend will not always be duplicati itself.
Many might be out of our control.