Multiple shedules in a fixed time frame


There is no way for multiple schedules option. the only way is to schedule it every X number of hours. i want to run the backups every say 2 hours during the day and freeze for the night while there is no activity.

Or is there a way to schedule enable / disable Duplicati service

The scheduler is fairly simple, so there is no way to set it up like you describe.

One workaround that I can think of would be to use an external scheduler that has the features you want and then getting it to signal “start backup”.

To get that, there is a script here that can pause/resume:

Line 32+33 looks like this:

def resume(self):
        return self.fetch('api/v1/serverstate/resume', post=True, data=b'')

You can make it run the backup by adding something like this:

def run(self, id):
        return self.fetch('api/v1/backup/' + str(id) + '/run', post=True, data=b'')

Alternatively, you can set the backup to run each 2 hours, and then call the pause/resume script from an external scheduler.

Not that I have any ability to make this happen, but I’m curious if this is still a feature people are interested in and how it’s use is envisioned.

For example, is it machine responsiveness people want? Network bandwidth conservation? Etc…

  • a simple “run from 9P to 6A then interrupt the backup even if not done” (aka don’t run when I’m likely to be using the computer)
  • run for 2 hours every other hour (aka let other tasks have chance)
  • run when machine “not in use” (aka don’t run when I’m at the keyboard)
  • run when network “not in use” (aka don’t run when I’m watching Netflix)
  • let two (or more) jobs flip back and forth rather than make one job wait until another is fully completed (aka faux multitasking)
  • etc.

Note that I don’t know how doable it is with the current scheduler system so it’s likely a fairly big rewrite would be needed but until that possibly happens, it sounds like some Windows users have had luck with has SysInternals pssuspend tool:

I’ve taken the code that was provided and added and tested this functionality, so I thought I’d share it in case it helps others. My thanks to the original author and @kenkendk

#!/usr/bin/env python3

import requests
import urllib.parse
import sys

class DuplicatiServer(object):
    def __init__(self, base_url):
        self.base_url = base_url
        self.cookiejar = requests.cookies.RequestsCookieJar()
        self.headers = dict()

    def fetch(self, path, post=False, data=None):
        for attempt in range(2): # we may get a "Missing XSRF token" error on the first attempt
            if post:
                r = + path, cookies=self.cookiejar, headers=self.headers, data=data)
                r = requests.get(self.base_url + path, cookies=self.cookiejar, headers=self.headers, data=data)
            self.headers['X-XSRF-Token'] = urllib.parse.unquote(r.cookies['xsrf-token'])
            if r.status_code != 400:
        return r

    def pause(self):
        return self.fetch('api/v1/serverstate/pause', post=True, data=b'')

    def resume(self):
        return self.fetch('api/v1/serverstate/resume', post=True, data=b'')
    def run(self, id):
        return self.fetch('api/v1/backup/' + str(id) + '/run', post=True, data=b'')
if __name__ == '__main__':
    ds = DuplicatiServer("http://localhost:8200/")
    r = None
        cmd = sys.argv[1]
    except IndexError:
        cmd = None
    if cmd == 'pause':
        r = ds.pause()
    elif cmd == 'resume':
        r = ds.resume()
    elif cmd == 'run':
        r =[2])
        print("Syntax: %s [pause|resume|run id]" % sys.argv[0])
        print("Where id = the number of the backup task to run")
    if r and r.status_code == 200:
        print("Something went wrong -- %d %s" % (r.status_code, r.reason))

Hey JonMikeIV…this is an old topic, but I would be interested in this feature. I’m using a machine that would be beneficial to have multiple backups during the day while it is in use, but after a certain time it wouldn’t be needed. Even if it was just twice a day, say at noon and at 6.

Right now I have the backups set to run every 6 hours, which gets the job done, but it also means I have 4 backups a day when only 2 are needed.

I could set up two separate backups to run at those times, but that would require twice the backup storage space (which maybe I need to do anyway for redundancy).

Thanks for letting us know you’d find this useful.

I don’t know what any progress has been made on it so far (we’re mostly focused on performance and bug squashing right now) but I do know there’s a “Custom” selection in the “Run again every” field - but I have no idea how that works. :blush:

It’s possible somebody else like @Pectojin, @kees-z, or @renestach might have some thoughts as to whether or not it could be used to do what you want.