Changed-files in duplicati server mode

There is a discussion from 2017-2018 on implementation of this in depth, and I got a fairly good grasp how to go about implementing a watcher for my needs from it:


However, I do not seem to find any documentation or instructions how to pass this information to duplicati when it is running in server mode (on linux).
There is changed-files field available in advanced job options, but how does that exactly work? Above post references environment variables, are these utilized in linux as well or am I supposed to pass the list to it - in which case… how?

This feature sounds extremely valuable for systems where filesystem IO and bandwidth are limited, or is there already another built-in mechanism for this?

For reference, I’m currently on version 2.0.5.1_beta_2020-01-18

The feature is currently only directly accessible from the CLI.

If you are running the server with the WebUI, there is no great way to patch in the two lists. The idea was that there would be a monitor built into the server, such that it could maintain this list while running, and then pass the lists once the backup started.

You can request that the backup starts, using a HTTP request (there is a python script that helps with this), but you cannot pass any arguments to it.

As mentioned in the post you reference, @JonMikelV suggests making a script and using that with --run-script-before. Such a script would just need to emit the options:

#!/bin/bash
echo "--changed-files=123.txt:abc.zip"
echo "--deleted-files=xyz.rst"

You can get more information from the example script files.

If whatever watcher you have can be queried to obtain these lists, it should be trivial to get it working.

use-background-io-priority might be another way to reduce system impact. I hope Duplicati doesn’t starve.

This option instructions the operating system to set the current process to use the lowest IO priority level, which can make operations run slower but will interfere less with other operations running at the same time.

If most of your load is walking the filesystem looking for changes (without opening files), then moving to a filesystem watcher might help. Ultimately though, the files need to be opened, examined and backed up…

Duplicati also does maintenance operations such as Compacting files at the backend that need some I/O.

I was afraid that was the case with webserver, alas I will check out the samples when I have some more time. And you’re correct in assuming it will be trivial to get the backups going, but I’d love to manage this from web interface where I can do restores as well.

Indeed the system I’m working with is IO limited, more than anything else, so I want to reduce access operations as much as possible while enabling as frequent backups as I can.