CPU load is too high

Great point. Looking for changes to backup, packaging that, and uploading can run in parallel. Initial backup has an easy time finding work. Later backups spend more time looking for work. Presumably scanning the filesystem has less CPU load than scanning plus all the other tasks.

usn-policy can sometimes eliminate scanning on Windows.

Duplicati is quite concurrent as described in Channel Pipeline, intended to boost performance through using more CPU cores. Core counts are increasing because individual cores hit limits.

Duplicati 2 work began in 2012 and was less concurrent. 2018 revision increased concurrency:

thread-priority can make Duplicati yield to other CPU users.

use-background-io-priority can do the same yielding for I/O.

throttle-upload can slow uploading.

asynchronous-concurrent-upload-limit can do a similar slow.

Usually there is more wish from users for performance. CPU load tends not to come up so much, however for those who would prefer Duplicati to run less perceptibly, there are ways to have that.

Battle plan for migrating to .Net8 (in test) improves many things, and one of them is performance.

I don’t know if anyone has run comparisons yet to the current .NET Framework and mono version.

1 Like