CPU load is too high

As noted in the support forum, the program generates an unacceptable CPU load, usually close to 10% here on Win 11 Pro, and sometimes higher. No program should do this for more than a few seconds. The problem is not simply abstract, as the user notices a real computer slowdown across the board. I strongly recommend revising the program to provide better performance with a smaller burden. Due to the CPU drain, I view Duplicati as currently unusable.

Knowing more about your hardware, CPU (cores, threads), RAM, disk type, network througput of both clients and backup target systems would be helpful.
I personally have no issues with Duplicati running in the background.

1 Like

I am accessing NAS over Wi-Fi.

BaseBoard Manufacturer ASUSTeK COMPUTER INC.
BaseBoard SerialNumber M207NXCV00F0MTMB
BIOS Manufacturer ASUSTeK COMPUTER INC.
BIOS Name B9400CEAV.311
BIOS SerialNumber M2NXCV10H352089
Processor AddressWidth 64
Processor Caption Intel64 Family 6 Model 140 Stepping 1
Processor Name 11th Gen Intel(R) Coreā„¢ i7-1185G7 @ 3.00GHz

So which NAS are you using? Does it have a NVMe or SATA SSD? Which WiFi technology are you using (2.4 or 5 GHz) and how fast is it? When you copy a large file from your PC to the NAS what does Windows Explorer tell you in regards to speed (MB/sec)? Does your PC have a SSD or hard disk installed?

Thank you for the help. I recently stopped using Duplicati due to the CPU load, so I do not have all of the answers. I have been using QNAP and TrueNAS. The TrueNAS OS is on SSD, with main data on HDD. I do not recall which Wi-Fi band was being used. My desktop and laptop computers run from SSD.

Best wishes for success.

Duplicati has a deduplication engine which requires CPU time. This may be viewed as a negative, but the benefit is extremely efficient storage or multiple versions on the back end.

If you want something that uses nearly zero CPU time, it is probably not going to have source-side deduplication or compression capabilities.

Depends on what is important to you. It is a trade-off.

Thank you for the reply. My opinion is that programs can and should use better approaches to managing their CPU loads!

Duplicati does have ways to adjust the aggressiveness (concurrency) of the deduplication and compression so it uses less CPU time. If your goal is to have it be nearly zero, then Duplicati is almost certainly not for you.

If you are able to find one that does source-side deduplication and compression with near zero CPU load, Iā€™d be interested in hearing about it. You could post to this thread as it may be of interest to other people looking for that type of solution.

1 Like

Kopia does dedupe and compression. It uses about 1% of my CPU during the backups. This is acceptable in terms of usability.

Nice, that is much better than Iā€™ve ever seen with backup products that do client-side dedupe and compression. Thanks for sharing.

BTW, I found this relevant open Issue on Github: High CPU usage while backing up Ā· Issue #2563 Ā· duplicati/duplicati Ā· GitHub

Now, my personal biased opinion:
I frankly fail to see how a 10% CPU usage is an issue in itself, even on a modern fast CPU.

For fairness sake, if the program is not using 100% CPU this means that itā€™s having other issues that are wasting its time waiting, and in the case of Duplicati itā€™s usually either a local disk or network access bottleneck.

Other than whatā€™s mentioned in the ā€œIssueā€ link above, I am not aware of an option to throttle Duplicatiā€™s CPU usage, however, some changes might help:

  1. Use a larger Block Size setting.
  2. Use a lighter hashing algorithm. This choice very much depends on your CPUā€™s capabilities.
  3. Use a lower compression setting, or outright disable compression.
  4. Throttle the backup storageā€™s network transfer speed (if somehow you have such an option on the remote storage).
  5. Use a program like Prio, and set Duplicatiā€™s process priority to the lowest level possible (from Task Manager). Prio saves the priority and applies it the next time the same program starts. Disclaimer: Prio worked well on Windows 7. Iā€™m not confident about it on later versions of Windows.
  6. Use a remote database server. I never tried this but read that it is possible to use a remote SQL server.

Itā€™s worth noting that different parts of the backup operation can exhibit different CPU usage levels.

I hope this helps.

Great point. Looking for changes to backup, packaging that, and uploading can run in parallel. Initial backup has an easy time finding work. Later backups spend more time looking for work. Presumably scanning the filesystem has less CPU load than scanning plus all the other tasks.

usn-policy can sometimes eliminate scanning on Windows.

Duplicati is quite concurrent as described in Channel Pipeline, intended to boost performance through using more CPU cores. Core counts are increasing because individual cores hit limits.

Duplicati 2 work began in 2012 and was less concurrent. 2018 revision increased concurrency:

thread-priority can make Duplicati yield to other CPU users.

use-background-io-priority can do the same yielding for I/O.

throttle-upload can slow uploading.

asynchronous-concurrent-upload-limit can do a similar slow.

Usually there is more wish from users for performance. CPU load tends not to come up so much, however for those who would prefer Duplicati to run less perceptibly, there are ways to have that.

Battle plan for migrating to .Net8 (in test) improves many things, and one of them is performance.

I donā€™t know if anyone has run comparisons yet to the current .NET Framework and mono version.

1 Like

I can happily announce that the next canary build includes a ā€œintensity levelā€ option that allows you to toggle how much pressure Duplicati puts on the system. There are also quite a few performance positive changes in the canary builds.

The intensity level is set with the option:

--cpu-intensity=<level>

Where level is an integer between 1 and 10, where 10 means unlimited.
The implementation is based on reducing the number of blocks processed per second, so despite the name, it will as reduce IO and memory bandwidth load as a by product.

You can set this in the UI under ā€œSettingsā€ ā†’ ā€œAdvanced optionsā€ so it is applied to all backups.

There will be a post detailing this later, but for now there are some measurements in the PR:

1 Like

Thanks for the revisions and update.