I’m using Duplicati 2.0.5.1_beta_2020-01-18 running as a Service on Windows Server 20212 R2, backing up to a Google Drive. The server has 16GB of RAM in it, and floats around 40% memory usage and 12% CPU. It’s backing up a very large number (150k-ish) of small-ish (<10MB) files.
When doing a backup to Google Drive, I notice that Duplicati’s memory consumption seems unusual. It’s reserving 32MB of working memory, and commit memory of 1.0GB.
Duplicati is triggering a lot of page faults - I’ve seen values of over 100 - and as a result is thrashing the pagefile horrendously, making it a real choke point on the I/O of the whole server. End users start to complain about poor file serving performance. (Unfortunately, the pagefile and the data volume are on the same physical disk/RAID array).
I can’t understand why Duplicati isn’t using more physical RAM, or what would be triggering the page faults. I’ve got some options set to try and calm it down -
backup-test-percentage=0
backup-test-samples=0
disable-file-scanner=true
snapshot-policy=off
use-background-io-policy=true
tempdir=(a directory on a separate USB3.0 connected drive)
Any thoughts on how I can stop it thrashing pagefile and blocking up the file system I/O?