Windows : High numbers of page faults and pagefile thrashing

That’s how I understand it as well. So I don’t really understand why there’s so much paging going on if Standby is really “free”.

I wouldn’t think it’d matter, but I can try testing.

My current assumption of the trigger circumstances is:

  • Duplicati created a SQLite DB with a lot of entries (c. 9 000 000) - small blocks, many files, large total backup
  • The backup kept failing - too many transactions to Google Drive that went past their opaque API call limit
  • In trying to resume the backup, Duplicati had to quickly thrash through the SQLite DB checking blocks which were incomplete (epoch date/time stamp), without being throttled by having to process file payloads or upload to the Google Drive
  • SQLite / Duplicati seems unable to claim actual memory beyond the initial 32MB
  • The lack of working memory is causing SQLite (?) to thrash the pagefile

Potential causes for this apparent hard limit on RAM:

  • Duplicati running as a Service
  • Duplicati backup job marked with Background I/O priority

ETA to finish the current job is Saturday evening UTC, so I’ll uninstall Duplicati as a service, re-run it as normal process, start an identical backup job but to a different Google Drive instance and then create some sort of failure after about 30GB-40GB of files.

Although I won’t argue too hard, because I’m not an expert, maybe I can clarify my previous note:

From above link:

change the suggested maximum number of database disk pages that SQLite will hold in memory

The Page Cache

Because main-memory is a limited resource, the page cache cannot be allowed to grow indefinitely.

This fixed memory configuration is why I’m not sure SQLite can be expected to grow as its load grows.

was suggested as one idea to hint whether it’s (somehow) SQLite or .NET Framework with big commit.

I’m not a Windows memory expert either, but I’d have expected hard faults to grow process working set, especially when there’s free memory available. So I’ll just drop hints and wish luck to the experiments…

SetProcessWorkingSetSize function (winbase.h) and SetProcessWorkingSetSizeEx function (memoryapi.h) show low-level way to control working set maximum, but there are some big ones in the screenshot. I’m not aware of all UI, but the deprecated Windows System Resource Manager could do it.

Thanks, I’ll try and use those tools when I dig into a bit deeper. I’ll be happier when I know I’ve got one complete off-site backup somewhere, and then I can start tinkering.

Update: The backup completed at 01:00 this morning. I’ve uninstalled Duplicati as a Windows Service and tried running it as a user process, and the working set is again stuck hard at 32,768KB, and the pagefile thrashing is still there. So I’ll put it back as a Service again and rule that out as a cause.