"Unexpected difference in fileset" on every backup

I guess we’ll see how the current direction goes. The ideal is to get a reproducible test case, then let the developers put it under a microscope using heavy logging, possibly debuggers attached to program, etc.

After lots of chasing, the last big case of this, which was fortunately reproducible for me to narrow down:

“Unexpected difference in fileset” test case and code clue #3800

but this scenario was fixed in mid-2019. Though things are much better now, maybe there’s another case.

One thing you could look at is whether this one is also in a compact, e.g. does checking no-auto-compact postpone it until you manually use the Compact now button. I think you can use the Verify files to try a verification like you’re having fail under the automatic one. If you’re lucky enough to get the usual job log, it breaks down the individual pieces of a run (such as compact) but it also may be skipped on failed backup.

To add to the fine thoughts from @Xavron (thanks!), Duplicati benefits from a larger blocksize for larger backups, because the default 100 KB creates too many blocks, and causes database operations to slow.
Previous rule-of-thumb was 1 million blocks per backup, but more recent testing here says go a bit higher.