Hello,
I’m using Duplicati 2.0.5.1 beta on Windows 10, running 18 backups (9 sources, each to two different destinations: local and remote). Duplicati runs as a service.
I set the “usn-policy” option to “required” in the global settings, however I recently noticed that Duplicati takes a significant amount of time to scan some of the sources even if there has been no change to the files. When I checked the databases of the affected backups, I noticed that the ChangeJournalData table is empty.
Deleting and re-creating the db does not help, and also adding the “usn-policy” option to the individual backup settings does not improve the situation.
It is interesting that in at least one case, one backup of one file set (the local one) does not use USN while the second (remote) does: these are exactly the same files, so the problem does not seem to be Windows. Additionally, all the sources are on the same filesystem, they are simply different directories.
Finally, my understanding is that setting “usn-policy” to “required” would cause a backup to abort if USN is not available, but this does not happen either.
Has someone a clue of what is happening? Suggestions to troubleshoot the issue?
Thanks!
Roberto
P.S.
In the backup logs, I noticed that Duplicati doesn’t do a full scan of the source files, when it doesn’t use USN, but scans ALWAYS the same number of files (e.g. 13065 of 44078 in one case) even if nothing has changed between backups. This is completely inexplicable.