I recently had to rebuild the database for one of my backups, and since then the backups are taking a lot longer than they should. I’ve looked in to it, and Duplicati seems to be ‘opening’ files even though they haven’t changed in a long time. It’s ‘opening’ 106 of the 551 files in the backup set, even though I ran the backup, and then immediately ran it again without any changes being made to the files being backed up.
If I use the live log viewer with ‘profile’ selected, I see entries like:
Assuming that those two timestamps listed at the end are Duplicati’s ‘remembered’ timestamp, and the actual timestamp of the file (they certainly match the latter), you can see that they’re identical. However, it is still considering them different.
I’m running 2.0.5.1_beta_2020-01-18, which I believe is the latest beta.
That’s exactly right, from what I recall. After a database recreate the timestamps have low precision and Duplicati notices that they don’t match the higher precision it reads from disk so it forces the file to be reprocessed.
Interesting that subsequent backups still do it. I haven’t seen that myself. Let me try some testing…
Ok I was able to reproduce your problem (subsequent backups keep doing the full file scan). I think this is caused by Duplicati discarding the backup version if nothing changes (no file adds/deletions and no file content changes). Metadata only isn’t enough for Duplicati to keep the most recent backup, it seems.
If you add or remove a single file or edit a file’s contents, it is enough for Duplicati to keep the most recent backup and then to stop this full file rescan on the subsequent backups.
I think Duplicati should retain the most recent backup even if it’s only metadata that has changed.