So I’m backing up a small NAS with duplicati (around 2 Terabytes) with a lot of small files.
Even if nothing on the files significantly has changed, duplicati will give me a result of around 30 “changed files” in the protocol. The size of these files seem to always be the same, around 27gb, slightly increasing over time.
This wouldn’t be a problem for a local backup, but since I’m syncing the backup afterwards to an offsite location I have to upload 27 usesless gigabytes per day.
Running the compare command results in 0 changed files, so I don’t even know which files really changed.
Has someone got this issue already? Is there a fix for that?
Let’s clarify the source of some numbers, and maybe move on to look more closely using extra logging.
Do you mean “Modified files” in backup job log summary of Source files and their changes? If not, what?
Do you mean in Complete log section below the job log summary looking at "SizeOfModifiedFiles"?
That’s the size of the modified files (even if changed slightly) not the size of the modification in those files.
Is this confirmed by looking into the newer destination files, for example after sorting by modification time?
Also look in Complete log under BackendStatistics for stats like BytesUploaded and FilesUploaded.
You should be able to get the paths of any “candidate” files that changed somehow to call for reading them.