I’ve been unlucky in that while my data has been increasing, my backups have failed a few times due to power outages, BSODs, etc. So each attempt to back up is taking longer because there are more previously uploaded files to scan through, increasing the chance something will interrupt the backup putting me back to square 1.
As I know that most of these files haven’t changed since they were uploaded the first time, what’s the advanced option to turn on so I can skip reading all these files on every attempt? Once I have managed to complete again I’ll turn it back on.
I don’t believe you can completely turn that off, but I think there’s a parameter that will use timestamps only to determine what to backup.
Unfortunately, I don’t recall the name of the parameter and am not finding it in search so maybe I’m misremembering… though I do recall people used it to improve performance.
Looks like it’s
--check-filetime-only, I’ll give it a go if and when this backup fails again.
Yep, that’s what I was thinking of. Here’s some detail for others that might be reading:
This flag instructs Duplicati to not look at metadata or filesize when deciding to scan a file for changes. Use this option if you have a large number of files and notice that the scanning takes a long time with unmodified files.
Default value: “false”