I‘m searching for a cross-platform backup. Due to encryption, Duplicati is a candidate!
I may ask how Duplicati will ensure the integrity of files when backing them up. Are checksums used when reading and writing the files to be backed up?
Welcome to the forum @sisyphosloughs
This is a two part question because the files at the backup destination are very different from the source.
How the backup process works
Source files have a hash computed at backup time which is verified at restore time at default setting, but
skip-restore-verification can turn off the verification to go faster, although this seems like a less safe plan.
--skip-restore-verification = false
After restoring files, the file hash of all restored files are checked to verify that the restore was successful.
Use this option to disable the check and avoid waiting for the verification.
Files for the destination have a hash computed at creation time prior to upload. Verifying the hash after an upload isn’t done by default because it might require a download of the file just uploaded. Instead, a listing of files is done after backup to see if what’s expected is seen at expected size. A sample test is done too:
Verifying backend files
To do heavier sampling of destination files, one can use backup-test-samples or backup-test-percentage.
The TEST command explains what a sample is (usually 3 files, but a given file is not done twice in a run).
Regardless of checksums, it’s good practice to test restore occasionally, including disaster recovery use.
@ts678 thanks for the detailed explanation!