@T-6 is correct that it’s automatic. The manual says more, and note that it’s not in the background:
Verifying backend files
At the end of each backup job, Duplicati checks the integrity by downloading a few files from the backend. The contents of these file is checked against what Duplicati expects it to be. This procedure increases the reliability of the backup files, but backups take a bit longer to complete and use some download bandwidth.
backup-test-samples defaults to 1, but you can make it far higher with backup-test-samples or this one:
--backup-test-percentage (Integer): The percentage of samples to test after
a backup
After a backup is completed, some (dblock, dindex, dlist) files from the
remote backend are selected for verification. Use this option to specify
the percentage (between 0 and 100) of files to test. If the
backup-test-samples option is also provided, the number of samples tested
is the maximum implied by the two options. If the no-backend-verification
option is provided, no remote files are verified.
* default value: 0
The TEST command explains what a sample is – typically 3 files in 1 sample set, but may be less.
There is also a complete file listing on every backup to check remote file names and sizes, and this theoretically will catch most problems of damage on the remote, avoiding need to download to look.
A fine but important point is that this does not do or replace doing file restore tests. It checks that the backup files look as the local database says they should, and it also self-checks the local database.
is a legitimate concern for any space-efficient backup format that does things like only saving changes rather than entire files. Loss of a single remote file may affect many. The AFFECTED command shows which versions and files are impacted. In contrast, keeping many separate total file copies takes space while making it even less likely that damage to a single version of a single file will cause wider damage.
Duplicati’s database keeps records not only of the destination files, but the source files that were backed up, and when a restore is done, the final step is to make sure right file content restored (or you’ll be told).
So the database checks on things that may go wrong, but the question is what if things go wrong with the database, which can sometimes happen. Generally issues are either repairable with various tools, or the database is recreated from the destination. Fixing a database that’s gone wrong can vary in its pain level.
So from a reliability point of view, and especially if you will only do one backup copy (which is dangerous), and even more especially if you delete the original after making it remote (something I never suggest), the copy-the-file-lots-of-times approach probably wins on reliability, providing there’s some check that the file made it. Duplicati checks that the specialized files that it uploads list correctly. It’s not simply “assumed”.
So these are some of the tradeoffs, and we haven’t even started on other things you may want in backup. Ultimately of course, it’s your business needs that need to be met, and your choice on how to meet them.