[Error-Duplicati.Library.Main.Operation.TestHandler-FailedToProcessFile]: Failed to process file duplicati-20200105T000000Z.dlist.zip

The dlist files are not ordinarily used for Restore, because their information (and dindex info) is in a database. If that database is damaged or its drive breaks, the dlist file is needed to get that version.

Figuring out what this is is probably worthwhile, but you have to get more details than the summary.

If you see an interesting looking error message, try clicking on it. Sometimes it will expand with detail.

Verifying backend files is the testing that was referred to.

At the end of each backup job, Duplicati checks the integrity by downloading a few files from the backend. The contents of these file is checked against what Duplicati expects it to be.

–backup-test-samples

--backup-test-samples = 1
After a backup is completed, some files are selected for verification on the remote backend. Use this option to change how many.

The TEST command is a more technical description of it:

Verifies integrity of a backup. A random sample of dlist, dindex, dblock files is downloaded, decrypted and the content is checked against recorded size values and data hashes. <samples> specifies the number of samples to be tested. If “all” is specified, all files in the backup will be tested. This is a rolling check, i.e. when executed another time different samples are verified than in the first run. A sample consists of 1 dlist, 1 dindex, 1 dblock.

So by default there will often be three files tested, but they wouldn’t all be dlist (and not the same dlist).

Are these completely independent backups, meaning two Duplicati jobs, one to SATA, one to Dropbox? Generally I’d expect SATA to be quite reliable. Network destinations are more likely to have file damage.

Another odd thing is that seeing a duplicati-20200105T000000Z.dlist.zip error on both would say both backups ran at the same time, and that doesn’t happen. Or is this some sort of a sync to Dropbox?