Duplicati will attempt to use data from source files to minimize the amount of downloaded data. Use this option to skip this optimization and only use remote data.
Good practices for well-maintained backups suggests using no-local-blocks for test restore, but the way to have spotted the corrupted encrypted file would have been The TEST command which is similar to default Verifying backend files except for setting the number of sample sets to test, and ability to also unzip the file.
Duplicati always looks at the Destination for file presence and length. but internal damage is harder to spot, because typically it requires actually downloading the files, but sometimes this is slow and expensive to do.
backup-test-samples defaults to 1, so is a very light sample of a large backup.
backup-test-percentage was added to allow better coverage of such backups.
Backup Test block selection logic explains how it tracks the destination testing.
I’m not sure if there’s an option to make sure everything is tested at least once.
upload-verification-file can be used for OS-readable destination, such as yours.
You would run utility-scripts/DuplicatiVerify.py to verify all of the destination files.
Note that this (and many tests that avoid downloads) test against the database.