Failed: Invalid header marker

Is 250 MB the right remote volume size for what you had configured back then (the default is 50 MB)?

On those bad files, you’ve clearly got large runs of NUL (00) characters where the header ought to be. Checking for others might find other runs, which I think would be very unexpected on an encrypted file.
Instead of piping to head try pipe to grep '00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00'

Adding and checking Advanced option –upload-verification-file can upload a duplicati-verification.json
How to verify duplicati-verification.json? shows how to do a more thorough file verification of expected sha256 hash and file length versus what’s actually found. If you don’t have Python, you can try testing by finding the filename (the three known bad ones at least) in that file, then doing sha256sum on them.

Try a good file too to check that the procedure is working right. The idea is that mismatch suggests the problem was on upload or during storage, while match suggests file was bad at an earlier process time. Running the verification script will do a better job of this same theory, so hopefully you do have Python.

To find out which files from the backup a broken dblock file affects, you can use the affected command, probably most easily run from Commandline by replacing the source paths with those dblock filenames.