Failed: Invalid header marker

Although it was asked lightly earlier, let me ask if anybody is using any sort of speed throttle on upload or download, e.g. from –throttle-upload, –throttle-download or the GUI throttle button at the top of the page?

Upload throttle corrupts backup, especially OneDrive. Analyzed, with code proposed. #3787
(the OneDrive comment just means it was the worst of the several tested – there could easily be more)

Determining the nature of the corruption would help, but may involve looking in SQLite databases (not so difficult, but probably needs DB Browser for SQLite or similar installed). Some simpler tests are to check size of the corrupted file, convert to hexadecimal, and see if it looks too round. That might be an FS error. –upload-verification-file will make a text file duplicati-verification.json that describes expected remote files. Running DuplicatiVerify.* from the Duplicati utility-scripts folder can test destination files look as expected, assuming you can either get that sort of destination access, or are willing to get the files more accessible, however if doing a complete download is required, you might just as well run the test command over all. That should do a hash check of all remote files, but you can also do your own simple manual checking of some particular file by looking it up in duplicati-verification.json to at least see if actual size is as expected.

Stepping back to more ordinary steps, logs can be good to see what led up to the problem, however with problems with older files (how old is the bad file based on remote info?), you can’t get details retroactively. Sometimes one might get lucky, and find relevant info in default job logs or About → Show log → Stored. Beyond that, use –log-file with –log-file-log-level=retry (reasonable compromise, but other levels may do).