Invalid header marker cryptographic when trying to delete unwanted files

This file isn’t mentioned earlier. Is it part of history or newly found as corrupted?
An all test should find the same corruption, but it would probably take longer…

As an aside, if checks are slow even server-side, it would probably be possible
to script a check of only the unchecked files. Once written, they don’t change…
Or if they do change after having checked good once, that would be significant.

If a new find, I guess prior tests to examine timestamp, DB, file data, etc. apply.
Now that you have one that affected sees, you can test list-broken-files.
The intent was to see if there were cases where only one of two tests caught it.

I don’t use Nextcloud, but from my research it seems like it supports numerous protocols to secondary storage from itself where it is the client, but only supports WebDAV for programs to access it. Correct?

Do you have the use-ssl option set on your Duplicati Destination? If not, possibly it would help to avoid undetected transmission errors (I’m not certain), but if unset it makes monitoring packet data possible.

Large data captures need either a lot of storage or quick detection of an issue to turn off capture to look. Chasing down rare problems is not an easy job. I’m in the middle of that with Linux CIFS bugs right now.

Is the Nextcloud server Ethernet-connected all the way to Duplicati? WiFi or Internet add more suspects.
You can get some low-level stats with netstat -s or ifconfig, but interpreting them is not very simple.
Still, errors at lower levels seem unlikely to be able to turn a whole dblock file into NULs. That’s quite odd.

There are only a few places to get information on how things are going, so might as well hit what we can.