Occasional errors "HashMismatchException"

lately I get occasional errors with one of my backup jobs. It just backs up my local d-drive to the NAS next to me, every 30 mins. So most of the time it finishes within seconds. But about once a day I get a pop-up “[Error-Duplicati.Library.Main.Operation.TestHandler-FailedToProcessFile]: Failed to process file duplicati-id558a946141545b6861b981e1c49d559.dindex.zip.aes”
I looked into the log files I save and found a possible reason in

[Retry-Duplicati.Library.Main.BackendManager-RetryGet]: Operation Get with file duplicati-ia087833b572f4071beb1921a7e67c95d.dindex.zip.aes attempt 5 of 5 failed with message: Fehlerhafter Hash-Wert für Datei “C:\Users<myuser>\AppData\Local\Temp\dup-0ac9daf2-c11e-4de9-902e-f6975c99ff9d”, gespeicherter Hash: M2AB7Gxx+7i0ZAHkvy9NCMM6OIoYqwsh4Hp+bodx17s=, aktueller Hash 9Job3cosun/UY4pZjlx6HOrWS/8BnqO+ckV2T6OuTs0=
Duplicati.Library.Main.BackendManager+HashMismatchException: Fehlerhafter Hash-Wert für Datei “C:\Users<myuser>\AppData\Local\Temp\dup-0ac9daf2-c11e-4de9-902e-f6975c99ff9d”, gespeicherter Hash: M2AB7Gxx+7i0ZAHkvy9NCMM6OIoYqwsh4Hp+bodx17s=, aktueller Hash 9Job3cosun/UY4pZjlx6HOrWS/8BnqO+ckV2T6OuTs0=

So it complains about a mismatch between a stored and an actual hash value.

The odd thing is that there are no such files in C:\Users<myuser>\AppData\Local\Temp - at least not after the job has completed.
If I manually trigger the job again, it completes with no errors.

Should I worry?

Hello

by default after each backup Duplicati verifies some files picked randomly. That’s why you get downloads on a backup. Index files are part of a backup and they are zipped/encrypted like other files. When Duplicati reads these files, it put them in a temp directory, decrypt and unzip them, and verify the hash (signature) stored in its local database against the file downloaded from the backend.
What happens in your case is that Duplicati has problems with this check. There is a file that it can’t verify. When you are doing a backup again, this file is not necessarily tested again, since data is checked randomly after each backup.
The reason is probably not on the backend, because if the backend had corrupted the data, Duplicati could not have uncrypted it.

If the problem concerns only this particular file, the reason is probably that a hardware error has caused an error while creating it. If it is the case, you can try to delete it and let Duplicati recreate it (possibly using repair).

If the files names with problems are changing, you may have a defect in your computer (probably memory). No backup can fully protect against defective hardware, unfortunately, not when it’s a corrupting data problem.

You may consider in any case adding an additional backup in the cloud, that’s why Duplicati exists. Two backups on 2 different media are 2 times better than a single backup.

The hash is on the encrypted file. My guess is that its decryption and hashing are tested in parallel.
This is kind of worrisome because I “think” there’s a length check on files. Did file contents go bad?

One way to test all files is with The TEST command in GUI Commandline, asking it to test all files.
Verifying backend files after backup is a small sample, typically 1 set of 3 files, so it can miss things.
backup-test-samples and backup-test-percentage can test more, but reliable storage is important…

Is this SMB? That sometimes causes trouble for unknown reasons. One issue is it’s pretty complex.
If it’s now just file-like access from Windows, the good news is there’s a file-oriented verification tool
utility-scripts\DuplicatiVerify.ps1 you can use, after a backup with upload-verification-file.

Thanks for the reference to cmd-line options. I ran a “test all” of which I only post the last lines:

duplicati-i79f96c38158d4f2794b774e8b12a3011.dindex.zip.aes: 935 errors
Extra: +185iT3IzImG+f99k0BSBZU5HhEwlj48joIvwlNr7Wc=
Extra: +2RnZG3E+Xz6AVpXKEUTPwg5F5iniR+JU9dMRlMXipQ=
Extra: +3c/YI9uZLC1H6qIPDD7iHJr96rFnRjUAgdSceOHksY=
Extra: +E8/xP1n55qrSTlgy2s2GXppixatCaLxwVVrNs4pXmM=
Extra: +ES7p/j30TS0cecZPOuZf4Czz7TKXjF0EZVaVbB0uQY=
Extra: +IAA/Dl0F80Aujb1rLxDBFPTAEVnb/G9xJb9HRmW+ps=
Extra: +ZLFvguzB5nCW2xkhVQDrq/jIZJdYK2f1IL1xb1c6/4=
Extra: +fC8K6rmyk8qXCmgXx+AAz6V+UMUA48upVGfDDOUjwM=
Extra: +fwTcdfwJMC3kTpWmjy9UrrxnAprdlcl3iEXCRQJ9p4=
Extra: +u9I/BRfZ+zlRzd0FPe1Utd52av8YV4DdIYl4dzrFr8=
… and 925 more
duplicati-i3b4aedc471504ec797bbd8c9591e7d9e.dindex.zip.aes: 382 errors
Extra: +AQ7LHPXCJwJpTP7pqDMjiIVoQ7eHtts8HByJkN+rwo=
Extra: +CqUAnmDHfgM9ES+mfrsdbb8sJ4KaLbrnlWYdVkXYSA=
Extra: +LUFmmsUgoEFkxGCgJBAz+4eei/uqSk2r3T3+ukJLiA=
Extra: +OsAFn6tiTQeCFt9CbJuqfCj4FRop/GZZuBMw7TkURM=
Extra: +QJN3mW2L4DHQ9kQldzMjX20h4bwiQGq6CXGsqM4chM=
Extra: +lBDnKctn6l6/W+WAgsmOtpFTh+jCE6T8v2Kyv9v7vQ=
Extra: +pMioF2tJ1PKRPvYtXZbLTklsXWlPlVbXdInbIzxpyc=
Extra: +pgjcu7hkhXJXeIQH4Pbs/xPzV82HMOOGPGlygZSfrU=
Extra: +uCKShlJU9oy7WCh1qbUACkI1/Oe8HQLTR0yLQuI+cw=
Extra: +w3h8bB3y6lB782NXFjupzXo23WUxVPvYAK+SdXyR4U=
… and 372 more
duplicati-ie562e530fd0547d99d4524602b1d8628.dindex.zip.aes: 590 errors
Extra: +/r9RbIhLvK7yd5HJmTxaCCcAv8YOR/ScE3totlmnmY=
Extra: +9qiCn2B3OSOMoCH4CHNpNp1hm/umDpxpdx+XnJOiYQ=
Extra: +H+BDaZm0pPneVYh9/C0KPYhebifshbRHbvzna4zKAM=
Extra: +JZiJcnT9pKdHsPOySsMUgIiiNaP1Jj3CX7x3XXzH8E=
Extra: +OETNK9M8a37ByDsskEkHOSSSYiDZDbG2tHQhnpP6eA=
Extra: +S0xM0qaB089J72YsubZsL9gkP+3WLvdTQE2D2agkqk=
Extra: +gJvAO8Lkj55uRfAGMwcsJniNO4n0V+f/oiVqIee5r8=
Extra: +p1R1A/NispgBF5tmsBO5AM0UITfZ2+w9YH7Ts7Hgx4=
Extra: +pAGqTKX1gPeJY+xEbhnhuGM/NWg2UAo3LP7S7+uqPM=
Extra: /Bcx1LW1BAeGK2+NN/My+B6A4/dJU6bX+p8r3oPCh84=
… and 580 more
Return code: 3

so in total it reported 14 errors in index files
After running a backup with -upload-verification-file I guess I need handholding which parameters I should provide to the DuplicatiVerify script. First thing it asks me is “FileOrDir:” – maybe more?

Connection to my Synology NAS is via SMB, I think.
The odd thing is that I now see the very same kind of error when runninng a similarly configured backup to the same share (different folder) from a laptop (also Win10). This also used to run without errors in the past.

Seemingly full-remote-verification was used too, which wasn’t really necessary just to do file test.
If you really want to see all the output (e.g. to make sure all is “Extra”), I think full-result does that.

Is original post’s duplicati-id558a946141545b6861b981e1c49d559.dindex.zip.aes readable or not?
Actually, is anything unreadable like that (ignoring Extra, which proves that the file could be read)?

Assuming you’ve told Duplicati that destination is a Local folder or drive, tell the script the same path.
There should be a duplicati-verification.json there now because you turned on upload-verification-file.

As of today all runs of this job have ended with errors, most on different index files The last one reported

“2023-03-23 15:38:05 +01 - [Error-Duplicati.Library.Main.Operation.TestHandler-FailedToProcessFile]: Failed to process file duplicati-ied188a07f2e7453c842ee2ff96713fcb.dindex.zip.aes”

I looked up this file in the target: it exists, has a size of 42,6kB and looks “reasonably scrambeld” in a texteditor.

The DuplicatiVerify script only provided errors:



FWIW I still have the option “upload-verification-file” enabled, so every run shold upload a (new?) hash file. I also see such a file with a time stamp of the latest run in my target folder.

BTW all my other jobs that backup into a cloud drive are still running without errors.

If these are all TestHandler errors, you should have a job log which would explain exact failure better.

I’m not sure what that’s saying. It’s encrypted, so should have very little readable beyond initial bytes.
Is your texteditor smart enough to look for balanced square braces? Your duplicati-verification.json is supposed to start with one, e.g. [{"ID":, and the matching right square brace should be at file’s end.

If you’re using Local folder or drive destination type and doing multiple backups, the verification
file can have leftover data at the end if it shrinks. If I dummy up such data, I get result similar to yours.

Actually, it doesn’t need a fancy text editor as there’s only one level, so see if you can see data past ], comparing it to the text of your error message. If it’s extra, you can edit it off, or delete file and backup.

2.0.6.100 Canary and beyond fix this, but it’s not in any Beta release yet. Are you running latest Beta?

File backend now overwrites files, thanks @warwickmm

This is supposed to work, but the bug fixed in Canary means overwrite with shorter file has leftovers.

Ah - now there is light: I assumed that each run of a “uplod-verification-file” would completely overwrite an existing one, and missed that the hash file only makes sense togeher with the matching block/index/list files. Now that I deleted the previous hash file and created a new one, DuplicatiVerify.ps1 did its job on the matching remote files. As a result it reported (again) 14 errors like

*** Hash check failed for file: duplicati-i<14 different names>dindex.zip.aes

What can I do to fix these broken index files? Delete them and somehow repair?
What might be the reason for such mismatches to occur out of the blue?
Is there a way to find out which files of the backup set may be actually be affected, so I can check if they are corrupt?

BTW

yes I think so: 2.0.6.3_beta_2021-06-17

was meant with tounge in cheek, I didn’t expect being able to read it, just checking if it looked “encrypted” - after a closer look it actually starts with “AES … !CREATED_BY … SharpAESCrypt v1.3.3.0” so it definitely is…

If you’ve never done a database recreate, and it still has needed info, repair will recreate the files.
Because you might not be sure, you could probably move them out of the way instead of deleting.
Before that, you could try seeing that they’re not decryptable by attempting decryption, e.g. in GUI
AES Crypt or the CLI SharpAESCrypt command in your Duplicati folder.

SMB as used by Duplicati is not reliable sometimes. Something’s lost between Windows and NAS. Looking at the file closely can sometimes at least recognize symptoms. For example, corruption is commonly a truncation. Did the tool mention sizes? If not, you can read them in the verification file.

For whatever reason (possibly a feature omission), The AFFECTED command hasn’t (IIRC) worked successfully for me given a dindex file, but you can try it in GUI Commandline. I think a dblock works.

A more direct way is to look in a copy of the database with DB Browser for SQLite or similar, where a table called IndexBlockLink shows what dblock goes with a dindex. Then you can ask with affected.
The Remotevolume table knows all the destination files, and its ID is what IndexBlockLink table uses.

So you’ve done one part now of the index file check – you’ve found the header, so its damage is later. Sometimes truncation is to an even binary size if you want to post sizes or type them into a calculator.

Losing an index file doesn’t directly lose data, but not having them is not good and can slow recreates. Actual data is in the dblock file, and each one normally has a dindex to make it easier to know content.