So I got this error today during the scheduled backup:
Failed to process file duplicati-ba144e47ab35b40beab74611f9130d454.dblock.zip.aes => Hash mismatch on file “C:\Users\XXXX\AppData\Local\Temp\dup-e63fa3d6-73f5-4ad2-9eac-da04147fe424”, recorded hash: jCRDdChnWhvFPTk1TIS6Y1JedePIiPkYflmxaWr4Be0=, actual hash 0EeJe1xLBLLpXdM4TdaAY+9HoTSz9mESETHzRcAvzzU=
It seems that somehow one of my dblock files got corrupted. What should I do now? Is there a way to re-create this corrupted file?
If not, how do I re-create the whole backup? Should I just physically delete all Duplicati files from the destination folder and Run the backup as I normally would?
Any suggestions here? My backup is quite big, so redoing it from scratch would be very time consuming. Is there a way to fix just that one dblock file that has mismatched hash?
Hi @alamos54 and welcome to the forum. I’m sorry about the dblock, but I’ll try some suggestions. This isn’t something I’m expert in, so I’ll partly be pointing to things that the experts said, and hope that those help you.
Lost dblock files are generally not considered recoverable. They backup source file blocks to the destination, however It’s not clear from you whether this is failing repeatedly, or just once (at which point you stop to ask).
Having a file come down bad would be one thing. Having it get bad on the backend would be another. Please save copies of all files if it’s not too late already, and consider saving the job database which is where info on what to expect is saved. The job’s database location can be found on the job’s Commandline page in dbpath.
That page might also help if you usually use the web UI, because you can adapt it to run your new command.
The AFFECTED command can show source file impacts of losing that file. There is also a Disaster Recovery article where one dblock file is intentionally corrupted, and another intentionally deleted. This is then cleaned.
That might be a good guide for you, however sometimes the consistency checking looks at file presence, not corruption (a full scan for corrupted backend files would currently require downloading all of them), therefore there is a chance you might have to manually change a dblock file corruption into deletion of the backend file.
Here is a post by the expert that gives me further confidence that a delete (if necessary) and purge-broken-files may help, if it actually comes to that point. I’m wondering if you’re up for runnng file hash checks or binary comparisons.
If you’re technically inclined, there’s more that could be done to gain a better understanding of this rare issue. Regardless, it would be helpful to hear about your environment, e.g. the Duplicati version and the destination.
I’m having problems using the command line tool on Windows.
For example, when I just want to test my backup, I get this:
C:\Program Files\Duplicati 2>Duplicati.CommandLine.exe test z:\backup_duplicati 1
Enter encryption passphrase: *************
Listing remote folder ...
Extra unknown file: duplicati-20180803T230001Z.dlist.zip.aes
Extra unknown file: duplicati-20180805T024352Z.dlist.zip.aes
Extra unknown file: duplicati-20180805T025523Z.dlist.zip.aes
(...) lots of files here
Extra unknown file: duplicati-ifb7481059e3d48f8b1fd7997016dd325.dindex.zip.aes
Extra unknown file: duplicati-ife0b5332f4b147d0ba59feb1fe5bd5bc.dindex.zip.aes
Found 440 remote files that are not recorded in local storage, please run repair
This works fine when using through GUI. What should I do to make the command line tool read local database that the GUI uses?
I’m not totally sure repair fixed things (partly because the details of the original situation weren’t really heard).
For example, the default after backup is to verify an actual file somewhat randomly (like the test you just ran). This is a chance to notice problems early, without having to verify everything, every time. If this is where your backup (if it was a backup) saw a bad dblock, the next backup will very likely verify a different file and dblock.
To test that original maybe-bad dblock, you could use the article I gave to see what file you could actually try to restore, in order to pull that dblock down, in order for Duplicati to complain again (or not). You could also look at the destination to see if the maybe-bad file is still there with its previous date. If so, it may still be bad.
Unfortunately, “repair” is not the cure-all that one might think from its naming. Sometimes one requires more…
Glad to hear you resolved your hash mismatch error, I went ahead and flagged your post about running repair as the solutoin.
Note that @ts678 is correct that a test restore of some files the use the duplicati-ba144e47ab35b40beab74611f9130d454.dblock.zip.aes file would be a good confirmation.
The database Repair command MOSTLY resyncs local database information with the destination, which might (as in your case) be storing the new hash from the destination into the database or it could go the opposite way of using the database to recreate a missing dlist or dindex file.
Unfortunately, dblock files themselves can’t be recreated which is why the expert (actually, primary developer) post he linked to two message ago suggested purging the bad file.
If you don’t feel like running affected to see what files / versions could be restored to test that dblock you could also try running a full test (see link below) or just wait until Duplicati (eventually) randomly tests that particular file.
No matter what you choose, just note that Duplicati will restore what it can, even from mangled destination files.
The bidirectional description of The REPAIR command that was just reinforced has always bothered me.
Regardless, fixing a recorded hash based on downloading then hashing a possibly corrupted file is odd, however I’m not familiar with its design. I do recall discussion of how far it can see without file downloads.