How do I use --rebuild-missing-dblock-files


I have a single dblock file with a problem:
[Warning-Duplicati.Library.Main.Operation.FilelistProcessor-MissingRemoteHash]: remote file is listed as Verified with size 41549824 but should be 52366617, please verify the sha256 hash “uzRIrCaJfyQ3Q3VasCRjcbBB8D2i6HqtBzehfRmdnlo=”
The file is corrupted, and cannot be extracted by unzip.

I wonder, if I could try --rebuild-missing-dblock-files to recreate the file, because the original files affected by the block still exists.
I have tried to add to the backup, but I get the same warning.
If I move the dblock file outside the destination and retry the backup with --rebuild-missing-dblock-files I get a warning, but no rebuild of dblock.

Any suggestions?


Actually this is related to this:

After getting the error message you’ve got. I deleted the broken files, and ended up with the situation in my post.

Let me know if the rebuild works.


You could certainly try. I think that’s an option on “repair” that used to be done automatically but now isn’t.

Release: (canary) 2018-08-30 talks about it, and about its success rate. It didn’t help on my test.

C:\Program Files\Duplicati 2>Duplicati.CommandLine.exe help rebuild-missing-dblock-files
  --rebuild-missing-dblock-files (Boolean): Rebuild dblock files when missing
    If dblock files are missing from the destination, you can attempt to
    rebuild them using local source data. However, since the local data may
    have changed, it may not be possible to retrieve all the required data
    and the process may be slow. Use this option to attempt to rebuild
    missing dblock files.
    * default value: false

C:\Program Files\Duplicati 2>

BTW your new reduced file size is a nice even binary value (multiple of 8 KB I think). Filesystem damage?


Is this still an available command to use? I created a problem for myself by un-checking a folder in the source data selection, panicking when Duplicati started removing that data from B2, and forcefully stopping the process in the middle of it. (I incorrectly thought smart retention would keep that data on B2 from an earlier backup.) The result was missing dblock files on the next run. I tried creating dummy dblock files with the same names, but I just ended up with hash errors.

I’m mostly just concerned in general about bad/missing dblock files breaking the backup.



The very-new Duplicati.CommandLine.exe help advanced seems to think it’s still available, and shows the limitations mentioned in the release notice cited earlier. I can’t say I’ve tried it though.

Before letting this loose, you might consider having B2 snapshot your files, or add some other protection. Possibly the good news in safety is that it used to be automatic, so probably harmless but not too helpful.

The AFFECTED command can show what sort of impact letting dblock files simply stay missing can cause, and right below it, the --list-broken-files and --purge-broken-files may help. Disaster Recovery shows use.

This should only affect the future backups (unlike CrashPlan and perhaps some other backup programs). There are quite a few forum posts from people who are told about extra steps needed to get fast deletion.
Possibly your backups are still healthy enough to start down the Restore path to look at what’s available? Retention (smart or not) should only delete complete backup versions. I don’t think they would cut folders.


This should only affect the future backups

Yeah, that’s what I was originally expecting. Perhaps my issue came about because I didn’t have one complete backup yet. That’s why I unchecked that folder in the first place – to get a faster complete backup on the other more important data. I’ll definitely make use of B2 snapshotting next time.