I have a single dblock file with a problem:
[Warning-Duplicati.Library.Main.Operation.FilelistProcessor-MissingRemoteHash]: remote file duplicati-b047d06492fa346f29275216588b2302c.dblock.zip is listed as Verified with size 41549824 but should be 52366617, please verify the sha256 hash “uzRIrCaJfyQ3Q3VasCRjcbBB8D2i6HqtBzehfRmdnlo=”
The file is corrupted, and cannot be extracted by unzip.
I wonder, if I could try --rebuild-missing-dblock-files to recreate the file, because the original files affected by the block still exists.
I have tried to add to the backup, but I get the same warning.
If I move the dblock file outside the destination and retry the backup with --rebuild-missing-dblock-files I get a warning, but no rebuild of dblock.
C:\Program Files\Duplicati 2>Duplicati.CommandLine.exe help rebuild-missing-dblock-files
--rebuild-missing-dblock-files (Boolean): Rebuild dblock files when missing
If dblock files are missing from the destination, you can attempt to
rebuild them using local source data. However, since the local data may
have changed, it may not be possible to retrieve all the required data
and the process may be slow. Use this option to attempt to rebuild
missing dblock files.
* default value: false
C:\Program Files\Duplicati 2>
BTW your new reduced file size is a nice even binary value (multiple of 8 KB I think). Filesystem damage?
Is this still an available command to use? I created a problem for myself by un-checking a folder in the source data selection, panicking when Duplicati started removing that data from B2, and forcefully stopping the process in the middle of it. (I incorrectly thought smart retention would keep that data on B2 from an earlier backup.) The result was missing dblock files on the next run. I tried creating dummy dblock files with the same names, but I just ended up with hash errors.
I’m mostly just concerned in general about bad/missing dblock files breaking the backup.
The very-new 2.0.3.12 Duplicati.CommandLine.exe help advanced seems to think it’s still available, and shows the limitations mentioned in the 2.0.3.10 release notice cited earlier. I can’t say I’ve tried it though.
Before letting this loose, you might consider having B2 snapshot your files, or add some other protection. Possibly the good news in safety is that it used to be automatic, so probably harmless but not too helpful.
The AFFECTED command can show what sort of impact letting dblock files simply stay missing can cause, and right below it, the --list-broken-files and --purge-broken-files may help. Disaster Recovery shows use.
This should only affect the future backups (unlike CrashPlan and perhaps some other backup programs). There are quite a few forum posts from people who are told about extra steps needed to get fast deletion.
Possibly your backups are still healthy enough to start down the Restore path to look at what’s available? Retention (smart or not) should only delete complete backup versions. I don’t think they would cut folders.
Yeah, that’s what I was originally expecting. Perhaps my issue came about because I didn’t have one complete backup yet. That’s why I unchecked that folder in the first place – to get a faster complete backup on the other more important data. I’ll definitely make use of B2 snapshotting next time.
I have now tried to add the rebuild-missing-dblock-files=true to the backup, since september, but still have the problem:
" 2018-11-27 00:52:36 +01 - [Warning-Duplicati.Library.Main.Operation.FilelistProcessor-MissingRemoteHash]: remote file duplicati-b047d06492fa346f29275216588b2302c.dblock.zip is listed as Verified with size 41549824 but should be 52366617, please verify the sha256 hash “uzRIrCaJfyQ3Q3VasCRjcbBB8D2i6HqtBzehfRmdnlo=”,"
Does anyone have a suggestion to rebuild this file?
It is not clea to me, if --rebuild-missing-dblock-files has to be added to backup or another opereation.
/Thomas