Stuck with "size was expected..." error

For some time one of my backups stopped working.
I keep getting an error when I run it (at the Deleting unwanted files stage):
The file duplicati-b5087f7ba32974df0b904e7be79cfa967.dblock.zip was downloaded and had size 185597952 but the size was expected to be 196755197

In the log I see:
Warnings 2
• 2022-05-14 13:18:53 +03 - [Warning-Duplicati.Library.Main.Operation.FilelistProcessor-MissingRemoteHash]: remote file duplicati-b5087f7ba32974df0b904e7be79cfa967.dblock.zip is listed as Verified with size 185597952 but should be 196755197, please verify the sha256 hash “yMYGZbwRkfhEwpUtQ2XRkv1FG+AgnyVjZDGfiVAHaxw=”
• 2022-05-14 13:19:12 +03 - [Warning-Duplicati.Library.Compression.FileArchiveZip-BrokenCentralHeaderFallback]: Zip archive appears to have a broken Central Record Header, switching to stream mode
Errors 1
• 2022-05-14 13:19:12 +03 - [Error-Duplicati.Library.Main.Operation.RepairHandler-RemoteFileVerificationError]: Failed to perform verification for file: duplicati-b5087f7ba32974df0b904e7be79cfa967.dblock.zip, please run verify; message: Object reference not set to an instance of an object.
And the complete log is:

            {
  "MainOperation": "Repair",
  "RecreateDatabaseResults": null,
  "ParsedResult": "Error",
  "Version": "2.0.6.102 (2.0.6.102_canary_2022-04-06)",
  "EndTime": "2022-05-14T10:20:18.923435Z",
  "BeginTime": "2022-05-14T10:18:51.5536089Z",
  "Duration": "00:01:27.3698261",
  "MessagesActualLength": 7,
  "WarningsActualLength": 2,
  "ErrorsActualLength": 1,
  "Messages": [
    "2022-05-14 13:18:51 +03 - [Information-Duplicati.Library.Main.Controller-StartingOperation]: The operation Repair has started",
    "2022-05-14 13:18:52 +03 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: List - Started:  ()",
    "2022-05-14 13:18:52 +03 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: List - Completed:  (201 bytes)",
    "2022-05-14 13:18:53 +03 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Get - Started: duplicati-b5087f7ba32974df0b904e7be79cfa967.dblock.zip ()",
    "2022-05-14 13:19:12 +03 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Get - Completed: duplicati-b5087f7ba32974df0b904e7be79cfa967.dblock.zip (177.00 MB)",
    "2022-05-14 13:20:16 +03 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Started: duplicati-ibf948f0611c44fa9aed2935054ce4503.dindex.zip (244.82 KB)",
    "2022-05-14 13:20:18 +03 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Completed: duplicati-ibf948f0611c44fa9aed2935054ce4503.dindex.zip (244.82 KB)"
  ],
  "Warnings": [
    "2022-05-14 13:18:53 +03 - [Warning-Duplicati.Library.Main.Operation.FilelistProcessor-MissingRemoteHash]: remote file duplicati-b5087f7ba32974df0b904e7be79cfa967.dblock.zip is listed as Verified with size 185597952 but should be 196755197, please verify the sha256 hash \"yMYGZbwRkfhEwpUtQ2XRkv1FG+AgnyVjZDGfiVAHaxw=\"",
    "2022-05-14 13:19:12 +03 - [Warning-Duplicati.Library.Compression.FileArchiveZip-BrokenCentralHeaderFallback]: Zip archive appears to have a broken Central Record Header, switching to stream mode"
  ],
  "Errors": [
    "2022-05-14 13:19:12 +03 - [Error-Duplicati.Library.Main.Operation.RepairHandler-RemoteFileVerificationError]: Failed to perform verification for file: duplicati-b5087f7ba32974df0b904e7be79cfa967.dblock.zip, please run verify; message: Object reference not set to an instance of an object."
  ],
  "BackendStatistics": {
    "RemoteCalls": 3,
    "BytesUploaded": 250697,
    "BytesDownloaded": 185597952,
    "FilesUploaded": 1,
    "FilesDownloaded": 1,
    "FilesDeleted": 0,
    "FoldersCreated": 0,
    "RetryAttempts": 0,
    "UnknownFileSize": 0,
    "UnknownFileCount": 0,
    "KnownFileCount": 200,
    "KnownFileSize": 90387706446,
    "LastBackupDate": "2022-04-22T20:39:56+03:00",
    "BackupListCount": 16,
    "TotalQuotaSpace": 1990727823360,
    "FreeQuotaSpace": 801719767040,
    "AssignedQuotaSpace": -1,
    "ReportedQuotaError": false,
    "ReportedQuotaWarning": false,
    "MainOperation": "Repair",
    "ParsedResult": "Success",
    "Version": "2.0.6.102 (2.0.6.102_canary_2022-04-06)",
    "EndTime": "0001-01-01T00:00:00",
    "BeginTime": "2022-05-14T10:18:51.5536089Z",
    "Duration": "00:00:00",
    "MessagesActualLength": 0,
    "WarningsActualLength": 0,
    "ErrorsActualLength": 0,
    "Messages": null,
    "Warnings": null,
    "Errors": null
  }
}

Trying database → repair give an error that look the same:
Warnings 2
• 2022-05-14 13:50:14 +03 - [Warning-Duplicati.Library.Main.Operation.FilelistProcessor-MissingRemoteHash]: remote file duplicati-b5087f7ba32974df0b904e7be79cfa967.dblock.zip is listed as Verified with size 185597952 but should be 196755197, please verify the sha256 hash “yMYGZbwRkfhEwpUtQ2XRkv1FG+AgnyVjZDGfiVAHaxw=”
• 2022-05-14 13:50:32 +03 - [Warning-Duplicati.Library.Compression.FileArchiveZip-BrokenCentralHeaderFallback]: Zip archive appears to have a broken Central Record Header, switching to stream mode
Errors 1
• 2022-05-14 13:50:32 +03 - [Error-Duplicati.Library.Main.Operation.RepairHandler-RemoteFileVerificationError]: Failed to perform verification for file: duplicati-b5087f7ba32974df0b904e7be79cfa967.dblock.zip, please run verify; message: Object reference not set to an instance of an object.

What can I do to get the backup working again?
I use the latest 2.0.6.102_canary_2022-04-06

185597952 is a nice binary-even 0xB100000 which usually makes me suspect filesystem problems.
What sort of storage is this? How is it accessed, e.g. is it over SMB share? What OS is being used?

It’s not clear how many bad files exist. duplicati-b5087f7ba32974df0b904e7be79cfa967.dblock.zip definitely is looking bad. If you like, try it with some .zip program. Or just assume bad, and delete.

Recovering by purging files is basically list-broken-files to see what was broken by the dblock loss.
purge-broken-files deletes those from the backup. You can run both of those in GUI Commandline.

1 Like

@ts678 this was supper helpful I have my backup running again!!

I was not aware of the command line feature and these two commands.

My files are stored in a samba share served by my router (running openwrt).
Opening that zip file in a winrar gave “Unexpected end of archive” error.

I was able to retore some of its content, but it did not help. I still got errors.
At the end, I deleted the bad file then run database restore + delete and recreate I still got errors.
It looked to me as the purge-broken-files did nothing (not even used CPU, disk, network) for a very long time until I force stopped it.

I think that the error stopped only after manually running the backup which now runs with no errors.

Thank!

SMB tends to get short files occasionally. It’s thought to be due to poor or late OS error reporting.
Some other protocol might be more reliable, but there’s now an earlier sanity check on file errors.

Release: 2.0.6.103 (canary) 2022-06-12

Added check to detect partially written files on network drives, thanks @dkrahmer

Above Canary is already old, and Canary are always a first release of code, so issues may exist.

You were supposed to delete then list-broken-files using the current database, which knows files.
Delete and recreate won’t work because you’ve destroyed all information about the bad dblock…

I’m not confident that all is well. If you can’t recreate the database, disaster recovery will also fail.
Did you happen to save a copy of the old database? If so, there’s still time to go down good path.

if by this you mean ‘deleted from the backend by end using system (non-duplicati) command’ (such as rm or del), it’s something one should do only if being an expert; I’d even say a master.

I had a problem on a backup this week with a strange message (“Detected non-empty blocksets with no associated blocks!”) and solved it by using the ‘delete’ fonction in the command line tool of the Web UI.

Just asking for the version 0 in the additional expert parameters of the ‘delete’ command got rid of the last backup that had crashed and everything started to work again after that - no need to repair the database.
Always try this first if your last backup is so bad that it stops Duplicati for work.

It’s a reasonable alternative first attempt, maybe drops a bit more data, but might be a bit simpler.
Its advantage increases if destination isn’t manually reachable, as Duplicati does all the access…

Agree in general, but in this case a file was well-proven corrupted, so keeping it has little purpose.
The list-broken-files command seems to require a file deleted to actually work as it’s named.

I just tested it on a file that I shortened in a hex editor, and it (maybe trying to be helpful?) just said:

Listing remote folder …

remote file duplicati-b7d1c2140e1bd4aa99eb4e913a8abff5e.dblock.zip is listed as Verified with size 1137 but should be 1138, please verify the sha256 hash “QSkC76OgfA17DE6MzuIE/GX/E7vgRnEJ/aOxVjje1G8=”

If I delete the dead file, then I get what I wanted:

Listing remote folder ...
0 : 6/19/2022 8:17:27 PM (2 match(es))
C:\backup source\A.txt (1 bytes)
C:\backup source\short.txt (19 bytes)

purge-broken-files would then delete only the files from the backup that are affected by the bad file. Potentially this would be less than the entire backup version, so it’s a more tailored way of cleaning up.

Alternatively one could use The AFFECTED command to see what dead dblock affects, but to actually clean backup up by purging affected source files and versions would still take a purge-broken-files, which I think has the same need for the file to be deleted, so for this route, might as well delete if dead.