Backup file size issue

Hello,

I am using Duplicati to create backup of my files on pc to a local server. all works well but at the end I get warning like this everyday when backup is created

  • 2021-08-17 12:06:53 +05 - [Warning-Duplicati.Library.Main.Operation.FilelistProcessor-MissingRemoteHash]: remote file duplicati-b4be9d218ed314918a40b7dcfd611c2ee.dblock.zip.aes is listed as Uploaded with size 45088768 but should be 52418413, please verify the sha256 hash “yUv7E7ONgSC/XI4RlJQ+2pv60o/VdvWvy+opTYuBGls=”

I tried to check the file size manually and it is actually 45088768. it would be great if someone can suggest how to get this fixed

version: Duplicati - 2.0.6.3_beta_2021-06-17

Thanks

What kind of local server? What protocol are you using (what is “Storage Type” set to on the Destination page of your backup config)?

Thanks alot for replying !!

Its just a permission based local server for which access is provided to me by my employer to store files as per my need. the server is maintained by other departments. Storage type is set to “Local Folder or Drive” after which i just pasted the address as “\(double slash)companyname.com\yyy\zzz”.

I am not sure what exactly do you mean by protocol, in the configuration i did not had to specify any protocol while creating this backup scheduler. for encryption I selected "AES-256, built- in’.

Might be good (or maybe not). Is there a professional IT department, with expertise and time to help?

Because I suspect issue is within your system and the server, can you or other departments help any?
A simple way to bypass SMB problems might exist if there is some other access, e.g. NFS, SFTP, etc.

Seeing what Duplicati wrote is very possible with Sysinternals Process Monitor, except it can be rather memory-intensive if it has to run a long time. Can you reproduce this issue with a nicely small backup?
If so, just monitor all activity to destination folder. When issue happens, look for the file that was named.

Alternatively, you can see if you can reproduce it with Duplicati.CommandLine.BackendTester.exe with target URL taken from Export As Command-line and then edited to point to an empty folder for this test.

You can also look for other wrong-sized files, e.g. doing The TEST command in GUI Commandline with larger sample size in the Commandline arguments box (or even ask for all if you’re willing to do that).

45088768 is a suspiciously even 2B00000 in hexadecimal, and seeing this always makes me suspect a filesystem or SMB problem, because file operations tend to work in large powers of two, from what I see. Specific design details and capabilities may depend on things like the SMB version used, and its settings.

To improve performance, there is also a lot of caching, so one question is whether doing less would help.

When to use SMB WriteThrough in Windows Server 2019
Controlling write-through behaviors in SMB

1 Like

I wonder if the hash is different. Could be that Duplicati could use a fix here. Maybe the allocation size used with that drive’s format is different. Don’t think I’ve tested Duplicati with a different size. But if Duplicati is having a problem putting together that file for some reason then it could be off as well depending on what’s going on the the total size variable.

Though MissingRemoteHash is kind of weird as well.

1 Like

Previously I would have said that the wrong size guarantees that, but we’ve seen SMB actually have the whole file with the correct hash, yet give the wrong size, which leads to a misleadingly-named message.

Corrupted files on shared folder in local LAN #4076

(So, sha256 the file, then Hex to Base64 and compare with the log entry)
All three files actually have the sha256 hash reported in the log as above

There are lots more debug ideas there, including Wireshark. The user didn’t pursue, and didn’t find a fix.
If @Rahul_Kumar_Sharma or the IT department (?) can help, that’d be great. I can’t do this remotely…
On a Windows test I just ran, I used “Hash Tool” from DigitalVolcano Software, from the Microsoft Store.
Hex to Base64: Encode and decode bytes online was then run. If you like other tools, run those instead.

I suppose you can also just try reading the file into something that will show how much of it is truly there.
You’d want something that doesn’t mind binary files, maybe a hex editor. Or just try to decrypt the file in AES Crypt. If that works, then it’s almost certainly not chopped short, though you don’t have actual hash.

1 Like