Backup file size issue

Might be good (or maybe not). Is there a professional IT department, with expertise and time to help?

Because I suspect issue is within your system and the server, can you or other departments help any?
A simple way to bypass SMB problems might exist if there is some other access, e.g. NFS, SFTP, etc.

Seeing what Duplicati wrote is very possible with Sysinternals Process Monitor, except it can be rather memory-intensive if it has to run a long time. Can you reproduce this issue with a nicely small backup?
If so, just monitor all activity to destination folder. When issue happens, look for the file that was named.

Alternatively, you can see if you can reproduce it with Duplicati.CommandLine.BackendTester.exe with target URL taken from Export As Command-line and then edited to point to an empty folder for this test.

You can also look for other wrong-sized files, e.g. doing The TEST command in GUI Commandline with larger sample size in the Commandline arguments box (or even ask for all if you’re willing to do that).

45088768 is a suspiciously even 2B00000 in hexadecimal, and seeing this always makes me suspect a filesystem or SMB problem, because file operations tend to work in large powers of two, from what I see. Specific design details and capabilities may depend on things like the SMB version used, and its settings.

To improve performance, there is also a lot of caching, so one question is whether doing less would help.

When to use SMB WriteThrough in Windows Server 2019
Controlling write-through behaviors in SMB

1 Like