I have Duplicati running as service on Linux Mint. I backup to a share on a samba server.
The backup always posts an error like this:
2019-04-26 20:40:58 -05 - [Error-Duplicati.Library.Main.Operation.TestHandler-FailedToProcessFile]: Failed to process file duplicati-bda9629895ad44012af979ea3333cef45.dblock.zip
I get a number of warnings like this:
2019-04-26 19:02:21 -05 - [Warning-Duplicati.Library.Main.Operation.FilelistProcessor-MissingRemoteHash]: remote file duplicati-b4bf3dd54f55f4b03a2fa3e06ed8d03ee.dblock.zip is listed as Uploaded with size 26935296 but should be 52344694, please verify the sha256 hash “SbYZrYWBKTv6HirKw04AkdRo28fJv4dwIHIGUs53BVY=”
After the backup ends a file is left open. In this case:
duplicati-b8a0f36297a604c4c890e7b57f1b880d1.dblock.zip
In this case the file left open is not one there was an error or a warning for.
The only way I have found to close the file is to kill the smb process on the server. Even restarting the service on the Linux machine does not resolve the problem.
Has anyone else encountered this? Any suggestions?
In “Uploaded with size 26935296 but should be 52344694”, the 52344694 is probably a full 50 MiB dblock. 26935296 is 0x19B0000 and is the size that was seen by Duplicati when it first saw the file had uploaded. Seeing files with sizes truncated to even binary values makes me think of file operations. Maybe caches?
Comparing current sizes from the client and server views could be helpful. Same goes for the open files. Implication of fixing an open file by killing the server process suggests file open at server. What of client?
For a view of what Duplicati thinks ought to be in the backup, you can set –upload-verification-file to read manually with a text viewer, or to check all your files using /usr/lib/duplicati/utility-scripts/DuplicatiVerify.py.
Not using Samba here as either client or server, so can’t say much on fixes if this turns out to be Samba. Maybe after you characterize this problem further, a wider Internet search will find some information on it.