In “Uploaded with size 26935296 but should be 52344694”, the 52344694 is probably a full 50 MiB dblock. 26935296 is 0x19B0000 and is the size that was seen by Duplicati when it first saw the file had uploaded. Seeing files with sizes truncated to even binary values makes me think of file operations. Maybe caches?
Comparing current sizes from the client and server views could be helpful. Same goes for the open files. Implication of fixing an open file by killing the server process suggests file open at server. What of client?
For a view of what Duplicati thinks ought to be in the backup, you can set –upload-verification-file to read manually with a text viewer, or to check all your files using /usr/lib/duplicati/utility-scripts/DuplicatiVerify.py.
Not using Samba here as either client or server, so can’t say much on fixes if this turns out to be Samba. Maybe after you characterize this problem further, a wider Internet search will find some information on it.
Please verify the sha256 hash?
Restore failed 2 files
were quite an extensive chase, but I think the final solution was to stop using CIFS and start using NFS…