File length is invalid?

I still have some interest in trying to understand (partly because you made an excellent find in the event log), however any chase will probably not end in a direct code fix, but might be convertible into an issue in GitHub.

Unfortunately the final ending of System.Exception with get_CurrentScope at stack top is probably something beyond my knowledge of the code, but I can do a bit to identify the path leading up to that (which might help).

03:49:04 +01 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Get - Failed: duplicati-b59269cd903f3457fa4d0a28c5a78298c.dblock.zip.aes (49,48 MB)
03:49:04 +01 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Get - Started: duplicati-b067629cc95ea4ed1a5117368a3b237da.dblock.zip.aes (49,41 MB)
03:49:25 +01 - [Error-Duplicati.Library.Main.Operation.BackupHandler-FatalError]: Fatal error
03:49:27 +01 - [Information-Duplicati.Library.Modules.Builtin.SendMail-SendMailComplete]: Email sent successfully using server: smtp://smtp.gmail.com:587/?starttls=when-available
(end of Information level log -- the rest is in event log)
03:49:49 WriteRetryMessage led to The process was terminated due to an unhandled exception.
03:52:08 WriteRetryMessage led to The process was terminated due to an unhandled exception.

ended almost (except for timing) like a continuation of the earlier retry pattern (five tries) leading to “Failed”.
The WriteRetryMessage format is like “Operation {0} with file {1} attempt {2} of {3} failed with message: {4}”
Below is an example from my Profiling level log. Raising your log level to there (or to Retry) would log more.

2018-12-05 11:51:45 -05 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Get - Started: duplicati-bd24ba5d9c516442a9cab8ae8abaea396.dblock.zip (273.68 KB)
2018-12-05 11:52:09 -05 - [Profiling-Duplicati.Library.Main.BackendManager-DownloadSpeed]: Downloaded 273.68 KB in 00:00:24.0206847, 11.39 KB/s
2018-12-05 11:52:09 -05 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Get - Completed: duplicati-bd24ba5d9c516442a9cab8ae8abaea396.dblock.zip (273.68 KB)
2018-12-05 11:52:09 -05 - [Profiling-Timer.Finished-Duplicati.Library.Main.BackendManager-RemoteOperationGet]: RemoteOperationGet took 0:00:00:24.022
2018-12-05 11:52:09 -05 - [Retry-Duplicati.Library.Main.BackendManager-RetryGet]: Operation Get with file duplicati-bd24ba5d9c516442a9cab8ae8abaea396.dblock.zip attempt 4 of 5 failed with message: Hash mismatch on file "C:\WINDOWS\TEMP\dup-fa172149-0e78-4536-8fe8-184416f441cc", recorded hash: kX+m7GbjoomBrjyx9S83PDRtqKEsXiFB9UjNItn+tn4=, actual hash ZMu04I7abDfkQpbaEmtMfSazb1otEhgMrQan+skANHE=
Duplicati.Library.Main.BackendManager+HashMismatchException: Hash mismatch on file "C:\WINDOWS\TEMP\dup-fa172149-0e78-4536-8fe8-184416f441cc", recorded hash: kX+m7GbjoomBrjyx9S83PDRtqKEsXiFB9UjNItn+tn4=, actual hash ZMu04I7abDfkQpbaEmtMfSazb1otEhgMrQan+skANHE=
   at Duplicati.Library.Main.BackendManager.DoGet(FileEntryItem item)
   at Duplicati.Library.Main.BackendManager.ThreadRun()
2018-12-05 11:52:09 -05 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Get - Retrying: duplicati-bd24ba5d9c516442a9cab8ae8abaea396.dblock.zip (273.68 KB)
2018-12-05 11:52:19 -05 - [Profiling-Timer.Begin-Duplicati.Library.Main.BackendManager-RemoteOperationGet]: Starting - RemoteOperationGet
2018-12-05 11:52:19 -05 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Get - Started: duplicati-bd24ba5d9c516442a9cab8ae8abaea396.dblock.zip (273.68 KB)
2018-12-05 11:52:43 -05 - [Profiling-Duplicati.Library.Main.BackendManager-DownloadSpeed]: Downloaded 273.68 KB in 00:00:24.0197099, 11.39 KB/s
2018-12-05 11:52:43 -05 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Get - Completed: duplicati-bd24ba5d9c516442a9cab8ae8abaea396.dblock.zip (273.68 KB)
2018-12-05 11:52:43 -05 - [Profiling-Timer.Finished-Duplicati.Library.Main.BackendManager-RemoteOperationGet]: RemoteOperationGet took 0:00:00:24.022
2018-12-05 11:52:43 -05 - [Retry-Duplicati.Library.Main.BackendManager-RetryGet]: Operation Get with file duplicati-bd24ba5d9c516442a9cab8ae8abaea396.dblock.zip attempt 5 of 5 failed with message: Hash mismatch on file "C:\WINDOWS\TEMP\dup-fe2d0ecc-9f5d-4584-8e67-50c723c1a598", recorded hash: kX+m7GbjoomBrjyx9S83PDRtqKEsXiFB9UjNItn+tn4=, actual hash ZMu04I7abDfkQpbaEmtMfSazb1otEhgMrQan+skANHE=
Duplicati.Library.Main.BackendManager+HashMismatchException: Hash mismatch on file "C:\WINDOWS\TEMP\dup-fe2d0ecc-9f5d-4584-8e67-50c723c1a598", recorded hash: kX+m7GbjoomBrjyx9S83PDRtqKEsXiFB9UjNItn+tn4=, actual hash ZMu04I7abDfkQpbaEmtMfSazb1otEhgMrQan+skANHE=
   at Duplicati.Library.Main.BackendManager.DoGet(FileEntryItem item)
   at Duplicati.Library.Main.BackendManager.ThreadRun()
2018-12-05 11:52:43 -05 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Get - Failed: duplicati-bd24ba5d9c516442a9cab8ae8abaea396.dblock.zip (273.68 KB)

What this demonstrates is that “Get - Failed” can also be declared when the download to a temporary file doesn’t have the expected hash. It’s again not immediately known if records are wrong, destination had a problem that corrupted that file, something happened in upload or download, or something else caused it.

The other good question (which I can’t answer) is why a download is still retrying after job should be done. Before that, the question would be why the failure of a dblock to download just went on to another dblock. There is a similar bug of failed dblock upload not ending the backup. See link under DuplicatiVerify below.

Although we won’t know for sure if your case would show similar details (if you turned logging higher), the methods for a comprehensive search for destination file problems include the test command, with all for sample count. This is the same test that verification step after backup runs, except turned up and manual. Easiest way to run it is probably in job’s GUI Commandline interface, converting “backup” format for “test”. Raising –console-log-level to at least Retry seems wise. I’m not sure exactly what happens on found error.

Another way to have a comprehensive audit is more feasible if the destination can be directly read as files. Running a backup with –upload-verification-file will upload a duplicati-verification.json file after the backup. Running the DuplicatiVerify.ps1 (PowerShell) or DuplicatiVerify.py (Python) scripts will compare actual files. Sample output of a run is here. Source looks like hash fails will say *** Hash check failed for file: ", fullpath.

A non-comprehensive test would look at the specific file that failed after five tries. I don’t know how quickly your destination can give files, but for slow destinations, a comprehensive test of a big backup takes time. Service providers may have download fees. A targeted test is easier on the system, but harder on you. :wink:

To get a file, best is if your destination just lets you get it. If not, Duplicati.CommandLine.BackendTool.exe supports the storage types that Duplicati supports, although it might be harder to setup. You can find your storage URL in the job’s CommandLine “Target URL” box. After obtaining the file, you could indirectly test health by decrypting with SharpAESCrypt.exe in your Duplicati folder, or with AES Crypt. The resulting file (which might not happen) would be a zip file, which you can open or test. The file names inside will appear meaningless, though they’re actually based on the hash of that block that the backup put in the dblock file. For this path I suggest checking duplicati-b59269cd903f3457fa4d0a28c5a78298c.dblock.zip.aes

If above was way too much, you can certainly see if start-from-fresh works, or a compromise solution would involve a start-from-fresh on a copy of this job (export, import, change destination to another location, run), which might get your backup healthy faster (assuming you have the space) while we poke the troubled one. Backing up source to two destinations is not a problem, but don’t point multiple backups to one destination.

TL;DR
Which way you go is sort of up to you. You could also try pieces from above in some different arrangement. Helping understand would be a service to others who might see this, and maybe also lead to eventual fixes, however people sometimes just want to try to get back to normal ASAP, and it’s completely understandable.