Timeout and missing files on B2 with canary 2.1.0.125

New install of .125 on a new install of Fedora 41 using b2.

Yesterday I performed the first backup. I had to increase read-write-timeout as mentioned in a prior post, but the backup completed with no apparent errors. The overnight backup failed as follows:

‘’’

  • Jul 18, 2025 3:36 AM: UsageReporter failed

  • Jul 18, 2025 3:30 AM: A Task’s exception(s) were not observed either by Waiting on the Task or accessing its Exception property. As a result, the unobserved exception was rethrown by the finalizer thread. (The CancellationTokenSource has been disposed.)

  • Jul 18, 2025 2:30 AM: A Task’s exception(s) were not observed either by Waiting on the Task or accessing its Exception property. As a result, the unobserved exception was rethrown by the finalizer thread. (The CancellationTokenSource has been disposed.)

  • Jul 18, 2025 2:25 AM: A Task’s exception(s) were not observed either by Waiting on the Task or accessing its Exception property. As a result, the unobserved exception was rethrown by the finalizer thread. (The CancellationTokenSource has been disposed.)

  • Jul 18, 2025 2:21 AM: The operation Backup has failed

  • Jul 18, 2025 2:21 AM: Fatal error

  • Jul 18, 2025 2:21 AM: Found 3 files that are missing from the remote storage, please run repair

  • Jul 18, 2025 2:20 AM: A Task’s exception(s) were not observed either by Waiting on the Task or accessing its Exception property. As a result, the unobserved exception was rethrown by the finalizer thread. (The CancellationTokenSource has been disposed.)

  • Jul 18, 2025 2:20 AM: The operation Backup has started
    ‘’’

Please let me know if there is any other data I can provide.

EDIT:
Attempting to run a backup now it shows the following:

Jul 18, 2025 9:15 AM: Missing file: duplicati-i46da0b105ed14312867b16f11cd7d8c8.dindex.zip.aes
Jul 18, 2025 9:15 AM: Missing file: duplicati-b58364c23f0ca48a9955cd2f6155926fb.dblock.zip.aes
Jul 18, 2025 9:15 AM: Missing file: duplicati-20250718T062000Z.dlist.zip.aes 

I’ve been having timeout trouble (lots of reports filed here) for a long time on Backblaze B2, and was curious if their S3 works better. Unfortunately my bucket is too old and migration is some work (I’d probably rclone in hopes of server-side copy). Does yours look S3 enabled?

I’ve also been wondering how to observe behavior, given a lack of Duplicati or .NET logging and the challenges of HTTPS. After looking at external tools, I’m just guessing in Wireshark.

was the idea long ago, and possibly it’s invalid now, but to raise the failure rate from my prior half at default 30 second timeout (I switched to 5 minutes, but 3 was working OK), I reduced the timeout to 15 seconds, got ample timeouts, and looked at one quite closely.

From packet log, this starts at 19:16:24 and ends at 19:16:59. From profiling log, there is

2025-07-21 19:16:23 -04 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Started: duplicati-bf346b005a8934a7ba3e4d5673dac47fa.dblock.zip.aes (9.94 MiB)
2025-07-21 19:16:39 -04 - [Profiling-Duplicati.Library.Main.Backend.PutOperation-UploadSpeed]: Uploaded 9.94 MiB in 00:00:15.1426118, 671.85 KiB/s
2025-07-21 19:16:39 -04 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Completed: duplicati-bf346b005a8934a7ba3e4d5673dac47fa.dblock.zip.aes (9.94 MiB)
2025-07-21 19:16:39 -04 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Started: duplicati-ia215b46c24cc49c78082ab426a1557f5.dindex.zip.aes (5.37 KiB)
2025-07-21 19:16:39 -04 - [Profiling-Duplicati.Library.Main.Backend.PutOperation-UploadSpeed]: Uploaded 5.37 KiB in 00:00:00.1822640, 29.47 KiB/s
2025-07-21 19:16:39 -04 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Completed: duplicati-ia215b46c24cc49c78082ab426a1557f5.dindex.zip.aes (5.37 KiB)
2025-07-21 19:16:40 -04 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Started: duplicati-b907243c763ee4658b188bf445c95e46d.dblock.zip.aes (9.94 MiB)
2025-07-21 19:16:59 -04 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Failed: duplicati-b907243c763ee4658b188bf445c95e46d.dblock.zip.aes ()

with the second dblock starting about 17 seconds in. Remote volume size is 10 MB, so graph of 20 MB transfers is probably showing both dblocks, with other work in between.

From the web UI for the bucket, this looks like all files in profiling log actually made it up:

image

image

image

even though Duplicati closed the TCP connection with a FIN when it claimed a timeout.
I saw this on an earlier run too, where a dblock got a timeout, and next run uploaded its dindex. This surprised me, but I verified dblock in DB list data at start of second run.

Key question for development is: where’s the connection that is stalled for 15 seconds?

EDIT 1:

Latest test is actually on 2.1.1.0 Experimental. I set one parallel upload to simplify study.

EDIT 2:

Backblaze IP Addresses told me what I should be filtering on when setting up Wireshark.

I don’t see anything that says anything about “S3”, other than the backblaze destination site. The bucket for the backup I reported in this thread was created within the last week, if that provides any indication?

I assume this is related to the b2 timeouts - I am getting “Found n files that are missing from the remote storage, please run repair” on 1 or 2 backups every night.

  • Sometimes repair works and I can then run the backup
  • Sometimes repair does not work and does not show an error. At least once going into the backup and clicking “show log” did not show any entry for the repair.
  • When repair does not work, sometimes adding the “rebuild missing blocks” option works.
  • When repair does not work and the “rebuild missing blocks” option does not work, doing a “purge-broken-files” usually works.
  • There have been 1 or 2 times where the only way I could get the backup working again was to delete and recreate the database (I think these were on versions prior to .125 though).

It seems like it should. You can see if the Buckets list has something starting like this

image

Maybe your “backblaze destination site” is the “Endpoint”? But it’s only needed for S3.

My old one says “Buckets created before May 4, 2020 cannot be used with our S3 …”

You can test the assumption by avoiding timeouts. Mine so far just picks back up next run. Troubleshooting steps include Advanced options such as log-file=<path> log-file-log-level=retry or live log at Information level. Looking in B2 UI as I did is also possible, and database bug report might be an option, especially if the dev gets into this situation…

My buckets all show:

Endpoint:s3.us-west-000.backblazeb2.com

I would love to avoid timeouts - but is a timeout of more than 5 minutes reasonable? I have the read-write-timeout set at the global level - I suppose I could try a longer timeout on backups that show this issue? But it seems to me that there must be something more fundamental going on with error detection as I would expect the default of 30 seconds to be sufficient for a stable network connection, and 5 retries (= 2.5 minutes total) to be enough to handle any typical network glitch due to route updates, etc. And if there is a network glitch that takes more than 2.5 minutes to resolve, I would expect the backup to fail rather than succeed and the next attempt to fail due to missing files.

The backups in question “pick back up” but only after I clear the error with manual intervention.

Thanks for paying attention to this!

I tried to add a new backup using S3 Compatible and my b2 credentials, but “Test Connection” fails.

I tried using:

Storage Type: S3 Compatible
Use SSL:  [checked]
Server: Custom server url (s3.us-west-000.backblazeb2.com)
             s3.us-west-000.backblazeb2.com
Bucket name: Duplicati-scanner
Bucket region: (default) ()
Storage class: (default) ()
Folder path: /scanner
AWS Access ID: my b2 id here
AWS Access Key: •••••••••••••••••••••••••••••••
Client library to use: Amazon AWS SDK

What do I need to change/update?

I have the same concern, but it would be nice to know any ugly workaround.
3 minutes may be enough for me. For you, 5 isn’t enough, and end is worse.

You can certainly force a nice hard error by disconnecting from network mid-upload.
There is some complicated state tracking and fix-ups next run that should cover you.
In job database, an upload goes from Uploading to Uploaded to Verified (fie in a list).

I’ve been lucky sometimes (maybe in one of these tests such as I described) where the upload worked even though Duplicati never heard that it finished, or called timeout, etc.

Below is a series of example adjustments. Apparently they auto-help me more than you.

2025-07-21 19:09:33 -04 - [Information-Duplicati.Library.Main.Operation.FilelistProcessor-RemoteUnwantedMissingFile]: Removing file listed as Deleting: duplicati-20250721T224032Z.dlist.zip.aes
2025-07-21 19:09:33 -04 - [Information-Duplicati.Library.Main.Operation.FilelistProcessor-KeepIncompleteFile]: Keeping protected incomplete remote file listed as Temporary: duplicati-20250721T224807Z.dlist.zip.aes
2025-07-21 19:09:33 -04 - [Information-Duplicati.Library.Main.Operation.FilelistProcessor-PromotingCompleteFile]: Promoting uploaded complete file from Uploading to Uploaded: duplicati-bd0a9393500834b04b4db2562d5f4b730.dblock.zip.aes
2025-07-21 19:09:33 -04 - [Information-Duplicati.Library.Main.Operation.FilelistProcessor-SchedulingMissingFileForDelete]: Scheduling missing file for deletion, currently listed as Uploading: duplicati-bf182fa6b565e4c7abc1ceba34b0fd576.dblock.zip.aes
2025-07-21 19:09:34 -04 - [Information-Duplicati.Library.Main.Operation.FilelistProcessor-SchedulingMissingFileForDelete]: Scheduling missing file for deletion, currently listed as Uploading: duplicati-id2f83889c49c4d70a0e2e6d27aafd31a.dindex.zip.aes
2025-07-21 19:09:34 -04 - [Information-Duplicati.Library.Main.Operation.FilelistProcessor-SchedulingMissingFileForDelete]: Scheduling missing file for deletion, currently listed as Uploading: duplicati-b4443f39bc7d2490998b6ebdfd809e311.dblock.zip.aes
2025-07-21 19:09:34 -04 - [Information-Duplicati.Library.Main.Operation.FilelistProcessor-RemoteUnwantedMissingFile]: Removing file listed as Temporary: duplicati-i1b784ed68d634139afd4f16bbdff8443.dindex.zip.aes
2025-07-21 19:09:34 -04 - [Information-Duplicati.Library.Main.Operation.FilelistProcessor-RemoteUnwantedMissingFile]: Removing file listed as Temporary: duplicati-i662c99aa24df4b6ea07dc698365ee9d9.dindex.zip.aes

That looks plausible except folder paths beginning with / have always been troublesome, however if you got away with it before, maybe it will still work. Can it Test connection? Next gentle step could be a Verify files. If that all works, then maybe test a backup?

There’s a chance that your application key isn’t S3 enabled, but new keys can be gotten.

That appears to have been the problem. The leading / doesn’t seem to be a problem with the b2 backend, but the s3 backend doesn’t like it.

It occurred to me that there is a simple test I could - and should - have performed:

rclone ls b2-scanner:Duplicati-scanner/scanner | grep dlist
 15021821 duplicati-20250717T232834Z.dlist.zip.aes

i.e. check to confirm whether or not the dlist file really is missing. It ends up that, yes, the dlist file it is looking for does not exist. However, an older dlist file from a fileset that was deleted does exist.

I’m thinking at this point I should just delete any existing filesets, delete any files that are left in the bucket, and start over.

EDIT:
Apparently the existing dlist file is correct - it is the 1st backup which reported no errors. The missing dlist file is for the 2nd backup which is causing the problem.

This looks like an alternative to the B2 web UI method, or looking in database at a list which BTW is also available in server live log and (for history, but can require scrolling) in <job> → Show log → Remote. Trying to line up log sources with actual might get insight.

One can always do that if willing, but I can’t predict whether it will improve the results any.

I have this specific backup running again - or at least trying - after:

  • repair - fails, suggests using the rebuild missing blocks option
  • repair w/rebuild missing blocks - succeeds, reports issues and suggests using purge-broken-files
  • purge-broken-files - works
  • delete --version=0 - because I don’t like that fileset

I will enable a log file for one of the other backups that has been complaining about missing files on a more frequent basis.

Was this on Backblaze B2 S3 API? Does it seem to timeout less than on B2 native API?

I went to look for a simple tester for B2, and wound up using the BackendTester tool like:

C:\Duplicati\duplicati-2.1.0.124_canary_2025-07-11-win-x64-gui>Duplicati.CommandLine.BackendTester "b2://BUCKET/timeout?auth-username=REDACTED&auth-password=REDACTED&read-write-timeout=15" --reruns=1
7/23/2025 8:54:31 PM
[20:54:31 774] Starting run no 0
[20:54:34 153] Read Write Timeout set to 60000 ms
[20:54:34 155] Generating file 0 (27.02 MiB)
[20:54:34 637] Generating file 1 (41.98 MiB)
[20:54:35 438] Generating file 2 (15.40 MiB)
[20:54:35 686] Generating file 3 (45.93 MiB)
[20:54:36 455] Generating file 4 (7.67 MiB)
[20:54:36 587] Generating file 5 (5.95 MiB)
[20:54:36 722] Generating file 6 (7.18 MiB)
[20:54:36 819] Generating file 7 (28.29 MiB)
[20:54:37 405] Generating file 8 (33.61 MiB)
[20:54:38 048] Generating file 9 (37.19 MiB)
[20:54:38 711] Uploading wrong files ...
[20:54:38 711] Generating file 10 (1.26 KiB)
[20:54:38 727] Uploading file 0, 1.26 KiB ...  done! in 591 ms (~2.14 KiB/s)
[20:54:39 320] Uploading file 0, 1.26 KiB ...  done! in 581 ms (~2.17 KiB/s)
[20:54:39 902] Uploading file 9, 1.26 KiB ...  done! in 341 ms (~3.70 KiB/s)
[20:54:40 244] Uploading files ...
[20:54:40 244] Uploading file 0, 27.02 MiB ...  done! in 7309 ms (~3.70 MiB/s)
[20:54:47 554] Uploading file 1, 41.98 MiB ... [20:55:21 500] Failed to upload file 1, error message: System.TimeoutException: The operation has timed out.
   at Duplicati.StreamUtil.TimeoutObservingStream.ReadImplAsync(Byte[] buffer, Int32 offset, Int32 count, CancellationToken cancellationToken)
   at Duplicati.StreamUtil.ThrottleEnabledStream.ReadAsync(Byte[] buffer, Int32 offset, Int32 count, CancellationToken cancellationToken)
   at Duplicati.StreamUtil.TimeoutObservingStream.ReadImplAsync(Byte[] buffer, Int32 offset, Int32 count, CancellationToken cancellationToken)
   at System.IO.Stream.<CopyToAsync>g__Core|27_0(Stream source, Stream destination, Int32 bufferSize, CancellationToken cancellationToken)
   at System.Net.Http.StreamToStreamCopy.<CopyAsync>g__DisposeSourceAsync|1_0(Task copyTask, Stream source)
   at System.Net.Http.HttpContent.<CopyToAsync>g__WaitAsync|56_0(ValueTask copyTask)
   at System.Net.Http.HttpConnection.SendRequestContentAsync(HttpRequestMessage request, HttpContentWriteStream stream, Boolean async, CancellationToken cancellationToken)
   at System.Net.Http.HttpConnection.SendAsync(HttpRequestMessage request, Boolean async, CancellationToken cancellationToken)
   at System.Net.Http.HttpConnection.SendAsync(HttpRequestMessage request, Boolean async, CancellationToken cancellationToken)
   at System.Net.Http.HttpConnectionPool.SendWithVersionDetectionAndRetryAsync(HttpRequestMessage request, Boolean async, Boolean doRequestAuth, CancellationToken cancellationToken)
   at System.Net.Http.RedirectHandler.SendAsync(HttpRequestMessage request, Boolean async, CancellationToken cancellationToken)
   at System.Net.Http.HttpClient.<SendAsync>g__Core|83_0(HttpRequestMessage request, HttpCompletionOption completionOption, CancellationTokenSource cts, Boolean disposeCts, CancellationTokenSource pendingRequestsCts, CancellationToken originalCancellationToken)
   at Duplicati.Library.Backend.Backblaze.B2.PutAsync(String remotename, Stream stream, CancellationToken cancelToken)
   at Duplicati.Library.Utility.Utility.Await(Task task)
   at Duplicati.CommandLine.BackendTester.Program.Uploadfile(String localfilename, Int32 i, String remotefilename, IBackend backend, Boolean disableStreaming, Int64 throttle, Int32 readWriteTimeout), remote name: gHQBTkjh2eHRH9pHhQDPTjbLX after 33946 ms
[20:55:21 513] Uploading file 2, 15.40 MiB ...  done! in 2258 ms (~6.82 MiB/s)
[20:55:23 772] Uploading file 3, 45.93 MiB ...  done! in 13738 ms (~3.34 MiB/s)
[20:55:37 511] Uploading file 4, 7.67 MiB ...  done! in 9771 ms (~803.51 KiB/s)
[20:55:47 283] Uploading file 5, 5.95 MiB ...  done! in 6736 ms (~904.69 KiB/s)
[20:55:54 020] Uploading file 6, 7.18 MiB ...  done! in 5763 ms (~1.25 MiB/s)
[20:55:59 784] Uploading file 7, 28.29 MiB ... [20:56:26 539] Failed to upload file 7, error message: System.TimeoutException: The operation has timed out.
   at Duplicati.StreamUtil.TimeoutObservingStream.ReadImplAsync(Byte[] buffer, Int32 offset, Int32 count, CancellationToken cancellationToken)
   at Duplicati.StreamUtil.ThrottleEnabledStream.ReadAsync(Byte[] buffer, Int32 offset, Int32 count, CancellationToken cancellationToken)
   at Duplicati.StreamUtil.TimeoutObservingStream.ReadImplAsync(Byte[] buffer, Int32 offset, Int32 count, CancellationToken cancellationToken)
   at System.IO.Stream.<CopyToAsync>g__Core|27_0(Stream source, Stream destination, Int32 bufferSize, CancellationToken cancellationToken)
   at System.Net.Http.StreamToStreamCopy.<CopyAsync>g__DisposeSourceAsync|1_0(Task copyTask, Stream source)
   at System.Net.Http.HttpContent.<CopyToAsync>g__WaitAsync|56_0(ValueTask copyTask)
   at System.Net.Http.HttpConnection.SendRequestContentAsync(HttpRequestMessage request, HttpContentWriteStream stream, Boolean async, CancellationToken cancellationToken)
   at System.Net.Http.HttpConnection.SendAsync(HttpRequestMessage request, Boolean async, CancellationToken cancellationToken)
   at System.Net.Http.HttpConnection.SendAsync(HttpRequestMessage request, Boolean async, CancellationToken cancellationToken)
   at System.Net.Http.HttpConnectionPool.SendWithVersionDetectionAndRetryAsync(HttpRequestMessage request, Boolean async, Boolean doRequestAuth, CancellationToken cancellationToken)
   at System.Net.Http.RedirectHandler.SendAsync(HttpRequestMessage request, Boolean async, CancellationToken cancellationToken)
   at System.Net.Http.HttpClient.<SendAsync>g__Core|83_0(HttpRequestMessage request, HttpCompletionOption completionOption, CancellationTokenSource cts, Boolean disposeCts, CancellationTokenSource pendingRequestsCts, CancellationToken originalCancellationToken)
   at Duplicati.Library.Backend.Backblaze.B2.PutAsync(String remotename, Stream stream, CancellationToken cancelToken)
   at Duplicati.Library.Utility.Utility.Await(Task task)
   at Duplicati.CommandLine.BackendTester.Program.Uploadfile(String localfilename, Int32 i, String remotefilename, IBackend backend, Boolean disableStreaming, Int64 throttle, Int32 readWriteTimeout), remote name: HYF81nf after 26754 ms
[20:56:26 541] Uploading file 8, 33.61 MiB ...  done! in 6487 ms (~5.18 MiB/s)
[20:56:33 029] Uploading file 9, 37.19 MiB ...  done! in 15008 ms (~2.48 MiB/s)
[20:56:48 037] Verifying file list ...
[20:56:48 118] *** File 2 with name 8HwksHAbUmi3PVTMtMLKpi1wpOLqp6WsAirPZRGpca4M7qbWQBVboUr4cyWTxhxp8 was uploaded but not found afterwards
[20:56:48 118] *** File 8 with name MyTFPA4 was uploaded but not found afterwards
[20:56:48 119] Downloading files
[20:56:48 120] Downloading file 0 ...  done in 2231 ms (~12.11 MiB/s)
[20:56:50 352] Checking hash ... done
[20:56:50 577] Downloading file 1 ...  done in 2589 ms (~16.21 MiB/s)
[20:56:53 168] Checking hash ... done
[20:56:53 647] Downloading file 2 ... [20:56:54 058] failed
*** Error: Duplicati.Library.Interface.FileMissingException: The requested file does not exist
   at Duplicati.Library.Backend.Backblaze.B2.GetAsync(String remotename, Stream stream, CancellationToken cancellationToken)
   at Duplicati.Library.Utility.Utility.Await(Task task)
   at Duplicati.CommandLine.BackendTester.Program.Run(List`1 args, Dictionary`2 options, Boolean first) after 409 ms
[20:56:54 059] Checking hash ... [20:56:54 059] failed
*** Downloaded file was corrupt
[20:56:54 060] Downloading file 3 ...  done in 3001 ms (~15.30 MiB/s)
[20:56:57 062] Checking hash ... done
[20:56:57 539] Downloading file 4 ...  done in 519 ms (~14.76 MiB/s)
[20:56:58 061] Checking hash ... done
[20:56:58 131] Downloading file 5 ...  done in 451 ms (~13.18 MiB/s)
[20:56:58 585] Checking hash ... done
[20:56:58 649] Downloading file 6 ...  done in 518 ms (~13.85 MiB/s)
[20:56:59 169] Checking hash ... done
[20:56:59 235] Downloading file 7 ...  done in 1715 ms (~16.49 MiB/s)
[20:57:00 952] Checking hash ... done
[20:57:01 256] Downloading file 8 ... [20:57:01 658] failed
*** Error: Duplicati.Library.Interface.FileMissingException: The requested file does not exist
   at Duplicati.Library.Backend.Backblaze.B2.GetAsync(String remotename, Stream stream, CancellationToken cancellationToken)
   at Duplicati.Library.Utility.Utility.Await(Task task)
   at Duplicati.CommandLine.BackendTester.Program.Run(List`1 args, Dictionary`2 options, Boolean first) after 401 ms
[20:57:01 659] Checking hash ... [20:57:01 659] failed
*** Downloaded file was corrupt
[20:57:01 660] Downloading file 9 ...  done in 2176 ms (~17.09 MiB/s)
[20:57:03 836] Checking hash ... done
[20:57:04 135] Deleting files...
[20:57:04 135] Deleting file 0
[20:57:04 219] Deleting file 1
[20:57:04 302] Deleting file 2
[20:57:04 385] *** Failed to delete file 8HwksHAbUmi3PVTMtMLKpi1wpOLqp6WsAirPZRGpca4M7qbWQBVboUr4cyWTxhxp8, message: Duplicati.Library.Interface.FileMissingException: The requested file does not exist
   at Duplicati.Library.Backend.Backblaze.B2.DeleteAsync(String remotename, CancellationToken cancellationToken)
   at Duplicati.Library.Utility.Utility.Await(Task task)
   at Duplicati.CommandLine.BackendTester.Program.Run(List`1 args, Dictionary`2 options, Boolean first)
[20:57:04 386] Deleting file 3
[20:57:04 550] Deleting file 4
[20:57:04 644] Deleting file 5
[20:57:04 728] Deleting file 6
[20:57:04 812] Deleting file 7
[20:57:04 898] Deleting file 8
[20:57:04 979] *** Failed to delete file MyTFPA4, message: Duplicati.Library.Interface.FileMissingException: The requested file does not exist
   at Duplicati.Library.Backend.Backblaze.B2.DeleteAsync(String remotename, CancellationToken cancellationToken)
   at Duplicati.Library.Utility.Utility.Await(Task task)
   at Duplicati.CommandLine.BackendTester.Program.Run(List`1 args, Dictionary`2 options, Boolean first)
[20:57:04 979] Deleting file 9
[20:57:05 232] Checking retrieval of non-existent file...
[20:57:05 423] Caught expected FileMissingException
[20:57:05 435] Checking DNS names used by this backend...
api.backblazeb2.com
api001.backblazeb2.com
f001.backblazeb2.com

C:\Duplicati\duplicati-2.1.0.124_canary_2025-07-11-win-x64-gui>

Summary of Wireshark TCP stream that looked like the several uploads until timeout:

   87 54:38 start of stream
   95 54:38 start of upload
25006 54:47 start of upload that timed out at 55:21
72818 55:23 B2 sends FIN and Duplicati PC sends RST

45 second TCP run, and its upload activity is like:

Roughly 9 seconds in on the graph is the last upload that Duplicati timed out.
Not seeing anything that I would call a 15 second stalled connection in there.
As a side note, TCP level stalls will retry awhile, so Duplicati can timeout first.
But I’m not sure there’s any such network hiccup here. FWIW, downstream is:

so if one is motivated enough, one might be able to label the other uploads in it.

Interestingly the two timed out uploads completed then got verified successfully.
There were also two that Duplicati thought it uploaded that didn’t actually arrive.

EDIT 1:

The hover-over definition of Goodput, which capture tool wasn’t able to capture:

“Display graph of mean ACKd Bytes vs Time”. Throughput is Transmitted Bytes.
They seem to track very closely though. Throughput is brown. Goodput is green.

Using the B2 backend.

I repeated the prior 15-second-timeout BackendTester run with an S3-enabled bucket and a different access key, and got a clean run, however I think I’ve had clean runs with B2 backend, when uploads speeds ran faster. Possibly the speed depends on what particular equipment Backblaze picks for upload.

Are you willing to try your luck using a Backblaze S3-enabled bucket to see if it works?

EDIT 1:

It might not help though. I did a run with S3 minio, then one with S3 aws. Both had three timeouts, then downloaded and verified those files successfully. Other prior oddities still, such as the minio run having two files that seemingly uploaded cleanly, but weren’t there.

EDIT 2:

I found that a simpler test and result are available with BackendTester put to native B2.
It ignores some options such as throttle-upload, but read-write-timeout=15s seems taken.
Test was a 40MB file. Graph is nice and smooth as usual, and yet Duplicati says timeout.
I’m not sure if I like the way B2 is ending the upload by closing connection after its ACKs:

75494	2025-07-24 18:45:52.973512	pod-000-1041-16.backblaze.com	192.168.86.175	TCP	54	https(443) → 64485 [ACK] Seq=3090 Ack=41336415 Win=2227584 Len=0
75495	2025-07-24 18:45:52.973512	pod-000-1041-16.backblaze.com	192.168.86.175	TCP	54	https(443) → 64485 [ACK] Seq=3090 Ack=41338120 Win=2227584 Len=0
75496	2025-07-24 18:45:52.973512	pod-000-1041-16.backblaze.com	192.168.86.175	TLSv1.2	85	Encrypted Alert
75497	2025-07-24 18:45:52.973512	pod-000-1041-16.backblaze.com	192.168.86.175	TCP	54	https(443) → 64485 [FIN, ACK] Seq=3121 Ack=41338120 Win=2227584 Len=0
75498	2025-07-24 18:45:52.973557	192.168.86.175	pod-000-1041-16.backblaze.com	TCP	54	64485 → https(443) [RST, ACK] Seq=41338120 Ack=3121 Win=0 Len=0

My first put worked, and I think this is how that one ended, maybe with HTTP response:

No.     Time                            Source          Destination     Protocol        Length  Info
36171   2025-07-24 18:45:16.785470      149.137.139.26  192.168.86.175  TCP             60      https(443) → 64480 [ACK] Seq=3090 Ack=41338030 Win=2136192 Len=0
36172   2025-07-24 18:45:16.785470      149.137.139.26  192.168.86.175  TCP             60      https(443) → 64480 [ACK] Seq=3090 Ack=41338119 Win=2136192 Len=0
36173   2025-07-24 18:45:16.874994      149.137.139.26  192.168.86.175  TLSv1.2         1086    Application Data
36176   2025-07-24 18:45:16.926728      192.168.86.175  149.137.139.26  TCP             54      64480 → https(443) [ACK] Seq=41338119 Ack=4122 Win=130304 Len=0
36197   2025-07-24 18:45:17.466268      192.168.86.175  149.137.139.26  TCP             54      64480 → https(443) [FIN, ACK] Seq=41338119 Ack=4122 Win=130304 Len=0
36199   2025-07-24 18:45:17.543008      149.137.139.26  192.168.86.175  TCP             54      https(443) → 64480 [FIN, ACK] Seq=4122 Ack=41338120 Win=2136192 Len=0
36200   2025-07-24 18:45:17.543058      192.168.86.175  149.137.139.26  TCP             54      64480 → https(443) [ACK] Seq=41338120 Ack=4123 Win=130304 Len=0

What’s a bit puzzling is why two B2 S3 clients also see timeout. Someone else should test.

EDIT 3:

At least some of my B2 S3 testing was done to an S3 enabled bucket, but with B2 protocol inadvertently. At this point, the Backblaze S3 option seems to be working. I also filed issue:

Backblaze B2 uploads fail, often with timeout. Big read-write-timeout may help. #6438

Did you ever try S3? See edits to above post for more news, including an opened issue.

In Wireshark testing, the pattern of B2 closing the connection in timeout case continued,
although I couldn’t figure out how the 15 second Duplicati timeout signaled B2 to do that.

My low-level debug tools remain weak with encrypted data. Hoping that devs have ways. Backend types proliferate, but technical ability to support them doesn’t seem good to me.

EDIT 1:

I’ve been deliberate provoking my production B2 backup to see if I could get the reported backup corruption you mentioned. After numerous timeout failures, it always fixes itself up automatically on next backup, and the fixup appears good according to my validation test.

I posted some examples of fixups at start of backup that you should expect to be seeing, if staying on B2 with timeouts. If you move to S3, things may go better except for figuring out corruption problem. There are other ways to troubleshoot too, and maybe the dev will help.

I have not tried S3 yet, as I have been hoping to get more info from the log files I have enabled. [See next post!]

I have a failure recorded with verbose logging.

Extracting from the logs, this morning it reports:

2025-07-29 03:50:08 -04 - [Error-Duplicati.Library.Main.Controller-FailedOperation]: The operation Backup has failed
Duplicati.Library.Interface.RemoteListVerificationException: Found 2 files that are missing from the remote storage, please run repair

The missing files are reported as:

2025-07-29 03:50:08 -04 - [Warning-Duplicati.Library.Main.Operation.FilelistProcessor-MissingFile]: Missing file: duplicati-bc30024dd65654e5882cb31ce1c09b7b4.dblock.zip.aes
2025-07-29 03:50:08 -04 - [Warning-Duplicati.Library.Main.Operation.FilelistProcessor-MissingFile]: Missing file: duplicati-b52eb81712ff6466a86a2b6e222a56348.dblock.zip.aes

Looking at log file entries for the 1st missing file we find:

[from the backup that claimed to be successful]

2025-07-29 03:42:45 -04 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Started: duplicati-b52eb81712ff6466a86a2b6e222a56348.dblock.zip.aes (49.42 MiB)
2025-07-29 03:44:19 -04 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Completed: duplicati-b52eb81712ff6466a86a2b6e222a56348.dblock.zip.aes (49.42 MiB)

[from this morning’s backup that failed]

2025-07-29 03:50:08 -04 - [Warning-Duplicati.Library.Main.Operation.FilelistProcessor-MissingFile]: Missing file: duplicati-b52eb81712ff6466a86a2b6e222a56348.dblock.zip.aes

[from running repair]

Duplicati.Library.Interface.UserInformationException: The backup storage destination is missing data files. You can either enable `--rebuild-missing-dblock-files` or run the purge command to remove these files. The following files are missing: duplicati-bc30024dd65654e5882cb31ce1c09b7b4.dblock.zip.aes, duplicati-b52eb81712ff6466a86a2b6e222a56348.dblock.zip.aes

[repair failed, from re-running with “rebuild missing blocks”]

2025-07-29 08:59:52 -04 - [Warning-Duplicati.Library.Main.Operation.RepairHandler-RepairMissingBlocks]: Repair acquired 2 blocks for volume duplicati-b52eb81712ff6466a86a2b6e222a56348.dblock.zip.aes, but 71 blocks are still missing. If you want to continue working with the database, you can use the "list-broken-files" and "purge-broken-files" commands to purge the missing data from the database and the remote storage.

[purge-broken-files was performed, from running backup]

duplicati-Config-Dir-2025-07-29.4.backup.log:2025-07-29 09:00:38 -04 - [Warning-Duplicati.Library.Main.Operation.FilelistProcessor-MissingFile]: Missing file: duplicati-b52eb81712ff6466a86a2b6e222a56348.dblock.zip.aes
duplicati-Config-Dir-2025-07-29.6.backup.log:2025-07-29 09:02:21 -04 - [Information-Duplicati.Library.Main.Operation.FilelistProcessor-RemoteUnwantedMissingFile]: Removing file listed as Deleting: duplicati-b52eb81712ff6466a86a2b6e222a56348.dblock.zip.aes

So according to the logs the initial “Put” completed, but the file never showed up afterwards suggesting the “completed” status was a false positive. This is consistent with ts678’s tests.

I am assuming that the need to use “rebuild missing blocks” with repair is due to the data no longer being present on the disk, the repair-with-rebuild then creates a broken file due to the non-available blocks, and then the broken file has to be purged before backup is able to run again.

If someone wants to look over the log files I can put them up on Google Drive. The only other thing of interest I see are entries similar to:

2025-07-29 09:00:01 -04 - [Warning-Duplicati.Library.Main.Operation.RepairHandler-LargeEmptyIndexFile]: The empty index file duplicati-id31d35115da74a948b2fd44700989f10.dindex.zip.aes is larger than expected (3917 bytes), choosing not to delete it

Since I can’t repro exactly that failure, I looked into a possible way that I get to success.
If you kept an archive of the old intermediate databases, there would be more to look at.

My theory is that a dblock goes from Uploading to Uploaded when B2 reports a success.
Uploading has a recovery code path for a missing file case that Uploaded does not have.

2025-07-25 20:20:32 -04 - [Information-Duplicati.Library.Main.Operation.FilelistProcessor-SchedulingMissingFileForDelete]: Scheduling missing file for deletion, currently listed as Uploading: duplicati-b2a1113bb76f5498fa922b6e829bc7ef1.dblock.zip.aes is its style of message.

Rough steps to a successful recovery or simpler investigation of issues here:

  • Prepare A.txt and B.txt containing some text like A and B respectively.
  • Set advanced option no-backend-verification true. Its help is wrong.
  • Add A.txt to Source and backup
  • Add B.txt to Source and backup
  • Sort destination to find dblock from second backup. Note name. Delete.
  • With a DB editor, change its State in Remotevolume table to Uploading.
  • Insert terminated-with-active-uploads as true in Configuration.
  • Toggle no-backend-verification false or delete that option instead.
  • Backup and notice its recovery without GUI noise or manual assistance.
  • Look in the job log’s “Complete log” and notice early messages like this:
  "Messages": [
    "2025-07-29 11:39:57 -04 - [Information-Duplicati.Library.Main.Controller-StartingOperation]: The operation Backup has started",
    "2025-07-29 11:39:59 -04 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: List - Started:  ()",
    "2025-07-29 11:39:59 -04 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: List - Completed:  (6 bytes)",
    "2025-07-29 11:39:59 -04 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: QuotaInfo - Started:  ()",
    "2025-07-29 11:39:59 -04 - [Information-Duplicati.Library.Main.Operation.FilelistProcessor-SchedulingMissingFileForDelete]: Scheduling missing file for deletion, currently listed as Uploading: duplicati-bebf4a0fbc49f47b3bc9a0b3bcdcdffdf.dblock.zip",

It looks like it gave up on the dblock that got lost while Uploading, and uploaded another.
This doesn’t happen if the dblock gets as far as Uploaded, but then isn’t actually present.

    "2025-07-29 11:39:59 -04 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Started: duplicati-bfff31862819c4151aad96d44ea34e52b.dblock.zip (686 bytes)",
    "2025-07-29 11:39:59 -04 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Completed: duplicati-bfff31862819c4151aad96d44ea34e52b.dblock.zip (686 bytes)",
    "2025-07-29 11:39:59 -04 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Started: duplicati-i8dec0db490df4d579e07c4301966e019.dindex.zip (623 bytes)",
    "2025-07-29 11:39:59 -04 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Completed: duplicati-i8dec0db490df4d579e07c4301966e019.dindex.zip (623 bytes)",
    "2025-07-29 11:39:59 -04 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Started: duplicati-20250729T153957Z.dlist.zip (763 bytes)",
    "2025-07-29 11:39:59 -04 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Completed: duplicati-20250729T153957Z.dlist.zip (763 bytes)",

So there’s some speculation for the devs about the hard recovery that you get but I don’t.

EDIT 1:

I ran my backup, got a failure with 15 second timeout, then had it clear up successfully on next backup.
Below is a look at various levels, with the default job log, then a log-file, plus a peek into database.

Default job log "Complete log" gives an overview, but no detail on reason for retry or result of rename:
2025-07-29 17:35:54 -04 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: List - Started:  ()
2025-07-29 17:35:56 -04 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: List - Completed:  (939 bytes)
2025-07-29 17:39:31 -04 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Started: duplicati-b56d89db9eb3842bcb98e274a5e27d968.dblock.zip.aes (39.97 MiB)
2025-07-29 17:40:06 -04 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Started: duplicati-bef519763c4c540e7b79f9169ac77cfde.dblock.zip.aes (39.96 MiB)
2025-07-29 17:40:15 -04 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Retrying: duplicati-b56d89db9eb3842bcb98e274a5e27d968.dblock.zip.aes ()
2025-07-29 17:40:25 -04 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Rename: duplicati-b56d89db9eb3842bcb98e274a5e27d968.dblock.zip.aes (39.97 MiB)
2025-07-29 17:40:25 -04 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Rename: duplicati-bfec57db1599e494b8f3f47459df4dbfb.dblock.zip.aes (39.97 MiB)
2025-07-29 17:40:25 -04 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Started: duplicati-bfec57db1599e494b8f3f47459df4dbfb.dblock.zip.aes (39.97 MiB)
2025-07-29 17:40:41 -04 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Retrying: duplicati-bef519763c4c540e7b79f9169ac77cfde.dblock.zip.aes ()
2025-07-29 17:40:51 -04 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Rename: duplicati-bef519763c4c540e7b79f9169ac77cfde.dblock.zip.aes (39.96 MiB)
2025-07-29 17:40:51 -04 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Rename: duplicati-b5c96f269b9184cf4be0e2d2aca37e61e.dblock.zip.aes (39.96 MiB)
2025-07-29 17:40:52 -04 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Started: duplicati-b5c96f269b9184cf4be0e2d2aca37e61e.dblock.zip.aes (39.96 MiB)
2025-07-29 17:41:38 -04 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Failed: duplicati-bfec57db1599e494b8f3f47459df4dbfb.dblock.zip.aes ()
2025-07-29 17:41:39 -04 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Failed: duplicati-b5c96f269b9184cf4be0e2d2aca37e61e.dblock.zip.aes ()

Summarizing activity using optioned log-file lines with (BackendEvent|Retry|Rename) fills in those gaps:
2025-07-29 17:35:54 -04 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: List - Started:  ()
2025-07-29 17:35:56 -04 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: List - Completed:  (939 bytes)
2025-07-29 17:39:31 -04 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Started: duplicati-b56d89db9eb3842bcb98e274a5e27d968.dblock.zip.aes (39.97 MiB)
2025-07-29 17:40:06 -04 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Started: duplicati-bef519763c4c540e7b79f9169ac77cfde.dblock.zip.aes (39.96 MiB)
2025-07-29 17:40:15 -04 - [Retry-Duplicati.Library.Main.Backend.Handler-RetryPut]: Operation Put with file duplicati-b56d89db9eb3842bcb98e274a5e27d968.dblock.zip.aes attempt 1 of 1 failed with message: The operation has timed out.
2025-07-29 17:40:15 -04 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Retrying: duplicati-b56d89db9eb3842bcb98e274a5e27d968.dblock.zip.aes ()
2025-07-29 17:40:25 -04 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Rename: duplicati-b56d89db9eb3842bcb98e274a5e27d968.dblock.zip.aes (39.97 MiB)
2025-07-29 17:40:25 -04 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Rename: duplicati-bfec57db1599e494b8f3f47459df4dbfb.dblock.zip.aes (39.97 MiB)
2025-07-29 17:40:25 -04 - [Information-Duplicati.Library.Main.Backend.PutOperation-RenameRemoteTargetFile]: Renaming "duplicati-b56d89db9eb3842bcb98e274a5e27d968.dblock.zip.aes" to "duplicati-bfec57db1599e494b8f3f47459df4dbfb.dblock.zip.aes"
2025-07-29 17:40:25 -04 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Started: duplicati-bfec57db1599e494b8f3f47459df4dbfb.dblock.zip.aes (39.97 MiB)
2025-07-29 17:40:41 -04 - [Retry-Duplicati.Library.Main.Backend.Handler-RetryPut]: Operation Put with file duplicati-bef519763c4c540e7b79f9169ac77cfde.dblock.zip.aes attempt 1 of 1 failed with message: The operation has timed out.
2025-07-29 17:40:41 -04 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Retrying: duplicati-bef519763c4c540e7b79f9169ac77cfde.dblock.zip.aes ()
2025-07-29 17:40:51 -04 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Rename: duplicati-bef519763c4c540e7b79f9169ac77cfde.dblock.zip.aes (39.96 MiB)
2025-07-29 17:40:51 -04 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Rename: duplicati-b5c96f269b9184cf4be0e2d2aca37e61e.dblock.zip.aes (39.96 MiB)
2025-07-29 17:40:51 -04 - [Information-Duplicati.Library.Main.Backend.PutOperation-RenameRemoteTargetFile]: Renaming "duplicati-bef519763c4c540e7b79f9169ac77cfde.dblock.zip.aes" to "duplicati-b5c96f269b9184cf4be0e2d2aca37e61e.dblock.zip.aes"
2025-07-29 17:40:52 -04 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Started: duplicati-b5c96f269b9184cf4be0e2d2aca37e61e.dblock.zip.aes (39.96 MiB)
2025-07-29 17:41:38 -04 - [Retry-Duplicati.Library.Main.Backend.Handler-RetryPut]: Operation Put with file duplicati-bfec57db1599e494b8f3f47459df4dbfb.dblock.zip.aes attempt 2 of 1 failed with message: The operation has timed out.
2025-07-29 17:41:38 -04 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Failed: duplicati-bfec57db1599e494b8f3f47459df4dbfb.dblock.zip.aes ()
2025-07-29 17:41:38 -04 - [Retry-Duplicati.Library.Main.Backend.Handler-RetryPut]: Operation Put with file duplicati-b5c96f269b9184cf4be0e2d2aca37e61e.dblock.zip.aes attempt 2 of 1 failed with message: The operation was canceled.
2025-07-29 17:41:39 -04 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Failed: duplicati-b5c96f269b9184cf4be0e2d2aca37e61e.dblock.zip.aes ()

Looking in folder would say what made it, but I run "rclone sync" back to local, and that log is saying:
2025/07/29 17:42:16 DEBUG : duplicati-b56d89db9eb3842bcb98e274a5e27d968.dblock.zip.aes: Need to transfer - File not found at Destination
2025/07/29 17:42:16 DEBUG : duplicati-bef519763c4c540e7b79f9169ac77cfde.dblock.zip.aes: Need to transfer - File not found at Destination
2025/07/29 17:42:16 DEBUG : duplicati-bfec57db1599e494b8f3f47459df4dbfb.dblock.zip.aes: Need to transfer - File not found at Destination
showing that both of the uploads that Duplicati failed and retried actually made it on the initial try, though I can't check right size.
The last one shown is a retry, and it too timeed out but worked. DB shows that it's the right size. DB State is Uploading, not Uploaded.
DB also has terminated-with-active-uploads true, meaning no State could go to Verified from list verification. None made it to Uploaded.
This might not be a bad thing, as I'm speculating that Uploading has a recovery that Uploaded does have. Is that a possible area to fix?

I'll gloss over a Verify files run, as it was smart enough to not change things. I put back the old database to see if Backup could fix.

From job log "Complete log", I think it hit the message limit, but one can see some of the things that are going on to get the recovery:
  "Messages": [
    "2025-07-30 07:20:05 -04 - [Information-Duplicati.Library.Main.Controller-StartingOperation]: The operation Backup has started",
    "2025-07-30 07:20:11 -04 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: List - Started:  ()",
    "2025-07-30 07:20:13 -04 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: List - Completed:  (942 bytes)",
    "2025-07-30 07:20:13 -04 - [Information-Duplicati.Library.Main.Operation.FilelistProcessor-KeepIncompleteFile]: Keeping protected incomplete remote file listed as Temporary: duplicati-20250729T213548Z.dlist.zip.aes",
    "2025-07-30 07:20:13 -04 - [Information-Duplicati.Library.Main.Operation.FilelistProcessor-PromotingCompleteFile]: Promoting uploaded complete file from Uploading to Uploaded: duplicati-bfec57db1599e494b8f3f47459df4dbfb.dblock.zip.aes",
    "2025-07-30 07:20:13 -04 - [Information-Duplicati.Library.Main.Operation.FilelistProcessor-SchedulingMissingFileForDelete]: Scheduling missing file for deletion, currently listed as Uploading: duplicati-i42321fa81fe94516b0d4878a335994f9.dindex.zip.aes",
    "2025-07-30 07:20:13 -04 - [Information-Duplicati.Library.Main.Operation.FilelistProcessor-SchedulingMissingFileForDelete]: Scheduling missing file for deletion, currently listed as Uploading: duplicati-b5c96f269b9184cf4be0e2d2aca37e61e.dblock.zip.aes",
    "2025-07-30 07:20:13 -04 - [Information-Duplicati.Library.Main.Operation.FilelistProcessor-SchedulingMissingFileForDelete]: Scheduling missing file for deletion, currently listed as Uploading: duplicati-i4cdf2f3b3dd347aa83e86ed62fa70111.dindex.zip.aes",
    "2025-07-30 07:20:13 -04 - [Information-Duplicati.Library.Main.Operation.FilelistProcessor-RemoteUnwantedMissingFile]: Removing file listed as Temporary: duplicati-b25558dc589874b4c83dead4d3011b82e.dblock.zip.aes",
    "2025-07-30 07:20:13 -04 - [Information-Duplicati.Library.Main.Operation.FilelistProcessor-RemoteUnwantedMissingFile]: Removing file listed as Temporary: duplicati-ic975bc727387490e8c5614e90d02e7da.dindex.zip.aes",
    "2025-07-30 07:20:13 -04 - [Information-Duplicati.Library.Main.Operation.FilelistProcessor-RemoveUnwantedRemoteFile]: Removing remote file listed as Deleting: duplicati-b56d89db9eb3842bcb98e274a5e27d968.dblock.zip.aes",
    "2025-07-30 07:20:13 -04 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Delete - Started: duplicati-b56d89db9eb3842bcb98e274a5e27d968.dblock.zip.aes ()",
    "2025-07-30 07:20:13 -04 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Delete - Completed: duplicati-b56d89db9eb3842bcb98e274a5e27d968.dblock.zip.aes ()",
    "2025-07-30 07:20:13 -04 - [Information-Duplicati.Library.Main.Operation.FilelistProcessor-RemoveUnwantedRemoteFile]: Removing remote file listed as Deleting: duplicati-bef519763c4c540e7b79f9169ac77cfde.dblock.zip.aes",
    "2025-07-30 07:20:13 -04 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Delete - Started: duplicati-bef519763c4c540e7b79f9169ac77cfde.dblock.zip.aes ()",
    "2025-07-30 07:20:13 -04 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Delete - Completed: duplicati-bef519763c4c540e7b79f9169ac77cfde.dblock.zip.aes ()",
    "2025-07-30 07:21:17 -04 - [Information-Duplicati.Library.Main.Operation.Backup.UploadSyntheticFilelist-PreviousBackupFilelistUpload]: Uploading filelist from previous interrupted backup",
    "2025-07-30 07:21:28 -04 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Started: duplicati-20250729T213549Z.dlist.zip.aes (955.01 KiB)",
    "2025-07-30 07:21:29 -04 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Completed: duplicati-20250729T213549Z.dlist.zip.aes (955.01 KiB)",
    "2025-07-30 07:21:29 -04 - [Information-Duplicati.Library.Main.Operation.Backup.RecreateMissingIndexFiles-RecreateMissingIndexFile]: Re-creating missing index file for duplicati-bfec57db1599e494b8f3f47459df4dbfb.dblock.zip.aes"
showing that the dblock that actually made it despite a timeout got promoted from Uploading to Uploaded, and its index file got uploaded.
Result seems to be a consistent backup, however if some file made it to Uploaded, was missing, then the backup failed, it might not have.
It would be interesting to see if timeout issues that don't recover can put files at Uploaded despite the file not really being uploaded.
This is sort of what my BackendTester run saw, but database from an actual failed backup that goes on to not recover right would be best.

Another minor issue with the repair process when files are missing…

“repair” indicates files are missing and to either use the “rebuild missing blocks” option or use “purge”:

  • “purge” should presumably be “purge-broken-files”
  • “purge-broken-files” fails with a null hash error
  • as “purge-broken-files” fails, it seems that the “rebuild missing blocks” option is effectively required