Duplicati didn't stop on dblock Put fail, and Put dlist using it. Later on, some successful B2 Put didn't actually leave files

Duplicati 2.1.0.107 on Windows 10 going to Backblaze B2 native did some strange things.
Allowable retries are set to 0 to invite issues, and flaky Wi-Fi probably started this.

Evidence of a network problem was pretty solid, with speeds a fraction of the expected.
Fixing just took putting the system to sleep then letting it wake, but damage was done.

My logging and error detection plan worked imperfectly, but there’s still a lot.

First failed job log has times of:

Start 2025-02-16 07:20:55
End 2025-02-16 07:35:15
Duration 00:14:20

Messages in server Profiling log with BackendEvent and BackendUploader tags are:

2025-02-16 07:21:04 -05 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: List - Started:  ()
2025-02-16 07:21:06 -05 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: List - Completed:  (493 bytes)
2025-02-16 07:23:56 -05 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Started: duplicati-be476ffad0d544711a050b3a356b03f60.dblock.zip.aes (50.01 MiB)
2025-02-16 07:23:56 -05 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Started: duplicati-bed1695fdb3d64f728d1e0751a8b0ee6b.dblock.zip.aes (50.03 MiB)
2025-02-16 07:24:57 -05 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Started: duplicati-b125bf82ede184f1bbfc31eb05ca2783e.dblock.zip.aes (23.25 MiB)
2025-02-16 07:25:28 -05 - [Retry-Duplicati.Library.Main.Operation.Backup.BackendUploader-RetryPut]: Operation Put with file duplicati-b125bf82ede184f1bbfc31eb05ca2783e.dblock.zip.aes attempt 1 of 1 failed with message: The operation was canceled.
2025-02-16 07:34:25 -05 - [Profiling-Duplicati.Library.Main.Operation.Backup.BackendUploader-UploadSpeed]: Uploaded 50.01 MiB in 00:10:29.2687382, 81.39 KiB/s
2025-02-16 07:34:25 -05 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Completed: duplicati-be476ffad0d544711a050b3a356b03f60.dblock.zip.aes (50.01 MiB)
2025-02-16 07:34:25 -05 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Started: duplicati-i2d1174abd6ca41519e7a7dff8383afb0.dindex.zip.aes (27.25 KiB)
2025-02-16 07:34:27 -05 - [Profiling-Duplicati.Library.Main.Operation.Backup.BackendUploader-UploadSpeed]: Uploaded 50.03 MiB in 00:10:30.8189377, 81.21 KiB/s
2025-02-16 07:34:27 -05 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Completed: duplicati-bed1695fdb3d64f728d1e0751a8b0ee6b.dblock.zip.aes (50.03 MiB)
2025-02-16 07:34:27 -05 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Started: duplicati-i85a7b316825e4b6bb617b6c0db4bd59a.dindex.zip.aes (35.01 KiB)
2025-02-16 07:34:29 -05 - [Profiling-Duplicati.Library.Main.Operation.Backup.BackendUploader-UploadSpeed]: Uploaded 27.25 KiB in 00:00:03.6168321, 7.53 KiB/s
2025-02-16 07:34:29 -05 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Completed: duplicati-i2d1174abd6ca41519e7a7dff8383afb0.dindex.zip.aes (27.25 KiB)
2025-02-16 07:34:30 -05 - [Profiling-Duplicati.Library.Main.Operation.Backup.BackendUploader-UploadSpeed]: Uploaded 35.01 KiB in 00:00:02.9578903, 11.84 KiB/s
2025-02-16 07:34:30 -05 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Completed: duplicati-i85a7b316825e4b6bb617b6c0db4bd59a.dindex.zip.aes (35.01 KiB)
2025-02-16 07:34:30 -05 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Started: duplicati-20250216T122100Z.dlist.zip.aes (948.67 KiB)
2025-02-16 07:34:38 -05 - [Profiling-Duplicati.Library.Main.Operation.Backup.BackendUploader-UploadSpeed]: Uploaded 948.67 KiB in 00:00:07.7408506, 122.55 KiB/s
2025-02-16 07:34:38 -05 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Completed: duplicati-20250216T122100Z.dlist.zip.aes (948.67 KiB)
2025-02-16 07:34:49 -05 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Delete - Started: duplicati-20240216T132340Z.dlist.zip.aes (938.00 KiB)
2025-02-16 07:34:52 -05 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Delete - Completed: duplicati-20240216T132340Z.dlist.zip.aes (938.00 KiB)
2025-02-16 07:34:52 -05 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Delete - Started: duplicati-20250202T122003Z.dlist.zip.aes (947.98 KiB)
2025-02-16 07:34:53 -05 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Delete - Completed: duplicati-20250202T122003Z.dlist.zip.aes (947.98 KiB)
2025-02-16 07:34:55 -05 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Delete - Started: duplicati-b3df2619cb6264febba5affb69d66510c.dblock.zip.aes (16.93 MiB)
2025-02-16 07:34:56 -05 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Delete - Completed: duplicati-b3df2619cb6264febba5affb69d66510c.dblock.zip.aes (16.93 MiB)
2025-02-16 07:34:56 -05 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Delete - Started: duplicati-i89772384028847d28da08e28ad9cfcc7.dindex.zip.aes (57.37 KiB)
2025-02-16 07:34:57 -05 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Delete - Completed: duplicati-i89772384028847d28da08e28ad9cfcc7.dindex.zip.aes (57.37 KiB)
2025-02-16 07:34:58 -05 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Started: duplicati-verification.json (102.96 KiB)
2025-02-16 07:35:15 -05 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Failed: duplicati-verification.json (102.96 KiB)

The detail on the dblock error was:

2025-02-16 07:24:57 -05 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Started: duplicati-b125bf82ede184f1bbfc31eb05ca2783e.dblock.zip.aes (23.25 MiB)
...
2025-02-16 07:25:28 -05 - [Retry-Duplicati.Library.Main.Operation.Backup.BackendUploader-RetryPut]: Operation Put with file duplicati-b125bf82ede184f1bbfc31eb05ca2783e.dblock.zip.aes attempt 1 of 1 failed with message: The operation was canceled.
System.Threading.Tasks.TaskCanceledException: The operation was canceled.
 ---> System.Threading.Tasks.TaskCanceledException: The operation was canceled.
 ---> System.IO.IOException: Unable to read data from the transport connection: The I/O operation has been aborted because of either a thread exit or an application request..
 ---> System.Net.Sockets.SocketException (995): The I/O operation has been aborted because of either a thread exit or an application request.
   --- End of inner exception stack trace ---
   at System.Net.Sockets.Socket.AwaitableSocketAsyncEventArgs.ThrowException(SocketError error, CancellationToken cancellationToken)
   at System.Net.Sockets.Socket.AwaitableSocketAsyncEventArgs.System.Threading.Tasks.Sources.IValueTaskSource<System.Int32>.GetResult(Int16 token)
   at System.Net.Security.SslStream.EnsureFullTlsFrameAsync[TIOAdapter](CancellationToken cancellationToken, Int32 estimatedSize)
   at System.Runtime.CompilerServices.PoolingAsyncValueTaskMethodBuilder`1.StateMachineBox`1.System.Threading.Tasks.Sources.IValueTaskSource<TResult>.GetResult(Int16 token)
   at System.Net.Security.SslStream.ReadAsyncInternal[TIOAdapter](Memory`1 buffer, CancellationToken cancellationToken)
   at System.Runtime.CompilerServices.PoolingAsyncValueTaskMethodBuilder`1.StateMachineBox`1.System.Threading.Tasks.Sources.IValueTaskSource<TResult>.GetResult(Int16 token)
   at System.Net.Http.HttpConnection.SendAsync(HttpRequestMessage request, Boolean async, CancellationToken cancellationToken)
   --- End of inner exception stack trace ---
   at System.Net.Http.HttpConnection.SendAsync(HttpRequestMessage request, Boolean async, CancellationToken cancellationToken)
   at System.Net.Http.HttpConnectionPool.SendWithVersionDetectionAndRetryAsync(HttpRequestMessage request, Boolean async, Boolean doRequestAuth, CancellationToken cancellationToken)
   at System.Net.Http.RedirectHandler.SendAsync(HttpRequestMessage request, Boolean async, CancellationToken cancellationToken)
   at Microsoft.Extensions.Http.Logging.LoggingHttpMessageHandler.<SendCoreAsync>g__Core|5_0(HttpRequestMessage request, Boolean useAsync, CancellationToken cancellationToken)
   at Microsoft.Extensions.Http.Logging.LoggingScopeHttpMessageHandler.<SendCoreAsync>g__Core|5_0(HttpRequestMessage request, Boolean useAsync, CancellationToken cancellationToken)
   at System.Net.Http.HttpClient.<SendAsync>g__Core|83_0(HttpRequestMessage request, HttpCompletionOption completionOption, CancellationTokenSource cts, Boolean disposeCts, CancellationTokenSource pendingRequestsCts, CancellationToken originalCancellationToken)
   --- End of inner exception stack trace ---
   at System.Net.Http.HttpClient.HandleFailure(Exception e, Boolean telemetryStarted, HttpResponseMessage response, CancellationTokenSource cts, CancellationToken cancellationToken, CancellationTokenSource pendingRequestsCts)
   at System.Net.Http.HttpClient.<SendAsync>g__Core|83_0(HttpRequestMessage request, HttpCompletionOption completionOption, CancellationTokenSource cts, Boolean disposeCts, CancellationTokenSource pendingRequestsCts, CancellationToken originalCancellationToken)
   at Duplicati.Library.JsonWebHelperHttpClient.GetResponseAsync(HttpRequestMessage req, CancellationToken cancellationToken)
   at Duplicati.Library.JsonWebHelperHttpClient.ReadJsonResponseAsync[T](HttpRequestMessage req, CancellationToken cancellationToken)
   at Duplicati.Library.JsonWebHelperHttpClient.GetJsonDataAsync[T](String url, CancellationToken cancellationToken, Action`1 setup)
   at Duplicati.Library.JsonWebHelperHttpClient.PostAndGetJsonDataAsync[T](String url, Object item, CancellationToken cancellationToken)
   at Duplicati.Library.Backend.Backblaze.B2.ListAsync(CancellationToken cancellationToken)
   at Duplicati.Library.Backend.Backblaze.B2.PutAsync(String remotename, Stream stream, CancellationToken cancelToken)
   at Duplicati.Library.Main.Operation.Backup.BackendUploader.DoPut(FileEntryItem item, IBackend backend, CancellationToken cancelToken)
   at Duplicati.Library.Main.Operation.Backup.BackendUploader.<>c__DisplayClass25_0.<<UploadFileAsync>b__0>d.MoveNext()
--- End of stack trace from previous location ---
   at Duplicati.Library.Main.Operation.Backup.BackendUploader.DoWithRetry(Func`1 method, FileEntryItem item, Worker worker, CancellationToken cancelToken)

and possibly was caused by this new timeout:

C:\Duplicati\duplicati-2.1.0.107_canary_2025-01-17-win-x64-gui>Duplicati.CommandLine help read-write-timeout
  --read-write-timeout (Timespan): Set the read/write timeout for the connection
    The read/write timeout is the maximum amount of time to wait for any activity during a transfer. If no activity is
    detected for this period, the connection is considered broken and the transfer is aborted. Set to 0s to disabled
    * default value: 30s

My first surprise was it kept going after a Put failure. Didn’t it used to stop?
What it wound up doing was uploading dlist referencing dblock that wasn’t there.
This was reported by my backup sanity checker script. Let’s look at some errors:

Missing dlist blocklist from dindex vol/ blocks {'ci8QqioewhgvToVX0PxXlWxWlVc59fOJe8XOpTsYHck=', '6PPDzlV79kzDgSc5ICBaht4nwph2hmFNYMnbSZzYF/8='}

In database history in the 7:25, both blocklist hashes lead to the missing file.

Missing small file data block from dindex vol/ blocks {'DFMUdE9GfSZ9koSo/tK98uF7dEdEaSZvp/kBNQ4nY/g='}

Missing metadata block from dindex vol/ blocks {'SxktK2gNItsQDoOC3aFkkLzpnZRSQNF181j0uau4BwQ='}

Both also in missing dblock file.

Duplicati tries to clean file issues at the start of each backup. Profiling log:

2025-02-16 08:52:42 -05 - [Information-Duplicati.Library.Main.Operation.FilelistProcessor-SchedulingMissingFileForDelete]: scheduling missing file for deletion, currently listed as Uploading: duplicati-b125bf82ede184f1bbfc31eb05ca2783e.dblock.zip.aes

It doesn’t actually wind up doing delete, because the dblock was never uploaded.
I think I overwrote one historical database, but regardless, bad dblock is gone.
The BlocklistHash table no longer has either reference pointing to missing file.

The dblock to dindex mappings look like this, based on looking at vol in dindex:

duplicati-bed1695fdb3d64f728d1e0751a8b0ee6b.dblock.zip.aes
duplicati-i85a7b316825e4b6bb617b6c0db4bd59a.dindex.zip.aes

duplicati-be476ffad0d544711a050b3a356b03f60.dblock.zip.aes
duplicati-i2d1174abd6ca41519e7a7dff8383afb0.dindex.zip.aes

duplicati-b125bf82ede184f1bbfc31eb05ca2783e.dblock.zip.aes
(which failed upload, at least seemed to not put a dindex)

So is all fixed? Probably not, because duplicati-20250216T122100Z.dlist.zip.aes
still has references into the dblock that didn’t get uploaded (while dlist did).

I used BackendTool to get a current copy, got its filelist.json, and found this:

"blocklists":["ci8QqioewhgvToVX0PxXlWxWlVc59fOJe8XOpTsYHck="]
"blocklists":["6PPDzlV79kzDgSc5ICBaht4nwph2hmFNYMnbSZzYF/8="]

So I suspect Recreate would be messy. I won’t immediately try, as B2 added this:

Found 5 files that are missing from the remote storage, please run repair

Red popup error on a backup after getting the Wi-Fi working again. Live log has:

Feb 16, 2025 10:55 AM: The operation Test has failed with error: Found 5 files that are missing from the remote storage, please run repair
Feb 16, 2025 10:55 AM: Found 5 files that are missing from the remote storage, please run repair
Feb 16, 2025 10:55 AM: Missing file: duplicati-i55861405c35f4c569ad171016b5f66d5.dindex.zip.aes
Feb 16, 2025 10:55 AM: Missing file: duplicati-i162a52604e33461d8844a395e75c97df.dindex.zip.aes
Feb 16, 2025 10:55 AM: Missing file: duplicati-b29f1b378849c4e55bd43dbe9da8cf5d7.dblock.zip.aes
Feb 16, 2025 10:55 AM: Missing file: duplicati-b8deaff454e8c46d9b0524a8d3d41573b.dblock.zip.aes
Feb 16, 2025 10:55 AM: Missing file: duplicati-20250216T154934Z.dlist.zip.aes

Profiling log has:

2025-02-16 10:51:41 -05 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Started: duplicati-i55861405c35f4c569ad171016b5f66d5.dindex.zip.aes (62.22 KiB)
2025-02-16 10:51:41 -05 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Completed: duplicati-i55861405c35f4c569ad171016b5f66d5.dindex.zip.aes (62.22 KiB)
2025-02-16 10:51:48 -05 - [Warning-Duplicati.Library.Main.Operation.FilelistProcessor-MissingFile]: Missing file: duplicati-i55861405c35f4c569ad171016b5f66d5.dindex.zip.aes
2025-02-16 10:55:56 -05 - [Warning-Duplicati.Library.Main.Operation.FilelistProcessor-MissingFile]: Missing file: duplicati-i55861405c35f4c569ad171016b5f66d5.dindex.zip.aes
2025-02-16 10:51:17 -05 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Started: duplicati-i162a52604e33461d8844a395e75c97df.dindex.zip.aes (29.23 KiB)
2025-02-16 10:51:17 -05 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Completed: duplicati-i162a52604e33461d8844a395e75c97df.dindex.zip.aes (29.23 KiB)
2025-02-16 10:51:48 -05 - [Warning-Duplicati.Library.Main.Operation.FilelistProcessor-MissingFile]: Missing file: duplicati-i162a52604e33461d8844a395e75c97df.dindex.zip.aes
2025-02-16 10:55:56 -05 - [Warning-Duplicati.Library.Main.Operation.FilelistProcessor-MissingFile]: Missing file: duplicati-i162a52604e33461d8844a395e75c97df.dindex.zip.aes
2025-02-16 10:51:36 -05 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Started: duplicati-b29f1b378849c4e55bd43dbe9da8cf5d7.dblock.zip.aes (22.62 MiB)
2025-02-16 10:51:40 -05 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Completed: duplicati-b29f1b378849c4e55bd43dbe9da8cf5d7.dblock.zip.aes (22.62 MiB)
2025-02-16 10:51:48 -05 - [Warning-Duplicati.Library.Main.Operation.FilelistProcessor-MissingFile]: Missing file: duplicati-b29f1b378849c4e55bd43dbe9da8cf5d7.dblock.zip.aes
2025-02-16 10:55:56 -05 - [Warning-Duplicati.Library.Main.Operation.FilelistProcessor-MissingFile]: Missing file: duplicati-b29f1b378849c4e55bd43dbe9da8cf5d7.dblock.zip.aes
2025-02-16 10:50:54 -05 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Started: duplicati-b8deaff454e8c46d9b0524a8d3d41573b.dblock.zip.aes (50.01 MiB)
2025-02-16 10:51:11 -05 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Completed: duplicati-b8deaff454e8c46d9b0524a8d3d41573b.dblock.zip.aes (50.01 MiB)
2025-02-16 10:51:48 -05 - [Warning-Duplicati.Library.Main.Operation.FilelistProcessor-MissingFile]: Missing file: duplicati-b8deaff454e8c46d9b0524a8d3d41573b.dblock.zip.aes
2025-02-16 10:55:56 -05 - [Warning-Duplicati.Library.Main.Operation.FilelistProcessor-MissingFile]: Missing file: duplicati-b8deaff454e8c46d9b0524a8d3d41573b.dblock.zip.aes
2025-02-16 10:51:41 -05 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Started: duplicati-20250216T154934Z.dlist.zip.aes (948.67 KiB)
2025-02-16 10:51:42 -05 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Completed: duplicati-20250216T154934Z.dlist.zip.aes (948.67 KiB)
2025-02-16 10:51:48 -05 - [Warning-Duplicati.Library.Main.Operation.FilelistProcessor-MissingFile]: Missing file: duplicati-20250216T154934Z.dlist.zip.aes
2025-02-16 10:55:56 -05 - [Warning-Duplicati.Library.Main.Operation.FilelistProcessor-MissingFile]: Missing file: duplicati-20250216T154934Z.dlist.zip.aes

I can’t find any of those files in the Backblaze B2 web UI for the bucket and folder either.
This isn’t the sort of thing I usually see Duplicati or B2 native doing, so it’s pretty odd.

EDIT 1:

Here are the Put - Completed. It seemingly finished 7 successfully, but only 5 are around:

2025-02-16 10:51:11 -05 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Completed: duplicati-b8deaff454e8c46d9b0524a8d3d41573b.dblock.zip.aes (50.01 MiB)
2025-02-16 10:51:16 -05 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Completed: duplicati-b5a714fb2fee94bf1b6fb131d48d23a65.dblock.zip.aes (49.98 MiB)
2025-02-16 10:51:17 -05 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Completed: duplicati-ib1a923650b104045a91fb8992fb81af1.dindex.zip.aes (24.62 KiB)
2025-02-16 10:51:17 -05 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Completed: duplicati-i162a52604e33461d8844a395e75c97df.dindex.zip.aes (29.23 KiB)
2025-02-16 10:51:40 -05 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Completed: duplicati-b29f1b378849c4e55bd43dbe9da8cf5d7.dblock.zip.aes (22.62 MiB)
2025-02-16 10:51:41 -05 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Completed: duplicati-i55861405c35f4c569ad171016b5f66d5.dindex.zip.aes (62.22 KiB)
2025-02-16 10:51:42 -05 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Completed: duplicati-20250216T154934Z.dlist.zip.aes (948.67 KiB)
2025-02-16 10:51:47 -05 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Completed: duplicati-verification.json (104.96 KiB)

These backup data files actually are around, according to B2 web UI:

duplicati-b5a714fb2fee94bf1b6fb131d48d23a65.dblock.zip.aes
duplicati-ib1a923650b104045a91fb8992fb81af1.dindex.zip.aes

EDIT 2:

As expected, DB after that backup shows five missing files as Uploaded, not making it as far as Verified, which is the State of the two that actually uploaded. There was a recent forum topic:

Random files not found on Backblaze target except it used B2 S3, and seemed to show the files actually there. In my case here, even new login to the Backblaze site browser doesn’t see them.

duplicati-b8deaff454e8c46d9b0524a8d3d41573b.dblock.zip.aes is roughly 50 MB and its upload duration seems to reflect that compared to the others. It times like uploading, but where’d file go?

EDIT 3:

Here are the uploads of the two files that showed up, to see if maybe times of Put give any clues:

2025-02-16 10:50:56 -05 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Started: duplicati-b5a714fb2fee94bf1b6fb131d48d23a65.dblock.zip.aes (49.98 MiB)
2025-02-16 10:51:16 -05 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Completed: duplicati-b5a714fb2fee94bf1b6fb131d48d23a65.dblock.zip.aes (49.98 MiB)
2025-02-16 10:51:17 -05 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Started: duplicati-ib1a923650b104045a91fb8992fb81af1.dindex.zip.aes (24.62 KiB)
2025-02-16 10:51:17 -05 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Completed: duplicati-ib1a923650b104045a91fb8992fb81af1.dindex.zip.aes (24.62 KiB)

Not clear to me. Maybe someone else’s look will be better, but passes and fails seem intermixed.

BackendTest run is completely fine at default settings. Google isn’t finding a burst of complaints of missing files from Backblaze B2 uploads, so this second-part-of-the-issue is still pretty mysterious.

If need be, maybe the focus can be on first part, but I don’t know if it could have caused second. There’s no network trace to help with second, but TrayIcon has no lingering connections with B2.

That seems to be the core of the problem. If an upload has failed, it should not be able to upload the dlist.

I re-wrote the backend manager for 2.1.0.108, so I hope this is fixed.
But we do have tests for failed uploads, and they passed in 2.1.0.107 as well :confused:

That makes sense, it will transition once the backend has reported the files as present. Those two files are actually present as I understand it, where the other 5 are not present.

Not sure how to crack this, so I am just listing some ideas here.

First, I am trying to see if it is possible that the failed upload would “pretend” to let the other uploads continue, but actually fail. This could then explain that some files are not uploaded.

But it does not quite hold, because the file upload appears to take some time to do, and the “Put - Completed” message is logged from within the code that does the actual upload. The log string cannot be emitted unless the stream has been fully passed to the destination.

Also, for 2.1.0.107, it does actually not use the backendmanager for the dlist upload, so it could be possible that the list file proceeds to upload, despite errors in the backendmanager. This could perhaps explain why the backup can continue, but this part of the code has not been changed for years (it was then changed for 2.1.0.108).

But even stranger, the dlist is also among the missing files. It is almost as if the failed upload somehow made a broken B2 backend that “fake uploads” the files. :question:

In order to get a friendlier view of the sequence, here are tags with Upload or BackendEvent:

2025-02-16 07:21:04 -05 - [Profiling-Timer.Begin-Duplicati.Library.Main.BackendManager-RemoteOperationList]: Starting - RemoteOperationList
2025-02-16 07:21:04 -05 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: List - Started:  ()
2025-02-16 07:21:06 -05 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: List - Completed:  (493 bytes)
2025-02-16 07:21:06 -05 - [Profiling-Timer.Finished-Duplicati.Library.Main.BackendManager-RemoteOperationList]: RemoteOperationList took 0:00:00:02.054
2025-02-16 07:23:56 -05 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Started: duplicati-be476ffad0d544711a050b3a356b03f60.dblock.zip.aes (50.01 MiB)
2025-02-16 07:23:56 -05 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Started: duplicati-bed1695fdb3d64f728d1e0751a8b0ee6b.dblock.zip.aes (50.03 MiB)
2025-02-16 07:24:57 -05 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Started: duplicati-b125bf82ede184f1bbfc31eb05ca2783e.dblock.zip.aes (23.25 MiB)
2025-02-16 07:25:00 -05 - [Profiling-Timer.Begin-Duplicati.Library.Main.Operation.Backup.UploadRealFilelist-UpdateChangeStatistics]: Starting - UpdateChangeStatistics
2025-02-16 07:25:01 -05 - [Profiling-Timer.Finished-Duplicati.Library.Main.Operation.Backup.UploadRealFilelist-UpdateChangeStatistics]: UpdateChangeStatistics took 0:00:00:00.509
2025-02-16 07:25:01 -05 - [Profiling-Timer.Begin-Duplicati.Library.Main.Operation.Backup.UploadRealFilelist-UploadNewFileset]: Starting - Uploading a new fileset
2025-02-16 07:25:03 -05 - [Profiling-Timer.Finished-Duplicati.Library.Main.Operation.Backup.UploadRealFilelist-UploadNewFileset]: Uploading a new fileset took 0:00:00:02.405
2025-02-16 07:25:28 -05 - [Retry-Duplicati.Library.Main.Operation.Backup.BackendUploader-RetryPut]: Operation Put with file duplicati-b125bf82ede184f1bbfc31eb05ca2783e.dblock.zip.aes attempt 1 of 1 failed with message: The operation was canceled.
2025-02-16 07:34:25 -05 - [Profiling-Duplicati.Library.Main.Operation.Backup.BackendUploader-UploadSpeed]: Uploaded 50.01 MiB in 00:10:29.2687382, 81.39 KiB/s
2025-02-16 07:34:25 -05 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Completed: duplicati-be476ffad0d544711a050b3a356b03f60.dblock.zip.aes (50.01 MiB)
2025-02-16 07:34:25 -05 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Started: duplicati-i2d1174abd6ca41519e7a7dff8383afb0.dindex.zip.aes (27.25 KiB)
2025-02-16 07:34:27 -05 - [Profiling-Duplicati.Library.Main.Operation.Backup.BackendUploader-UploadSpeed]: Uploaded 50.03 MiB in 00:10:30.8189377, 81.21 KiB/s
2025-02-16 07:34:27 -05 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Completed: duplicati-bed1695fdb3d64f728d1e0751a8b0ee6b.dblock.zip.aes (50.03 MiB)
2025-02-16 07:34:27 -05 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Started: duplicati-i85a7b316825e4b6bb617b6c0db4bd59a.dindex.zip.aes (35.01 KiB)
2025-02-16 07:34:29 -05 - [Profiling-Duplicati.Library.Main.Operation.Backup.BackendUploader-UploadSpeed]: Uploaded 27.25 KiB in 00:00:03.6168321, 7.53 KiB/s
2025-02-16 07:34:29 -05 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Completed: duplicati-i2d1174abd6ca41519e7a7dff8383afb0.dindex.zip.aes (27.25 KiB)
2025-02-16 07:34:30 -05 - [Profiling-Duplicati.Library.Main.Operation.Backup.BackendUploader-UploadSpeed]: Uploaded 35.01 KiB in 00:00:02.9578903, 11.84 KiB/s
2025-02-16 07:34:30 -05 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Completed: duplicati-i85a7b316825e4b6bb617b6c0db4bd59a.dindex.zip.aes (35.01 KiB)
2025-02-16 07:34:30 -05 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Started: duplicati-20250216T122100Z.dlist.zip.aes (948.67 KiB)
2025-02-16 07:34:38 -05 - [Profiling-Duplicati.Library.Main.Operation.Backup.BackendUploader-UploadSpeed]: Uploaded 948.67 KiB in 00:00:07.7408506, 122.55 KiB/s
2025-02-16 07:34:38 -05 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Completed: duplicati-20250216T122100Z.dlist.zip.aes (948.67 KiB)
2025-02-16 07:34:49 -05 - [Profiling-Timer.Begin-Duplicati.Library.Main.BackendManager-RemoteOperationDelete]: Starting - RemoteOperationDelete
2025-02-16 07:34:49 -05 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Delete - Started: duplicati-20240216T132340Z.dlist.zip.aes (938.00 KiB)
2025-02-16 07:34:52 -05 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Delete - Completed: duplicati-20240216T132340Z.dlist.zip.aes (938.00 KiB)
2025-02-16 07:34:52 -05 - [Profiling-Timer.Finished-Duplicati.Library.Main.BackendManager-RemoteOperationDelete]: RemoteOperationDelete took 0:00:00:03.301
2025-02-16 07:34:52 -05 - [Profiling-Timer.Begin-Duplicati.Library.Main.BackendManager-RemoteOperationDelete]: Starting - RemoteOperationDelete
2025-02-16 07:34:52 -05 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Delete - Started: duplicati-20250202T122003Z.dlist.zip.aes (947.98 KiB)
2025-02-16 07:34:53 -05 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Delete - Completed: duplicati-20250202T122003Z.dlist.zip.aes (947.98 KiB)
2025-02-16 07:34:53 -05 - [Profiling-Timer.Finished-Duplicati.Library.Main.BackendManager-RemoteOperationDelete]: RemoteOperationDelete took 0:00:00:00.649
2025-02-16 07:34:53 -05 - [Profiling-Timer.Begin-Duplicati.Library.Main.BackendManager-RemoteOperationNothing]: Starting - RemoteOperationNothing
2025-02-16 07:34:53 -05 - [Profiling-Timer.Finished-Duplicati.Library.Main.BackendManager-RemoteOperationNothing]: RemoteOperationNothing took 0:00:00:00.000
2025-02-16 07:34:55 -05 - [Profiling-Timer.Begin-Duplicati.Library.Main.BackendManager-RemoteOperationNothing]: Starting - RemoteOperationNothing
2025-02-16 07:34:55 -05 - [Profiling-Timer.Finished-Duplicati.Library.Main.BackendManager-RemoteOperationNothing]: RemoteOperationNothing took 0:00:00:00.000
2025-02-16 07:34:55 -05 - [Profiling-Timer.Begin-Duplicati.Library.Main.BackendManager-RemoteOperationDelete]: Starting - RemoteOperationDelete
2025-02-16 07:34:55 -05 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Delete - Started: duplicati-b3df2619cb6264febba5affb69d66510c.dblock.zip.aes (16.93 MiB)
2025-02-16 07:34:56 -05 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Delete - Completed: duplicati-b3df2619cb6264febba5affb69d66510c.dblock.zip.aes (16.93 MiB)
2025-02-16 07:34:56 -05 - [Profiling-Timer.Finished-Duplicati.Library.Main.BackendManager-RemoteOperationDelete]: RemoteOperationDelete took 0:00:00:01.038
2025-02-16 07:34:56 -05 - [Profiling-Timer.Begin-Duplicati.Library.Main.BackendManager-RemoteOperationDelete]: Starting - RemoteOperationDelete
2025-02-16 07:34:56 -05 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Delete - Started: duplicati-i89772384028847d28da08e28ad9cfcc7.dindex.zip.aes (57.37 KiB)
2025-02-16 07:34:57 -05 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Delete - Completed: duplicati-i89772384028847d28da08e28ad9cfcc7.dindex.zip.aes (57.37 KiB)
2025-02-16 07:34:57 -05 - [Profiling-Timer.Finished-Duplicati.Library.Main.BackendManager-RemoteOperationDelete]: RemoteOperationDelete took 0:00:00:01.182
2025-02-16 07:34:57 -05 - [Profiling-Timer.Begin-Duplicati.Library.Main.BackendManager-RemoteOperationNothing]: Starting - RemoteOperationNothing
2025-02-16 07:34:57 -05 - [Profiling-Timer.Finished-Duplicati.Library.Main.BackendManager-RemoteOperationNothing]: RemoteOperationNothing took 0:00:00:00.000
2025-02-16 07:34:58 -05 - [Profiling-Timer.Begin-Duplicati.Library.Main.BackendManager-RemoteOperationTerminate]: Starting - RemoteOperationTerminate
2025-02-16 07:34:58 -05 - [Profiling-Timer.Finished-Duplicati.Library.Main.BackendManager-RemoteOperationTerminate]: RemoteOperationTerminate took 0:00:00:00.000
2025-02-16 07:34:58 -05 - [Profiling-Timer.Begin-Duplicati.Library.Main.BackendManager-RemoteOperationPut]: Starting - RemoteOperationPut
2025-02-16 07:34:58 -05 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Started: duplicati-verification.json (102.96 KiB)
2025-02-16 07:35:15 -05 - [Profiling-Timer.Finished-Duplicati.Library.Main.BackendManager-RemoteOperationPut]: RemoteOperationPut took 0:00:00:17.310
2025-02-16 07:35:15 -05 - [Retry-Duplicati.Library.Main.BackendManager-RetryPut]: Operation Put with file duplicati-verification.json attempt 1 of 0 failed with message: An error occurred while sending the request.
2025-02-16 07:35:15 -05 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Failed: duplicati-verification.json (102.96 KiB)

The simultaneous upload starts of two 50 MB dblocks at 7:23:56 are probably due to my two concurrency-compressors hitting dblock-size at same time. This leads to bursty uploads.

One interesting thing is the early “Uploading a new fileset took 0:00:00:02.405”. at 07:25:03. Actually the dlist upload finished at 7:25:30 and took 7 seconds. Ignoring time, was decision predetermined early (prior to any dblock upload failure) that a dlist would upload eventually?

I think I did have some sort of interlock added to avoid dlist finishing before, say, final dblock, because if a crash happened then, the dlist would reference missing data just like it did here.

EDIT 1:

Attempted to reproduce problem with new simple job using same Duplicati as was up before.
Couldn’t get a failure with read-write-timeout=1s, even with throttle at 1 MB/sec, so watched profiling log for “Uploading new fileset”, and disconnected Wi-Fi. No attempted dlist upload…

2025-02-19 19:53:12 -05 - [Profiling-Timer.Begin-Duplicati.Library.Main.BackendManager-RemoteOperationList]: Starting - RemoteOperationList
2025-02-19 19:53:12 -05 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: List - Started:  ()
2025-02-19 19:53:13 -05 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: List - Completed:  ()
2025-02-19 19:53:13 -05 - [Profiling-Timer.Finished-Duplicati.Library.Main.BackendManager-RemoteOperationList]: RemoteOperationList took 0:00:00:00.825
2025-02-19 19:53:13 -05 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Started: duplicati-b3210fe11000444359031e9cec68450cb.dblock.zip.aes (9.00 MiB)
2025-02-19 19:53:13 -05 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Started: duplicati-ba53218a056b0483d933db2098706c142.dblock.zip.aes (9.00 MiB)
2025-02-19 19:53:14 -05 - [Profiling-Timer.Begin-Duplicati.Library.Main.Operation.Backup.UploadRealFilelist-UpdateChangeStatistics]: Starting - UpdateChangeStatistics
2025-02-19 19:53:14 -05 - [Profiling-Timer.Finished-Duplicati.Library.Main.Operation.Backup.UploadRealFilelist-UpdateChangeStatistics]: UpdateChangeStatistics took 0:00:00:00.002
2025-02-19 19:53:14 -05 - [Profiling-Timer.Begin-Duplicati.Library.Main.Operation.Backup.UploadRealFilelist-UploadNewFileset]: Starting - Uploading a new fileset
2025-02-19 19:53:14 -05 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Started: duplicati-bd1012ba3a50d4ff684fa4ec7d4ac3a70.dblock.zip.aes (2.00 MiB)
2025-02-19 19:53:14 -05 - [Profiling-Timer.Finished-Duplicati.Library.Main.Operation.Backup.UploadRealFilelist-UploadNewFileset]: Uploading a new fileset took 0:00:00:00.279
2025-02-19 19:53:44 -05 - [Retry-Duplicati.Library.Main.Operation.Backup.BackendUploader-RetryPut]: Operation Put with file duplicati-b3210fe11000444359031e9cec68450cb.dblock.zip.aes attempt 1 of 1 failed with message: Error while copying content to a stream.
2025-02-19 19:53:44 -05 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Failed: duplicati-b3210fe11000444359031e9cec68450cb.dblock.zip.aes (9.00 MiB)
2025-02-19 19:53:44 -05 - [Retry-Duplicati.Library.Main.Operation.Backup.BackendUploader-RetryPut]: Operation Put with file duplicati-ba53218a056b0483d933db2098706c142.dblock.zip.aes attempt 1 of 1 failed with message: Error while copying content to a stream.
2025-02-19 19:53:44 -05 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Failed: duplicati-ba53218a056b0483d933db2098706c142.dblock.zip.aes (9.00 MiB)
2025-02-19 19:54:09 -05 - [Retry-Duplicati.Library.Main.Operation.Backup.BackendUploader-RetryPut]: Operation Put with file duplicati-bd1012ba3a50d4ff684fa4ec7d4ac3a70.dblock.zip.aes attempt 1 of 1 failed with message: Error while copying content to a stream.
2025-02-19 19:54:09 -05 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Failed: duplicati-bd1012ba3a50d4ff684fa4ec7d4ac3a70.dblock.zip.aes (2.00 MiB)

That should certainly not happen. The failed upload should crash the process with an exception.

That is also how I would expect it to work.

I have added a new test that I think replicates your traces:

At least for the updated backend manager code, this works as expected. I also tried running it on 2.1.0.107 and there were no errors there either.

That moves my focus to the B2 interaction. I wonder if it is possible for the B2 code to fail an upload, but still report success?

To expand final error from 7 A.M. backup (while the Wi-Fi was flaky, so maybe caused errors):

2025-02-16 07:34:58 -05 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Started: duplicati-verification.json (102.96 KiB)
2025-02-16 07:35:15 -05 - [Profiling-Timer.Finished-Duplicati.Library.Main.BackendManager-RemoteOperationPut]: RemoteOperationPut took 0:00:00:17.310
2025-02-16 07:35:15 -05 - [Retry-Duplicati.Library.Main.BackendManager-RetryPut]: Operation Put with file duplicati-verification.json attempt 1 of 0 failed with message: An error occurred while sending the request.
System.Net.Http.HttpRequestException: An error occurred while sending the request.
 ---> System.Net.Http.HttpIOException: The response ended prematurely. (ResponseEnded)
   at System.Net.Http.HttpConnection.SendAsync(HttpRequestMessage request, Boolean async, CancellationToken cancellationToken)
   --- End of inner exception stack trace ---
   at System.Net.Http.HttpConnection.SendAsync(HttpRequestMessage request, Boolean async, CancellationToken cancellationToken)
   at System.Net.Http.HttpConnectionPool.SendWithVersionDetectionAndRetryAsync(HttpRequestMessage request, Boolean async, Boolean doRequestAuth, CancellationToken cancellationToken)
   at System.Net.Http.RedirectHandler.SendAsync(HttpRequestMessage request, Boolean async, CancellationToken cancellationToken)
   at Microsoft.Extensions.Http.Logging.LoggingHttpMessageHandler.<SendCoreAsync>g__Core|5_0(HttpRequestMessage request, Boolean useAsync, CancellationToken cancellationToken)
   at Microsoft.Extensions.Http.Logging.LoggingScopeHttpMessageHandler.<SendCoreAsync>g__Core|5_0(HttpRequestMessage request, Boolean useAsync, CancellationToken cancellationToken)
   at System.Net.Http.HttpClient.<SendAsync>g__Core|83_0(HttpRequestMessage request, HttpCompletionOption completionOption, CancellationTokenSource cts, Boolean disposeCts, CancellationTokenSource pendingRequestsCts, CancellationToken originalCancellationToken)
   at Duplicati.Library.JsonWebHelperHttpClient.GetResponseAsync(HttpRequestMessage req, CancellationToken cancellationToken)
   at Duplicati.Library.JsonWebHelperHttpClient.ReadJsonResponseAsync[T](HttpRequestMessage req, CancellationToken cancellationToken)
   at Duplicati.Library.JsonWebHelperHttpClient.GetJsonDataAsync[T](String url, CancellationToken cancellationToken, Action`1 setup)
   at Duplicati.Library.JsonWebHelperHttpClient.PostAndGetJsonDataAsync[T](String url, Object item, CancellationToken cancellationToken)
   at Duplicati.Library.Backend.Backblaze.B2.GetUploadUrlDataAsync(CancellationToken cancellationToken)
   at Duplicati.Library.Backend.Backblaze.B2.PutAsync(String remotename, Stream stream, CancellationToken cancelToken)
   at Duplicati.Library.Main.BackendManager.DoPutAsync(FileEntryItem item)
   at Duplicati.Library.Utility.Utility.Await(Task task)
   at Duplicati.Library.Main.BackendManager.ThreadRun()
2025-02-16 07:35:15 -05 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Failed: duplicati-verification.json (102.96 KiB)

For a friendlier view of 10 A.M. backup with missing files, tags with Upload or BackendEvent:

2025-02-16 10:49:39 -05 - [Profiling-Timer.Begin-Duplicati.Library.Main.BackendManager-RemoteOperationList]: Starting - RemoteOperationList
2025-02-16 10:49:39 -05 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: List - Started:  ()
2025-02-16 10:49:40 -05 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: List - Completed:  (506 bytes)
2025-02-16 10:49:40 -05 - [Profiling-Timer.Finished-Duplicati.Library.Main.BackendManager-RemoteOperationList]: RemoteOperationList took 0:00:00:01.251
2025-02-16 10:50:54 -05 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Started: duplicati-b8deaff454e8c46d9b0524a8d3d41573b.dblock.zip.aes (50.01 MiB)
2025-02-16 10:50:56 -05 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Started: duplicati-b5a714fb2fee94bf1b6fb131d48d23a65.dblock.zip.aes (49.98 MiB)
2025-02-16 10:51:11 -05 - [Profiling-Duplicati.Library.Main.Operation.Backup.BackendUploader-UploadSpeed]: Uploaded 50.01 MiB in 00:00:16.7923503, 2.98 MiB/s
2025-02-16 10:51:11 -05 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Completed: duplicati-b8deaff454e8c46d9b0524a8d3d41573b.dblock.zip.aes (50.01 MiB)
2025-02-16 10:51:14 -05 - [Profiling-Duplicati.Library.Main.Operation.Backup.BackendUploader-UploadSpeed]: Uploaded 49.98 MiB in 00:00:18.2696385, 2.74 MiB/s
2025-02-16 10:51:16 -05 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Completed: duplicati-b5a714fb2fee94bf1b6fb131d48d23a65.dblock.zip.aes (49.98 MiB)
2025-02-16 10:51:17 -05 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Started: duplicati-i162a52604e33461d8844a395e75c97df.dindex.zip.aes (29.23 KiB)
2025-02-16 10:51:17 -05 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Started: duplicati-ib1a923650b104045a91fb8992fb81af1.dindex.zip.aes (24.62 KiB)
2025-02-16 10:51:17 -05 - [Profiling-Duplicati.Library.Main.Operation.Backup.BackendUploader-UploadSpeed]: Uploaded 24.62 KiB in 00:00:00.1768233, 139.25 KiB/s
2025-02-16 10:51:17 -05 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Completed: duplicati-ib1a923650b104045a91fb8992fb81af1.dindex.zip.aes (24.62 KiB)
2025-02-16 10:51:17 -05 - [Profiling-Duplicati.Library.Main.Operation.Backup.BackendUploader-UploadSpeed]: Uploaded 29.23 KiB in 00:00:00.4153594, 70.38 KiB/s
2025-02-16 10:51:17 -05 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Completed: duplicati-i162a52604e33461d8844a395e75c97df.dindex.zip.aes (29.23 KiB)
2025-02-16 10:51:36 -05 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Started: duplicati-b29f1b378849c4e55bd43dbe9da8cf5d7.dblock.zip.aes (22.62 MiB)
2025-02-16 10:51:37 -05 - [Profiling-Timer.Begin-Duplicati.Library.Main.Operation.Backup.UploadRealFilelist-UpdateChangeStatistics]: Starting - UpdateChangeStatistics
2025-02-16 10:51:37 -05 - [Profiling-Timer.Finished-Duplicati.Library.Main.Operation.Backup.UploadRealFilelist-UpdateChangeStatistics]: UpdateChangeStatistics took 0:00:00:00.398
2025-02-16 10:51:37 -05 - [Profiling-Timer.Begin-Duplicati.Library.Main.Operation.Backup.UploadRealFilelist-UploadNewFileset]: Starting - Uploading a new fileset
2025-02-16 10:51:39 -05 - [Profiling-Timer.Finished-Duplicati.Library.Main.Operation.Backup.UploadRealFilelist-UploadNewFileset]: Uploading a new fileset took 0:00:00:01.436
2025-02-16 10:51:40 -05 - [Profiling-Duplicati.Library.Main.Operation.Backup.BackendUploader-UploadSpeed]: Uploaded 22.62 MiB in 00:00:03.8972025, 5.80 MiB/s
2025-02-16 10:51:40 -05 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Completed: duplicati-b29f1b378849c4e55bd43dbe9da8cf5d7.dblock.zip.aes (22.62 MiB)
2025-02-16 10:51:41 -05 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Started: duplicati-i55861405c35f4c569ad171016b5f66d5.dindex.zip.aes (62.22 KiB)
2025-02-16 10:51:41 -05 - [Profiling-Duplicati.Library.Main.Operation.Backup.BackendUploader-UploadSpeed]: Uploaded 62.22 KiB in 00:00:00.2984505, 208.46 KiB/s
2025-02-16 10:51:41 -05 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Completed: duplicati-i55861405c35f4c569ad171016b5f66d5.dindex.zip.aes (62.22 KiB)
2025-02-16 10:51:41 -05 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Started: duplicati-20250216T154934Z.dlist.zip.aes (948.67 KiB)
2025-02-16 10:51:42 -05 - [Profiling-Duplicati.Library.Main.Operation.Backup.BackendUploader-UploadSpeed]: Uploaded 948.67 KiB in 00:00:00.6454318, 1.44 MiB/s
2025-02-16 10:51:42 -05 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Completed: duplicati-20250216T154934Z.dlist.zip.aes (948.67 KiB)
2025-02-16 10:51:44 -05 - [Profiling-Timer.Begin-Duplicati.Library.Main.BackendManager-RemoteOperationNothing]: Starting - RemoteOperationNothing
2025-02-16 10:51:44 -05 - [Profiling-Timer.Finished-Duplicati.Library.Main.BackendManager-RemoteOperationNothing]: RemoteOperationNothing took 0:00:00:00.000
2025-02-16 10:51:44 -05 - [Profiling-Timer.Begin-Duplicati.Library.Main.BackendManager-RemoteOperationPut]: Starting - RemoteOperationPut
2025-02-16 10:51:44 -05 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Started: duplicati-verification.json (104.96 KiB)
2025-02-16 10:51:47 -05 - [Profiling-Duplicati.Library.Main.BackendManager-UploadSpeed]: Uploaded 104.96 KiB in 00:00:02.0918700, 50.18 KiB/s
2025-02-16 10:51:47 -05 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Completed: duplicati-verification.json (104.96 KiB)
2025-02-16 10:51:47 -05 - [Profiling-Timer.Finished-Duplicati.Library.Main.BackendManager-RemoteOperationPut]: RemoteOperationPut took 0:00:00:02.094
2025-02-16 10:51:47 -05 - [Profiling-Timer.Begin-Duplicati.Library.Main.BackendManager-RemoteOperationTerminate]: Starting - RemoteOperationTerminate
2025-02-16 10:51:47 -05 - [Profiling-Timer.Finished-Duplicati.Library.Main.BackendManager-RemoteOperationTerminate]: RemoteOperationTerminate took 0:00:00:00.000
2025-02-16 10:51:47 -05 - [Profiling-Timer.Begin-Duplicati.Library.Main.BackendManager-RemoteOperationList]: Starting - RemoteOperationList
2025-02-16 10:51:47 -05 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: List - Started:  ()
2025-02-16 10:51:47 -05 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: List - Completed:  (508 bytes)
2025-02-16 10:51:47 -05 - [Profiling-Timer.Finished-Duplicati.Library.Main.BackendManager-RemoteOperationList]: RemoteOperationList took 0:00:00:00.525

Of the two initial roughly 50 MB dblock files, the first vanished somehow, but second made it. Second file that made it was duplicati-ib1a923650b104045a91fb8992fb81af1.dindex.zip.aes. Possibly by coincidence (or maybe there’s a connection?) that’s dindex for successful dblock.

Which aspect? Unfortunately, the notes after it suggest it didn’t repro the odd dlist-after-failure.

I looked at emails from over a year of failures (remember, I’m “asking for it” with no-retry policy):

503 - service_unavailable: no tomes available

503 - service_unavailable: CPU too busy

500 - internal_error

401 - bad_auth_token

System.Net.Sockets.SocketException: A connection attempt failed because the connected party did not properly respond after a period of time, or established connection failed because connected host has failed to respond

System.AggregateException: Unable to read data from the transport connection: The connection was closed.

Details: System.AggregateException: One or more errors occurred. (Error while copying content to a stream. (Error while copying content to a stream.) (One or more errors occurred. (Error while copying content to a stream.)))
 ---> System.AggregateException: Error while copying content to a stream. (Error while copying content to a stream.) (One or more errors occurred. (Error while copying content to a stream.))
 ---> System.Net.WebException: Error while copying content to a stream.
 ---> System.Net.Http.HttpRequestException: Error while copying content to a stream.
 ---> System.IO.IOException: Unable to write data to the transport connection: An existing connection was forcibly closed by the remote host..
 ---> System.Net.Sockets.SocketException (10054): An existing connection was forcibly closed by the remote host.

The 503 (which were most common) are actually part of normal operations per B2 upload plan.

Other errors were handled well enough that I don’t remember problems like this one previously.

I’m not sure either of these failures are heading for any answer, so I might give up and clean up.

In order to save downloading and keep state awhile, I’ll practice on a cloned job and destination.

EDIT 1:

Duplicati crashed (seemingly idle except for polling). Server profiling log first messages of today:

2025-02-20 11:19:03 -05 - [Information-Duplicati.Server.Program-ServerStopping]: Server is stopping, tearing down handlers
2025-02-20 11:19:13 -05 - [Information-Duplicati.Server.Program-ServerStopped]: Server has stopped

Seemingly nothing relevant in event logs. Terminal output has no dates as usual, but does have:

[WebSocket] Client closed connection prematurely.
[WebSocket] Client closed connection prematurely.
[WebSocket] Client closed connection prematurely.
[WebSocket] Client closed connection prematurely.
[WebSocket] Client closed connection prematurely.
[WebSocket] Client closed connection prematurely.
Unexpected error: System.ComponentModel.Win32Exception (0x80004005): The operation completed successfully.
   at Avalonia.Win32.Interop.Win32Icon.CreateIcon(Bitmap bitmap, PixelPoint hotSpot)
   at Avalonia.Win32.Interop.Win32Icon..ctor(Win32Icon original, PixelSize size)
   at Avalonia.Win32.TrayIconImpl.UpdateIcon(Boolean remove)
   at Avalonia.Controls.TrayIcon.OnPropertyChanged(AvaloniaPropertyChangedEventArgs change)
   at Avalonia.AvaloniaObject.RaisePropertyChanged[T](AvaloniaProperty`1 property, Optional`1 oldValue, BindingValue`1 newValue, BindingPriority priority, Boolean isEffectiveValue)
   at Avalonia.PropertyStore.EffectiveValue`1.NotifyValueChanged(ValueStore owner, StyledProperty`1 property, T oldValue)
   at Avalonia.PropertyStore.EffectiveValue`1.SetAndRaiseCore(ValueStore owner, StyledProperty`1 property, T value, BindingPriority priority, Boolean isOverriddenCurrentValue, Boolean isCoercedDefaultValue)
   at Duplicati.GUI.TrayIcon.AvaloniaApp.SetIcon(TrayIcons icon)
   at Avalonia.Threading.DispatcherOperation.InvokeCore()
   at Avalonia.Threading.DispatcherOperation.Execute()
   at Avalonia.Threading.Dispatcher.ExecuteJob(DispatcherOperation job)
   at Avalonia.Threading.Dispatcher.ExecuteJobsCore(Boolean fromExplicitBackgroundProcessingCallback)
   at Avalonia.Threading.Dispatcher.Signaled()
   at Avalonia.Win32.Win32Platform.WndProc(IntPtr hWnd, UInt32 msg, IntPtr wParam, IntPtr lParam)
   at Avalonia.Win32.Interop.UnmanagedMethods.DispatchMessage(MSG& lpmsg)
   at Avalonia.Win32.Win32DispatcherImpl.RunLoop(CancellationToken cancellationToken)
   at Avalonia.Threading.DispatcherFrame.Run(IControlledDispatcherImpl impl)
   at Avalonia.Threading.Dispatcher.PushFrame(DispatcherFrame frame)
   at Avalonia.Threading.Dispatcher.MainLoop(CancellationToken cancellationToken)
   at Avalonia.Controls.ApplicationLifetimes.ClassicDesktopStyleApplicationLifetime.StartCore(String[] args)
   at Avalonia.Controls.ApplicationLifetimes.ClassicDesktopStyleApplicationLifetime.Start(String[] args)
   at Duplicati.GUI.TrayIcon.AvaloniaRunner.Run(String[] args)
   at Duplicati.GUI.TrayIcon.TrayIconBase.Init(String[] args)
   at Duplicati.GUI.TrayIcon.AvaloniaRunner.Init(String[] args)
   at Duplicati.GUI.TrayIcon.Program.StartTray(String[] _args, Dictionary`2 options, HostedInstanceKeeper hosted, PasswordSource passwordSource, String password, String acceptedHostCertificate)
An error occurred on server tear down: System.Threading.ThreadInterruptedException: Thread was interrupted from a waiting state.
   at System.Threading.Monitor.ObjWait(Int32 millisecondsTimeout, Object obj)
   at System.Threading.ManualResetEventSlim.Wait(Int32 millisecondsTimeout, CancellationToken cancellationToken)
   at System.Threading.Tasks.Task.SpinThenBlockingWait(Int32 millisecondsTimeout, CancellationToken cancellationToken)
   at System.Threading.Tasks.Task.InternalWaitCore(Int32 millisecondsTimeout, CancellationToken cancellationToken)
   at Duplicati.Server.Program.ShutdownModernWebserver()
   at Duplicati.Server.Program.<>c.<Main>b__60_9()
   at Duplicati.Server.Program.Main(String[] _args)

however I’m not sure if that was cause or result. I did probably mouse over the TrayIcon as test.

EDIT 2:

This was an attempt to see if following the directions works. I also prefer not to recreate the DB, however I did also try a test (not written up yet) going down that path which had its own quirks.

Backup says:
Found 5 files that are missing from the remote storage, please run repair

Repair says:
The backup storage destination is missing data files. You can either enable --rebuild-missing-dblock-files or run the purge command to remove these files. The following files are missing: duplicati-b8deaff454e8c46d9b0524a8d3d41573b.dblock.zip.aes, duplicati-b29f1b378849c4e55bd43dbe9da8cf5d7.dblock.zip.aes

Set advanced option rebuild-missing-dblock-files

Repair says:
The backup storage destination is missing data files. You can either enable --rebuild-missing-dblock-files or run the purge command to remove these files. The following files are missing: duplicati-b8deaff454e8c46d9b0524a8d3d41573b.dblock.zip.aes, duplicati-b29f1b378849c4e55bd43dbe9da8cf5d7.dblock.zip.aes

So try the other path, ignoring likely incorrect purge command direction. It almost certainly means purge-broken-files. Purge is very different.

list-broken-files from Commandline says:
Listing remote folder …
0 : 2/16/2025 10:49:34 AM (7 match(es))
and then actually shows 6 lines. Seems wrong.

purge-broken-files from Commandline says:
Listing remote folder …
Uploading file duplicati-20250216T154935Z.dlist.zip.aes (947.93 KiB) …
Deleting file duplicati-20250216T154934Z.dlist.zip.aes …
and the Verbose live log again says it’s:
Feb 21, 2025 8:05 AM: Purging 7 file(s) from fileset 2/16/2025 10:49:34 AM
and it itemizes each, but only 6 of them.
This isn’t a live log omission. Profiling log to file also shows 6 purges.

Verify files says:
(log says success)

Repair says:
(log says success)

BUT rclone --dry-run sync shows two dindex files were uploaded, pointing at missing dblocks.
What’s annoying is the reuploads are not in the RemoteOperation table. Looking at Data, they
first show on a list-broken-files after Repair, and I found the Repair upload in the profiling log.

is where I wrote up primarily weirdness about failed Repair with rebuild-missing-dblock-files.

EDIT 3:

I turned off rebuild-missing-dblock-files because it was harmful (beyond merely ineffective).

Recreate button got a red popup error. The log does exist and it shows what was expected:

  • 2025-02-21 19:53:27 -05 - [Error-Duplicati.Library.Main.Operation.RecreateDatabaseHandler-MissingFileDetected]: Remote file referenced as duplicati-b8deaff454e8c46d9b0524a8d3d41573b.dblock.zip.aes by duplicati-i162a52604e33461d8844a395e75c97df.dindex.zip.aes, but not found in list, registering a missing remote file
  • 2025-02-21 19:53:36 -05 - [Error-Duplicati.Library.Main.Operation.RecreateDatabaseHandler-MissingFileDetected]: Remote file referenced as duplicati-b29f1b378849c4e55bd43dbe9da8cf5d7.dblock.zip.aes by duplicati-i55861405c35f4c569ad171016b5f66d5.dindex.zip.aes, but not found in list, registering a missing remote file
  • 2025-02-21 19:56:18 -05 - [Error-Duplicati.Library.Main.Controller-FailedOperation]: The operation Repair has failed with error: Recreated database has missing blocks and 1 broken filelists. Consider using “list-broken-files” and “purge-broken-files” to purge broken data from the remote store and the database. UserInformationException: Recreated database has missing blocks and 1 broken filelists. Consider using “list-broken-files” and “purge-broken-files” to purge broken data from the remote store and the database.

The bad references are in replacement dindex files that Repair somehow decided to reupload, not noticing that associated dblock wasn’t there. The missing files are from the original issue where files thought to be uploaded somehow never showed up.

EDIT 4:

I was trying to track down a problem where purge-broken-files says:

  Uploading file duplicati-20250216T122101Z.dlist.zip.aes (948.40 KiB) ...
  Deleting file duplicati-20250216T122100Z.dlist.zip.aes  ...
The operation PurgeBrokenFiles has failed with error: Found 1 file(s) with missing blocklist hashes => Found 1 file(s) with missing blocklist hashes

System.IO.InvalidDataException: Found 1 file(s) with missing blocklist hashes
   at Duplicati.Library.Main.Database.LocalDatabase.VerifyConsistency(Int64 blocksize, Int64 hashsize, Boolean verifyfilelists, IDbTransaction transaction)
   at Duplicati.Library.Main.Operation.PurgeBrokenFilesHandler.Run(IFilter filter)
   at Duplicati.Library.Main.Controller.<>c__DisplayClass43_0.<PurgeBrokenFiles>b__0(PurgeBrokenFilesResults result)
   at Duplicati.Library.Main.Controller.RunAction[T](T result, String[]& paths, IFilter& filter, Action`1 method)
   at Duplicati.Library.Main.Controller.RunAction[T](T result, Action`1 method)
   at Duplicati.Library.Main.Controller.PurgeBrokenFiles(IFilter filter)
   at Duplicati.CommandLine.Commands.PurgeBrokenFiles(TextWriter outwriter, Action`1 setup, List`1 args, Dictionary`2 options, IFilter filter)
   at Duplicati.CommandLine.Program.ParseCommandLine(TextWriter outwriter, Action`1 setup, Boolean& verboseErrors, String[] args)
   at Duplicati.CommandLine.Program.RunCommandLine(TextWriter outwriter, TextWriter errwriter, Action`1 setup, String[] args)
Return code: 100

and was trying to see if it would say it again. I’m pretty sure I can get it if I Recreate again, but it now appears to be stuck in an otherwise irrecoverable situation because purge-broken-files says

The operation PurgeBrokenFiles has failed with error: Cannot continue because the database is marked as being under repair, but does not have broken files. => Cannot continue because the database is marked as being under repair, but does not have broken files.

ErrorID: CannotListOnDatabaseInRepair
Cannot continue because the database is marked as being under repair, but does not have broken files.
Return code: 100

list-broken-files says:

The operation ListBrokenFiles has failed with error: Cannot continue because the database is marked as being under repair, but does not have broken files. => Cannot continue because the database is marked as being under repair, but does not have broken files.

ErrorID: CannotListOnDatabaseInRepair
Cannot continue because the database is marked as being under repair, but does not have broken files.
Return code: 100

Repair button says:

The database was attempted repaired, but the repair did not complete. This database may be incomplete and the repair process is not allowed to alter remote files as that could result in data loss.

Looking in source code at RepairInProgress variable, it looks like setting it false can be done in:

if it had gotten that far, however I think it failed earlier below, so how is variable ever clearable?

EDIT 5:

Since Repair stated its reasons for apprently refusing to run after an initial failure, does this make purge-broken-files the place to clear RepairInProgress? If so, maybe after finding no issues now, should it perhaps just declare an all-looks-good? If not, how to get that, short of a new Recreate?

EDIT 6:

Test reproducibility of that after rclone sync of destination to the local test copy.

Recreate says (as expected from damage):

Recreated database has missing blocks and 1 broken filelists. Consider using “list-broken-files” and “purge-broken-files” to purge broken data from the remote store and the database.

purge-broken-files says (as it’s done before):

  Uploading file duplicati-20250216T122101Z.dlist.zip.aes (948.40 KiB) ...
  Deleting file duplicati-20250216T122100Z.dlist.zip.aes  ...
The operation PurgeBrokenFiles has failed with error: Found 1 file(s) with missing blocklist hashes => Found 1 file(s) with missing blocklist hashes

System.IO.InvalidDataException: Found 1 file(s) with missing blocklist hashes
   at Duplicati.Library.Main.Database.LocalDatabase.VerifyConsistency(Int64 blocksize, Int64 hashsize, Boolean verifyfilelists, IDbTransaction transaction)
   at Duplicati.Library.Main.Operation.PurgeBrokenFilesHandler.Run(IFilter filter)
   at Duplicati.Library.Main.Controller.<>c__DisplayClass43_0.<PurgeBrokenFiles>b__0(PurgeBrokenFilesResults result)
   at Duplicati.Library.Main.Controller.RunAction[T](T result, String[]& paths, IFilter& filter, Action`1 method)
   at Duplicati.Library.Main.Controller.RunAction[T](T result, Action`1 method)
   at Duplicati.Library.Main.Controller.PurgeBrokenFiles(IFilter filter)
   at Duplicati.CommandLine.Commands.PurgeBrokenFiles(TextWriter outwriter, Action`1 setup, List`1 args, Dictionary`2 options, IFilter filter)
   at Duplicati.CommandLine.Program.ParseCommandLine(TextWriter outwriter, Action`1 setup, Boolean& verboseErrors, String[] args)
   at Duplicati.CommandLine.Program.RunCommandLine(TextWriter outwriter, TextWriter errwriter, Action`1 setup, String[] args)
Return code: 100

purge-broken-files says (as it’s done before):

The operation PurgeBrokenFiles has failed with error: Cannot continue because the database is marked as being under repair, but does not have broken files. => Cannot continue because the database is marked as being under repair, but does not have broken files.

ErrorID: CannotListOnDatabaseInRepair
Cannot continue because the database is marked as being under repair, but does not have broken files.
Return code: 100

Now look for some result that can be improved.

From a guess about why purge-broken-files was complaining, I changed blocksize to 100 KB.
This is what the destination actually uses, and some parts of Duplicati notice go by blocksize.

Another Recreate, and purge-broken-files says:

  Uploading file duplicati-20250216T122101Z.dlist.zip.aes (948.40 KiB) ...
  Deleting file duplicati-20250216T122100Z.dlist.zip.aes  ...
Return code: 0

If I look in the job DB Configuration table, it looks like it cleared entry behind RepairInProgress.

Repair now also is willing to run (and makes a successful job log). So some progress on those.

Suspect code is below. Is it thinking the option value is the real blocksize? Is assumption safe?

I already knew that I had only one file big enough to have two blocklists, and had laboriously attempted to hand-build the SQL behind the error. But initially I used 3200 blockhash / block, suceeding every time. I changed it to 32000 and got a failure from the one big blockset I had.

Is that line set up to specifically not log SQL to profiling log? It would’ve been easier if it had.

My manul attempt with original value, prettified:

SELECT COUNT(*)
FROM (
	SELECT *
	FROM (
		SELECT "N"."BlocksetID"
			,(("N"."BlockCount" + 3200 - 1) / 3200) AS "BlocklistHashCountExpected"
			,CASE 
				WHEN "G"."BlocklistHashCount" IS NULL
					THEN 0
				ELSE "G"."BlocklistHashCount"
				END AS "BlocklistHashCountActual"
		FROM (
			SELECT "BlocksetID"
				,COUNT(*) AS "BlockCount"
			FROM "BlocksetEntry"
			GROUP BY "BlocksetID"
			) "N"
		LEFT OUTER JOIN (
			SELECT "BlocksetID"
				,COUNT(*) AS "BlocklistHashCount"
			FROM "BlocklistHash"
			GROUP BY "BlocksetID"
			) "G" ON "N"."BlocksetID" = "G"."BlocksetID"
		WHERE "N"."BlockCount" > 1
		)
	WHERE "BlocklistHashCountExpected" != "BlocklistHashCountActual"
	)

EDIT 7:

If I change the 3200 to 32000, I get one row back. If I change COUNT(*) to * I get my big file:

image

I have to say that the error message Duplicati put out was no help at all to fast understanding. Possibly it’s impossible to explain something like this though, so big chase like this is needed?

I exceeded the 32000 character limit on what would have been EDIT 8, so here’s a new post:

So that appears to be one possible path to (sort-of) recovery, except it loses some DB history, meaning things like logs, but hopefully not files history. If so, not bad compared to a start-over.

In fairness this is also a double failure. I’ll show below what that means, as I’m still trying more.
Goal is to see if there’s any recovery that fixes the destination and database via Duplicati tools.

I’ve added blocksize=100KB to try to keep purge-broken-files happy. I’m using rclone sync to refresh the local destination copy as needed, and copy command to reset test DB to failed one.

One idea I hadn’t tried yet is to just delete versions starting from the mess-up, but that tells me:

  Listing remote folder ...
Missing file: duplicati-20250216T154934Z.dlist.zip.aes
Missing file: duplicati-b8deaff454e8c46d9b0524a8d3d41573b.dblock.zip.aes
Missing file: duplicati-b29f1b378849c4e55bd43dbe9da8cf5d7.dblock.zip.aes
Missing file: duplicati-i162a52604e33461d8844a395e75c97df.dindex.zip.aes
Missing file: duplicati-i55861405c35f4c569ad171016b5f66d5.dindex.zip.aes
Found 5 files that are missing from the remote storage, please run repair
The operation Delete has failed with error: Found 5 files that are missing from the remote storage, please run repair => Found 5 files that are missing from the remote storage, please run repair

ErrorID: MissingRemoteFiles
Found 5 files that are missing from the remote storage, please run repair
Return code: 100

Without checking the names, this sounds like the second part of foul-up stopping fix of first part.

Reset things, and list-broken-files says:

  Listing remote folder ...
0	: 2/16/2025 10:49:34 AM	(7 match(es))
(shows 6 paths)
Return code: 0

purge-broken-files says:

  Listing remote folder ...
The operation PurgeBrokenFiles has failed with error: Unexpected number of remote volumes marked as deleted. Found 1 filesets, but 0 volumes => Unexpected number of remote volumes marked as deleted. Found 1 filesets, but 0 volumes

System.Exception: Unexpected number of remote volumes marked as deleted. Found 1 filesets, but 0 volumes
   at Duplicati.Library.Main.Database.LocalDeleteDatabase.DropFilesetsFromTable(DateTime[] toDelete, IDbTransaction transaction)+MoveNext()
   at System.Collections.Generic.LargeArrayBuilder`1.AddRange(IEnumerable`1 items)
   at System.Collections.Generic.EnumerableHelpers.ToArray[T](IEnumerable`1 source)
   at Duplicati.Library.Main.Operation.PurgeFilesHandler.DoRun(LocalPurgeDatabase db, IFilter filter, Action`3 filtercommand, Single pgoffset, Single pgspan)
   at Duplicati.Library.Main.Operation.PurgeFilesHandler.Run(LocalPurgeDatabase db, Single pgoffset, Single pgspan, Action`3 filtercommand)
   at Duplicati.Library.Main.Operation.PurgeBrokenFilesHandler.Run(IFilter filter)
   at Duplicati.Library.Main.Controller.<>c__DisplayClass43_0.<PurgeBrokenFiles>b__0(PurgeBrokenFilesResults result)
   at Duplicati.Library.Main.Controller.RunAction[T](T result, String[]& paths, IFilter& filter, Action`1 method)
   at Duplicati.Library.Main.Controller.RunAction[T](T result, Action`1 method)
   at Duplicati.Library.Main.Controller.PurgeBrokenFiles(IFilter filter)
   at Duplicati.CommandLine.Commands.PurgeBrokenFiles(TextWriter outwriter, Action`1 setup, List`1 args, Dictionary`2 options, IFilter filter)
   at Duplicati.CommandLine.Program.ParseCommandLine(TextWriter outwriter, Action`1 setup, Boolean& verboseErrors, String[] args)
   at Duplicati.CommandLine.Program.RunCommandLine(TextWriter outwriter, TextWriter errwriter, Action`1 setup, String[] args)
Return code: 100

and one guess at the message (I could look at code, but won’t now) is 0 volumes refers to the dlist file that didn’t show up at the destination somehow in the second part of the problem. It’s still in the database records, explaining its 1 filesets. Repair has some ability to reupload these from an intact database. I already mentioned a case where it uploaded a bad dindex, so see if it will upload a dlist.

is where things started after the reset, although I see that these went to Deleting in some step:

Reset and watch more closely. This might explain why Repair in earlier run didn’t upload a dlist.

It wasn’t the Delete. It wasn’t list-broken-files. Test Repair before things get worse. Repair says:

The backup storage destination is missing data files. You can either enable --rebuild-missing-dblock-files or run the purge command to remove these files. The following files are missing: duplicati-b8deaff454e8c46d9b0524a8d3d41573b.dblock.zip.aes, duplicati-b29f1b378849c4e55bd43dbe9da8cf5d7.dblock.zip.aes

So here it is again asking for a previously ineffective Repair with rebuild-missing-dblock-files. Running Repair first isn’t working, and running purge-broken-files shown above wasn’t either.

EDIT 1:

To update this, the oddity with the SQL incorrect-blocksize theory didn’t lead to any easy repro, however I haven’t yet put a debugger on a backup that’s actually able to demonstrate the error.

I thought of a possible additional way to back things out to, say, Feb 15 because although some destination files have been deleted, my rclone sync to local uses --backup-dir so I have them. Thinking about this is why I was looking at SyncTool as I only gave rclone read only B2 access.

My job database rolling save history will provide a database, if I can get destination match again.

It’d still be nice if Duplicati methods can fix up the busted database, but this might be beyond it…

Meanwhile, my other Duplicati backup is taking care of things, until this one gets back, somehow.

Turns out it does not in all cases:

That note was aimed at the continuation after dblock Put fail in the first part of this problem.
Does it also plausibly explain the second part, where Puts didn’t actually leave files (ouch)?

If there’s some signature to this that I haven’t already mentioned, I’ve got old DBs and logs.
Knowing symptoms at different levels would be handy in handling any future reports of this.

Yes (ouch indeed). The destination server is told in advance how large the file is. When the timeout happens it stops the transfer, and the destination server correctly discards the upload.

Since the failure is treated as a requested cancellation, no errors are triggered. During the post-backup verification step, this missing file is then seen as a partial upload and purged from the database, leaving no trace of the problem, besides an entry in the table RemoteOperation.

This post-backup cleanup was the reason that no testing found it. It simply purges the data associated with the missing file, making it as if some operations never happened. Only in the rare cases where it affects dlist files does it become immediately visible to the user.

It is hard to find in the database, but what you will see is that there is a PUT entry in RemoteOperation mentioning the file, but it is missing in Remotevolume.

Because files can be removed later, a missing entry in Remotevolume can be correct or a symptom of the error.

I have been building a more strict verification and cleanup strategy, so errors of this kind will be caught should they happen in other places.

Not seen. Below are RemoteOperation backup Path, then State in Remotevolume:

Uploaded	duplicati-b8deaff454e8c46d9b0524a8d3d41573b.dblock.zip.aes
Verified	duplicati-b5a714fb2fee94bf1b6fb131d48d23a65.dblock.zip.aes
Uploaded	duplicati-i162a52604e33461d8844a395e75c97df.dindex.zip.aes
Verified	duplicati-ib1a923650b104045a91fb8992fb81af1.dindex.zip.aes
Uploaded	duplicati-b29f1b378849c4e55bd43dbe9da8cf5d7.dblock.zip.aes
Uploaded	duplicati-i55861405c35f4c569ad171016b5f66d5.dindex.zip.aes
Uploaded	duplicati-20250216T154934Z.dlist.zip.aes

This from a database last modified 10:51:55 AM. Profiling log view of the same files:

2025-02-16 10:50:54 -05 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Started: duplicati-b8deaff454e8c46d9b0524a8d3d41573b.dblock.zip.aes (50.01 MiB)
2025-02-16 10:50:56 -05 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Started: duplicati-b5a714fb2fee94bf1b6fb131d48d23a65.dblock.zip.aes (49.98 MiB)
2025-02-16 10:51:11 -05 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Completed: duplicati-b8deaff454e8c46d9b0524a8d3d41573b.dblock.zip.aes (50.01 MiB)
2025-02-16 10:51:16 -05 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Completed: duplicati-b5a714fb2fee94bf1b6fb131d48d23a65.dblock.zip.aes (49.98 MiB)
2025-02-16 10:51:17 -05 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Started: duplicati-i162a52604e33461d8844a395e75c97df.dindex.zip.aes (29.23 KiB)
2025-02-16 10:51:17 -05 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Started: duplicati-ib1a923650b104045a91fb8992fb81af1.dindex.zip.aes (24.62 KiB)
2025-02-16 10:51:17 -05 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Completed: duplicati-ib1a923650b104045a91fb8992fb81af1.dindex.zip.aes (24.62 KiB)
2025-02-16 10:51:17 -05 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Completed: duplicati-i162a52604e33461d8844a395e75c97df.dindex.zip.aes (29.23 KiB)
2025-02-16 10:51:36 -05 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Started: duplicati-b29f1b378849c4e55bd43dbe9da8cf5d7.dblock.zip.aes (22.62 MiB)
2025-02-16 10:51:40 -05 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Completed: duplicati-b29f1b378849c4e55bd43dbe9da8cf5d7.dblock.zip.aes (22.62 MiB)
2025-02-16 10:51:41 -05 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Started: duplicati-i55861405c35f4c569ad171016b5f66d5.dindex.zip.aes (62.22 KiB)
2025-02-16 10:51:41 -05 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Completed: duplicati-i55861405c35f4c569ad171016b5f66d5.dindex.zip.aes (62.22 KiB)
2025-02-16 10:51:41 -05 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Started: duplicati-20250216T154934Z.dlist.zip.aes (948.67 KiB)
2025-02-16 10:51:42 -05 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Completed: duplicati-20250216T154934Z.dlist.zip.aes (948.67 KiB)

EDIT 1:

Above was made by search for BackendEvent, narrowed to capture times of interest.

EDIT 2:

Isolating on the logging of first file, it’s in Remotevolume before its upload even starts,
although as Temporary. Later changes to all the file State don’t seem to be in this log.

2025-02-16 10:50:25 -05 - [Profiling-Timer.Begin-Duplicati.Library.Main.Database.ExtensionMethods-ExecuteScalarInt64]: Starting - ExecuteScalarInt64: INSERT INTO "Remotevolume" ("OperationID", "Name", "Type", "State", "Size", "VerificationCount", "DeleteGraceTime") VALUES (2138, "duplicati-b8deaff454e8c46d9b0524a8d3d41573b.dblock.zip.aes", "Blocks", "Temporary", -1, 0, 0); SELECT last_insert_rowid();
2025-02-16 10:50:25 -05 - [Profiling-Timer.Finished-Duplicati.Library.Main.Database.ExtensionMethods-ExecuteScalarInt64]: ExecuteScalarInt64: INSERT INTO "Remotevolume" ("OperationID", "Name", "Type", "State", "Size", "VerificationCount", "DeleteGraceTime") VALUES (2138, "duplicati-b8deaff454e8c46d9b0524a8d3d41573b.dblock.zip.aes", "Blocks", "Temporary", -1, 0, 0); SELECT last_insert_rowid(); took 0:00:00:00.000
2025-02-16 10:50:54 -05 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Started: duplicati-b8deaff454e8c46d9b0524a8d3d41573b.dblock.zip.aes (50.01 MiB)
2025-02-16 10:51:11 -05 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Completed: duplicati-b8deaff454e8c46d9b0524a8d3d41573b.dblock.zip.aes (50.01 MiB)
2025-02-16 10:51:48 -05 - [Warning-Duplicati.Library.Main.Operation.FilelistProcessor-MissingFile]: Missing file: duplicati-b8deaff454e8c46d9b0524a8d3d41573b.dblock.zip.aes

You can see here that they started and completed, so those files are good wrt the 2.1.0.4 bug.
You are looking for files that only have “Started”. These will be missing in Remotevolume.

That looks different. The last warning can only be emitted if the remotevolume has state “Uploaded”, which is consistent with “Put - Completed”.

But why is the update to the Remotevolume table not recorded?
And it does not explain why the file is missing.

I’m keeping some of the historical evidence, but wanted to get the backup up again.

  • Set blocksize=100KB was the mystery ingredient (speculation above on why)
  • Recreate and get an error, so
  • purge-broken-files which runs nice and clean thanks to the blocksize fix.

Without that, it complains:

The operation PurgeBrokenFiles has failed with error: Found 1 file(s) with missing blocklist hashes => Found 1 file(s) with missing blocklist hashes

so I go to the database to (guessing) find my only file big enough to get two.
They’re there, but if the code is expecting one (with 1 MB blocksize), it’ll fail.

EDIT 1:

Found steps to reproduce the missing blocklist hashes error, and filed issue: