How to recover from aborted/stuck/crashed backup run?

I’m having issues completing backup job runs of an existing backup job. There are other posts describing somewhat similar issues (backup appears stuck / status not clear 1 2) but I could not really extract from those posts what I should do in my specific case.

Here is an extensive description of the context of my problem - feel free to first jump to the description of my questions and come back later here to understand the scenario:
I have a backup job (job 1) backing up data to a USB-connected remote.
As described previously, I recently moved to new hardware for the source device and after migrating Duplicati application data incl. jobs to the new hardware, used the most recently run backup of job 1 to restore the backed up data on the new source device.
After initial trouble caused by my restore strategy I decided to delay all backup runs of the existing job for some weeks and to create an additional job 2, with an increased blocksize and another remote destination, and added all source folders of the job 1 iteratively with backup runs in between to ensure the single backup runs do run only a couple of hours. Having two remotes for the same data is part of my redundancy strategy, so I want both jobs 1 & 2 to be run in the future.
Within the past month after moving to the new source device, I have added some new source folders for jobs 1 & 2 even though only the latter job was actually scheduled to run. That went generally well, so I assumed I’m ready to try to un-retire job 1 which had originally run on the old source device. Knowing that the source of job 1 prior to the backup run was about 80 GB smaller than the one of job 2, I removed some of the source folders from job 1 (with the idea to add them step by step, that is, run the backup job multiple times and each time add some of the source folders to ensure managable run times of each backup run).

The issue:

  1. First attempt of job run:

Even though I expected not too much changes to have occured to be captured by backup job 1, it initially showed 1xx GB to be backed up and ran for about 1.5 hours until (according to the GUI progress bar) 0 bytes remained. From then on, the GUI did no more give away what was happening but there was no more traffic to the remote observable for most of the time. After an additional hour of patience, I decided to abort the job run (first using the option “after current file”, then the ASAP option). Three hours later the job ended with the error message The channel “BackendRequests” is retired and an exception logged in About → Show Log → Stored:

Apr 24, 2024 9:29 PM: Failed while executing "Backup" with id: 2
CoCoL.RetiredException: The channel "BackendRequests" is retired
   at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw()
   at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)
   at CoCoL.Channel`1.<WriteAsync>d__32.MoveNext()
--- End of stack trace from previous location where exception was thrown ---
   at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw()
   at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)
   at Duplicati.Library.Main.Operation.BackupHandler.<FlushBackend>d__19.MoveNext()
--- End of stack trace from previous location where exception was thrown ---
   at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw()
   at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)
   at Duplicati.Library.Main.Operation.BackupHandler.<RunAsync>d__20.MoveNext()
--- End of stack trace from previous location where exception was thrown ---
   at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw()
   at CoCoL.ChannelExtensions.WaitForTaskOrThrow(Task task)
   at Duplicati.Library.Main.Controller.<>c__DisplayClass14_0.<Backup>b__0(BackupResults result)
   at Duplicati.Library.Main.Controller.RunAction[T](T result, String[]& paths, IFilter& filter, Action`1 method)
   at Duplicati.Library.Main.Controller.Backup(String[] inputsources, IFilter filter)
   at Duplicati.Server.Runner.Run(IRunnerData data, Boolean fromQueue)
  1. Second attempt of job run:

The next day, I reduced the list of source folders for the job one more time and gave it a second try. This time, the size of data to be backed up indicated in the progress bar was only about 20 GB and after only a few minutes the GUI status indicated that 0 bytes are remaining. On the remote, 5 files had been created in that time. Afterwards, nothing observable happened. On the remote, there was zero traffic for hours. Duplicati however actively used the CPU and the local database was still open (with a -journal temporary file existing).
After some hours, I lost my patience and tried to abort the backup. Again, there was no indicator of the abort ongoing except for the live log stating that the abort request had been received. There was also no more traffic to the remote and no files were written there. After about seven hours, I paused and resumed the job using the GUI option, hoping for a sign of a changed status. The only effect of that was that Duplicati then did not use considerable amounts of CPU anymore (even after resume), and the GUI interface was no more responsible (reloading the localhost webpage took minutes). After two more hours in which no data had been written to remote, the local database files last modified timestamps being unchanged and the GUI still not responsive, I quit the tray icon application and shut down the source device.

So that is the current status:

  • Duplicati reports three partial backups:
    – Two from the first attempt with same time according to the GUI dropdown - not sure why it is two of them
    – One from the second attempt
  • The local temp DB file with the “-journal” suffix seems to have vanished away.
  • Log → General for the backup job does not mention those backups.
  • Log → Remote does look quite normal for me but reflects the long intervals in which nothing visible has happened in the GUI and on the remote (unfortunately too long to paste it directly):
    job_remote_log.zip (23.6 KB)

Consequently, I do have a couple of questions:

  1. I’m unsure what this state means for the integrity of the remote data and of the job as such and thus do not know how to proceed the best. Should I do any checking, cleanup or repair before proceeding?
  2. What would be the best way to proceed with the backup job? Would it help to reduce the source folder list to a minimal set of data (lets say, a single file) for the next run? Should I enable any logging and if so, where and how (GUI live log vs. logging to file via command line option)?
  3. I am also asking myself how I can ensure that my attempts and partial backups do not overwrite the data of previous successful backup runs. Should I turn off smart retention temporarily or is there any option to flag a specific backup revision as undeletable?

I am okay with any attempt of the next job run to take up to 24 hours, more is not really feasible for me.

If there is any data missing in the description, let me know.
Thanks for your advice!

Edit: Found & added exception message from first attempt of job run.

This could use some developer input. Meanwhile, I’ll comment some, although lack of proper logs hurts…

I think the status bar shows what is seen to be examined for backup. This is both a file count, and file size. Number can go far higher than amount actually backed up after reading through any files that got modified.

image

is an example of a log report at the end of a backup, showing lots of candidates, and little new or modified.

Complete log shows what got uploaded. Deduplication means modified files might be only partial upload.

“BytesUploaded”: 98859909,

You probably noticed the popup explaining:

You can stop the backup after any file uploads currently in progress have finished.

although why uploads would have trouble is unclear. Is USB-connected remote for the jobs a hard drive?

The author once wrote: “That is unfortunately a “follow error” that happens when something else breaks.”

The stack trace isn’t that great, but maybe it’s here:

How much? If SQLite is pegged, it’s usually one core’s worth, so (for example) CPU is 25% if four cores.
Generally About → Show log → Live → Profiling is what to try, but sometimes it’s too late to see the start (although one might catch the end, and a long SQL run). Logging to a file is best (if issue is reproducible).

There’s a sixth that popped out somehow. I wonder if this last one was after your “seven hours” activities?

Apr 25, 2024 7:02 PM: put duplicati-i4c520942678141a88b5ffb3a2ab3bf95.dindex.zip.aes
Apr 25, 2024 12:39 PM: put duplicati-ba8062607fee047ae8cbe4a9a9f8fe4e8.dblock.zip.aes
Apr 25, 2024 12:39 PM: put duplicati-i16facfc35b914afdaa5e62752a736080.dindex.zip.aes
Apr 25, 2024 12:39 PM: put duplicati-b6606c1c7572c4236b11e7ae68e5a2a13.dblock.zip.aes
Apr 25, 2024 12:39 PM: put duplicati-i2971351cba5d4822978daf35451953b6.dindex.zip.aes
Apr 25, 2024 12:39 PM: put duplicati-b6ea6bb0a5b8a431f96ebe3555f3f4c9b.dblock.zip.aes
Apr 25, 2024 12:37 PM: list

Journals are pretty transient, and used for rollback, e.g. next DB open might use the journal, then discard it.

Although some of this might be the general job log messiness that’s improved in next Beta, job logs are also in the database, so any unclean end can roll back the logs, along with all of the other uncommitted changes.

I already asked about the trailing sixth entry, but as this is also in the database, maybe you got a clean close.

Improve result reporting for interrupted/failed backups, thanks @Jojo-1000

is what should get more job logs, instead of making the user discover logs in About → Show log → Stored.

Was the CPU idle then? One other way to get a slow GUI is to open lots of tabs to Duplicati (clogs browser).

I suppose you could check the USB-connected remote. Is job 2 (which works) on same or different than 1?
Export As Command-line will get URL. Edit to empty folder for Duplicati.CommandLine.BackendTester.exe

You could use the Verify files button if you like, but it’s the same test as runs before and after a backup. Repair button can do more powerful changes (which is nice if it helps, but worse if it hurts), but maybe wait.

That’s going to get a lot of files tallied as deleted (see image above), which is not a huge problem if an old backup exists that still has the file, but it’ll just mean extra work reconnecting as you build back to present.

You might want to set a log-file=<path> log-file-log-level=verbose, or profiling if you can stand a very big log when you try backup next. For a minimal log, information or retry is good. Maybe it’ll just run OK next time…

Partial just means it didn’t finish backing up all the changes. Remote files don’t get overwritten, however the retention policy can delete versions (dlist files), turning some of the dblock files into waste to be compacted.

It’d be a good idea to inspect your retention on Options screen to see if you want to halt deletes temporarily. Turning no-auto-compact for now would make extra sure that the not-sure-it’s-healthy remote gets less use.

Compacting files at the backend also adds an unpredictable element to your seemingly limited time window.

That is correct. Based on the GUI progress bar stating 0 bytes to go, I would also not expect issues in uploading. During the time in which no more activity was visible on the remote, I opened the destination directory from time to time in Windows Explorer and that worked normally.

I think I’m lacking the expertise in interpreting that, but maybe someone involved in development can. Does that mean the exception occured in the write process to remote?

CPU consumption was about 8.x %, system has 20 logical processors. Note that in task manager two items were displayed for Duplicati, having the mentioned consumption in summary:
image

I think at that time I had not yet experimented with the pause/resume functionality but probably asked for an abort more than one time. When I saw the sixth put (file size: 9 KB) I hoped that this was an indicator of progress.
For completeness, I’ll mention that on the first attempt, also a single file was put at the very end after hours of no visible activity (but that may be another story, as that run had ended with the RetiredException and not due to me shutting down the system).

Good to know! I initially planned to make a copy of the -journal file before restarting Duplicati, but while writing my first post here I wanted to look up some data in the GUI and forgot that the -journal file may get lost when starting Duplicati or interacting with the backup job.

How can I find that out? Will Duplicati notify me in the next job run, if the close was not clean?

Yes, it was no more than 0.x %. And yes to the second question, I had probably 6-8 Duplicati tabs open at that time.

Thanks for the suggestion, I checked that one. I had to modify the path structure retrieved from the export option not only to point to an empty folder but also remove all backslashes (even the escaped ones) and replace them with single slashed, otherwise the command would fail with

Unittest failed: System.ArgumentException: Illegal characters in path.
Summary

PS C:\Program Files\Duplicati 2> .\Duplicati.CommandLine.BackendTester.exe “file://A:/butester/”
Starting run no 0
Generating file 0 (4.29 MB)
Generating file 1 (47.54 MB)
Generating file 2 (11.83 MB)
Generating file 3 (28.12 MB)
Generating file 4 (17.36 MB)
Generating file 5 (29.59 MB)
Generating file 6 (21.79 MB)
Generating file 7 (29.20 MB)
Generating file 8 (28.38 MB)
Generating file 9 (1.02 MB)
Uploading wrong files …
Generating file 10 (1.24 KB)
Uploading file 0, 1.24 KB … done!
Uploading file 0, 1.24 KB … done!
Uploading file 9, 1.24 KB … done!
Uploading files …
Uploading file 0, 4.29 MB … done!
Uploading file 1, 47.54 MB … done!
Uploading file 2, 11.83 MB … done!
Uploading file 3, 28.12 MB … done!
Uploading file 4, 17.36 MB … done!
Uploading file 5, 29.59 MB … done!
Uploading file 6, 21.79 MB … done!
Uploading file 7, 29.20 MB … done!
Uploading file 8, 28.38 MB … done!
Uploading file 9, 1.02 MB … done!
Renaming file 1 from BQr2ShRDtmW7VoNSynNrVcTM65BmgD1jyZzJUmB8EzeTGmcSkqPcDT4z to ewVd6hzYyOApURs3YxtZ3ab553I1rDJre4GBc7blywc2
Verifying file list …
Downloading files
Downloading file 0 … done
Checking hash … done
Downloading file 1 … done
Checking hash … done
Downloading file 2 … done
Checking hash … done
Downloading file 3 … done
Checking hash … done
Downloading file 4 … done
Checking hash … done
Downloading file 5 … done
Checking hash … done
Downloading file 6 … done
Checking hash … done
Downloading file 7 … done
Checking hash … done
Downloading file 8 … done
Checking hash … done
Downloading file 9 … done
Checking hash … done
Deleting files…
Checking retrieval of non-existent file…
*** Retrieval of non-existent file failed: System.IO.FileNotFoundException: Could not find file ‘\?\A:\butester\NonExistentFile-ab097135-39ae-400d-9983-ed6ba34e8028’.
File name: ‘\?\A:\butester\NonExistentFile-ab097135-39ae-400d-9983-ed6ba34e8028’
at System.IO.__Error.WinIOError(Int32 errorCode, String maybeFullPath)
at System.IO.File.InternalCopy(String sourceFileName, String destFileName, Boolean overwrite, Boolean checkHost)
at Duplicati.CommandLine.BackendTester.Program.Run(List1 args, Dictionary2 options, Boolean first)
*** Retrieval of non-existent file should have failed with FileMissingException
Checking quota…
Free Space: 75.99 GB
Total Space: 300.00 GB
Checking DNS names used by this backend…
No DNS names reported
Starting run no 1
Generating file 0 (23.82 MB)
Generating file 1 (26.22 MB)
Generating file 2 (14.66 MB)
Generating file 3 (24.22 MB)
Generating file 4 (46.10 MB)
Generating file 5 (12.46 MB)
Generating file 6 (8.35 MB)
Generating file 7 (9.44 MB)
Generating file 8 (1.10 MB)
Generating file 9 (6.58 MB)
Uploading wrong files …
Generating file 10 (1.80 KB)
Uploading file 0, 1.80 KB … done!
Uploading file 0, 1.80 KB … done!
Uploading file 9, 1.80 KB … done!
Uploading files …
Uploading file 0, 23.82 MB … done!
Uploading file 1, 26.22 MB … done!
Uploading file 2, 14.66 MB … done!
Uploading file 3, 24.22 MB … done!
Uploading file 4, 46.10 MB … done!
Uploading file 5, 12.46 MB … done!
Uploading file 6, 8.35 MB … done!
Uploading file 7, 9.44 MB … done!
Uploading file 8, 1.10 MB … done!
Uploading file 9, 6.58 MB … done!
Renaming file 1 from ae3fTyJchR71pWIA8q3MGUWFsz6Ysgt5xPnbTI6ZbdBWKOEkvQd to AolDKHKaykelH2ZzGqU7TfsWhqtAqvJxzNAFnLfNhAOYKikZ36
Verifying file list …
Downloading files
Downloading file 0 … done
Checking hash … done
Downloading file 1 … done
Checking hash … done
Downloading file 2 … done
Checking hash … done
Downloading file 3 … done
Checking hash … done
Downloading file 4 … done
Checking hash … done
Downloading file 5 … done
Checking hash … done
Downloading file 6 … done
Checking hash … done
Downloading file 7 … done
Checking hash … done
Downloading file 8 … done
Checking hash … done
Downloading file 9 … done
Checking hash … done
Deleting files…
Checking retrieval of non-existent file…
*** Retrieval of non-existent file failed: System.IO.FileNotFoundException: Could not find file ‘\?\A:\butester\NonExistentFile-3d8f9c3d-dc06-4913-9b0e-1bb8668d52a0’.
File name: ‘\?\A:\butester\NonExistentFile-3d8f9c3d-dc06-4913-9b0e-1bb8668d52a0’
at System.IO.__Error.WinIOError(Int32 errorCode, String maybeFullPath)
at System.IO.File.InternalCopy(String sourceFileName, String destFileName, Boolean overwrite, Boolean checkHost)
at Duplicati.CommandLine.BackendTester.Program.Run(List1 args, Dictionary2 options, Boolean first)
*** Retrieval of non-existent file should have failed with FileMissingException
Checking quota…
Free Space: 75.99 GB
Total Space: 300.00 GB
Checking DNS names used by this backend…
No DNS names reported
Starting run no 2
Generating file 0 (14.39 MB)
Generating file 1 (11.07 MB)
Generating file 2 (31.78 MB)
Generating file 3 (48.75 MB)
Generating file 4 (15.62 MB)
Generating file 5 (17.68 MB)
Generating file 6 (32.19 MB)
Generating file 7 (502.74 KB)
Generating file 8 (6.69 MB)
Generating file 9 (32.79 MB)
Uploading wrong files …
Generating file 10 (1.75 KB)
Uploading file 0, 1.75 KB … done!
Uploading file 0, 1.75 KB … done!
Uploading file 9, 1.75 KB … done!
Uploading files …
Uploading file 0, 14.39 MB … done!
Uploading file 1, 11.07 MB … done!
Uploading file 2, 31.78 MB … done!
Uploading file 3, 48.75 MB … done!
Uploading file 4, 15.62 MB … done!
Uploading file 5, 17.68 MB … done!
Uploading file 6, 32.19 MB … done!
Uploading file 7, 502.74 KB … done!
Uploading file 8, 6.69 MB … done!
Uploading file 9, 32.79 MB … done!
Renaming file 1 from 5Q7IfvafCggomPuIoyRDCDM9rgP11 to uAvFJsmu6ltrkRoeVsjEMD1JbF99WqebnTdnFsY4k7gn4
Verifying file list …
Downloading files
Downloading file 0 … done
Checking hash … done
Downloading file 1 … done
Checking hash … done
Downloading file 2 … done
Checking hash … done
Downloading file 3 … done
Checking hash … done
Downloading file 4 … done
Checking hash … done
Downloading file 5 … done
Checking hash … done
Downloading file 6 … done
Checking hash … done
Downloading file 7 … done
Checking hash … done
Downloading file 8 … done
Checking hash … done
Downloading file 9 … done
Checking hash … done
Deleting files…
Checking retrieval of non-existent file…
*** Retrieval of non-existent file failed: System.IO.FileNotFoundException: Could not find file ‘\?\A:\butester\NonExistentFile-8ba71f6f-87f8-4afd-a247-62f9535df396’.
File name: ‘\?\A:\butester\NonExistentFile-8ba71f6f-87f8-4afd-a247-62f9535df396’
at System.IO.__Error.WinIOError(Int32 errorCode, String maybeFullPath)
at System.IO.File.InternalCopy(String sourceFileName, String destFileName, Boolean overwrite, Boolean checkHost)
at Duplicati.CommandLine.BackendTester.Program.Run(List1 args, Dictionary2 options, Boolean first)
*** Retrieval of non-existent file should have failed with FileMissingException
Checking quota…
Free Space: 75.99 GB
Total Space: 300.00 GB
Checking DNS names used by this backend…
No DNS names reported
Starting run no 3
Generating file 0 (28.11 MB)
Generating file 1 (31.53 MB)
Generating file 2 (36.36 MB)
Generating file 3 (6.13 MB)
Generating file 4 (10.85 MB)
Generating file 5 (39.52 MB)
Generating file 6 (15.27 MB)
Generating file 7 (31.30 MB)
Generating file 8 (7.28 MB)
Generating file 9 (10.01 MB)
Uploading wrong files …
Generating file 10 (1.36 KB)
Uploading file 0, 1.36 KB … done!
Uploading file 0, 1.36 KB … done!
Uploading file 9, 1.36 KB … done!
Uploading files …
Uploading file 0, 28.11 MB … done!
Uploading file 1, 31.53 MB … done!
Uploading file 2, 36.36 MB … done!
Uploading file 3, 6.13 MB … done!
Uploading file 4, 10.85 MB … done!
Uploading file 5, 39.52 MB … done!
Uploading file 6, 15.27 MB … done!
Uploading file 7, 31.30 MB … done!
Uploading file 8, 7.28 MB … done!
Uploading file 9, 10.01 MB … done!
Renaming file 1 from b9LVIyaTjSJy4auWH2zyjav4icecPU4wMHxXNJ6k7ISJR4tZbnF5SEGb to agYUTGV5scssHT2sdALunEbVuuICGSqfDW7UhdMxjlT1VOl4rMHNPgJFeczz8tyIU
Verifying file list …
Downloading files
Downloading file 0 … done
Checking hash … done
Downloading file 1 … done
Checking hash … done
Downloading file 2 … done
Checking hash … done
Downloading file 3 … done
Checking hash … done
Downloading file 4 … done
Checking hash … done
Downloading file 5 … done
Checking hash … done
Downloading file 6 … done
Checking hash … done
Downloading file 7 … done
Checking hash … done
Downloading file 8 … done
Checking hash … done
Downloading file 9 … done
Checking hash … done
Deleting files…
Checking retrieval of non-existent file…
*** Retrieval of non-existent file failed: System.IO.FileNotFoundException: Could not find file ‘\?\A:\butester\NonExistentFile-90ce62c9-0f8a-4938-82a8-3c5fe065ab78’.
File name: ‘\?\A:\butester\NonExistentFile-90ce62c9-0f8a-4938-82a8-3c5fe065ab78’
at System.IO.__Error.WinIOError(Int32 errorCode, String maybeFullPath)
at System.IO.File.InternalCopy(String sourceFileName, String destFileName, Boolean overwrite, Boolean checkHost)
at Duplicati.CommandLine.BackendTester.Program.Run(List1 args, Dictionary2 options, Boolean first)
*** Retrieval of non-existent file should have failed with FileMissingException
Checking quota…
Free Space: 75.99 GB
Total Space: 300.00 GB
Checking DNS names used by this backend…
No DNS names reported
Starting run no 4
Generating file 0 (32.57 MB)
Generating file 1 (49.08 MB)
Generating file 2 (46.68 MB)
Generating file 3 (2.41 MB)
Generating file 4 (28.68 MB)
Generating file 5 (15.70 MB)
Generating file 6 (7.31 MB)
Generating file 7 (4.86 MB)
Generating file 8 (6.38 MB)
Generating file 9 (40.79 MB)
Uploading wrong files …
Generating file 10 (1.32 KB)
Uploading file 0, 1.32 KB … done!
Uploading file 0, 1.32 KB … done!
Uploading file 9, 1.32 KB … done!
Uploading files …
Uploading file 0, 32.57 MB … done!
Uploading file 1, 49.08 MB … done!
Uploading file 2, 46.68 MB … done!
Uploading file 3, 2.41 MB … done!
Uploading file 4, 28.68 MB … done!
Uploading file 5, 15.70 MB … done!
Uploading file 6, 7.31 MB … done!
Uploading file 7, 4.86 MB … done!
Uploading file 8, 6.38 MB … done!
Uploading file 9, 40.79 MB … done!
Renaming file 1 from k9YiClFB47tANJjahj9Lan6TvISV7Rgn3JQLAhNyhNo to TPYKYQCbRRYbkRxMf5lkXpN3GVdH8DfqJkDczjhG2fxFovy1XDOQZFeEb6ddM5dnMIBaoun5
Verifying file list …
Downloading files
Downloading file 0 … done
Checking hash … done
Downloading file 1 … done
Checking hash … done
Downloading file 2 … done
Checking hash … done
Downloading file 3 … done
Checking hash … done
Downloading file 4 … done
Checking hash … done
Downloading file 5 … done
Checking hash … done
Downloading file 6 … done
Checking hash … done
Downloading file 7 … done
Checking hash … done
Downloading file 8 … done
Checking hash … done
Downloading file 9 … done
Checking hash … done
Deleting files…
Checking retrieval of non-existent file…
*** Retrieval of non-existent file failed: System.IO.FileNotFoundException: Could not find file ‘\?\A:\butester\NonExistentFile-aee56595-3296-4545-a543-a9bff21a09da’.
File name: ‘\?\A:\butester\NonExistentFile-aee56595-3296-4545-a543-a9bff21a09da’
at System.IO.__Error.WinIOError(Int32 errorCode, String maybeFullPath)
at System.IO.File.InternalCopy(String sourceFileName, String destFileName, Boolean overwrite, Boolean checkHost)
at Duplicati.CommandLine.BackendTester.Program.Run(List1 args, Dictionary2 options, Boolean first)
*** Retrieval of non-existent file should have failed with FileMissingException
Checking quota…
Free Space: 75.99 GB
Total Space: 300.00 GB
Checking DNS names used by this backend…
No DNS names reported
Unittest complete!

Besides the expected FileMissingException which is in fact a FileNotFoundException (probably not really an issue?) I do not see any problems with the remote. Do you agree?

I have added these options with log level verbose, compact disabled and all data now being retained.
Having done that, I will give the job one more try, hopefully the changed options will result in some clue for further investigation.

Thanks for your help so far!

A developer who knows the area could say more certainly, but Duplicati has lots of processing components that pass data to each other through channels sort of like Linux and even Command Prompt can use pipes.

I don’t know why the BackendUploader would have gone away, but it might explain why channel to it broke.

The first is the original install, and just launches the second after looking for the latest updated version of it.

I don’t know that it looks. I think that SQLite automatically rolls back the transaction automatically on open.

Rollback Journals

If a crash or power loss occurs in the middle of a transaction, then the rollback journal file is left on disk. The next time another application attempts to open the database file, it notices the presence of the abandoned rollback journal (we call it a “hot journal” in this circumstance) and uses the information in the journal to restore the database to its state prior to the start of the incomplete transaction.

Can’t spot anything wrong, but it was worth a test because I’m not sure why things are being troublesome.

I searched for this phrase with Google, and found a couple of references. This most interesting one was:

2.0.7.101_canary_2024-03-08

Fixed exception happening when stopping the running backup, thanks @Jojo-1000

which matches:

So this is possibly also fixed in next Beta.

Thanks for the explanations!

That is great to hear! Is there anything I can do in the meantime? From my understanding, the fix tackles the exception that I encountered but the problem of the backup job run being seemingly stuck is probably a separate one to be investigated.

The job is now running for 1h15min:

  • First it took a few minutes to start, then, after a status message that the previous backup would be completed, the expected progress bar status appeared 8824 files (-10648784 bytes) to go at …
  • Duplicati modified or created four files on the remote in an interval of about one minute.
  • Since then no more progress or activity on remote has been observable. CPU consumption is between 6.x and 8.x %.
    Both live log and log file are no more extended, last message is after about 5 minutes from start when last file had been put to remote:
  1. Last log file entries:
2024-04-27 00:24:12 +02 - [Verbose-Duplicati.Library.Main.Operation.Backup.FilePreFilterProcess.FileEntry-CheckFileForChanges]: Checking file for changes C:\Users\user\AppData\Local\Duplicati\NJTFCXSFBJ.sqlite, new: False, timestamp changed: True, size changed: False, metadatachanged: True, 2024-03-20 5:33:22 PM vs 2024-03-11 2:59:32 PM
2024-04-27 00:24:14 +02 - [Verbose-Duplicati.Library.Main.Operation.Backup.FileBlockProcessor.FileEntry-ChangedFile]: File has changed C:\Users\user\AppData\Local\Duplicati\NJTFCXSFBJ.sqlite
2024-04-27 00:24:41 +02 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Started: duplicati-b89935303fcf54ceea825ae3fef3931f0.dblock.zip.aes (99.91 MB)
2024-04-27 00:24:46 +02 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Completed: duplicati-b89935303fcf54ceea825ae3fef3931f0.dblock.zip.aes (99.91 MB)
2024-04-27 00:24:46 +02 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Started: duplicati-i034a63b4872d4dcd83dd136b906dac00.dindex.zip.aes (77.59 KB)
2024-04-27 00:24:46 +02 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Completed: duplicati-i034a63b4872d4dcd83dd136b906dac00.dindex.zip.aes (77.59 KB)
2024-04-27 00:24:49 +02 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Started: duplicati-bdd98b82632464cafaefb9ad65dff6c9f.dblock.zip.aes (39.88 MB)
  1. Last live log entries (starting from the entry corresponding to the last log file entry, ending with the most recent entry on top):
Apr 27, 2024 12:24 AM: ExecuteReader: SELECT f."Path", fs."FileID", fs."Lastmodified", COALESCE(bs."Length", -1) FROM (SELECT DISTINCT "FileID", "Lastmodified" FROM "FilesetEntry-19F3ACBA2C66B04BB4A1064D13BA5EE4") AS fs LEFT JOIN "File" AS f ON fs."FileID" = f."ID" LEFT JOIN "Blockset" AS bs ON f."BlocksetID" = bs."ID"; took 0:00:00:06.546
Apr 27, 2024 12:24 AM: Starting - ExecuteReader: SELECT f."Path", fs."FileID", fs."Lastmodified", COALESCE(bs."Length", -1) FROM (SELECT DISTINCT "FileID", "Lastmodified" FROM "FilesetEntry-19F3ACBA2C66B04BB4A1064D13BA5EE4") AS fs LEFT JOIN "File" AS f ON fs."FileID" = f."ID" LEFT JOIN "Blockset" AS bs ON f."BlocksetID" = bs."ID";
Apr 27, 2024 12:24 AM: ExecuteNonQuery: CREATE TEMPORARY TABLE "FilesetEntry-19F3ACBA2C66B04BB4A1064D13BA5EE4" AS SELECT "FileID", "Lastmodified" FROM (SELECT DISTINCT "FilesetID", "FileID", "Lastmodified" FROM "FilesetEntry" WHERE "FilesetID" = 16 AND "FileID" NOT IN (SELECT "FileID" FROM "FilesetEntry" WHERE "FilesetID" = 17)) took 0:00:00:00.444
Apr 27, 2024 12:24 AM: Uploaded 39.88 MB in 00:00:00.0313744, 1.24 GB/s
Apr 27, 2024 12:24 AM: Starting - ExecuteNonQuery: CREATE TEMPORARY TABLE "FilesetEntry-19F3ACBA2C66B04BB4A1064D13BA5EE4" AS SELECT "FileID", "Lastmodified" FROM (SELECT DISTINCT "FilesetID", "FileID", "Lastmodified" FROM "FilesetEntry" WHERE "FilesetID" = 16 AND "FileID" NOT IN (SELECT "FileID" FROM "FilesetEntry" WHERE "FilesetID" = 17))
Apr 27, 2024 12:24 AM: ExecuteScalarInt64: SELECT "ID" FROM "Fileset" WHERE "Timestamp" < 1714170037 AND "ID" != 17 ORDER BY "Timestamp" DESC took 0:00:00:00.000
Apr 27, 2024 12:24 AM: Starting - ExecuteScalarInt64: SELECT "ID" FROM "Fileset" WHERE "Timestamp" < 1714170037 AND "ID" != 17 ORDER BY "Timestamp" DESC
Apr 27, 2024 12:24 AM: Backend event: Put - Started: duplicati-bdd98b82632464cafaefb9ad65dff6c9f.dblock.zip.aes (39.88 MB)

One more comment, just in case you suspect the backed up source files to be faulty: The source file mentioned in the log is the last one listed to be backed up. It is the Duplicati database file of another retired job I had used on the old device but deactivated on the new device (last run on 2024-03-11 on the old device, periodic run probably deactivated on 2024-03-20). That database file is also included in job 2. If it helps in the investigation of the issue, I can check whether this file has successfully been included in the previous, aborted partial job runs of un-retired job 1 (I could try to restore it from one of those and see which of the both timestamps it has).

Not sure how much time to wait for a happy end is reasonable this time (can give Duplicati the rest of the night but unsure whether that really helps) and what would be the proper way to abort if the GUI function does not respond as previously observed. If you have any recommendations, let me know.

While backing up the currently active database is suspected of possibly causing hangs, this one is old. Unrelated unused databases “should” be fine. Another clue is it looks like your Option screen has made “Remote volume size” to 100 MB, so twice the default. You should see a series of ones about that size, however when data to backup finally runs out, the last dblock file is smaller, and here yours was 40 MB.

Did duplicati-bdd98b82632464cafaefb9ad65dff6c9f.dblock.zip.aes appear at the destination? duplicati-i034a63b4872d4dcd83dd136b906dac00.dindex.zip.aes?
I don’t know why it’s so difficult writing files to the USB drive.

You can also look at About → System info → Server state properties → lastPgEvent to check write info.

lastPgEvent : {“BackupID”:“1”,“TaskID”:5,“BackendAction”:“Put”,“BackendPath”:“ParallelUpload”,“BackendFileSize”:104336049,“BackendFileProgress”:103373172,“BackendSpeed”:1351751,“BackendIsBlocking”:false,“CurrentFilename”:null,“CurrentFilesize”:0,“CurrentFileoffset”:0,“CurrentFilecomplete”:false,“Phase”:“Backup_WaitForUpload”,“OverallProgress”:0,“ProcessedFileCount”:6833,“ProcessedFileSize”:8146975251,“TotalFileCount”:6833,“TotalFileSize”:8146975251,“StillCounting”:false}

is an example at the end of a backup where BackendFileProgress hasn’t quite reached BackendFileSize, meaning uploading is a bit behind. On the production side, ProcessedFile* look like it’s done with that part.

EDIT 1:

has the explanation for that, however it bothers me a lot more when the backup stops like yours is doing.

EDIT 2:

(deleted because I was reading a live log going down, and I need to be reading bottom to top. Try again.)

EDIT 3:

Actually, it looks like the bottom-most 39.88 MB file made it (reading bottom to top). Normally a dindex will follow, and at the end of all the dblock/dindex pairs, a dlist should go up. I guess we’ll see what logs show.

Both files exist at remote.

According to that, BackendFileProgress equals BackendFileSize:

lastPgEvent : {"BackupID":"2","TaskID":2,"BackendAction":"Put","BackendPath":"ParallelUpload","BackendFileSize":146665844,"BackendFileProgress":146665844,"BackendSpeed":12554,"BackendIsBlocking":false,"CurrentFilename":"C:\\Users\\user\\AppData\\Local\\Duplicati\\NJTFCXSFBJ.sqlite","CurrentFilesize":1127022592,"CurrentFileoffset":1127022592,"CurrentFilecomplete":true,"Phase":"Backup_ProcessingFiles","OverallProgress":0,"ProcessedFileCount":18523,"ProcessedFileSize":36087390675,"TotalFileCount":27347,"TotalFileSize":36076741891,"StillCounting":false}

Correct, the file is present at the remote, no further dindex or dlist file has followed up to now. Job is still running, GUI status unchanged.
No new lines in log file but three new messages in the live log:

Apr 27, 2024 1:48 AM: Cannot open WMI provider \\localhost\root\virtualization\v2. Hyper-V is probably not installed.
Apr 27, 2024 1:48 AM: Using WMI provider \\localhost\root\virtualization\v2
Apr 27, 2024 1:48 AM: Cannot find any MS SQL Server instance. MS SQL Server is probably not installed.

Regarding the WMI provider and SQL Server messages:

It’s too bad nothing helpful is coming in log. The live log seems to sometimes miss log lines though.

If it’s still using CPU of roughly one core plus possible OS overhead from any I/O that SQLite wants,
Task Manager has an I/O reads (and similar) columns you can enable to see if anything’s happening.

Sysinternals Process Explorer gives a better view, but it’s probably not installed on system currently.
Sysinternals Process Monitor can show what files Duplicati is accessing (if it’s accessing files now).

Commonly when a backup gets too big for its blocksize, SQLite gets very busy accessing its files. Sometimes adding SQLite cache helps, but it’s done by an environment variable, so a little awkward.

I have some good news:
The backup job completed 5h25min after starting it (start: 00.20 AM, end: 05.46 AM).
On the remote, one dindex and two dlist files have appeared at the very end of the backup process at 05.46 AM.
From the logs available I do not really understand what happened in most of the time (00.25 AM - 05.45 AM) because there are almost no entries, neither in the live log nor in the saved log file. The next messages after the ones copied in my previous posts are:

  1. New entries in live log (too much characters to post directly):
    job_live_log_new_entries.zip (3.7 KB)
    So for the live log, in the interval in which no visible progress happens, only the Hyper-V / MS SQL Server/ WMI provider messages appear twice but nothing else.
  2. New entries in saved log file:
2024-04-27 05:45:45 +02 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Completed: duplicati-bdd98b82632464cafaefb9ad65dff6c9f.dblock.zip.aes (39.88 MB)
2024-04-27 05:46:27 +02 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Started: duplicati-i94139bbd21cb408ca1bcced2d5bd664a.dindex.zip.aes (32.04 KB)
2024-04-27 05:46:27 +02 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Completed: duplicati-i94139bbd21cb408ca1bcced2d5bd664a.dindex.zip.aes (32.04 KB)
2024-04-27 05:46:27 +02 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Started: duplicati-20240426T222037Z.dlist.zip.aes (2.28 MB)
2024-04-27 05:46:27 +02 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Completed: duplicati-20240426T222037Z.dlist.zip.aes (2.28 MB)
2024-04-27 05:46:28 +02 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Started: duplicati-20240425T103610Z.dlist.zip.aes (62.84 MB)
2024-04-27 05:46:31 +02 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Completed: duplicati-20240425T103610Z.dlist.zip.aes (62.84 MB)
2024-04-27 05:46:32 +02 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: List - Started:  ()
2024-04-27 05:46:32 +02 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: List - Completed:  (2.82 KB)
2024-04-27 05:46:32 +02 - [Information-Duplicati.Library.Main.Operation.FilelistProcessor-RemoteUnwantedMissingFile]: removing file listed as Temporary: duplicati-20240425T103609Z.dlist.zip.aes
2024-04-27 05:46:33 +02 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Get - Started: duplicati-20240426T222037Z.dlist.zip.aes (2.28 MB)
2024-04-27 05:46:33 +02 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Get - Completed: duplicati-20240426T222037Z.dlist.zip.aes (2.28 MB)
2024-04-27 05:46:33 +02 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Get - Started: duplicati-ia41acd4ef57e4049ace8b09b5ca35509.dindex.zip.aes (14.56 KB)
2024-04-27 05:46:33 +02 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Get - Completed: duplicati-ia41acd4ef57e4049ace8b09b5ca35509.dindex.zip.aes (14.56 KB)
2024-04-27 05:46:33 +02 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Get - Started: duplicati-be550b9dfae8044cc9e24e6ffc689d1d5.dblock.zip.aes (99.95 MB)
2024-04-27 05:46:34 +02 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Get - Completed: duplicati-be550b9dfae8044cc9e24e6ffc689d1d5.dblock.zip.aes (99.95 MB)
  1. Now there is also available the general log of the job run (Expand job → Reporting → Show Log → General) which looks like a summary of the job log file:
 Apr 27, 2024 5:46 AM - Operation: Backup
Time
Start 2024-04-27 00:20:37
End 2024-04-27 05:46:34
Duration 05:25:57
Source Files
Examined 18523 (33.61 GB)
Opened 2955 (9.91 GB)
Added 2681 (8.62 GB)
Modified 274 (1.29 GB)
Deleted 887041
 Test Phase 
Start 2024-04-27 05:46:33
End 2024-04-27 05:46:34
Duration 00:00:01
Verifications 3
 Warnings 0
 Errors 0
 Warnings 0
 Errors 0
Complete log 



            {
  "DeletedFiles": 887041,
  "DeletedFolders": 80747,
  "ModifiedFiles": 274,
  "ExaminedFiles": 18523,
  "OpenedFiles": 2955,
  "AddedFiles": 2681,
  "SizeOfModifiedFiles": 1381506370,
  "SizeOfAddedFiles": 9254360244,
  "SizeOfExaminedFiles": 36087387660,
  "SizeOfOpenedFiles": 10639404036,
  "NotProcessedFiles": 0,
  "AddedFolders": 1548,
  "TooLargeFiles": 0,
  "FilesWithError": 0,
  "ModifiedFolders": 0,
  "ModifiedSymlinks": 0,
  "AddedSymlinks": 0,
  "DeletedSymlinks": 0,
  "PartialBackup": false,
  "Dryrun": false,
  "MainOperation": "Backup",
  "CompactResults": null,
  "VacuumResults": null,
  "DeleteResults": null,
  "RepairResults": null,
  "TestResults": {
    "MainOperation": "Test",
    "VerificationsActualLength": 3,
    "Verifications": [
      {
        "Key": "duplicati-20240426T222037Z.dlist.zip.aes",
        "Value": []
      },
      {
        "Key": "duplicati-ia41acd4ef57e4049ace8b09b5ca35509.dindex.zip.aes",
        "Value": []
      },
      {
        "Key": "duplicati-be550b9dfae8044cc9e24e6ffc689d1d5.dblock.zip.aes",
        "Value": []
      }
    ],
    "ParsedResult": "Success",
    "Version": "2.0.7.1 (2.0.7.1_beta_2023-05-25)",
    "EndTime": "2024-04-27T03:46:34.6571012Z",
    "BeginTime": "2024-04-27T03:46:33.7145473Z",
    "Duration": "00:00:00.9425539",
    "MessagesActualLength": 0,
    "WarningsActualLength": 0,
    "ErrorsActualLength": 0,
    "Messages": null,
    "Warnings": null,
    "Errors": null,
    "BackendStatistics": {
      "RemoteCalls": 12,
      "BytesUploaded": 214984043,
      "BytesDownloaded": 107207767,
      "FilesUploaded": 7,
      "FilesDownloaded": 3,
      "FilesDeleted": 0,
      "FoldersCreated": 0,
      "RetryAttempts": 0,
      "UnknownFileSize": 0,
      "UnknownFileCount": 0,
      "KnownFileCount": 2891,
      "KnownFileSize": 150696707391,
      "LastBackupDate": "2024-04-27T00:20:37+02:00",
      "BackupListCount": 9,
      "TotalQuotaSpace": 322122280960,
      "FreeQuotaSpace": 81383854080,
      "AssignedQuotaSpace": -1,
      "ReportedQuotaError": false,
      "ReportedQuotaWarning": false,
      "MainOperation": "Backup",
      "ParsedResult": "Success",
      "Version": "2.0.7.1 (2.0.7.1_beta_2023-05-25)",
      "EndTime": "0001-01-01T00:00:00",
      "BeginTime": "2024-04-26T22:20:37.5682708Z",
      "Duration": "00:00:00",
      "MessagesActualLength": 0,
      "WarningsActualLength": 0,
      "ErrorsActualLength": 0,
      "Messages": null,
      "Warnings": null,
      "Errors": null
    }
  },
  "ParsedResult": "Success",
  "Version": "2.0.7.1 (2.0.7.1_beta_2023-05-25)",
  "EndTime": "2024-04-27T03:46:34.665164Z",
  "BeginTime": "2024-04-26T22:20:37.5682708Z",
  "Duration": "05:25:57.0968932",
  "MessagesActualLength": 32,
  "WarningsActualLength": 0,
  "ErrorsActualLength": 0,
  "Messages": [
    "2024-04-27 00:20:37 +02 - [Information-Duplicati.Library.Main.Controller-StartingOperation]: The operation Backup has started",
    "2024-04-27 00:22:23 +02 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: List - Started:  ()",
    "2024-04-27 00:22:27 +02 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: List - Completed:  (2.82 KB)",
    "2024-04-27 00:22:27 +02 - [Information-Duplicati.Library.Main.Operation.FilelistProcessor-RemoteUnwantedMissingFile]: removing file listed as Temporary: duplicati-20240424T134522Z.dlist.zip.aes",
    "2024-04-27 00:22:27 +02 - [Information-Duplicati.Library.Main.Operation.FilelistProcessor-SchedulingMissingFileForDelete]: scheduling missing file for deletion, currently listed as Uploading: duplicati-20240424T134523Z.dlist.zip.aes",
    "2024-04-27 00:22:27 +02 - [Information-Duplicati.Library.Main.Operation.FilelistProcessor-KeepIncompleteFile]: keeping protected incomplete remote file listed as Temporary: duplicati-20240425T103609Z.dlist.zip.aes",
    "2024-04-27 00:22:36 +02 - [Information-Duplicati.Library.Main.Operation.Backup.UploadSyntheticFilelist-PreviousBackupFilelistUpload]: Uploading filelist from previous interrupted backup",
    "2024-04-27 00:23:06 +02 - [Information-Duplicati.Library.Main.Operation.Backup.RecreateMissingIndexFiles-RecreateMissingIndexFile]: Re-creating missing index file for duplicati-ba8062607fee047ae8cbe4a9a9f8fe4e8.dblock.zip.aes",
    "2024-04-27 00:23:09 +02 - [Information-Duplicati.Library.Main.Operation.BackupHandler-SkipUsnForVolume]: Performing full scan for volume \"C:\\\": Previous backup did not record USN journal info",
    "2024-04-27 00:23:09 +02 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Started: duplicati-ifc87eb0863e74e57b90e65d36d8f510a.dindex.zip.aes (6.39 KB)",
    "2024-04-27 00:23:09 +02 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Completed: duplicati-ifc87eb0863e74e57b90e65d36d8f510a.dindex.zip.aes (6.39 KB)",
    "2024-04-27 00:24:41 +02 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Started: duplicati-b89935303fcf54ceea825ae3fef3931f0.dblock.zip.aes (99.91 MB)",
    "2024-04-27 00:24:46 +02 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Completed: duplicati-b89935303fcf54ceea825ae3fef3931f0.dblock.zip.aes (99.91 MB)",
    "2024-04-27 00:24:46 +02 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Started: duplicati-i034a63b4872d4dcd83dd136b906dac00.dindex.zip.aes (77.59 KB)",
    "2024-04-27 00:24:46 +02 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Completed: duplicati-i034a63b4872d4dcd83dd136b906dac00.dindex.zip.aes (77.59 KB)",
    "2024-04-27 00:24:49 +02 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Started: duplicati-bdd98b82632464cafaefb9ad65dff6c9f.dblock.zip.aes (39.88 MB)",
    "2024-04-27 05:45:45 +02 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Completed: duplicati-bdd98b82632464cafaefb9ad65dff6c9f.dblock.zip.aes (39.88 MB)",
    "2024-04-27 05:46:27 +02 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Started: duplicati-i94139bbd21cb408ca1bcced2d5bd664a.dindex.zip.aes (32.04 KB)",
    "2024-04-27 05:46:27 +02 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Completed: duplicati-i94139bbd21cb408ca1bcced2d5bd664a.dindex.zip.aes (32.04 KB)",
    "2024-04-27 05:46:27 +02 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Started: duplicati-20240426T222037Z.dlist.zip.aes (2.28 MB)"
  ],
  "Warnings": [],
  "Errors": [],
  "BackendStatistics": {
    "RemoteCalls": 12,
    "BytesUploaded": 214984043,
    "BytesDownloaded": 107207767,
    "FilesUploaded": 7,
    "FilesDownloaded": 3,
    "FilesDeleted": 0,
    "FoldersCreated": 0,
    "RetryAttempts": 0,
    "UnknownFileSize": 0,
    "UnknownFileCount": 0,
    "KnownFileCount": 2891,
    "KnownFileSize": 150696707391,
    "LastBackupDate": "2024-04-27T00:20:37+02:00",
    "BackupListCount": 9,
    "TotalQuotaSpace": 322122280960,
    "FreeQuotaSpace": 81383854080,
    "AssignedQuotaSpace": -1,
    "ReportedQuotaError": false,
    "ReportedQuotaWarning": false,
    "MainOperation": "Backup",
    "ParsedResult": "Success",
    "Version": "2.0.7.1 (2.0.7.1_beta_2023-05-25)",
    "EndTime": "0001-01-01T00:00:00",
    "BeginTime": "2024-04-26T22:20:37.5682708Z",
    "Duration": "00:00:00",
    "MessagesActualLength": 0,
    "WarningsActualLength": 0,
    "ErrorsActualLength": 0,
    "Messages": null,
    "Warnings": null,
    "Errors": null
  }
}
  1. And also a couple of entries exist in Expand job → Reporting → Show Log → Remote:
Apr 27, 2024 5:46 AM: get duplicati-be550b9dfae8044cc9e24e6ffc689d1d5.dblock.zip.aes
Apr 27, 2024 5:46 AM: get duplicati-ia41acd4ef57e4049ace8b09b5ca35509.dindex.zip.aes
Apr 27, 2024 5:46 AM: get duplicati-20240426T222037Z.dlist.zip.aes
Apr 27, 2024 5:46 AM: list
Apr 27, 2024 5:46 AM: put duplicati-20240425T103610Z.dlist.zip.aes
Apr 27, 2024 5:46 AM: put duplicati-20240426T222037Z.dlist.zip.aes
Apr 27, 2024 5:46 AM: put duplicati-i94139bbd21cb408ca1bcced2d5bd664a.dindex.zip.aes
Apr 27, 2024 12:24 AM: put duplicati-bdd98b82632464cafaefb9ad65dff6c9f.dblock.zip.aes
Apr 27, 2024 12:24 AM: put duplicati-i034a63b4872d4dcd83dd136b906dac00.dindex.zip.aes
Apr 27, 2024 12:24 AM: put duplicati-b89935303fcf54ceea825ae3fef3931f0.dblock.zip.aes
Apr 27, 2024 12:23 AM: put duplicati-ifc87eb0863e74e57b90e65d36d8f510a.dindex.zip.aes
Apr 27, 2024 12:22 AM: list

Do you have any ideas of what may have happened between 00.25 AM and 05.45 AM?
With compact disabled and full retention configured, we at least know that those process steps are not responsible for the delay observable this time.

Now that the job appears to have finished in a clean state, I may try to restart it and see how it behaves now (there should be barely any difference in source compared to the previous run). If you have any other suggestions of what to do next, I’m happy to hear them.
Also, if you need the full log in a single block for the live log or the log file, let me know. For the former, I can check whether my partial copies capture the whole job run interval.

EDIT, responding to the additional information you posted in the meantime:

Okay, that explanation is plausible. In fact, I remember going into “Edit” for the running job to check whether it has a snapshot-policy configured (it has not), then leaving without saving.

I can experiment a bit using those tools if you think restarting the job would be the most valuable option to proceed.

Is there an easy-to-follow way to check whether that is the case? The job details report (note that source is so small because I had removed source folders in previous partial job runs, it will be about 230 GB after re-adding them):

Source:33.61 GB
Backup:140.35 GB

Configured blocksize is 250 KB.

We posted at almost the same time, so see my last post, but it’s a little late to catch the action live now.

The good news is that it finished, but the bad news in that is it’s harder to figure out where it was before.

You might find something in a log (level varies) near start of backup talking about things like the following:

2024-04-20 06:51:45 -04 - [Profiling-Timer.Begin-Duplicati.Library.Main.Operation.Common.DatabaseCommon-CommitTransactionAsync]: Starting - PreSyntheticFilelist
2024-04-20 06:51:46 -04 - [Profiling-Timer.Finished-Duplicati.Library.Main.Operation.Common.DatabaseCommon-CommitTransactionAsync]: PreSyntheticFilelist took 0:00:00:00.452
2024-04-20 06:51:46 -04 - [Information-Duplicati.Library.Main.Operation.Backup.UploadSyntheticFilelist-PreviousBackupFilelistUpload]: Uploading filelist from previous interrupted backup

The synthetic file list is a recovery for an interrupted backup that reflects how far it had gotten by interrupt.
I’m not sure why it’s prepared at start of backup but its old dlist is uploaded at end, next to the current dlist.

2024-04-27 05:46:27 +02 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Started: duplicati-20240426T222037Z.dlist.zip.aes (2.28 MB)
2024-04-27 05:46:27 +02 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Completed: duplicati-20240426T222037Z.dlist.zip.aes (2.28 MB)
2024-04-27 05:46:28 +02 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Started: duplicati-20240425T103610Z.dlist.zip.aes (62.84 MB)
2024-04-27 05:46:31 +02 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Completed: duplicati-20240425T103610Z.dlist.zip.aes (62.84 MB)

The sizes seem strange, unless you cut the backup down as you were thinking of, but I didn’t really advise.
You can look at your other dlist file sizes. Unless your backup shrank a lot, dlist sizes should be similar, but

makes me think you might have cut back, unless there’s a huge amount of file removal on the source drive.

Not much, and we just missed our chance to go in with heavier duty tools to explore the SQLite read theory.
Sometimes when SQLite slows, I see it reading something easy to recognize, like the database or -journal.
Other times it’s reading a file in the user Temp folder that begins with etilqs (which is sqlite backwards).

What I find odd (and again, input from a skilled developer would help) is how the logs got a little misaligned.

was the live log, and reading up finds a good looking progress message of almost certainly the same file:

plus you said you saw it. The question is why the log file had the Started message but not the Completed:

and hours later:

The most definitive way is to catch it with Task Manager, Process Explorer, or Process Monitor, however normally there would also be a log entry showing some SQL starting. I mentioned the worse reliability of logging via live log compared to log file. I’ve also been noting that some SQL doesn’t even log to a log file.

So that probably explains why the dlist got so small (comments above), but you can see if sizes look OK.

You can’t change blocksize on an existing backup. Default is 100 KB, Is that 250 KB for job 1 or job 2?

My rough rule of thumb is to limit a backup to 1 million blocks, but even a short file makes 2 blocks, so effectiveness of increasing blocksize works well on large files, and not as well with lots of smaller files.

This could use a whole lot more performance testing of backup sizes, but people and gear are scarce.

SQLite’s default memory cache size is only 2 MB, and databases are almost always larger, so getting temporary file activity instead of memory activity is easy to induce – and I/O is far slower than memory.

Query Planning can also make bad choices, especially if PRAGMA optimize isn’t run often enough, and there’s some thought that its decisions aren’t always good, so PRAGMA analyze should be run directly.

All of this is pointing at general SQLite concern, but I’d note we found no proof that yours was in SQLite.

Thanks, I have tried to incorporate your findings in the edit of my previous post.

That is correct, the chosen log level seems not to show it as detailed as in your example but I also have the UploadSyntheticFilelist-PreviousBackupFilelistUpload entry at the beginning of the job run:

2024-04-27 00:20:37 +02 - [Information-Duplicati.Library.Main.Controller-StartingOperation]: The operation Backup has started
2024-04-27 00:22:23 +02 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: List - Started:  ()
2024-04-27 00:22:27 +02 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: List - Completed:  (2.82 KB)
2024-04-27 00:22:27 +02 - [Information-Duplicati.Library.Main.Operation.FilelistProcessor-RemoteUnwantedMissingFile]: removing file listed as Temporary: duplicati-20240424T134522Z.dlist.zip.aes
2024-04-27 00:22:27 +02 - [Information-Duplicati.Library.Main.Operation.FilelistProcessor-SchedulingMissingFileForDelete]: scheduling missing file for deletion, currently listed as Uploading: duplicati-20240424T134523Z.dlist.zip.aes
2024-04-27 00:22:27 +02 - [Information-Duplicati.Library.Main.Operation.FilelistProcessor-KeepIncompleteFile]: keeping protected incomplete remote file listed as Temporary: duplicati-20240425T103609Z.dlist.zip.aes
2024-04-27 00:22:36 +02 - [Information-Duplicati.Library.Main.Operation.Backup.UploadSyntheticFilelist-PreviousBackupFilelistUpload]: Uploading filelist from previous interrupted backup

The dlist file size continuously increased up to 62.8 MB in March when the job was retired. Same size is observable for the interrupted backup dlist file. For the successful run afterwards, as you already noted, it decreased considerably to 2.28 MB.
To clarify modifications in the job configuration:

  • I had removed source folders from the job two times: 1) Before un-retiring the job (that is, before the first partial job run). 2) Before the second partial job run, hoping for faster progress. Afterwards, I posted in this forum and following your advice, I did no more reduce the source folder list. I did also not re-add any source folders to keep the scenario constant and not add another variable for investigation.
    My interpretation of the dlist file sizes is that the change in source folder list is only reflected in the dlist if a run has completed successfully (that at least is what the data of my case suggests).
  • The only modifications in job configuration after posting here in the forum are:

Regarding the odd put:

Well spotted! This is indeed also reflected in the log file (there are no other log messages in between those two):

2024-04-27 00:24:49 +02 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Started: duplicati-bdd98b82632464cafaefb9ad65dff6c9f.dblock.zip.aes (39.88 MB)
2024-04-27 05:45:45 +02 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Completed: duplicati-bdd98b82632464cafaefb9ad65dff6c9f.dblock.zip.aes (39.88 MB)

Correct, I saw that file and as you already referred to it by its file size in your previous post, I am also pretty sure that it already had a size of 39.88 MB. Even though Duplicati reports the Put to be completed at 05.45 AM, the file meta data (last modified date) is still 00.24 AM (but I can’t tell whether this really means that the file was no more changed at 05.45 AM on the remote).
I would also be interested in developers’ thoughts about that.

If that helps, I can change log-file-log-level from verbose to profiling. Verbose level resulted in 85k lines in a 25 MB log file.

Block size of both jobs are:

  1. Job 1: 250 KB
  2. Job 2: 400 KB
    I have set the size at the initial configuration before running the backup for the first time. When creating job 2, I decided to choose a larger block size (mainly because I read that too small block sizes may cause problems when the number of overall blocks grows large).

I can test both but to be honest, I do not really know what I had to do specifically. If I understand correctly, for cache size there exists an environment variable I could modify. If I know its name and which value format it expects, I can test increasing the cache size.
For the PRAGMA analyze I am even more clueless. I recall having read about that pragma from time to time when browing in the forum but am not sure what to do precisely: Is this something I should use manually and if so, at which point in time? How can I run it (within the Duplicati GUI, as command-line option or even by directly accessing the SQLite database when Duplicati is not running)?

I think I will rerun the backup job one more time with log-file-log-level profiling and in case it appears stuck, check out whether SysInternals tools provide any clue about what is actually happening.

EDIT: Corrected reported log file size (it’s MB, not KB)

EDIT 2: The rerun has completed in 2 minutes, processing time was thus not suspicious this time. As expected, not that much of data had to be transferred to the remote (one dindex/dblock pair and a single dlist file). I think I may now gradually extend the source folder list to the configuration prior to the partial backup job runs and may also enable auto-compact again. I will report whether I am successful. If there is anything else I can do to help reproducing the original issue, let me know.

What’s especially strange to me is that there’s seemingly not much high-level code distance between Uploaded and Completed messages:

A more normal look (except for speed on this debug build, which seems oddly low and unpredictable) is:

2024-04-25 15:45:18 -04 - [Profiling-Duplicati.Library.Main.Operation.Backup.BackendUploader-UploadSpeed]: Uploaded 48.34 MB in 00:01:04.3566658, 769.22 KB/s
2024-04-25 15:45:18 -04 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Completed: duplicati-b5e32ab48f30e4e4da36d66fa08f8d902.dblock.zip.aes (48.34 MB)

I suppose you could look at your logs to see if your Completed is typically a quick follow to UploadSpeed.

CUSTOMSQLITEOPTIONS_DUPLICATI is its name. Was EXPERIMENTAL_CUSTOMSQLITEOPTIONS_DUPLICATI and you can probably find several usages under one or the other in the forum and possibly GitHub Issues.

set CUSTOMSQLITEOPTIONS_DUPLICATI=cache_size=-200000 is how a batch file gets 200 MB of cache.
Command Prompt and manual launch can also be done. Changing environment variables for a Windows service is awkward compared with a Linux service where it’s very easy. Windows might need a global set.

The optimize/analyze are probably something the code would have to do, so that’s a developer suggestion. You might have read of PRAGMA optimize at database close, but for initial backup, that’s a long time away. Database proper close also doesn’t happen if process is killed, so next try still hasn’t had analyze benefits.

This is asking too much for most people, but if something like DB Browser for SQLite is used, you can do PRAGMA analyze there (assuming database will open) if you have to do a process kill (so miss optimize).

Just following up for the current status:
After iteratively re-adding the previously omitted source folders, re-enabling auto-compact and activating a retention policy the backup is now up to date. There were no more occurences of the job appearing stuck for longer time.

For the post-interrupted backup job runs, Completed is indeed typically a quick follow to Upload speed. For each of those backup runs, there is only one instance in which both messages do not appear in the same second (but about 5 seconds delayed). In this one instance, the logs indicate that VerifyConsistency has finished between Upload speed and Completed:

2024-04-27 14:00:21 +02 - [Profiling-Timer.Finished-Duplicati.Library.Main.Operation.BackupHandler-VerifyConsistency]: VerifyConsistency took 0:00:00:06.642

Thanks for explaining, I may experiment a bit with the environment variable to see whether this can generally improve the performance of the backup job. Regarding the PRAGMA, I hope I will not need to use it but it could be a feature request to add an option in the Duplicati Web GUI to run it (maybe alternatively to a full repair?) or to do it even automatically if Duplicati can detect it has quit inproperly.

Does that mean that also in case of an inproperly quit Duplicati, log entries up to that point should exist? Or are they affected by the rollback when restarting Duplicati?
In the latter case, submitting log entries over a second connection and immediately committing them could maybe help?

Another (generic) idea could be to make more transparent in the GUI progress bar / status the currently performed step, especially if a single step appears to take hours, it could help to know where exactly we are at. I am not sure though what a user-friendly description could be, maybe just a “ takes longer than expected, see … for details” where … could be a new/extended entry in About → System info that indicates more accurately then currently where the process is stuck.

Thanks again for your help!

It’s good to hear that the huge unexplainable gap isn’t common. I’m not enough of an expert dev to guess at where your hours of delay came from, although if it comes back again, you have new things you can check.

The forum is not a good issue tracker, but you can certainly see if it’s in GitHub Issues. If not, you can open.
The way PRAGMA optimize is explained, it sounds like it’s very fast if it doesn’t actually have to run analyze. Running one at database opens might be quick, although there’s still a question of how well optimize works.

The change makes some things that used to go only into the hard-to-find server log also go to usual job log, however I think the job log can still be lost in transaction rollback. Exact commit points are not documented.

SQLite only allows one writer at a time, I think, although I’m not sure when it’s enforced. Maybe at connect?

Also, wouldn’t commit apply to the whole database, including other changes which aren’t ready to commit?

This is something that worries me currently. Lots of concurrency, and each part commits as it feels a need. What happens if a good time for part A is a bad time for part B? Still looking at how the design handles that.

The status bar is already too crowded for what it tries to show, so it has to use a variable display sequence. Posting specific internals details of SQL and such seems way too much to show, or to be comprehensible.

This is where I thought the details could be hidden. Your idea of a combination approach might be workable, however it would be up to some developer (and they are scarce) to figure out what might be implemented…

shows some early thinking, although the bar of “a moment” or “a second” seems way too sensitive for large backups. The idea of “what it’s currently doing” is also old, as 2.0.3.6_canary_2018-04-23 went concurrent.

I’m still wishing there was a design document on concurrency, with transaction design that fits in properly.

Lacking that, I tried expanding my thinking in the channel pipeline topic into some pipelines. Main might be:

FileEnumerationProcess.cs → SourcePaths → MetadataPreProcess.cs → ProcessedFiles → FilePreFilterProcess.cs → AcceptedChangedFile → FileBlockProcessor.cs → StreamBlock → StreamBlockSplitter.cs → OutputBlocks → DataBlockProcessor.cs → BackendRequest

and several of those (but not yet charted) are doing database actions and occasional commits-as-seen-fit.