Duplicati hangs after backup attempt (OSX) Was: Crashplan Refugee can't get Duplicati to restore

Installed Duplicati on my OS X 10.13.6 machine to experiment with it before migrating the rest of my family over to it. Seems to backup ‘ok’ to my Google Drive. Thought I’d try a restore. Ignoring that it takes a really long time to drill down to my “~/Documents” folder (like several minutes per directory on my 150down/25up link) but then when I tell it to actually start restoring (to my ~/Desktop/ folder), it says “Waiting for task to begin” for several days. The last log message is Sept 1st and says:

I’m afraid someone is going to suggest rebooting (because “Mono” I guess?) but I’m loathe to actually do that. I’ve got a ton of terminal windows doing actual work that I don’t want to restart (not all of them are tmux sessions on remote hosts).

Duplicati - 2.0.3.3_beta_2018-04-02

Edit: it must be hung or something. When I click ‘Run now’, nothing happens.

What’s the total size of the backup set in question? If it’s really a “test” set (several gigs or less), I wouldn’t expect it to behave like it is.

Can you try restarting the Duplicati stuff itself? I don’t think restarting your machine would be the first thing on my list of things to try, as long as you can be sure you’ve fully stopped and restarted Duplicati.

It’s 160GB. It’s the backup of 1 host out of the 5 that I am currently backing up to CP.

I restarted Duplicati yesterday by killing Mono and then checking there were no other Duplicati processes running (ps -ef | grep -i duplicat). Then I restarted it. It immediately did a backup that claimed to finish and the user interface was hung again after. “Run now” doesn’t do anything and a restore says “Waiting for task to begin” again.

I killed Duplicati again and restarted it, then quickly stopped the backup it tried to start and then began a restore. It appears to work in that case (it’s actually downloading data now).

A few things I’ve noticed are:

  1. after installing Duplicati in early July, it’s only ever done 4 backups. (July 5, 11, Aug 30, Sept 5). They correspond to either reboots or restarts of Duplicati processes. So it strongly suggests that Duplicati hangs after a backup.

  2. Duplicati says there has never been a successful backup. It logs errors about permissions on files I don’t care about (named sockets, the odd file created while I was sudo’d, etc). So I understand why the backup is not ever ‘successful’. I doubt that’s relevant to this issue.

Looks like I need a script to kill Duplicati after each backup. Seems sub-optimal.

Further to last, after my restore, if I click “Run now” and then Abort the backup, the UI hangs in the same way so it’s very repeatable. In fact, very fragile.

The typical “because Mono” heard in the forum involves a need to upgrade from whatever the distribution shipped to a new Mono right from Mono project. I don’t know if this ever requires a reboot, but mine didn’t yesterday when I updated to clear some odd behavior where “Check for updates now” saw none. OK now.

I don’t run OS X, and I suspect you had to pull things together manually, but Mono is still worth mentioning.

For examining the seeming Duplicati hang, the best thing to try might be running from a job Commandline,
setting --log-level (–console-log-level for newer releases) to Information or even Profiling (highly verbose).

Although the boundary between runs gets less clear, one can also watch server activities live by using the About --> Show log --> Live facility, and setting the drop-down to use Information, or Profiling, or whatever.

For long-term unattended logs, one can set the --log-file and related options to log data to look over later.

If you’re going to go into scripts, you might consider scripting the backup instead of the killing of Duplicati. Exporting a job in a script-friendly format can be done with Configuration --> Export --> As Command-line.

I used “because Mono” not because I saw it in the forum but that my general experience with Mono under both OSX and Linux is pretty crappy. Personal opinions on software development aside, I’m not inclined to invest a lot of time in mission-critical software that has to run under Mono.

I didn’t have to pull anything together manually; I just downloaded Duplicati and installed it. It wasn’t until this problem arose that I even realized it ran under Mono.

Thanks for the debugging tips. I’ll dig in as time permits.

Interesting about Mono. I hadn’t heard that Duplicati bundles it, but it does need it. I wonder what you have?

Duplicati won’t startup after update on MacOS has some discussion. Maybe some Mac user can help clarify.

I just upgraded to 5.14 from 5.10. It looks like 5.10 is what is bundled with OSX. I was still able to reproduce the problem.

I’ve browsed that thread and will spend some time tomorrow which is my next window for debugging.

No change. Nothing additional is logged with --console-log-level Information (or even --log-level Information). The server log doesn’t say anything useful. I switched browsers (from Opera to Safari) in case it’s browser related, but it appears not to be. It still hung after doing a backup.

I tried a dtruss on the duplicati process and it appears to be in a poll/timeout loop so presumably one of the other processes has gone awry. The dtruss output is different from when it’s not in the hung state.

I’m not sure what you tried, but --console-log-level Information definitely makes my “Commandline” log more:

Running commandline entry
Finished!

            
Backup started at 9/7/2018 10:34:56 AM
The operation Backup has started
Checking remote backup ...
Backend event: List - Started:  ()
  Listing remote folder ...
Backend event: List - Completed:  (33 bytes)
Scanning local files ...
  4 files need to be examined (45 bytes)
Backend event: Put - Started: duplicati-b25b93732d78b4889bc52d497204e05b8.dblock.zip (512 bytes)
  Uploading file (512 bytes) ...
Backend event: Put - Completed: duplicati-b25b93732d78b4889bc52d497204e05b8.dblock.zip (512 bytes)
Backend event: Put - Started: duplicati-i5343db5750d74dbbbfdff3952628efde.dindex.zip (568 bytes)
  Uploading file (568 bytes) ...
Backend event: Put - Completed: duplicati-i5343db5750d74dbbbfdff3952628efde.dindex.zip (568 bytes)
Backend event: Put - Started: duplicati-20180907T143456Z.dlist.zip (801 bytes)
  Uploading file (801 bytes) ...
Backend event: Put - Completed: duplicati-20180907T143456Z.dlist.zip (801 bytes)
Start checking if backups can be removed
Time frames and intervals pairs: Unlimited / Keep all
Backups to consider: 9/7/2018 10:33:33 AM, 9/6/2018 6:12:38 AM, 9/5/2018 7:26:09 PM, 9/5/2018 7:24:20 PM, 9/5/2018 7:20:52 PM, 9/4/2018 10:37:04 AM, 9/1/2018 9:42:47 AM, 9/1/2018 9:41:24 AM, 8/28/2018 2:03:40 PM, 8/28/2018 2:03:15 PM
Backups outside of all time frames and thus getting deleted: 
All backups to delete: 
No remote filesets were deleted
Compacting not required
Backend event: Put - Started: duplicati-verification.json (6.68 KB)
  Uploading file (6.68 KB) ...
Backend event: Put - Completed: duplicati-verification.json (6.68 KB)
Checking remote backup ...
Backend event: List - Started:  ()
  Listing remote folder ...
Backend event: List - Completed:  (36 bytes)
Verifying remote backup ...
Remote backup verification completed
Backend event: Get - Started: duplicati-20180907T143456Z.dlist.zip (801 bytes)
  Downloading file (801 bytes) ...
Backend event: Get - Completed: duplicati-20180907T143456Z.dlist.zip (801 bytes)
Backend event: Get - Started: duplicati-i7351adc9b9904809a8471e8a2dc25d0d.dindex.zip (567 bytes)
  Downloading file (567 bytes) ...
Backend event: Get - Completed: duplicati-i7351adc9b9904809a8471e8a2dc25d0d.dindex.zip (567 bytes)
Backend event: Get - Started: duplicati-b9b1d2200e07a485ebcdc601f79c33702.dblock.zip (512 bytes)
  Downloading file (512 bytes) ...
Backend event: Get - Completed: duplicati-b9b1d2200e07a485ebcdc601f79c33702.dblock.zip (512 bytes)
  0 files need to be examined (0 bytes)
  Duration of backup: 00:00:02
  Remote files: 35
  Remote size: 2.76 MB
  Total remote quota: 231.42 GB
  Available remote quota: 25.38 GB
  Files added: 0
  Files deleted: 0
  Files changed: 1
  Data uploaded: 8.52 KB
  Data downloaded: 1.84 KB
Backup completed successfully!
Return code: 0

Even without the additional information, the idea is that on an otherwise idle Duplicati, the job’s Commandline option should give a more step-by-step-readable display of what’s going on, compared to the regular job log. Several things happen after the backup itself, such as deletion per retention rules, and a sample verification. Seeing things go by in sequence helps narrow the area, so we can know where it hung after doing a backup.

Went away for the weekend. This is all Duplicati had to say:

chimay MacOS $ ./duplicati --log-level Information
Server has started and is listening on 127.0.0.1, port 8200

Does the same thing with “–console-log-level Information”.

That said, it also hasn’t done anything since 9/7/2018. I just restarted it with ‘–console-log-level Information’:

chimay MacOS $ ./duplicati --console-log-level Information
Server has started and is listening on 127.0.0.1, port 8200

In the browser, it’s starting a backup and is saying stuff like:

Counting (36377 files found, 1.09 GB)

After digging further, it looks like ‘duplicati’ is a shell script under OS-X, which gathers some information and runs another script called ‘run-with-mono.sh’. The latter script doesn’t appear to pass any command line arguments to the executable so that explains it.

I tried modifying ‘run-with-mono.sh’:

--- run-with-mono.sh.orig	2018-09-10 05:49:46.000000000 -0600
+++ run-with-mono.sh	2018-09-10 05:46:21.000000000 -0600
@@ -100,7 +100,7 @@
 	fi
 else
 	# On a modern OSX, so we avoid modifying the bundle contents
 -	exec -a "$APP_NAME" "${MONO_BIN}" "$ASSEMBLY" $@
 +	exec -a "$APP_NAME" "${MONO_BIN}" "$ASSEMBLY" $@ --log-level Information
 fi

and called it the same way the ‘duplicati’ script calls it:

chimay MacOS $ ASSEMBLY=Duplicati.GUI.TrayIcon.exe APP_NAME=Duplicati SCRIPT_DIR=/Applications/Duplicati.app/Contents/MacOS sh -x ./run-with-mono.sh
+++ dirname ./run-with-mono.sh
++ cd .
++ pwd
+ SCRIPT_DIR=/Applications/Duplicati.app/Contents/MacOS
+ '[' z/Applications/Duplicati.app/Contents/MacOS == z ']'
++ dirname /Applications/Duplicati.app/Contents/MacOS
+ APP_PATH=/Applications/Duplicati.app/Contents
+ export GDIPLUS_NOX=1
+ GDIPLUS_NOX=1
+ REQUIRED_MAJOR=3
+ REQUIRED_MINOR=0
+ VERSION_TITLE='Cannot launch Duplicati'
+ VERSION_MSG='Duplicati requires the Mono Framework version 3.0 or later.'
+ DOWNLOAD_URL=http://www.mono-project.com/download/stable/#download-mac
+ '[' z == z ']'
++ which mono
+ MONO_BIN=/Library/Frameworks/Mono.framework/Versions/Current/Commands/mono
+ '[' '!' -f /Library/Frameworks/Mono.framework/Versions/Current/Commands/mono ']'
+ '[' z/Library/Frameworks/Mono.framework/Versions/Current/Commands/mono == z ']'
++ /Library/Frameworks/Mono.framework/Versions/Current/Commands/mono --version
++ grep 'Mono JIT compiler version '
++ cut -f5 '-d '
+ MONO_VERSION=5.14.0.177
++ echo 5.14.0.177
++ cut -f1 -d.
+ MONO_VERSION_MAJOR=5
++ echo 5.14.0.177
++ cut -f2 -d.
+ MONO_VERSION_MINOR=14
+ '[' -z 5.14.0.177 ']'
+ '[' 5 -lt 3 ']'
+ '[' 5 -eq 3 -a 14 -lt 0 ']'
+ cd /Applications/Duplicati.app/Contents/Resources
++ uname -r
++ cut -f1 -d.
+ OSX_VERSION=17
+ '[' 17 -lt 9 ']'
+ exec -a Duplicati /Library/Frameworks/Mono.framework/Versions/Current/Commands/mono Duplicati.GUI.TrayIcon.exe --log-level Information
Server has started and is listening on 127.0.0.1, port 8200
^C

Does the same thing with ‘–console-log-level Information’.

–log-file-log-level usually uses --log-file to say where to log. --log-file documentation doesn’t give any default:

C:\Program Files\Duplicati 2>duplicati.commandline.exe help log-file
  --log-file (Path): Log internal information to a file
    Logs information to the file specified

C:\Program Files\Duplicati 2>

–console-log-level seems like it’s mostly relevant to the command line case. Not sure if it works with tray icon.

Possibly OSX is different. You could try just saying “duplicati help” or maybe go with my initial log explanation.

Linux /usr/bin/duplicati passes options:

#!/bin/bash
INSTALLDIR=/usr/lib/duplicati
export LD_LIBRARY_PATH="${INSTALLDIR}${LD_LIBRARY_PATH:+:$LD_LIBRARY_PATH}"
export MONO_PATH=$MONO_PATH:${INSTALLDIR}

EXE_FILE=${INSTALLDIR}/Duplicati.GUI.TrayIcon.exe
APP_NAME=Duplicati

exec -a "$APP_NAME" mono "$EXE_FILE" "$@"

I’m generally on Windows, where Duplicati.GUI.TrayIcon.exe launches in the background and doesn’t even have a console window open to write data to, so maybe TrayIcon just isn’t talky. I suggest --log-file or even using the “Commandline” option under “Advanced” on the main menu. That isolates output for that backup. Example output that I showed demonstrating more verbosity at “Information” level was from “Commandline”.

I understand now how this is put together I think. I’m running an Advanced->CommandLine task right now with log-level set to Information. I’m sure it’ll be running long after I’ve left for work. Will report back tonight or tomorrow morning.

Well, I can’t say there’s a lot to say here but I’ll paste the log:

Backup started at 9/10/2018 6:31:26 AM
Checking remote backup …
Listing remote folder …
Scanning local files …
6821 files need to be examined (668.30 MB) (still counting)
Failed to process path: /Users/hpeyerl/duplicati.log => Sharing violation on path /Users/hpeyerl/duplicati.log
6939 files need to be examined (404.00 MB) (still counting)
7224 files need to be examined (86.72 MB) (still counting)
7805 files need to be examined (148.22 MB) (still counting)
8871 files need to be examined (156.39 MB) (still counting)
10060 files need to be examined (176.09 MB) (still counting)
10856 files need to be examined (198.10 MB) (still counting)
11940 files need to be examined (229.83 MB) (still counting)
13293 files need to be examined (246.21 MB) (still counting)
14321 files need to be examined (265.41 MB) (still counting)
15004 files need to be examined (274.18 MB) (still counting)
15759 files need to be examined (263.54 MB) (still counting)
16482 files need to be examined (246.41 MB) (still counting)
---- snip----
210258 files need to be examined (28.87 GB) (still counting)
211000 files need to be examined (28.91 GB) (still counting)
211286 files need to be examined (29.00 GB) (still counting)
211387 files need to be examined (29.03 GB) (still counting)
211872 files need to be examined (29.05 GB) (still counting)
212847 files need to be examined (29.22 GB) (still counting)
213691 files need to be examined (30.87 GB)
211703 files need to be examined (30.84 GB)
209962 files need to be examined (30.82 GB)
208046 files need to be examined (30.80 GB)
206097 files need to be examined (30.78 GB)
204118 files need to be examined (30.75 GB)
—snip—
15723 files need to be examined (4.77 GB)
14055 files need to be examined (3.94 GB)
12068 files need to be examined (3.21 GB)
10284 files need to be examined (2.81 GB)
8370 files need to be examined (2.04 GB)
6805 files need to be examined (1.92 GB)
5260 files need to be examined (1.87 GB)
3003 files need to be examined (1.70 GB)
1052 files need to be examined (1.60 GB)
1051 files need to be examined (156.01 MB)
0 files need to be examined (0 bytes)
Uploading file (571.07 MB) …
Uploading file (1.35 MB) …
Uploading file (38.86 MB) …
Checking remote backup …
Listing remote folder …
Verifying remote backup …
Remote backup verification completed
Downloading file (38.86 MB) …
Downloading file (756.09 KB) …
Downloading file (1,023.99 MB) …

In the browser it says “Finished!” … If I click ‘run now’, it’s back to being hung. If I go to “Advanced” and try to select “Restore”, none of the fields change from the state they were in with the ‘Backup’… If I try to start a restore, it’s hung at “Waiting for task to start” again…

I found the ‘duplicati.log’ file that also contains nothing substantially of interest. A few errors on files that are inaccessible but that’s it:

2018-09-10 12:32:28Z - Information: Backend event: List - Started: ()
2018-09-10 12:32:33Z - Information: Backend event: List - Completed: (225 bytes)
2018-09-10 12:32:37Z - Warning: Failed to process path: /Users/hpeyerl/duplicati.log
System.IO.IOException: Sharing violation on path /Users/hpeyerl/duplicati.log
at System.IO.FileStream…ctor (System.String path, System.IO.FileMode mode, System.IO.FileAccess access, System.IO.FileShare share, System.Int32 bufferSize, System.Boolean anonymous, System.IO.FileOptions options) [0x0019e] in <016f22b59bd64951b06e38b9f5a0506c>:0
at System.IO.FileStream…ctor (System.String path, System.IO.FileMode mode, System.IO.FileAccess access, System.IO.FileShare share) [0x00000] in <016f22b59bd64951b06e38b9f5a0506c>:0
at (wrapper remoting-invoke-with-check) System.IO.FileStream…ctor(string,System.IO.FileMode,System.IO.FileAccess,System.IO.FileShare)
at System.IO.File.Open (System.String path, System.IO.FileMode mode, System.IO.FileAccess access, System.IO.FileShare share) [0x00000] in <016f22b59bd64951b06e38b9f5a0506c>:0
at Duplicati.Library.Snapshots.NoSnapshot.OpenRead (System.String file) [0x00000] in <2be220318a714f97a1ac5d13c83b05be>:0
at Duplicati.Library.Main.Operation.BackupHandler.HandleFilesystemEntry (Duplicati.Library.Snapshots.ISnapshotService snapshot, Duplicati.Library.Main.BackendManager backend, System.String path, System.IO.FileAttributes attributes) [0x00433] in :0

2018-09-10 13:21:37Z - Warning: Failed to process metadata for “/Users/hpeyerl/Library/Google/GoogleSoftwareUpdate/TicketStore/”, storing empty metadata
UnixSupport.File+FileAccesException: Unable to access the file “/Users/hpeyerl/Library/Google/GoogleSoftwareUpdate/TicketStore” with method listxattr, error: EACCES (13)
at UnixSupport.File.GetExtendedAttributes (System.String path, System.Boolean isSymlink, System.Boolean followSymlink) [0x00077] in <3fbec333f978484785726c089e2a43ac>:0
at Duplicati.Library.Snapshots.SystemIOLinux.GetMetadata (System.String file, System.Boolean isSymlink, System.Boolean followSymlink) [0x0000d] in <2be220318a714f97a1ac5d13c83b05be>:0
at Duplicati.Library.Snapshots.NoSnapshotLinux.GetMetadata (System.String file, System.Boolean isSymlink, System.Boolean followSymlink) [0x00007] in <2be220318a714f97a1ac5d13c83b05be>:0
at Duplicati.Library.Main.Operation.BackupHandler.GenerateMetadata (Duplicati.Library.Snapshots.ISnapshotService snapshot, System.String path, System.IO.FileAttributes attributes) [0x00027] in :0

2018-09-10 13:21:37Z - Warning: Error reported while accessing file: /Users/hpeyerl/Library/Google/GoogleSoftwareUpdate/TicketStore/
System.UnauthorizedAccessException: Access to the path ‘/Users/hpeyerl/Library/Google/GoogleSoftwareUpdate/TicketStore’ is denied.
at System.IO.__Error.WinIOError (System.Int32 errorCode, System.String maybeFullPath) [0x00129] in <016f22b59bd64951b06e38b9f5a0506c>:0
at System.IO.FileSystemEnumerableIterator1[TSource].HandleError (System.Int32 hr, System.String path) [0x00006] in <016f22b59bd64951b06e38b9f5a0506c>:0 at System.IO.FileSystemEnumerableIterator1[TSource].CommonInit () [0x00054] in <016f22b59bd64951b06e38b9f5a0506c>:0
at System.IO.FileSystemEnumerableIterator1[TSource]..ctor (System.String path, System.String originalUserPath, System.String searchPattern, System.IO.SearchOption searchOption, System.IO.SearchResultHandler1[TSource] resultHandler, System.Boolean checkHost) [0x000d6] in <016f22b59bd64951b06e38b9f5a0506c>:0
at System.IO.FileSystemEnumerableFactory.CreateFileNameIterator (System.String path, System.String originalUserPath, System.String searchPattern, System.Boolean includeFiles, System.Boolean includeDirs, System.IO.SearchOption searchOption, System.Boolean checkHost) [0x00009] in <016f22b59bd64951b06e38b9f5a0506c>:0
at System.IO.Directory.InternalGetFileDirectoryNames (System.String path, System.String userPathOriginal, System.String searchPattern, System.Boolean includeFiles, System.Boolean includeDirs, System.IO.SearchOption searchOption, System.Boolean checkHost) [0x00000] in <016f22b59bd64951b06e38b9f5a0506c>:0
at System.IO.Directory.InternalGetDirectories (System.String path, System.String searchPattern, System.IO.SearchOption searchOption) [0x00000] in <016f22b59bd64951b06e38b9f5a0506c>:0
at System.IO.Directory.GetDirectories (System.String path) [0x0000e] in <016f22b59bd64951b06e38b9f5a0506c>:0
at Duplicati.Library.Snapshots.NoSnapshot.ListFolders (System.String folder) [0x00000] in <2be220318a714f97a1ac5d13c83b05be>:0
at Duplicati.Library.Utility.Utility+d__22.MoveNext () [0x0013c] in <0828ce86ffa94a4bbbb2da4331bcc67b>:0

2018-09-10 13:21:37Z - Warning: Error reported while accessing file: /Users/hpeyerl/Library/Google/GoogleSoftwareUpdate/TicketStore/
System.UnauthorizedAccessException: Access to the path ‘/Users/hpeyerl/Library/Google/GoogleSoftwareUpdate/TicketStore’ is denied.
at System.IO.__Error.WinIOError (System.Int32 errorCode, System.String maybeFullPath) [0x00129] in <016f22b59bd64951b06e38b9f5a0506c>:0
at System.IO.FileSystemEnumerableIterator1[TSource].HandleError (System.Int32 hr, System.String path) [0x00006] in <016f22b59bd64951b06e38b9f5a0506c>:0 at System.IO.FileSystemEnumerableIterator1[TSource].CommonInit () [0x00054] in <016f22b59bd64951b06e38b9f5a0506c>:0
at System.IO.FileSystemEnumerableIterator1[TSource]..ctor (System.String path, System.String originalUserPath, System.String searchPattern, System.IO.SearchOption searchOption, System.IO.SearchResultHandler1[TSource] resultHandler, System.Boolean checkHost) [0x000d6] in <016f22b59bd64951b06e38b9f5a0506c>:0
at System.IO.FileSystemEnumerableFactory.CreateFileNameIterator (System.String path, System.String originalUserPath, System.String searchPattern, System.Boolean includeFiles, System.Boolean includeDirs, System.IO.SearchOption searchOption, System.Boolean checkHost) [0x00009] in <016f22b59bd64951b06e38b9f5a0506c>:0
at System.IO.Directory.InternalGetFileDirectoryNames (System.String path, System.String userPathOriginal, System.String searchPattern, System.Boolean includeFiles, System.Boolean includeDirs, System.IO.SearchOption searchOption, System.Boolean checkHost) [0x00000] in <016f22b59bd64951b06e38b9f5a0506c>:0
at System.IO.Directory.InternalGetFiles (System.String path, System.String searchPattern, System.IO.SearchOption searchOption) [0x00000] in <016f22b59bd64951b06e38b9f5a0506c>:0
at System.IO.Directory.GetFiles (System.String path) [0x0000e] in <016f22b59bd64951b06e38b9f5a0506c>:0
at Duplicati.Library.Snapshots.NoSnapshot.ListFiles (System.String folder) [0x00000] in <2be220318a714f97a1ac5d13c83b05be>:0
at Duplicati.Library.Utility.Utility+d__22.MoveNext () [0x00244] in <0828ce86ffa94a4bbbb2da4331bcc67b>:0

2018-09-10 13:43:43Z - Information: Backend event: Put - Started: duplicati-b039426f3a9ac4c62a2dd3a27c1432406.dblock.zip.aes (571.07 MB)
2018-09-10 13:51:27Z - Information: Backend event: Put - Completed: duplicati-b039426f3a9ac4c62a2dd3a27c1432406.dblock.zip.aes (571.07 MB)
2018-09-10 13:51:27Z - Information: Backend event: Put - Started: duplicati-ifb520288d20b4c968486b9eacde7af90.dindex.zip.aes (1.35 MB)
2018-09-10 13:51:31Z - Information: Backend event: Put - Completed: duplicati-ifb520288d20b4c968486b9eacde7af90.dindex.zip.aes (1.35 MB)
2018-09-10 13:51:31Z - Information: Backend event: Put - Started: duplicati-20180910T123126Z.dlist.zip.aes (38.86 MB)
2018-09-10 13:52:04Z - Information: Backend event: Put - Completed: duplicati-20180910T123126Z.dlist.zip.aes (38.86 MB)
2018-09-10 13:52:04Z - Information: Backend event: List - Started: ()
2018-09-10 13:52:13Z - Information: Backend event: List - Completed: (228 bytes)
2018-09-10 13:52:13Z - Information: Backend event: Get - Started: duplicati-20180910T123126Z.dlist.zip.aes (38.86 MB)
2018-09-10 13:52:22Z - Information: Backend event: Get - Completed: duplicati-20180910T123126Z.dlist.zip.aes (38.86 MB)
2018-09-10 13:52:22Z - Information: Backend event: Get - Started: duplicati-iee50dc71c48a4d6c9fe44e4eeb9389e3.dindex.zip.aes (756.09 KB)
2018-09-10 13:52:22Z - Information: Backend event: Get - Completed: duplicati-iee50dc71c48a4d6c9fe44e4eeb9389e3.dindex.zip.aes (756.09 KB)
2018-09-10 13:52:22Z - Information: Backend event: Get - Started: duplicati-baa6c348118484f718a9d22ecf81c9237.dblock.zip.aes (1,023.99 MB)
2018-09-10 13:53:29Z - Information: Backend event: Get - Completed: duplicati-baa6c348118484f718a9d22ecf81c9237.dblock.zip.aes (1,023.99 MB)

So are we officially out of ideas then?

Is that the end of duplicati.log or is there more? Normally it should continue. To give examples I did some tests on a small backup done using the web UI Commandline option, then copied log lines from a Profiling-level log.

2.0.3.11 canary “backup”. SQL looks like it’s noting what was verified, and deleting old logs. Do you get SQL?

2018-09-13 08:08:41 -04 - [Information-Duplicati.Library.Main.Database.LocalDeleteDatabase-CompactReason]: Compacting not required
2018-09-13 08:08:41 -04 - [Profiling-Timer.Begin-Duplicati.Library.Main.Operation.BackupHandler-CommitFinalizingBackup]: Starting - CommitFinalizingBackup
2018-09-13 08:08:41 -04 - [Profiling-Timer.Finished-Duplicati.Library.Main.Operation.BackupHandler-CommitFinalizingBackup]: CommitFinalizingBackup took 0:00:00:00.000
2018-09-13 08:08:41 -04 - [Profiling-Timer.Begin-Duplicati.Library.Main.Operation.BackupHandler-AfterBackupVerify]: Starting - AfterBackupVerify
... (various Get and SQL work)
2018-09-13 08:08:42 -04 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Get - Started: duplicati-b91c4c456f4c748669615bb6c3ea121ba.dblock.zip (700 bytes)
2018-09-13 08:08:42 -04 - [Profiling-Duplicati.Library.Main.BackendManager-DownloadSpeed]: Downloaded 700 bytes in 00:00:00, -8,388,608.00 TB/s
2018-09-13 08:08:42 -04 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Get - Completed: duplicati-b91c4c456f4c748669615bb6c3ea121ba.dblock.zip (700 bytes)
2018-09-13 08:08:42 -04 - [Profiling-Timer.Finished-Duplicati.Library.Main.BackendManager-RemoteOperationGet]: RemoteOperationGet took 0:00:00:00.000
2018-09-13 08:08:42 -04 - [Profiling-Timer.Begin-Duplicati.Library.Main.Database.ExtensionMethods-ExecuteNonQuery]: Starting - ExecuteNonQuery: UPDATE "RemoteVolume" SET "VerificationCount" = MAX(1, CASE WHEN "VerificationCount" <= 0 THEN (SELECT MAX("VerificationCount") FROM "RemoteVolume") ELSE "VerificationCount" + 1 END) WHERE "Name" = "duplicati-b91c4c456f4c748669615bb6c3ea121ba.dblock.zip"
2018-09-13 08:08:43 -04 - [Profiling-Timer.Finished-Duplicati.Library.Main.Database.ExtensionMethods-ExecuteNonQuery]: ExecuteNonQuery: UPDATE "RemoteVolume" SET "VerificationCount" = MAX(1, CASE WHEN "VerificationCount" <= 0 THEN (SELECT MAX("VerificationCount") FROM "RemoteVolume") ELSE "VerificationCount" + 1 END) WHERE "Name" = "duplicati-b91c4c456f4c748669615bb6c3ea121ba.dblock.zip" took 0:00:00:00.131
2018-09-13 08:08:43 -04 - [Profiling-Timer.Begin-Duplicati.Library.Main.Database.ExtensionMethods-ExecuteNonQuery]: Starting - ExecuteNonQuery: DELETE FROM "LogData" WHERE "Timestamp" < 1534248523
2018-09-13 08:08:43 -04 - [Profiling-Timer.Finished-Duplicati.Library.Main.Database.ExtensionMethods-ExecuteNonQuery]: ExecuteNonQuery: DELETE FROM "LogData" WHERE "Timestamp" < 1534248523 took 0:00:00:00.000
2018-09-13 08:08:43 -04 - [Profiling-Timer.Begin-Duplicati.Library.Main.Database.ExtensionMethods-ExecuteNonQuery]: Starting - ExecuteNonQuery: DELETE FROM "RemoteOperation" WHERE "Timestamp" < 1534248523
2018-09-13 08:08:43 -04 - [Profiling-Timer.Finished-Duplicati.Library.Main.Database.ExtensionMethods-ExecuteNonQuery]: ExecuteNonQuery: DELETE FROM "RemoteOperation" WHERE "Timestamp" < 1534248523 took 0:00:00:00.000
2018-09-13 08:08:43 -04 - [Profiling-Timer.Finished-Duplicati.Library.Main.Controller-RunBackup]: Running Backup took 0:00:00:03.609
2018-09-13 08:08:43 -04 - [Information-Duplicati.Library.Main.Controller-CompletedOperation]: The operation Backup has completed

2.0.3.11 canary “backup” with –backup-test-samples=0 looks like it suppresses Get to look inside the files, but still does a quick List:

2018-09-13 11:36:30 -04 - [Information-Duplicati.Library.Main.Database.LocalDeleteDatabase-CompactReason]: Compacting not required
2018-09-13 11:36:30 -04 - [Profiling-Timer.Begin-Duplicati.Library.Main.Operation.BackupHandler-CommitFinalizingBackup]: Starting - CommitFinalizingBackup
2018-09-13 11:36:30 -04 - [Profiling-Timer.Finished-Duplicati.Library.Main.Operation.BackupHandler-CommitFinalizingBackup]: CommitFinalizingBackup took 0:00:00:00.000
2018-09-13 11:36:30 -04 - [Profiling-Timer.Begin-Duplicati.Library.Main.Operation.BackupHandler-AfterBackupVerify]: Starting - AfterBackupVerify
2018-09-13 11:36:30 -04 - [Profiling-Timer.Begin-Duplicati.Library.Main.BackendManager-RemoteOperationList]: Starting - RemoteOperationList
2018-09-13 11:36:30 -04 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: List - Started:  ()
2018-09-13 11:36:30 -04 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: List - Completed:  (10 bytes)
2018-09-13 11:36:30 -04 - [Profiling-Timer.Finished-Duplicati.Library.Main.BackendManager-RemoteOperationList]: RemoteOperationList took 0:00:00:00.000
2018-09-13 11:36:30 -04 - [Profiling-Timer.Begin-Duplicati.Library.Main.Database.ExtensionMethods-ExecuteReader]: Starting - ExecuteReader: SELECT DISTINCT "Name", "State" FROM "Remotevolume" WHERE "Name" IN (SELECT "Name" FROM "Remotevolume" WHERE "State" IN ("Deleted", "Deleting")) AND NOT "State" IN ("Deleted", "Deleting")
2018-09-13 11:36:30 -04 - [Profiling-Timer.Finished-Duplicati.Library.Main.Database.ExtensionMethods-ExecuteReader]: ExecuteReader: SELECT DISTINCT "Name", "State" FROM "Remotevolume" WHERE "Name" IN (SELECT "Name" FROM "Remotevolume" WHERE "State" IN ("Deleted", "Deleting")) AND NOT "State" IN ("Deleted", "Deleting") took 0:00:00:00.000
2018-09-13 11:36:30 -04 - [Profiling-Timer.Finished-Duplicati.Library.Main.Operation.BackupHandler-AfterBackupVerify]: AfterBackupVerify took 0:00:00:00.600
2018-09-13 11:36:30 -04 - [Profiling-Timer.Begin-Duplicati.Library.Main.BackendManager-RemoteOperationTerminate]: Starting - RemoteOperationTerminate
2018-09-13 11:36:30 -04 - [Profiling-Timer.Finished-Duplicati.Library.Main.BackendManager-RemoteOperationTerminate]: RemoteOperationTerminate took 0:00:00:00.000
2018-09-13 11:36:30 -04 - [Profiling-Timer.Begin-Duplicati.Library.Main.Database.ExtensionMethods-ExecuteNonQuery]: Starting - ExecuteNonQuery: DELETE FROM "LogData" WHERE "Timestamp" < 1534260990
2018-09-13 11:36:30 -04 - [Profiling-Timer.Finished-Duplicati.Library.Main.Database.ExtensionMethods-ExecuteNonQuery]: ExecuteNonQuery: DELETE FROM "LogData" WHERE "Timestamp" < 1534260990 took 0:00:00:00.000
2018-09-13 11:36:30 -04 - [Profiling-Timer.Begin-Duplicati.Library.Main.Database.ExtensionMethods-ExecuteNonQuery]: Starting - ExecuteNonQuery: DELETE FROM "RemoteOperation" WHERE "Timestamp" < 1534260990
2018-09-13 11:36:30 -04 - [Profiling-Timer.Finished-Duplicati.Library.Main.Database.ExtensionMethods-ExecuteNonQuery]: ExecuteNonQuery: DELETE FROM "RemoteOperation" WHERE "Timestamp" < 1534260990 took 0:00:00:00.000
2018-09-13 11:36:30 -04 - [Profiling-Timer.Finished-Duplicati.Library.Main.Controller-RunBackup]: Running Backup took 0:00:00:01.845
2018-09-13 11:36:30 -04 - [Information-Duplicati.Library.Main.Controller-CompletedOperation]: The operation Backup has completed

2.0.3.11 canary “backup” with –no-backend-verification goes further, removing the List of the backend files:

2018-09-13 12:05:53 -04 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Get - Completed: duplicati-b22058baae7994a50b3a21393c2dde132.dblock.zip (512 bytes)
2018-09-13 12:05:53 -04 - [Profiling-Timer.Finished-Duplicati.Library.Main.BackendManager-RemoteOperationGet]: RemoteOperationGet took 0:00:00:00.013
2018-09-13 12:05:53 -04 - [Profiling-Timer.Begin-Duplicati.Library.Main.Database.ExtensionMethods-ExecuteNonQuery]: Starting - ExecuteNonQuery: DROP TABLE IF EXISTS "Blocklist-0D8D90EED6F72043A1EEE99760097F1C"
2018-09-13 12:05:53 -04 - [Profiling-Timer.Finished-Duplicati.Library.Main.Database.ExtensionMethods-ExecuteNonQuery]: ExecuteNonQuery: DROP TABLE IF EXISTS "Blocklist-0D8D90EED6F72043A1EEE99760097F1C" took 0:00:00:00.000
2018-09-13 12:05:53 -04 - [Profiling-Timer.Begin-Duplicati.Library.Main.Database.ExtensionMethods-ExecuteNonQuery]: Starting - ExecuteNonQuery: CREATE TEMPORARY TABLE "Indexlist-53732DC907B4FC44A3C52300152D840B" ("Name" TEXT NOT NULL, "Hash" TEXT NOT NULL, "Size" INTEGER NOT NULL)
2018-09-13 12:05:53 -04 - [Profiling-Timer.Finished-Duplicati.Library.Main.Database.ExtensionMethods-ExecuteNonQuery]: ExecuteNonQuery: CREATE TEMPORARY TABLE "Indexlist-53732DC907B4FC44A3C52300152D840B" ("Name" TEXT NOT NULL, "Hash" TEXT NOT NULL, "Size" INTEGER NOT NULL) took 0:00:00:00.000
2018-09-13 12:05:53 -04 - [Profiling-Timer.Begin-Duplicati.Library.Main.Database.ExtensionMethods-ExecuteNonQuery]: Starting - ExecuteNonQuery: CREATE TEMPORARY TABLE "CmpTable-28A2819C834CD4469486A8893CC6C5A6" AS SELECT "A"."Name", "A"."Hash", "A"."Size" FROM "Remotevolume" A, "Remotevolume" B, "IndexBlockLink" WHERE "B"."Name" = "duplicati-iece85d0092a34c7b8d570d412109cfe2.dindex.zip" AND "A"."ID" = "IndexBlockLink"."BlockVolumeID" AND "B"."ID" = "IndexBlockLink"."IndexVolumeID" 
2018-09-13 12:05:53 -04 - [Profiling-Timer.Finished-Duplicati.Library.Main.Database.ExtensionMethods-ExecuteNonQuery]: ExecuteNonQuery: CREATE TEMPORARY TABLE "CmpTable-28A2819C834CD4469486A8893CC6C5A6" AS SELECT "A"."Name", "A"."Hash", "A"."Size" FROM "Remotevolume" A, "Remotevolume" B, "IndexBlockLink" WHERE "B"."Name" = "duplicati-iece85d0092a34c7b8d570d412109cfe2.dindex.zip" AND "A"."ID" = "IndexBlockLink"."BlockVolumeID" AND "B"."ID" = "IndexBlockLink"."IndexVolumeID"  took 0:00:00:00.000
2018-09-13 12:05:53 -04 - [Profiling-Timer.Begin-Duplicati.Library.Main.Database.ExtensionMethods-ExecuteReader]: Starting - ExecuteReader: SELECT 1 AS "Type", "Indexlist-53732DC907B4FC44A3C52300152D840B"."Name" AS "Name" FROM "Indexlist-53732DC907B4FC44A3C52300152D840B" WHERE "Indexlist-53732DC907B4FC44A3C52300152D840B"."Name" NOT IN ( SELECT "Name" FROM "CmpTable-28A2819C834CD4469486A8893CC6C5A6" ) UNION SELECT 0 AS "Type", "Name" AS "Name" FROM "CmpTable-28A2819C834CD4469486A8893CC6C5A6" WHERE "Name" NOT IN (SELECT "Name" FROM "Indexlist-53732DC907B4FC44A3C52300152D840B") UNION SELECT 2 AS "Type", "E"."Name" AS "Name" FROM "Indexlist-53732DC907B4FC44A3C52300152D840B" E, "CmpTable-28A2819C834CD4469486A8893CC6C5A6" D WHERE "D"."Name" = "E"."Name" AND ("D"."Hash" != "E"."Hash" OR "D"."Size" != "E"."Size") 
2018-09-13 12:05:53 -04 - [Profiling-Timer.Finished-Duplicati.Library.Main.Database.ExtensionMethods-ExecuteReader]: ExecuteReader: SELECT 1 AS "Type", "Indexlist-53732DC907B4FC44A3C52300152D840B"."Name" AS "Name" FROM "Indexlist-53732DC907B4FC44A3C52300152D840B" WHERE "Indexlist-53732DC907B4FC44A3C52300152D840B"."Name" NOT IN ( SELECT "Name" FROM "CmpTable-28A2819C834CD4469486A8893CC6C5A6" ) UNION SELECT 0 AS "Type", "Name" AS "Name" FROM "CmpTable-28A2819C834CD4469486A8893CC6C5A6" WHERE "Name" NOT IN (SELECT "Name" FROM "Indexlist-53732DC907B4FC44A3C52300152D840B") UNION SELECT 2 AS "Type", "E"."Name" AS "Name" FROM "Indexlist-53732DC907B4FC44A3C52300152D840B" E, "CmpTable-28A2819C834CD4469486A8893CC6C5A6" D WHERE "D"."Name" = "E"."Name" AND ("D"."Hash" != "E"."Hash" OR "D"."Size" != "E"."Size")  took 0:00:00:00.000
2018-09-13 12:05:53 -04 - [Profiling-Timer.Begin-Duplicati.Library.Main.Database.ExtensionMethods-ExecuteNonQuery]: Starting - ExecuteNonQuery: DROP TABLE IF EXISTS "CmpTable-28A2819C834CD4469486A8893CC6C5A6" 
2018-09-13 12:05:53 -04 - [Profiling-Timer.Finished-Duplicati.Library.Main.Database.ExtensionMethods-ExecuteNonQuery]: ExecuteNonQuery: DROP TABLE IF EXISTS "CmpTable-28A2819C834CD4469486A8893CC6C5A6"  took 0:00:00:00.000
2018-09-13 12:05:53 -04 - [Profiling-Timer.Begin-Duplicati.Library.Main.Database.ExtensionMethods-ExecuteNonQuery]: Starting - ExecuteNonQuery: DROP TABLE IF EXISTS "Indexlist-53732DC907B4FC44A3C52300152D840B"
2018-09-13 12:05:53 -04 - [Profiling-Timer.Finished-Duplicati.Library.Main.Database.ExtensionMethods-ExecuteNonQuery]: ExecuteNonQuery: DROP TABLE IF EXISTS "Indexlist-53732DC907B4FC44A3C52300152D840B" took 0:00:00:00.000
2018-09-13 12:05:53 -04 - [Profiling-Timer.Begin-Duplicati.Library.Main.Database.ExtensionMethods-ExecuteNonQuery]: Starting - ExecuteNonQuery: UPDATE "RemoteVolume" SET "VerificationCount" = MAX(1, CASE WHEN "VerificationCount" <= 0 THEN (SELECT MAX("VerificationCount") FROM "RemoteVolume") ELSE "VerificationCount" + 1 END) WHERE "Name" = "duplicati-iece85d0092a34c7b8d570d412109cfe2.dindex.zip"
2018-09-13 12:05:53 -04 - [Profiling-Timer.Finished-Duplicati.Library.Main.Database.ExtensionMethods-ExecuteNonQuery]: ExecuteNonQuery: UPDATE "RemoteVolume" SET "VerificationCount" = MAX(1, CASE WHEN "VerificationCount" <= 0 THEN (SELECT MAX("VerificationCount") FROM "RemoteVolume") ELSE "VerificationCount" + 1 END) WHERE "Name" = "duplicati-iece85d0092a34c7b8d570d412109cfe2.dindex.zip" took 0:00:00:00.134
2018-09-13 12:05:53 -04 - [Profiling-Timer.Begin-Duplicati.Library.Main.Database.ExtensionMethods-ExecuteNonQuery]: Starting - ExecuteNonQuery: CREATE TEMPORARY TABLE "Blocklist-79EC607C1E73F745B3997C31858B03D9" ("Hash" TEXT NOT NULL, "Size" INTEGER NOT NULL)
2018-09-13 12:05:53 -04 - [Profiling-Timer.Finished-Duplicati.Library.Main.Database.ExtensionMethods-ExecuteNonQuery]: ExecuteNonQuery: CREATE TEMPORARY TABLE "Blocklist-79EC607C1E73F745B3997C31858B03D9" ("Hash" TEXT NOT NULL, "Size" INTEGER NOT NULL) took 0:00:00:00.000
2018-09-13 12:05:53 -04 - [Profiling-Timer.Begin-Duplicati.Library.Main.Database.ExtensionMethods-ExecuteNonQuery]: Starting - ExecuteNonQuery: CREATE TEMPORARY TABLE "CmpTable-6E292AE78573384E908EB84918CF1CC3" AS SELECT DISTINCT "Hash" AS "Hash", "Size" AS "Size" FROM (SELECT "Block"."Hash" AS "Hash", "Block"."Size" AS "Size" FROM "Remotevolume", "Block" WHERE "Remotevolume"."Name" = "duplicati-b22058baae7994a50b3a21393c2dde132.dblock.zip" AND "Remotevolume"."ID" = "Block"."VolumeID" UNION SELECT "DeletedBlock"."Hash" AS "Hash", "DeletedBlock"."Size" AS "Size" FROM "DeletedBlock", "RemoteVolume" WHERE "RemoteVolume"."Name" = "duplicati-b22058baae7994a50b3a21393c2dde132.dblock.zip" AND "RemoteVolume"."ID" = "DeletedBlock"."VolumeID")
2018-09-13 12:05:53 -04 - [Profiling-Timer.Finished-Duplicati.Library.Main.Database.ExtensionMethods-ExecuteNonQuery]: ExecuteNonQuery: CREATE TEMPORARY TABLE "CmpTable-6E292AE78573384E908EB84918CF1CC3" AS SELECT DISTINCT "Hash" AS "Hash", "Size" AS "Size" FROM (SELECT "Block"."Hash" AS "Hash", "Block"."Size" AS "Size" FROM "Remotevolume", "Block" WHERE "Remotevolume"."Name" = "duplicati-b22058baae7994a50b3a21393c2dde132.dblock.zip" AND "Remotevolume"."ID" = "Block"."VolumeID" UNION SELECT "DeletedBlock"."Hash" AS "Hash", "DeletedBlock"."Size" AS "Size" FROM "DeletedBlock", "RemoteVolume" WHERE "RemoteVolume"."Name" = "duplicati-b22058baae7994a50b3a21393c2dde132.dblock.zip" AND "RemoteVolume"."ID" = "DeletedBlock"."VolumeID") took 0:00:00:00.000
2018-09-13 12:05:53 -04 - [Profiling-Timer.Begin-Duplicati.Library.Main.Database.ExtensionMethods-ExecuteReader]: Starting - ExecuteReader: SELECT 1 AS "Type", "Blocklist-79EC607C1E73F745B3997C31858B03D9"."Hash" AS "Hash" FROM "Blocklist-79EC607C1E73F745B3997C31858B03D9" WHERE "Blocklist-79EC607C1E73F745B3997C31858B03D9"."Hash" NOT IN ( SELECT "Hash" FROM "CmpTable-6E292AE78573384E908EB84918CF1CC3" ) UNION SELECT 0 AS "Type", "Hash" AS "Hash" FROM "CmpTable-6E292AE78573384E908EB84918CF1CC3" WHERE "Hash" NOT IN (SELECT "Hash" FROM "Blocklist-79EC607C1E73F745B3997C31858B03D9") UNION SELECT 2 AS "Type", "E"."Hash" AS "Hash" FROM "Blocklist-79EC607C1E73F745B3997C31858B03D9" E, "CmpTable-6E292AE78573384E908EB84918CF1CC3" D WHERE "D"."Hash" = "E"."Hash" AND "D"."Size" != "E"."Size"  
2018-09-13 12:05:53 -04 - [Profiling-Timer.Finished-Duplicati.Library.Main.Database.ExtensionMethods-ExecuteReader]: ExecuteReader: SELECT 1 AS "Type", "Blocklist-79EC607C1E73F745B3997C31858B03D9"."Hash" AS "Hash" FROM "Blocklist-79EC607C1E73F745B3997C31858B03D9" WHERE "Blocklist-79EC607C1E73F745B3997C31858B03D9"."Hash" NOT IN ( SELECT "Hash" FROM "CmpTable-6E292AE78573384E908EB84918CF1CC3" ) UNION SELECT 0 AS "Type", "Hash" AS "Hash" FROM "CmpTable-6E292AE78573384E908EB84918CF1CC3" WHERE "Hash" NOT IN (SELECT "Hash" FROM "Blocklist-79EC607C1E73F745B3997C31858B03D9") UNION SELECT 2 AS "Type", "E"."Hash" AS "Hash" FROM "Blocklist-79EC607C1E73F745B3997C31858B03D9" E, "CmpTable-6E292AE78573384E908EB84918CF1CC3" D WHERE "D"."Hash" = "E"."Hash" AND "D"."Size" != "E"."Size"   took 0:00:00:00.000
2018-09-13 12:05:53 -04 - [Profiling-Timer.Begin-Duplicati.Library.Main.Database.ExtensionMethods-ExecuteNonQuery]: Starting - ExecuteNonQuery: DROP TABLE IF EXISTS "CmpTable-6E292AE78573384E908EB84918CF1CC3" 
2018-09-13 12:05:53 -04 - [Profiling-Timer.Finished-Duplicati.Library.Main.Database.ExtensionMethods-ExecuteNonQuery]: ExecuteNonQuery: DROP TABLE IF EXISTS "CmpTable-6E292AE78573384E908EB84918CF1CC3"  took 0:00:00:00.000
2018-09-13 12:05:53 -04 - [Profiling-Timer.Begin-Duplicati.Library.Main.Database.ExtensionMethods-ExecuteNonQuery]: Starting - ExecuteNonQuery: DROP TABLE IF EXISTS "Blocklist-79EC607C1E73F745B3997C31858B03D9"
2018-09-13 12:05:53 -04 - [Profiling-Timer.Finished-Duplicati.Library.Main.Database.ExtensionMethods-ExecuteNonQuery]: ExecuteNonQuery: DROP TABLE IF EXISTS "Blocklist-79EC607C1E73F745B3997C31858B03D9" took 0:00:00:00.000
2018-09-13 12:05:53 -04 - [Profiling-Timer.Begin-Duplicati.Library.Main.Database.ExtensionMethods-ExecuteNonQuery]: Starting - ExecuteNonQuery: UPDATE "RemoteVolume" SET "VerificationCount" = MAX(1, CASE WHEN "VerificationCount" <= 0 THEN (SELECT MAX("VerificationCount") FROM "RemoteVolume") ELSE "VerificationCount" + 1 END) WHERE "Name" = "duplicati-b22058baae7994a50b3a21393c2dde132.dblock.zip"
2018-09-13 12:05:53 -04 - [Profiling-Timer.Finished-Duplicati.Library.Main.Database.ExtensionMethods-ExecuteNonQuery]: ExecuteNonQuery: UPDATE "RemoteVolume" SET "VerificationCount" = MAX(1, CASE WHEN "VerificationCount" <= 0 THEN (SELECT MAX("VerificationCount") FROM "RemoteVolume") ELSE "VerificationCount" + 1 END) WHERE "Name" = "duplicati-b22058baae7994a50b3a21393c2dde132.dblock.zip" took 0:00:00:00.123
2018-09-13 12:05:54 -04 - [Profiling-Timer.Finished-Duplicati.Library.Main.Controller-RunTest]: Running Test took 0:00:00:01.542
2018-09-13 12:05:54 -04 - [Information-Duplicati.Library.Main.Controller-CompletedOperation]: The operation Test has completed

These said [Information-Duplicati.Library.Main.Controller-CompletedOperation]. Did yours?

We’re closing in on the point of the possible hang, but not there yet. If we get close enough, code-level work might be possible, however we’d probably have to call in someone else. The coders seem to be more at the https://github.com/duplicati/duplicati/issues site whereas the forum is more for users to try to help each other.

Do you have any other backups? If so, how are they going? If not, would you be willing to set up a small one with files that don’t generate complaints, and see if it does backups and restores from a fresh Duplicati start without the troubled job running (I’m not sure how easy or difficult it will be to keep the other from “finishing”).

Things done via the web UI can definitely interact with each other in terms of scheduling. Some might wait… MENU --> About --> Server state properties can help see these, and that view is live-updated with changes.

activeTask being empty would probably indicate it’s ready for the next. A non-blank is now hard to decipher.

That was the entire contents of ‘duplicati.log’.

That backup was the only Duplicati backup I’ve been working on. I’m only experimenting at this point so the data is not vital. I’m still running Crashplan in parallel and I’ve got some time to get a reliable alternative before I have to switch away from Crashplan.

This morning was a bit of an adventure trying to delete/eradicate the really big backup we’ve been working through in this thread. Something is up with Google Drive. If I dump the Duplicati backup folder into the Google Drive Trash bin and then empty the trash, it appears to be ‘gone’. When I attempt to backup to that folder, Duplicati thinks all of the files are still there. I’ve tried blowing away $HOME/.config/Duplicati/JASRAADOTX.sqlite (the local cache DB) but Duplicati still sees the remote files. So each time I try to create a fresh backup set, I have to get Duplicati to create a new remote folder (test, test2, test3, etc)… Google Drive also reports that the data has not been freed up. So perhaps there’s some lag between what’s reported in their web-interface and what’s actually happening behind the scenes.

Anyway, working around that, I was able to create a new backup that was much smaller (26M). It ran for a few minutes.

The backup I created was with “debug-output” set, “log-file” set to “duplicati.log”, and “log-level” set to “Information”. After the browser said “Finished!”, I gathered the comandline output and logfile output. It appears that they’re both incomplete and the interface is hung again.

I ran it in “commandline” and this is the browser output:

Running commandline entry

Running … stop now

Backup started at 9/14/2018 6:56:45 AM

Checking remote backup …

Listing remote folder …

Scanning local files …

228 files need to be examined (25.56 MB)

0 files need to be examined (0 bytes)

Uploading file (20.71 MB) …

Uploading file (29.56 KB) …

Uploading file (13.11 KB) …

No remote filesets were deleted

Checking remote backup …

Listing remote folder …

Verifying remote backup …

Remote backup verification completed

Downloading file (13.11 KB) …

Downloading file (29.56 KB) …

Downloading file (20.71 MB) …

…and this is “duplicati.log” (which ends up in the OS X application bundle):

chimay Resources $ pwd
/Applications/Duplicati.app/Contents/Resources
chimay Resources $ cat duplicati.log
2018-09-14 12:56:46Z - Information: Backend event: List - Started: ()
2018-09-14 12:56:47Z - Information: Backend event: List - Completed: ()
2018-09-14 12:56:52Z - Information: Backend event: Put - Started: duplicati-b5b228fae5a2c4825ad02bb478f23b633.dblock.zip.aes (20.71 MB)
2018-09-14 12:57:05Z - Information: Backend event: Put - Completed: duplicati-b5b228fae5a2c4825ad02bb478f23b633.dblock.zip.aes (20.71 MB)
2018-09-14 12:57:05Z - Information: Backend event: Put - Started: duplicati-ie8fcc4e18498428a9f33aee982d7f330.dindex.zip.aes (29.56 KB)
2018-09-14 12:57:09Z - Information: Backend event: Put - Completed: duplicati-ie8fcc4e18498428a9f33aee982d7f330.dindex.zip.aes (29.56 KB)
2018-09-14 12:57:09Z - Information: Backend event: Put - Started: duplicati-20180914T125645Z.dlist.zip.aes (13.11 KB)
2018-09-14 12:57:10Z - Information: Backend event: Put - Completed: duplicati-20180914T125645Z.dlist.zip.aes (13.11 KB)
2018-09-14 12:57:10Z - Information: No remote filesets were deleted
2018-09-14 12:57:10Z - Information: Backend event: List - Started: ()
2018-09-14 12:57:12Z - Information: Backend event: List - Completed: (3 bytes)
2018-09-14 12:57:12Z - Information: Backend event: Get - Started: duplicati-20180914T125645Z.dlist.zip.aes (13.11 KB)
2018-09-14 12:57:14Z - Information: Backend event: Get - Completed: duplicati-20180914T125645Z.dlist.zip.aes (13.11 KB)
2018-09-14 12:57:14Z - Information: Backend event: Get - Started: duplicati-ie8fcc4e18498428a9f33aee982d7f330.dindex.zip.aes (29.56 KB)
2018-09-14 12:57:14Z - Information: Backend event: Get - Completed: duplicati-ie8fcc4e18498428a9f33aee982d7f330.dindex.zip.aes (29.56 KB)
2018-09-14 12:57:14Z - Information: Backend event: Get - Started: duplicati-b5b228fae5a2c4825ad02bb478f23b633.dblock.zip.aes (20.71 MB)
2018-09-14 12:57:17Z - Information: Backend event: Get - Completed: duplicati-b5b228fae5a2c4825ad02bb478f23b633.dblock.zip.aes (20.71 MB)
chimay Resources $

Further to last, I re-ran the above backup again (after restarting duplicati) with log-level set to “Profile”. Since it was a re-run, there were no files to upload but here is the output:

Backup started at 9/14/2018 7:12:29 AM

Checking remote backup …

Listing remote folder …

Scanning local files …

228 files need to be examined (25.56 MB)

0 files need to be examined (0 bytes)

removing temp files, as no data needs to be uploaded

No remote filesets were deleted

Checking remote backup …

Listing remote folder …

removing file listed as Temporary: duplicati-b4245674cf28045b08fa5249667f345a3.dblock.zip.aes

removing file listed as Temporary: duplicati-ida31effd15314ccf909404affb277488.dindex.zip.aes

Verifying remote backup …

Remote backup verification completed

Downloading file (13.11 KB) …

Downloading file (29.56 KB) …

Downloading file (20.71 MB) …

and the duplicati.log is substantially larger with a bunch of SQL and so forth (probably with private info in it so I’m loathe to post it. I can upload it somewhere for you if the details are of interest.

It ends like this:

2018-09-14 13:12:36Z - Profiling: Starting - ExecuteReader: SELECT “ID”, “Name”, “Size”, “Hash”, “VerificationCount” FROM “Remotevolume” WHERE “Type” = ? AND “State” IN (?, ?)
2018-09-14 13:12:36Z - Profiling: ExecuteReader: SELECT “ID”, “Name”, “Size”, “Hash”, “VerificationCount” FROM “Remotevolume” WHERE “Type” = ? AND “State” IN (?, ?) took 00:00:00.000
2018-09-14 13:12:36Z - Profiling: Starting - ExecuteReader: SELECT “ID”, “Name”, “Size”, “Hash”, “VerificationCount” FROM “Remotevolume” WHERE “Type” = ? AND “State” IN (?, ?)
2018-09-14 13:12:36Z - Profiling: ExecuteReader: SELECT “ID”, “Name”, “Size”, “Hash”, “VerificationCount” FROM “Remotevolume” WHERE “Type” = ? AND “State” IN (?, ?) took 00:00:00.000
2018-09-14 13:12:36Z - Profiling: Starting - RemoteOperationGet
2018-09-14 13:12:36Z - Information: Backend event: Get - Started: duplicati-20180914T125645Z.dlist.zip.aes (13.11 KB)
2018-09-14 13:12:38Z - Profiling: Downloaded 13.11 KB in 00:00:01.6495620, 7.95 KB/s
2018-09-14 13:12:38Z - Information: Backend event: Get - Completed: duplicati-20180914T125645Z.dlist.zip.aes (13.11 KB)
2018-09-14 13:12:38Z - Profiling: RemoteOperationGet took 00:00:01.654
2018-09-14 13:12:38Z - Profiling: Starting - ExecuteNonQuery: UPDATE “RemoteVolume” SET “VerificationCount” = MAX(1, CASE WHEN “VerificationCount” <= 0 THEN (SELECT MAX(“VerificationCount”) FROM “RemoteVolume”) ELSE “VerificationCount” + 1 END) WHERE “Name” = ?
2018-09-14 13:12:38Z - Profiling: ExecuteNonQuery: UPDATE “RemoteVolume” SET “VerificationCount” = MAX(1, CASE WHEN “VerificationCount” <= 0 THEN (SELECT MAX(“VerificationCount”) FROM “RemoteVolume”) ELSE “VerificationCount” + 1 END) WHERE “Name” = ? took 00:00:00.001
2018-09-14 13:12:38Z - Profiling: Starting - RemoteOperationGet
2018-09-14 13:12:38Z - Information: Backend event: Get - Started: duplicati-ie8fcc4e18498428a9f33aee982d7f330.dindex.zip.aes (29.56 KB)
2018-09-14 13:12:38Z - Profiling: Downloaded 29.56 KB in 00:00:00.3557830, 83.08 KB/s
2018-09-14 13:12:38Z - Information: Backend event: Get - Completed: duplicati-ie8fcc4e18498428a9f33aee982d7f330.dindex.zip.aes (29.56 KB)
2018-09-14 13:12:38Z - Profiling: RemoteOperationGet took 00:00:00.356
2018-09-14 13:12:38Z - Profiling: Starting - ExecuteNonQuery: UPDATE “RemoteVolume” SET “VerificationCount” = MAX(1, CASE WHEN “VerificationCount” <= 0 THEN (SELECT MAX(“VerificationCount”) FROM “RemoteVolume”) ELSE “VerificationCount” + 1 END) WHERE “Name” = ?
2018-09-14 13:12:38Z - Profiling: ExecuteNonQuery: UPDATE “RemoteVolume” SET “VerificationCount” = MAX(1, CASE WHEN “VerificationCount” <= 0 THEN (SELECT MAX(“VerificationCount”) FROM “RemoteVolume”) ELSE “VerificationCount” + 1 END) WHERE “Name” = ? took 00:00:00.000
2018-09-14 13:12:38Z - Profiling: Starting - RemoteOperationGet
2018-09-14 13:12:38Z - Information: Backend event: Get - Started: duplicati-b5b228fae5a2c4825ad02bb478f23b633.dblock.zip.aes (20.71 MB)
2018-09-14 13:12:41Z - Profiling: Downloaded 20.71 MB in 00:00:03.0153380, 6.87 MB/s
2018-09-14 13:12:41Z - Information: Backend event: Get - Completed: duplicati-b5b228fae5a2c4825ad02bb478f23b633.dblock.zip.aes (20.71 MB)
2018-09-14 13:12:41Z - Profiling: RemoteOperationGet took 00:00:03.020
2018-09-14 13:12:41Z - Profiling: Starting - ExecuteNonQuery: UPDATE “RemoteVolume” SET “VerificationCount” = MAX(1, CASE WHEN “VerificationCount” <= 0 THEN (SELECT MAX(“VerificationCount”) FROM “RemoteVolume”) ELSE “VerificationCount” + 1 END) WHERE “Name” = ?
2018-09-14 13:12:41Z - Profiling: ExecuteNonQuery: UPDATE “RemoteVolume” SET “VerificationCount” = MAX(1, CASE WHEN “VerificationCount” <= 0 THEN (SELECT MAX(“VerificationCount”) FROM “RemoteVolume”) ELSE “VerificationCount” + 1 END) WHERE “Name” = ? took 00:00:00.000
2018-09-14 13:12:41Z - Profiling: Starting - ExecuteNonQuery: DELETE FROM “LogData” WHERE “Timestamp” < ?
2018-09-14 13:12:41Z - Profiling: ExecuteNonQuery: DELETE FROM “LogData” WHERE “Timestamp” < ? took 00:00:00.000
2018-09-14 13:12:41Z - Profiling: Starting - ExecuteNonQuery: DELETE FROM “RemoteOperation” WHERE “Timestamp” < ?
2018-09-14 13:12:41Z - Profiling: ExecuteNonQuery: DELETE FROM “RemoteOperation” WHERE “Timestamp” < ? took 00:00:00.000
2018-09-14 13:12:41Z - Profiling: Running Backup took 00:00:12.037

The interface is hung again, as in, if I try to do anything, including running the backup again, it just says “Waiting for task to start …” and no new information is appended to “duplicati.log”.

To get a better comparison, my output from 2.0.3.3 looks similar:

2018-09-13 14:08:41Z - Profiling: Starting - ExecuteNonQuery: UPDATE “RemoteVolume” SET “VerificationCount” = MAX(1, CASE WHEN “VerificationCount” <= 0 THEN (SELECT MAX(“VerificationCount”) FROM “RemoteVolume”) ELSE “VerificationCount” + 1 END) WHERE “Name” = ?
2018-09-13 14:08:41Z - Profiling: ExecuteNonQuery: UPDATE “RemoteVolume” SET “VerificationCount” = MAX(1, CASE WHEN “VerificationCount” <= 0 THEN (SELECT MAX(“VerificationCount”) FROM “RemoteVolume”) ELSE “VerificationCount” + 1 END) WHERE “Name” = ? took 00:00:00.023
2018-09-13 14:08:42Z - Profiling: Starting - ExecuteNonQuery: DELETE FROM “LogData” WHERE “Timestamp” < ?
2018-09-13 14:08:42Z - Profiling: ExecuteNonQuery: DELETE FROM “LogData” WHERE “Timestamp” < ? took 00:00:00.000
2018-09-13 14:08:42Z - Profiling: Starting - ExecuteNonQuery: DELETE FROM “RemoteOperation” WHERE “Timestamp” < ?
2018-09-13 14:08:42Z - Profiling: ExecuteNonQuery: DELETE FROM “RemoteOperation” WHERE “Timestamp” < ? took 00:00:00.000
2018-09-13 14:08:42Z - Profiling: Running Backup took 00:00:01.296

which (to me) might mean that your job completed but perhaps the scheduler (part of the server) is confused.
What does Menu --> About --> System info --> Server state properties show? EDIT: Corrected the sequence
I’d expect a normal result to show activeTask blank and maybe proposedSchedule only showing any planned future job runs (if you have any). The format is cryptic-internal but you can turn the quoted number into a job name by substituting it onto the similar small number from a URL from a UI job operation, such as “Show log”.

(for others in the future who might stumble on this thread, Menu->About->System Info->Server state).

I’m out of town for the weekend so I’m doing this via remote desktop (cringe) but here’s what Server state properties has to say.

Server state properties

  • lastEventId : 309
  • lastDataUpdateId : 2
  • lastNotificationUpdateId : 0
  • estimatedPauseEnd : 0001-01-01T00:00:00
  • activeTask : {“Item1”:3,“Item2”:null}
  • programState : Running
  • lastErrorMessage :
  • connectionState : connected
  • xsfrerror : false
  • connectionAttemptTimer : 0
  • failedConnectionAttempts : 0
  • lastPgEvent : {“BackupID”:null,“TaskID”:3,“BackendAction”:“Get”,“BackendPath”:“duplicati-b5b228fae5a2c4825ad02bb478f23b633.dblock.zip.aes”,“BackendFileSize”:21721229,“BackendFileProgress”:21721229,“BackendSpeed”:242,“BackendIsBlocking”:false,“CurrentFilename”:null,“CurrentFilesize”:0,“CurrentFileoffset”:0,“Phase”:“Backup_Complete”,“OverallProgress”:1,“ProcessedFileCount”:228,“ProcessedFileSize”:26802068,“TotalFileCount”:228,“TotalFileSize”:26802068,“StillCounting”:false}
  • updaterState : Waiting
  • updatedVersion :
  • updateReady : false
  • updateDownloadProgress : 0
  • proposedSchedule : [{“Item1”:“1”,“Item2”:“2018-09-16T07:00:00Z”}]
  • schedulerQueueIds : [{“Item1”:4,“Item2”:“1”},{“Item1”:6,“Item2”:null}]
  • pauseTimeRemain : 0