Duplicati hangs after backup attempt (OSX) Was: Crashplan Refugee can't get Duplicati to restore

–log-file-log-level usually uses --log-file to say where to log. --log-file documentation doesn’t give any default:

C:\Program Files\Duplicati 2>duplicati.commandline.exe help log-file
  --log-file (Path): Log internal information to a file
    Logs information to the file specified

C:\Program Files\Duplicati 2>

–console-log-level seems like it’s mostly relevant to the command line case. Not sure if it works with tray icon.

Possibly OSX is different. You could try just saying “duplicati help” or maybe go with my initial log explanation.

Linux /usr/bin/duplicati passes options:

#!/bin/bash
INSTALLDIR=/usr/lib/duplicati
export LD_LIBRARY_PATH="${INSTALLDIR}${LD_LIBRARY_PATH:+:$LD_LIBRARY_PATH}"
export MONO_PATH=$MONO_PATH:${INSTALLDIR}

EXE_FILE=${INSTALLDIR}/Duplicati.GUI.TrayIcon.exe
APP_NAME=Duplicati

exec -a "$APP_NAME" mono "$EXE_FILE" "$@"

I’m generally on Windows, where Duplicati.GUI.TrayIcon.exe launches in the background and doesn’t even have a console window open to write data to, so maybe TrayIcon just isn’t talky. I suggest --log-file or even using the “Commandline” option under “Advanced” on the main menu. That isolates output for that backup. Example output that I showed demonstrating more verbosity at “Information” level was from “Commandline”.

I understand now how this is put together I think. I’m running an Advanced->CommandLine task right now with log-level set to Information. I’m sure it’ll be running long after I’ve left for work. Will report back tonight or tomorrow morning.

Well, I can’t say there’s a lot to say here but I’ll paste the log:

Backup started at 9/10/2018 6:31:26 AM
Checking remote backup …
Listing remote folder …
Scanning local files …
6821 files need to be examined (668.30 MB) (still counting)
Failed to process path: /Users/hpeyerl/duplicati.log => Sharing violation on path /Users/hpeyerl/duplicati.log
6939 files need to be examined (404.00 MB) (still counting)
7224 files need to be examined (86.72 MB) (still counting)
7805 files need to be examined (148.22 MB) (still counting)
8871 files need to be examined (156.39 MB) (still counting)
10060 files need to be examined (176.09 MB) (still counting)
10856 files need to be examined (198.10 MB) (still counting)
11940 files need to be examined (229.83 MB) (still counting)
13293 files need to be examined (246.21 MB) (still counting)
14321 files need to be examined (265.41 MB) (still counting)
15004 files need to be examined (274.18 MB) (still counting)
15759 files need to be examined (263.54 MB) (still counting)
16482 files need to be examined (246.41 MB) (still counting)
---- snip----
210258 files need to be examined (28.87 GB) (still counting)
211000 files need to be examined (28.91 GB) (still counting)
211286 files need to be examined (29.00 GB) (still counting)
211387 files need to be examined (29.03 GB) (still counting)
211872 files need to be examined (29.05 GB) (still counting)
212847 files need to be examined (29.22 GB) (still counting)
213691 files need to be examined (30.87 GB)
211703 files need to be examined (30.84 GB)
209962 files need to be examined (30.82 GB)
208046 files need to be examined (30.80 GB)
206097 files need to be examined (30.78 GB)
204118 files need to be examined (30.75 GB)
—snip—
15723 files need to be examined (4.77 GB)
14055 files need to be examined (3.94 GB)
12068 files need to be examined (3.21 GB)
10284 files need to be examined (2.81 GB)
8370 files need to be examined (2.04 GB)
6805 files need to be examined (1.92 GB)
5260 files need to be examined (1.87 GB)
3003 files need to be examined (1.70 GB)
1052 files need to be examined (1.60 GB)
1051 files need to be examined (156.01 MB)
0 files need to be examined (0 bytes)
Uploading file (571.07 MB) …
Uploading file (1.35 MB) …
Uploading file (38.86 MB) …
Checking remote backup …
Listing remote folder …
Verifying remote backup …
Remote backup verification completed
Downloading file (38.86 MB) …
Downloading file (756.09 KB) …
Downloading file (1,023.99 MB) …

In the browser it says “Finished!” … If I click ‘run now’, it’s back to being hung. If I go to “Advanced” and try to select “Restore”, none of the fields change from the state they were in with the ‘Backup’… If I try to start a restore, it’s hung at “Waiting for task to start” again…

I found the ‘duplicati.log’ file that also contains nothing substantially of interest. A few errors on files that are inaccessible but that’s it:

2018-09-10 12:32:28Z - Information: Backend event: List - Started: ()
2018-09-10 12:32:33Z - Information: Backend event: List - Completed: (225 bytes)
2018-09-10 12:32:37Z - Warning: Failed to process path: /Users/hpeyerl/duplicati.log
System.IO.IOException: Sharing violation on path /Users/hpeyerl/duplicati.log
at System.IO.FileStream…ctor (System.String path, System.IO.FileMode mode, System.IO.FileAccess access, System.IO.FileShare share, System.Int32 bufferSize, System.Boolean anonymous, System.IO.FileOptions options) [0x0019e] in <016f22b59bd64951b06e38b9f5a0506c>:0
at System.IO.FileStream…ctor (System.String path, System.IO.FileMode mode, System.IO.FileAccess access, System.IO.FileShare share) [0x00000] in <016f22b59bd64951b06e38b9f5a0506c>:0
at (wrapper remoting-invoke-with-check) System.IO.FileStream…ctor(string,System.IO.FileMode,System.IO.FileAccess,System.IO.FileShare)
at System.IO.File.Open (System.String path, System.IO.FileMode mode, System.IO.FileAccess access, System.IO.FileShare share) [0x00000] in <016f22b59bd64951b06e38b9f5a0506c>:0
at Duplicati.Library.Snapshots.NoSnapshot.OpenRead (System.String file) [0x00000] in <2be220318a714f97a1ac5d13c83b05be>:0
at Duplicati.Library.Main.Operation.BackupHandler.HandleFilesystemEntry (Duplicati.Library.Snapshots.ISnapshotService snapshot, Duplicati.Library.Main.BackendManager backend, System.String path, System.IO.FileAttributes attributes) [0x00433] in :0

2018-09-10 13:21:37Z - Warning: Failed to process metadata for “/Users/hpeyerl/Library/Google/GoogleSoftwareUpdate/TicketStore/”, storing empty metadata
UnixSupport.File+FileAccesException: Unable to access the file “/Users/hpeyerl/Library/Google/GoogleSoftwareUpdate/TicketStore” with method listxattr, error: EACCES (13)
at UnixSupport.File.GetExtendedAttributes (System.String path, System.Boolean isSymlink, System.Boolean followSymlink) [0x00077] in <3fbec333f978484785726c089e2a43ac>:0
at Duplicati.Library.Snapshots.SystemIOLinux.GetMetadata (System.String file, System.Boolean isSymlink, System.Boolean followSymlink) [0x0000d] in <2be220318a714f97a1ac5d13c83b05be>:0
at Duplicati.Library.Snapshots.NoSnapshotLinux.GetMetadata (System.String file, System.Boolean isSymlink, System.Boolean followSymlink) [0x00007] in <2be220318a714f97a1ac5d13c83b05be>:0
at Duplicati.Library.Main.Operation.BackupHandler.GenerateMetadata (Duplicati.Library.Snapshots.ISnapshotService snapshot, System.String path, System.IO.FileAttributes attributes) [0x00027] in :0

2018-09-10 13:21:37Z - Warning: Error reported while accessing file: /Users/hpeyerl/Library/Google/GoogleSoftwareUpdate/TicketStore/
System.UnauthorizedAccessException: Access to the path ‘/Users/hpeyerl/Library/Google/GoogleSoftwareUpdate/TicketStore’ is denied.
at System.IO.__Error.WinIOError (System.Int32 errorCode, System.String maybeFullPath) [0x00129] in <016f22b59bd64951b06e38b9f5a0506c>:0
at System.IO.FileSystemEnumerableIterator1[TSource].HandleError (System.Int32 hr, System.String path) [0x00006] in <016f22b59bd64951b06e38b9f5a0506c>:0 at System.IO.FileSystemEnumerableIterator1[TSource].CommonInit () [0x00054] in <016f22b59bd64951b06e38b9f5a0506c>:0
at System.IO.FileSystemEnumerableIterator1[TSource]..ctor (System.String path, System.String originalUserPath, System.String searchPattern, System.IO.SearchOption searchOption, System.IO.SearchResultHandler1[TSource] resultHandler, System.Boolean checkHost) [0x000d6] in <016f22b59bd64951b06e38b9f5a0506c>:0
at System.IO.FileSystemEnumerableFactory.CreateFileNameIterator (System.String path, System.String originalUserPath, System.String searchPattern, System.Boolean includeFiles, System.Boolean includeDirs, System.IO.SearchOption searchOption, System.Boolean checkHost) [0x00009] in <016f22b59bd64951b06e38b9f5a0506c>:0
at System.IO.Directory.InternalGetFileDirectoryNames (System.String path, System.String userPathOriginal, System.String searchPattern, System.Boolean includeFiles, System.Boolean includeDirs, System.IO.SearchOption searchOption, System.Boolean checkHost) [0x00000] in <016f22b59bd64951b06e38b9f5a0506c>:0
at System.IO.Directory.InternalGetDirectories (System.String path, System.String searchPattern, System.IO.SearchOption searchOption) [0x00000] in <016f22b59bd64951b06e38b9f5a0506c>:0
at System.IO.Directory.GetDirectories (System.String path) [0x0000e] in <016f22b59bd64951b06e38b9f5a0506c>:0
at Duplicati.Library.Snapshots.NoSnapshot.ListFolders (System.String folder) [0x00000] in <2be220318a714f97a1ac5d13c83b05be>:0
at Duplicati.Library.Utility.Utility+d__22.MoveNext () [0x0013c] in <0828ce86ffa94a4bbbb2da4331bcc67b>:0

2018-09-10 13:21:37Z - Warning: Error reported while accessing file: /Users/hpeyerl/Library/Google/GoogleSoftwareUpdate/TicketStore/
System.UnauthorizedAccessException: Access to the path ‘/Users/hpeyerl/Library/Google/GoogleSoftwareUpdate/TicketStore’ is denied.
at System.IO.__Error.WinIOError (System.Int32 errorCode, System.String maybeFullPath) [0x00129] in <016f22b59bd64951b06e38b9f5a0506c>:0
at System.IO.FileSystemEnumerableIterator1[TSource].HandleError (System.Int32 hr, System.String path) [0x00006] in <016f22b59bd64951b06e38b9f5a0506c>:0 at System.IO.FileSystemEnumerableIterator1[TSource].CommonInit () [0x00054] in <016f22b59bd64951b06e38b9f5a0506c>:0
at System.IO.FileSystemEnumerableIterator1[TSource]..ctor (System.String path, System.String originalUserPath, System.String searchPattern, System.IO.SearchOption searchOption, System.IO.SearchResultHandler1[TSource] resultHandler, System.Boolean checkHost) [0x000d6] in <016f22b59bd64951b06e38b9f5a0506c>:0
at System.IO.FileSystemEnumerableFactory.CreateFileNameIterator (System.String path, System.String originalUserPath, System.String searchPattern, System.Boolean includeFiles, System.Boolean includeDirs, System.IO.SearchOption searchOption, System.Boolean checkHost) [0x00009] in <016f22b59bd64951b06e38b9f5a0506c>:0
at System.IO.Directory.InternalGetFileDirectoryNames (System.String path, System.String userPathOriginal, System.String searchPattern, System.Boolean includeFiles, System.Boolean includeDirs, System.IO.SearchOption searchOption, System.Boolean checkHost) [0x00000] in <016f22b59bd64951b06e38b9f5a0506c>:0
at System.IO.Directory.InternalGetFiles (System.String path, System.String searchPattern, System.IO.SearchOption searchOption) [0x00000] in <016f22b59bd64951b06e38b9f5a0506c>:0
at System.IO.Directory.GetFiles (System.String path) [0x0000e] in <016f22b59bd64951b06e38b9f5a0506c>:0
at Duplicati.Library.Snapshots.NoSnapshot.ListFiles (System.String folder) [0x00000] in <2be220318a714f97a1ac5d13c83b05be>:0
at Duplicati.Library.Utility.Utility+d__22.MoveNext () [0x00244] in <0828ce86ffa94a4bbbb2da4331bcc67b>:0

2018-09-10 13:43:43Z - Information: Backend event: Put - Started: duplicati-b039426f3a9ac4c62a2dd3a27c1432406.dblock.zip.aes (571.07 MB)
2018-09-10 13:51:27Z - Information: Backend event: Put - Completed: duplicati-b039426f3a9ac4c62a2dd3a27c1432406.dblock.zip.aes (571.07 MB)
2018-09-10 13:51:27Z - Information: Backend event: Put - Started: duplicati-ifb520288d20b4c968486b9eacde7af90.dindex.zip.aes (1.35 MB)
2018-09-10 13:51:31Z - Information: Backend event: Put - Completed: duplicati-ifb520288d20b4c968486b9eacde7af90.dindex.zip.aes (1.35 MB)
2018-09-10 13:51:31Z - Information: Backend event: Put - Started: duplicati-20180910T123126Z.dlist.zip.aes (38.86 MB)
2018-09-10 13:52:04Z - Information: Backend event: Put - Completed: duplicati-20180910T123126Z.dlist.zip.aes (38.86 MB)
2018-09-10 13:52:04Z - Information: Backend event: List - Started: ()
2018-09-10 13:52:13Z - Information: Backend event: List - Completed: (228 bytes)
2018-09-10 13:52:13Z - Information: Backend event: Get - Started: duplicati-20180910T123126Z.dlist.zip.aes (38.86 MB)
2018-09-10 13:52:22Z - Information: Backend event: Get - Completed: duplicati-20180910T123126Z.dlist.zip.aes (38.86 MB)
2018-09-10 13:52:22Z - Information: Backend event: Get - Started: duplicati-iee50dc71c48a4d6c9fe44e4eeb9389e3.dindex.zip.aes (756.09 KB)
2018-09-10 13:52:22Z - Information: Backend event: Get - Completed: duplicati-iee50dc71c48a4d6c9fe44e4eeb9389e3.dindex.zip.aes (756.09 KB)
2018-09-10 13:52:22Z - Information: Backend event: Get - Started: duplicati-baa6c348118484f718a9d22ecf81c9237.dblock.zip.aes (1,023.99 MB)
2018-09-10 13:53:29Z - Information: Backend event: Get - Completed: duplicati-baa6c348118484f718a9d22ecf81c9237.dblock.zip.aes (1,023.99 MB)

So are we officially out of ideas then?

Is that the end of duplicati.log or is there more? Normally it should continue. To give examples I did some tests on a small backup done using the web UI Commandline option, then copied log lines from a Profiling-level log.

2.0.3.11 canary “backup”. SQL looks like it’s noting what was verified, and deleting old logs. Do you get SQL?

2018-09-13 08:08:41 -04 - [Information-Duplicati.Library.Main.Database.LocalDeleteDatabase-CompactReason]: Compacting not required
2018-09-13 08:08:41 -04 - [Profiling-Timer.Begin-Duplicati.Library.Main.Operation.BackupHandler-CommitFinalizingBackup]: Starting - CommitFinalizingBackup
2018-09-13 08:08:41 -04 - [Profiling-Timer.Finished-Duplicati.Library.Main.Operation.BackupHandler-CommitFinalizingBackup]: CommitFinalizingBackup took 0:00:00:00.000
2018-09-13 08:08:41 -04 - [Profiling-Timer.Begin-Duplicati.Library.Main.Operation.BackupHandler-AfterBackupVerify]: Starting - AfterBackupVerify
... (various Get and SQL work)
2018-09-13 08:08:42 -04 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Get - Started: duplicati-b91c4c456f4c748669615bb6c3ea121ba.dblock.zip (700 bytes)
2018-09-13 08:08:42 -04 - [Profiling-Duplicati.Library.Main.BackendManager-DownloadSpeed]: Downloaded 700 bytes in 00:00:00, -8,388,608.00 TB/s
2018-09-13 08:08:42 -04 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Get - Completed: duplicati-b91c4c456f4c748669615bb6c3ea121ba.dblock.zip (700 bytes)
2018-09-13 08:08:42 -04 - [Profiling-Timer.Finished-Duplicati.Library.Main.BackendManager-RemoteOperationGet]: RemoteOperationGet took 0:00:00:00.000
2018-09-13 08:08:42 -04 - [Profiling-Timer.Begin-Duplicati.Library.Main.Database.ExtensionMethods-ExecuteNonQuery]: Starting - ExecuteNonQuery: UPDATE "RemoteVolume" SET "VerificationCount" = MAX(1, CASE WHEN "VerificationCount" <= 0 THEN (SELECT MAX("VerificationCount") FROM "RemoteVolume") ELSE "VerificationCount" + 1 END) WHERE "Name" = "duplicati-b91c4c456f4c748669615bb6c3ea121ba.dblock.zip"
2018-09-13 08:08:43 -04 - [Profiling-Timer.Finished-Duplicati.Library.Main.Database.ExtensionMethods-ExecuteNonQuery]: ExecuteNonQuery: UPDATE "RemoteVolume" SET "VerificationCount" = MAX(1, CASE WHEN "VerificationCount" <= 0 THEN (SELECT MAX("VerificationCount") FROM "RemoteVolume") ELSE "VerificationCount" + 1 END) WHERE "Name" = "duplicati-b91c4c456f4c748669615bb6c3ea121ba.dblock.zip" took 0:00:00:00.131
2018-09-13 08:08:43 -04 - [Profiling-Timer.Begin-Duplicati.Library.Main.Database.ExtensionMethods-ExecuteNonQuery]: Starting - ExecuteNonQuery: DELETE FROM "LogData" WHERE "Timestamp" < 1534248523
2018-09-13 08:08:43 -04 - [Profiling-Timer.Finished-Duplicati.Library.Main.Database.ExtensionMethods-ExecuteNonQuery]: ExecuteNonQuery: DELETE FROM "LogData" WHERE "Timestamp" < 1534248523 took 0:00:00:00.000
2018-09-13 08:08:43 -04 - [Profiling-Timer.Begin-Duplicati.Library.Main.Database.ExtensionMethods-ExecuteNonQuery]: Starting - ExecuteNonQuery: DELETE FROM "RemoteOperation" WHERE "Timestamp" < 1534248523
2018-09-13 08:08:43 -04 - [Profiling-Timer.Finished-Duplicati.Library.Main.Database.ExtensionMethods-ExecuteNonQuery]: ExecuteNonQuery: DELETE FROM "RemoteOperation" WHERE "Timestamp" < 1534248523 took 0:00:00:00.000
2018-09-13 08:08:43 -04 - [Profiling-Timer.Finished-Duplicati.Library.Main.Controller-RunBackup]: Running Backup took 0:00:00:03.609
2018-09-13 08:08:43 -04 - [Information-Duplicati.Library.Main.Controller-CompletedOperation]: The operation Backup has completed

2.0.3.11 canary “backup” with –backup-test-samples=0 looks like it suppresses Get to look inside the files, but still does a quick List:

2018-09-13 11:36:30 -04 - [Information-Duplicati.Library.Main.Database.LocalDeleteDatabase-CompactReason]: Compacting not required
2018-09-13 11:36:30 -04 - [Profiling-Timer.Begin-Duplicati.Library.Main.Operation.BackupHandler-CommitFinalizingBackup]: Starting - CommitFinalizingBackup
2018-09-13 11:36:30 -04 - [Profiling-Timer.Finished-Duplicati.Library.Main.Operation.BackupHandler-CommitFinalizingBackup]: CommitFinalizingBackup took 0:00:00:00.000
2018-09-13 11:36:30 -04 - [Profiling-Timer.Begin-Duplicati.Library.Main.Operation.BackupHandler-AfterBackupVerify]: Starting - AfterBackupVerify
2018-09-13 11:36:30 -04 - [Profiling-Timer.Begin-Duplicati.Library.Main.BackendManager-RemoteOperationList]: Starting - RemoteOperationList
2018-09-13 11:36:30 -04 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: List - Started:  ()
2018-09-13 11:36:30 -04 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: List - Completed:  (10 bytes)
2018-09-13 11:36:30 -04 - [Profiling-Timer.Finished-Duplicati.Library.Main.BackendManager-RemoteOperationList]: RemoteOperationList took 0:00:00:00.000
2018-09-13 11:36:30 -04 - [Profiling-Timer.Begin-Duplicati.Library.Main.Database.ExtensionMethods-ExecuteReader]: Starting - ExecuteReader: SELECT DISTINCT "Name", "State" FROM "Remotevolume" WHERE "Name" IN (SELECT "Name" FROM "Remotevolume" WHERE "State" IN ("Deleted", "Deleting")) AND NOT "State" IN ("Deleted", "Deleting")
2018-09-13 11:36:30 -04 - [Profiling-Timer.Finished-Duplicati.Library.Main.Database.ExtensionMethods-ExecuteReader]: ExecuteReader: SELECT DISTINCT "Name", "State" FROM "Remotevolume" WHERE "Name" IN (SELECT "Name" FROM "Remotevolume" WHERE "State" IN ("Deleted", "Deleting")) AND NOT "State" IN ("Deleted", "Deleting") took 0:00:00:00.000
2018-09-13 11:36:30 -04 - [Profiling-Timer.Finished-Duplicati.Library.Main.Operation.BackupHandler-AfterBackupVerify]: AfterBackupVerify took 0:00:00:00.600
2018-09-13 11:36:30 -04 - [Profiling-Timer.Begin-Duplicati.Library.Main.BackendManager-RemoteOperationTerminate]: Starting - RemoteOperationTerminate
2018-09-13 11:36:30 -04 - [Profiling-Timer.Finished-Duplicati.Library.Main.BackendManager-RemoteOperationTerminate]: RemoteOperationTerminate took 0:00:00:00.000
2018-09-13 11:36:30 -04 - [Profiling-Timer.Begin-Duplicati.Library.Main.Database.ExtensionMethods-ExecuteNonQuery]: Starting - ExecuteNonQuery: DELETE FROM "LogData" WHERE "Timestamp" < 1534260990
2018-09-13 11:36:30 -04 - [Profiling-Timer.Finished-Duplicati.Library.Main.Database.ExtensionMethods-ExecuteNonQuery]: ExecuteNonQuery: DELETE FROM "LogData" WHERE "Timestamp" < 1534260990 took 0:00:00:00.000
2018-09-13 11:36:30 -04 - [Profiling-Timer.Begin-Duplicati.Library.Main.Database.ExtensionMethods-ExecuteNonQuery]: Starting - ExecuteNonQuery: DELETE FROM "RemoteOperation" WHERE "Timestamp" < 1534260990
2018-09-13 11:36:30 -04 - [Profiling-Timer.Finished-Duplicati.Library.Main.Database.ExtensionMethods-ExecuteNonQuery]: ExecuteNonQuery: DELETE FROM "RemoteOperation" WHERE "Timestamp" < 1534260990 took 0:00:00:00.000
2018-09-13 11:36:30 -04 - [Profiling-Timer.Finished-Duplicati.Library.Main.Controller-RunBackup]: Running Backup took 0:00:00:01.845
2018-09-13 11:36:30 -04 - [Information-Duplicati.Library.Main.Controller-CompletedOperation]: The operation Backup has completed

2.0.3.11 canary “backup” with –no-backend-verification goes further, removing the List of the backend files:

2018-09-13 12:05:53 -04 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Get - Completed: duplicati-b22058baae7994a50b3a21393c2dde132.dblock.zip (512 bytes)
2018-09-13 12:05:53 -04 - [Profiling-Timer.Finished-Duplicati.Library.Main.BackendManager-RemoteOperationGet]: RemoteOperationGet took 0:00:00:00.013
2018-09-13 12:05:53 -04 - [Profiling-Timer.Begin-Duplicati.Library.Main.Database.ExtensionMethods-ExecuteNonQuery]: Starting - ExecuteNonQuery: DROP TABLE IF EXISTS "Blocklist-0D8D90EED6F72043A1EEE99760097F1C"
2018-09-13 12:05:53 -04 - [Profiling-Timer.Finished-Duplicati.Library.Main.Database.ExtensionMethods-ExecuteNonQuery]: ExecuteNonQuery: DROP TABLE IF EXISTS "Blocklist-0D8D90EED6F72043A1EEE99760097F1C" took 0:00:00:00.000
2018-09-13 12:05:53 -04 - [Profiling-Timer.Begin-Duplicati.Library.Main.Database.ExtensionMethods-ExecuteNonQuery]: Starting - ExecuteNonQuery: CREATE TEMPORARY TABLE "Indexlist-53732DC907B4FC44A3C52300152D840B" ("Name" TEXT NOT NULL, "Hash" TEXT NOT NULL, "Size" INTEGER NOT NULL)
2018-09-13 12:05:53 -04 - [Profiling-Timer.Finished-Duplicati.Library.Main.Database.ExtensionMethods-ExecuteNonQuery]: ExecuteNonQuery: CREATE TEMPORARY TABLE "Indexlist-53732DC907B4FC44A3C52300152D840B" ("Name" TEXT NOT NULL, "Hash" TEXT NOT NULL, "Size" INTEGER NOT NULL) took 0:00:00:00.000
2018-09-13 12:05:53 -04 - [Profiling-Timer.Begin-Duplicati.Library.Main.Database.ExtensionMethods-ExecuteNonQuery]: Starting - ExecuteNonQuery: CREATE TEMPORARY TABLE "CmpTable-28A2819C834CD4469486A8893CC6C5A6" AS SELECT "A"."Name", "A"."Hash", "A"."Size" FROM "Remotevolume" A, "Remotevolume" B, "IndexBlockLink" WHERE "B"."Name" = "duplicati-iece85d0092a34c7b8d570d412109cfe2.dindex.zip" AND "A"."ID" = "IndexBlockLink"."BlockVolumeID" AND "B"."ID" = "IndexBlockLink"."IndexVolumeID" 
2018-09-13 12:05:53 -04 - [Profiling-Timer.Finished-Duplicati.Library.Main.Database.ExtensionMethods-ExecuteNonQuery]: ExecuteNonQuery: CREATE TEMPORARY TABLE "CmpTable-28A2819C834CD4469486A8893CC6C5A6" AS SELECT "A"."Name", "A"."Hash", "A"."Size" FROM "Remotevolume" A, "Remotevolume" B, "IndexBlockLink" WHERE "B"."Name" = "duplicati-iece85d0092a34c7b8d570d412109cfe2.dindex.zip" AND "A"."ID" = "IndexBlockLink"."BlockVolumeID" AND "B"."ID" = "IndexBlockLink"."IndexVolumeID"  took 0:00:00:00.000
2018-09-13 12:05:53 -04 - [Profiling-Timer.Begin-Duplicati.Library.Main.Database.ExtensionMethods-ExecuteReader]: Starting - ExecuteReader: SELECT 1 AS "Type", "Indexlist-53732DC907B4FC44A3C52300152D840B"."Name" AS "Name" FROM "Indexlist-53732DC907B4FC44A3C52300152D840B" WHERE "Indexlist-53732DC907B4FC44A3C52300152D840B"."Name" NOT IN ( SELECT "Name" FROM "CmpTable-28A2819C834CD4469486A8893CC6C5A6" ) UNION SELECT 0 AS "Type", "Name" AS "Name" FROM "CmpTable-28A2819C834CD4469486A8893CC6C5A6" WHERE "Name" NOT IN (SELECT "Name" FROM "Indexlist-53732DC907B4FC44A3C52300152D840B") UNION SELECT 2 AS "Type", "E"."Name" AS "Name" FROM "Indexlist-53732DC907B4FC44A3C52300152D840B" E, "CmpTable-28A2819C834CD4469486A8893CC6C5A6" D WHERE "D"."Name" = "E"."Name" AND ("D"."Hash" != "E"."Hash" OR "D"."Size" != "E"."Size") 
2018-09-13 12:05:53 -04 - [Profiling-Timer.Finished-Duplicati.Library.Main.Database.ExtensionMethods-ExecuteReader]: ExecuteReader: SELECT 1 AS "Type", "Indexlist-53732DC907B4FC44A3C52300152D840B"."Name" AS "Name" FROM "Indexlist-53732DC907B4FC44A3C52300152D840B" WHERE "Indexlist-53732DC907B4FC44A3C52300152D840B"."Name" NOT IN ( SELECT "Name" FROM "CmpTable-28A2819C834CD4469486A8893CC6C5A6" ) UNION SELECT 0 AS "Type", "Name" AS "Name" FROM "CmpTable-28A2819C834CD4469486A8893CC6C5A6" WHERE "Name" NOT IN (SELECT "Name" FROM "Indexlist-53732DC907B4FC44A3C52300152D840B") UNION SELECT 2 AS "Type", "E"."Name" AS "Name" FROM "Indexlist-53732DC907B4FC44A3C52300152D840B" E, "CmpTable-28A2819C834CD4469486A8893CC6C5A6" D WHERE "D"."Name" = "E"."Name" AND ("D"."Hash" != "E"."Hash" OR "D"."Size" != "E"."Size")  took 0:00:00:00.000
2018-09-13 12:05:53 -04 - [Profiling-Timer.Begin-Duplicati.Library.Main.Database.ExtensionMethods-ExecuteNonQuery]: Starting - ExecuteNonQuery: DROP TABLE IF EXISTS "CmpTable-28A2819C834CD4469486A8893CC6C5A6" 
2018-09-13 12:05:53 -04 - [Profiling-Timer.Finished-Duplicati.Library.Main.Database.ExtensionMethods-ExecuteNonQuery]: ExecuteNonQuery: DROP TABLE IF EXISTS "CmpTable-28A2819C834CD4469486A8893CC6C5A6"  took 0:00:00:00.000
2018-09-13 12:05:53 -04 - [Profiling-Timer.Begin-Duplicati.Library.Main.Database.ExtensionMethods-ExecuteNonQuery]: Starting - ExecuteNonQuery: DROP TABLE IF EXISTS "Indexlist-53732DC907B4FC44A3C52300152D840B"
2018-09-13 12:05:53 -04 - [Profiling-Timer.Finished-Duplicati.Library.Main.Database.ExtensionMethods-ExecuteNonQuery]: ExecuteNonQuery: DROP TABLE IF EXISTS "Indexlist-53732DC907B4FC44A3C52300152D840B" took 0:00:00:00.000
2018-09-13 12:05:53 -04 - [Profiling-Timer.Begin-Duplicati.Library.Main.Database.ExtensionMethods-ExecuteNonQuery]: Starting - ExecuteNonQuery: UPDATE "RemoteVolume" SET "VerificationCount" = MAX(1, CASE WHEN "VerificationCount" <= 0 THEN (SELECT MAX("VerificationCount") FROM "RemoteVolume") ELSE "VerificationCount" + 1 END) WHERE "Name" = "duplicati-iece85d0092a34c7b8d570d412109cfe2.dindex.zip"
2018-09-13 12:05:53 -04 - [Profiling-Timer.Finished-Duplicati.Library.Main.Database.ExtensionMethods-ExecuteNonQuery]: ExecuteNonQuery: UPDATE "RemoteVolume" SET "VerificationCount" = MAX(1, CASE WHEN "VerificationCount" <= 0 THEN (SELECT MAX("VerificationCount") FROM "RemoteVolume") ELSE "VerificationCount" + 1 END) WHERE "Name" = "duplicati-iece85d0092a34c7b8d570d412109cfe2.dindex.zip" took 0:00:00:00.134
2018-09-13 12:05:53 -04 - [Profiling-Timer.Begin-Duplicati.Library.Main.Database.ExtensionMethods-ExecuteNonQuery]: Starting - ExecuteNonQuery: CREATE TEMPORARY TABLE "Blocklist-79EC607C1E73F745B3997C31858B03D9" ("Hash" TEXT NOT NULL, "Size" INTEGER NOT NULL)
2018-09-13 12:05:53 -04 - [Profiling-Timer.Finished-Duplicati.Library.Main.Database.ExtensionMethods-ExecuteNonQuery]: ExecuteNonQuery: CREATE TEMPORARY TABLE "Blocklist-79EC607C1E73F745B3997C31858B03D9" ("Hash" TEXT NOT NULL, "Size" INTEGER NOT NULL) took 0:00:00:00.000
2018-09-13 12:05:53 -04 - [Profiling-Timer.Begin-Duplicati.Library.Main.Database.ExtensionMethods-ExecuteNonQuery]: Starting - ExecuteNonQuery: CREATE TEMPORARY TABLE "CmpTable-6E292AE78573384E908EB84918CF1CC3" AS SELECT DISTINCT "Hash" AS "Hash", "Size" AS "Size" FROM (SELECT "Block"."Hash" AS "Hash", "Block"."Size" AS "Size" FROM "Remotevolume", "Block" WHERE "Remotevolume"."Name" = "duplicati-b22058baae7994a50b3a21393c2dde132.dblock.zip" AND "Remotevolume"."ID" = "Block"."VolumeID" UNION SELECT "DeletedBlock"."Hash" AS "Hash", "DeletedBlock"."Size" AS "Size" FROM "DeletedBlock", "RemoteVolume" WHERE "RemoteVolume"."Name" = "duplicati-b22058baae7994a50b3a21393c2dde132.dblock.zip" AND "RemoteVolume"."ID" = "DeletedBlock"."VolumeID")
2018-09-13 12:05:53 -04 - [Profiling-Timer.Finished-Duplicati.Library.Main.Database.ExtensionMethods-ExecuteNonQuery]: ExecuteNonQuery: CREATE TEMPORARY TABLE "CmpTable-6E292AE78573384E908EB84918CF1CC3" AS SELECT DISTINCT "Hash" AS "Hash", "Size" AS "Size" FROM (SELECT "Block"."Hash" AS "Hash", "Block"."Size" AS "Size" FROM "Remotevolume", "Block" WHERE "Remotevolume"."Name" = "duplicati-b22058baae7994a50b3a21393c2dde132.dblock.zip" AND "Remotevolume"."ID" = "Block"."VolumeID" UNION SELECT "DeletedBlock"."Hash" AS "Hash", "DeletedBlock"."Size" AS "Size" FROM "DeletedBlock", "RemoteVolume" WHERE "RemoteVolume"."Name" = "duplicati-b22058baae7994a50b3a21393c2dde132.dblock.zip" AND "RemoteVolume"."ID" = "DeletedBlock"."VolumeID") took 0:00:00:00.000
2018-09-13 12:05:53 -04 - [Profiling-Timer.Begin-Duplicati.Library.Main.Database.ExtensionMethods-ExecuteReader]: Starting - ExecuteReader: SELECT 1 AS "Type", "Blocklist-79EC607C1E73F745B3997C31858B03D9"."Hash" AS "Hash" FROM "Blocklist-79EC607C1E73F745B3997C31858B03D9" WHERE "Blocklist-79EC607C1E73F745B3997C31858B03D9"."Hash" NOT IN ( SELECT "Hash" FROM "CmpTable-6E292AE78573384E908EB84918CF1CC3" ) UNION SELECT 0 AS "Type", "Hash" AS "Hash" FROM "CmpTable-6E292AE78573384E908EB84918CF1CC3" WHERE "Hash" NOT IN (SELECT "Hash" FROM "Blocklist-79EC607C1E73F745B3997C31858B03D9") UNION SELECT 2 AS "Type", "E"."Hash" AS "Hash" FROM "Blocklist-79EC607C1E73F745B3997C31858B03D9" E, "CmpTable-6E292AE78573384E908EB84918CF1CC3" D WHERE "D"."Hash" = "E"."Hash" AND "D"."Size" != "E"."Size"  
2018-09-13 12:05:53 -04 - [Profiling-Timer.Finished-Duplicati.Library.Main.Database.ExtensionMethods-ExecuteReader]: ExecuteReader: SELECT 1 AS "Type", "Blocklist-79EC607C1E73F745B3997C31858B03D9"."Hash" AS "Hash" FROM "Blocklist-79EC607C1E73F745B3997C31858B03D9" WHERE "Blocklist-79EC607C1E73F745B3997C31858B03D9"."Hash" NOT IN ( SELECT "Hash" FROM "CmpTable-6E292AE78573384E908EB84918CF1CC3" ) UNION SELECT 0 AS "Type", "Hash" AS "Hash" FROM "CmpTable-6E292AE78573384E908EB84918CF1CC3" WHERE "Hash" NOT IN (SELECT "Hash" FROM "Blocklist-79EC607C1E73F745B3997C31858B03D9") UNION SELECT 2 AS "Type", "E"."Hash" AS "Hash" FROM "Blocklist-79EC607C1E73F745B3997C31858B03D9" E, "CmpTable-6E292AE78573384E908EB84918CF1CC3" D WHERE "D"."Hash" = "E"."Hash" AND "D"."Size" != "E"."Size"   took 0:00:00:00.000
2018-09-13 12:05:53 -04 - [Profiling-Timer.Begin-Duplicati.Library.Main.Database.ExtensionMethods-ExecuteNonQuery]: Starting - ExecuteNonQuery: DROP TABLE IF EXISTS "CmpTable-6E292AE78573384E908EB84918CF1CC3" 
2018-09-13 12:05:53 -04 - [Profiling-Timer.Finished-Duplicati.Library.Main.Database.ExtensionMethods-ExecuteNonQuery]: ExecuteNonQuery: DROP TABLE IF EXISTS "CmpTable-6E292AE78573384E908EB84918CF1CC3"  took 0:00:00:00.000
2018-09-13 12:05:53 -04 - [Profiling-Timer.Begin-Duplicati.Library.Main.Database.ExtensionMethods-ExecuteNonQuery]: Starting - ExecuteNonQuery: DROP TABLE IF EXISTS "Blocklist-79EC607C1E73F745B3997C31858B03D9"
2018-09-13 12:05:53 -04 - [Profiling-Timer.Finished-Duplicati.Library.Main.Database.ExtensionMethods-ExecuteNonQuery]: ExecuteNonQuery: DROP TABLE IF EXISTS "Blocklist-79EC607C1E73F745B3997C31858B03D9" took 0:00:00:00.000
2018-09-13 12:05:53 -04 - [Profiling-Timer.Begin-Duplicati.Library.Main.Database.ExtensionMethods-ExecuteNonQuery]: Starting - ExecuteNonQuery: UPDATE "RemoteVolume" SET "VerificationCount" = MAX(1, CASE WHEN "VerificationCount" <= 0 THEN (SELECT MAX("VerificationCount") FROM "RemoteVolume") ELSE "VerificationCount" + 1 END) WHERE "Name" = "duplicati-b22058baae7994a50b3a21393c2dde132.dblock.zip"
2018-09-13 12:05:53 -04 - [Profiling-Timer.Finished-Duplicati.Library.Main.Database.ExtensionMethods-ExecuteNonQuery]: ExecuteNonQuery: UPDATE "RemoteVolume" SET "VerificationCount" = MAX(1, CASE WHEN "VerificationCount" <= 0 THEN (SELECT MAX("VerificationCount") FROM "RemoteVolume") ELSE "VerificationCount" + 1 END) WHERE "Name" = "duplicati-b22058baae7994a50b3a21393c2dde132.dblock.zip" took 0:00:00:00.123
2018-09-13 12:05:54 -04 - [Profiling-Timer.Finished-Duplicati.Library.Main.Controller-RunTest]: Running Test took 0:00:00:01.542
2018-09-13 12:05:54 -04 - [Information-Duplicati.Library.Main.Controller-CompletedOperation]: The operation Test has completed

These said [Information-Duplicati.Library.Main.Controller-CompletedOperation]. Did yours?

We’re closing in on the point of the possible hang, but not there yet. If we get close enough, code-level work might be possible, however we’d probably have to call in someone else. The coders seem to be more at the https://github.com/duplicati/duplicati/issues site whereas the forum is more for users to try to help each other.

Do you have any other backups? If so, how are they going? If not, would you be willing to set up a small one with files that don’t generate complaints, and see if it does backups and restores from a fresh Duplicati start without the troubled job running (I’m not sure how easy or difficult it will be to keep the other from “finishing”).

Things done via the web UI can definitely interact with each other in terms of scheduling. Some might wait… MENU --> About --> Server state properties can help see these, and that view is live-updated with changes.

activeTask being empty would probably indicate it’s ready for the next. A non-blank is now hard to decipher.

That was the entire contents of ‘duplicati.log’.

That backup was the only Duplicati backup I’ve been working on. I’m only experimenting at this point so the data is not vital. I’m still running Crashplan in parallel and I’ve got some time to get a reliable alternative before I have to switch away from Crashplan.

This morning was a bit of an adventure trying to delete/eradicate the really big backup we’ve been working through in this thread. Something is up with Google Drive. If I dump the Duplicati backup folder into the Google Drive Trash bin and then empty the trash, it appears to be ‘gone’. When I attempt to backup to that folder, Duplicati thinks all of the files are still there. I’ve tried blowing away $HOME/.config/Duplicati/JASRAADOTX.sqlite (the local cache DB) but Duplicati still sees the remote files. So each time I try to create a fresh backup set, I have to get Duplicati to create a new remote folder (test, test2, test3, etc)… Google Drive also reports that the data has not been freed up. So perhaps there’s some lag between what’s reported in their web-interface and what’s actually happening behind the scenes.

Anyway, working around that, I was able to create a new backup that was much smaller (26M). It ran for a few minutes.

The backup I created was with “debug-output” set, “log-file” set to “duplicati.log”, and “log-level” set to “Information”. After the browser said “Finished!”, I gathered the comandline output and logfile output. It appears that they’re both incomplete and the interface is hung again.

I ran it in “commandline” and this is the browser output:

Running commandline entry

Running … stop now

Backup started at 9/14/2018 6:56:45 AM

Checking remote backup …

Listing remote folder …

Scanning local files …

228 files need to be examined (25.56 MB)

0 files need to be examined (0 bytes)

Uploading file (20.71 MB) …

Uploading file (29.56 KB) …

Uploading file (13.11 KB) …

No remote filesets were deleted

Checking remote backup …

Listing remote folder …

Verifying remote backup …

Remote backup verification completed

Downloading file (13.11 KB) …

Downloading file (29.56 KB) …

Downloading file (20.71 MB) …

…and this is “duplicati.log” (which ends up in the OS X application bundle):

chimay Resources $ pwd
/Applications/Duplicati.app/Contents/Resources
chimay Resources $ cat duplicati.log
2018-09-14 12:56:46Z - Information: Backend event: List - Started: ()
2018-09-14 12:56:47Z - Information: Backend event: List - Completed: ()
2018-09-14 12:56:52Z - Information: Backend event: Put - Started: duplicati-b5b228fae5a2c4825ad02bb478f23b633.dblock.zip.aes (20.71 MB)
2018-09-14 12:57:05Z - Information: Backend event: Put - Completed: duplicati-b5b228fae5a2c4825ad02bb478f23b633.dblock.zip.aes (20.71 MB)
2018-09-14 12:57:05Z - Information: Backend event: Put - Started: duplicati-ie8fcc4e18498428a9f33aee982d7f330.dindex.zip.aes (29.56 KB)
2018-09-14 12:57:09Z - Information: Backend event: Put - Completed: duplicati-ie8fcc4e18498428a9f33aee982d7f330.dindex.zip.aes (29.56 KB)
2018-09-14 12:57:09Z - Information: Backend event: Put - Started: duplicati-20180914T125645Z.dlist.zip.aes (13.11 KB)
2018-09-14 12:57:10Z - Information: Backend event: Put - Completed: duplicati-20180914T125645Z.dlist.zip.aes (13.11 KB)
2018-09-14 12:57:10Z - Information: No remote filesets were deleted
2018-09-14 12:57:10Z - Information: Backend event: List - Started: ()
2018-09-14 12:57:12Z - Information: Backend event: List - Completed: (3 bytes)
2018-09-14 12:57:12Z - Information: Backend event: Get - Started: duplicati-20180914T125645Z.dlist.zip.aes (13.11 KB)
2018-09-14 12:57:14Z - Information: Backend event: Get - Completed: duplicati-20180914T125645Z.dlist.zip.aes (13.11 KB)
2018-09-14 12:57:14Z - Information: Backend event: Get - Started: duplicati-ie8fcc4e18498428a9f33aee982d7f330.dindex.zip.aes (29.56 KB)
2018-09-14 12:57:14Z - Information: Backend event: Get - Completed: duplicati-ie8fcc4e18498428a9f33aee982d7f330.dindex.zip.aes (29.56 KB)
2018-09-14 12:57:14Z - Information: Backend event: Get - Started: duplicati-b5b228fae5a2c4825ad02bb478f23b633.dblock.zip.aes (20.71 MB)
2018-09-14 12:57:17Z - Information: Backend event: Get - Completed: duplicati-b5b228fae5a2c4825ad02bb478f23b633.dblock.zip.aes (20.71 MB)
chimay Resources $

Further to last, I re-ran the above backup again (after restarting duplicati) with log-level set to “Profile”. Since it was a re-run, there were no files to upload but here is the output:

Backup started at 9/14/2018 7:12:29 AM

Checking remote backup …

Listing remote folder …

Scanning local files …

228 files need to be examined (25.56 MB)

0 files need to be examined (0 bytes)

removing temp files, as no data needs to be uploaded

No remote filesets were deleted

Checking remote backup …

Listing remote folder …

removing file listed as Temporary: duplicati-b4245674cf28045b08fa5249667f345a3.dblock.zip.aes

removing file listed as Temporary: duplicati-ida31effd15314ccf909404affb277488.dindex.zip.aes

Verifying remote backup …

Remote backup verification completed

Downloading file (13.11 KB) …

Downloading file (29.56 KB) …

Downloading file (20.71 MB) …

and the duplicati.log is substantially larger with a bunch of SQL and so forth (probably with private info in it so I’m loathe to post it. I can upload it somewhere for you if the details are of interest.

It ends like this:

2018-09-14 13:12:36Z - Profiling: Starting - ExecuteReader: SELECT “ID”, “Name”, “Size”, “Hash”, “VerificationCount” FROM “Remotevolume” WHERE “Type” = ? AND “State” IN (?, ?)
2018-09-14 13:12:36Z - Profiling: ExecuteReader: SELECT “ID”, “Name”, “Size”, “Hash”, “VerificationCount” FROM “Remotevolume” WHERE “Type” = ? AND “State” IN (?, ?) took 00:00:00.000
2018-09-14 13:12:36Z - Profiling: Starting - ExecuteReader: SELECT “ID”, “Name”, “Size”, “Hash”, “VerificationCount” FROM “Remotevolume” WHERE “Type” = ? AND “State” IN (?, ?)
2018-09-14 13:12:36Z - Profiling: ExecuteReader: SELECT “ID”, “Name”, “Size”, “Hash”, “VerificationCount” FROM “Remotevolume” WHERE “Type” = ? AND “State” IN (?, ?) took 00:00:00.000
2018-09-14 13:12:36Z - Profiling: Starting - RemoteOperationGet
2018-09-14 13:12:36Z - Information: Backend event: Get - Started: duplicati-20180914T125645Z.dlist.zip.aes (13.11 KB)
2018-09-14 13:12:38Z - Profiling: Downloaded 13.11 KB in 00:00:01.6495620, 7.95 KB/s
2018-09-14 13:12:38Z - Information: Backend event: Get - Completed: duplicati-20180914T125645Z.dlist.zip.aes (13.11 KB)
2018-09-14 13:12:38Z - Profiling: RemoteOperationGet took 00:00:01.654
2018-09-14 13:12:38Z - Profiling: Starting - ExecuteNonQuery: UPDATE “RemoteVolume” SET “VerificationCount” = MAX(1, CASE WHEN “VerificationCount” <= 0 THEN (SELECT MAX(“VerificationCount”) FROM “RemoteVolume”) ELSE “VerificationCount” + 1 END) WHERE “Name” = ?
2018-09-14 13:12:38Z - Profiling: ExecuteNonQuery: UPDATE “RemoteVolume” SET “VerificationCount” = MAX(1, CASE WHEN “VerificationCount” <= 0 THEN (SELECT MAX(“VerificationCount”) FROM “RemoteVolume”) ELSE “VerificationCount” + 1 END) WHERE “Name” = ? took 00:00:00.001
2018-09-14 13:12:38Z - Profiling: Starting - RemoteOperationGet
2018-09-14 13:12:38Z - Information: Backend event: Get - Started: duplicati-ie8fcc4e18498428a9f33aee982d7f330.dindex.zip.aes (29.56 KB)
2018-09-14 13:12:38Z - Profiling: Downloaded 29.56 KB in 00:00:00.3557830, 83.08 KB/s
2018-09-14 13:12:38Z - Information: Backend event: Get - Completed: duplicati-ie8fcc4e18498428a9f33aee982d7f330.dindex.zip.aes (29.56 KB)
2018-09-14 13:12:38Z - Profiling: RemoteOperationGet took 00:00:00.356
2018-09-14 13:12:38Z - Profiling: Starting - ExecuteNonQuery: UPDATE “RemoteVolume” SET “VerificationCount” = MAX(1, CASE WHEN “VerificationCount” <= 0 THEN (SELECT MAX(“VerificationCount”) FROM “RemoteVolume”) ELSE “VerificationCount” + 1 END) WHERE “Name” = ?
2018-09-14 13:12:38Z - Profiling: ExecuteNonQuery: UPDATE “RemoteVolume” SET “VerificationCount” = MAX(1, CASE WHEN “VerificationCount” <= 0 THEN (SELECT MAX(“VerificationCount”) FROM “RemoteVolume”) ELSE “VerificationCount” + 1 END) WHERE “Name” = ? took 00:00:00.000
2018-09-14 13:12:38Z - Profiling: Starting - RemoteOperationGet
2018-09-14 13:12:38Z - Information: Backend event: Get - Started: duplicati-b5b228fae5a2c4825ad02bb478f23b633.dblock.zip.aes (20.71 MB)
2018-09-14 13:12:41Z - Profiling: Downloaded 20.71 MB in 00:00:03.0153380, 6.87 MB/s
2018-09-14 13:12:41Z - Information: Backend event: Get - Completed: duplicati-b5b228fae5a2c4825ad02bb478f23b633.dblock.zip.aes (20.71 MB)
2018-09-14 13:12:41Z - Profiling: RemoteOperationGet took 00:00:03.020
2018-09-14 13:12:41Z - Profiling: Starting - ExecuteNonQuery: UPDATE “RemoteVolume” SET “VerificationCount” = MAX(1, CASE WHEN “VerificationCount” <= 0 THEN (SELECT MAX(“VerificationCount”) FROM “RemoteVolume”) ELSE “VerificationCount” + 1 END) WHERE “Name” = ?
2018-09-14 13:12:41Z - Profiling: ExecuteNonQuery: UPDATE “RemoteVolume” SET “VerificationCount” = MAX(1, CASE WHEN “VerificationCount” <= 0 THEN (SELECT MAX(“VerificationCount”) FROM “RemoteVolume”) ELSE “VerificationCount” + 1 END) WHERE “Name” = ? took 00:00:00.000
2018-09-14 13:12:41Z - Profiling: Starting - ExecuteNonQuery: DELETE FROM “LogData” WHERE “Timestamp” < ?
2018-09-14 13:12:41Z - Profiling: ExecuteNonQuery: DELETE FROM “LogData” WHERE “Timestamp” < ? took 00:00:00.000
2018-09-14 13:12:41Z - Profiling: Starting - ExecuteNonQuery: DELETE FROM “RemoteOperation” WHERE “Timestamp” < ?
2018-09-14 13:12:41Z - Profiling: ExecuteNonQuery: DELETE FROM “RemoteOperation” WHERE “Timestamp” < ? took 00:00:00.000
2018-09-14 13:12:41Z - Profiling: Running Backup took 00:00:12.037

The interface is hung again, as in, if I try to do anything, including running the backup again, it just says “Waiting for task to start …” and no new information is appended to “duplicati.log”.

To get a better comparison, my output from 2.0.3.3 looks similar:

2018-09-13 14:08:41Z - Profiling: Starting - ExecuteNonQuery: UPDATE “RemoteVolume” SET “VerificationCount” = MAX(1, CASE WHEN “VerificationCount” <= 0 THEN (SELECT MAX(“VerificationCount”) FROM “RemoteVolume”) ELSE “VerificationCount” + 1 END) WHERE “Name” = ?
2018-09-13 14:08:41Z - Profiling: ExecuteNonQuery: UPDATE “RemoteVolume” SET “VerificationCount” = MAX(1, CASE WHEN “VerificationCount” <= 0 THEN (SELECT MAX(“VerificationCount”) FROM “RemoteVolume”) ELSE “VerificationCount” + 1 END) WHERE “Name” = ? took 00:00:00.023
2018-09-13 14:08:42Z - Profiling: Starting - ExecuteNonQuery: DELETE FROM “LogData” WHERE “Timestamp” < ?
2018-09-13 14:08:42Z - Profiling: ExecuteNonQuery: DELETE FROM “LogData” WHERE “Timestamp” < ? took 00:00:00.000
2018-09-13 14:08:42Z - Profiling: Starting - ExecuteNonQuery: DELETE FROM “RemoteOperation” WHERE “Timestamp” < ?
2018-09-13 14:08:42Z - Profiling: ExecuteNonQuery: DELETE FROM “RemoteOperation” WHERE “Timestamp” < ? took 00:00:00.000
2018-09-13 14:08:42Z - Profiling: Running Backup took 00:00:01.296

which (to me) might mean that your job completed but perhaps the scheduler (part of the server) is confused.
What does Menu --> About --> System info --> Server state properties show? EDIT: Corrected the sequence
I’d expect a normal result to show activeTask blank and maybe proposedSchedule only showing any planned future job runs (if you have any). The format is cryptic-internal but you can turn the quoted number into a job name by substituting it onto the similar small number from a URL from a UI job operation, such as “Show log”.

(for others in the future who might stumble on this thread, Menu->About->System Info->Server state).

I’m out of town for the weekend so I’m doing this via remote desktop (cringe) but here’s what Server state properties has to say.

Server state properties

  • lastEventId : 309
  • lastDataUpdateId : 2
  • lastNotificationUpdateId : 0
  • estimatedPauseEnd : 0001-01-01T00:00:00
  • activeTask : {“Item1”:3,“Item2”:null}
  • programState : Running
  • lastErrorMessage :
  • connectionState : connected
  • xsfrerror : false
  • connectionAttemptTimer : 0
  • failedConnectionAttempts : 0
  • lastPgEvent : {“BackupID”:null,“TaskID”:3,“BackendAction”:“Get”,“BackendPath”:“duplicati-b5b228fae5a2c4825ad02bb478f23b633.dblock.zip.aes”,“BackendFileSize”:21721229,“BackendFileProgress”:21721229,“BackendSpeed”:242,“BackendIsBlocking”:false,“CurrentFilename”:null,“CurrentFilesize”:0,“CurrentFileoffset”:0,“Phase”:“Backup_Complete”,“OverallProgress”:1,“ProcessedFileCount”:228,“ProcessedFileSize”:26802068,“TotalFileCount”:228,“TotalFileSize”:26802068,“StillCounting”:false}
  • updaterState : Waiting
  • updatedVersion :
  • updateReady : false
  • updateDownloadProgress : 0
  • proposedSchedule : [{“Item1”:“1”,“Item2”:“2018-09-16T07:00:00Z”}]
  • schedulerQueueIds : [{“Item1”:4,“Item2”:“1”},{“Item1”:6,“Item2”:null}]
  • pauseTimeRemain : 0

Back from the weekend. Server state properties has only changed slightly, but it’s still hung and hasn’t done anything since Friday.

  • lastEventId : 694
  • lastDataUpdateId : 2
  • lastNotificationUpdateId : 0
  • estimatedPauseEnd : 0001-01-01T00:00:00
  • activeTask : {“Item1”:3,“Item2”:null}
  • programState : Running
  • lastErrorMessage :
  • connectionState : connected
  • xsfrerror : false
  • connectionAttemptTimer : 0
  • failedConnectionAttempts : 0
  • lastPgEvent : {“BackupID”:null,“TaskID”:3,“BackendAction”:“Get”,“BackendPath”:“duplicati-b5b228fae5a2c4825ad02bb478f23b633.dblock.zip.aes”,“BackendFileSize”:21721229,“BackendFileProgress”:21721229,“BackendSpeed”:106,“BackendIsBlocking”:false,“CurrentFilename”:null,“CurrentFilesize”:0,“CurrentFileoffset”:0,“Phase”:“Backup_Complete”,“OverallProgress”:1,“ProcessedFileCount”:228,“ProcessedFileSize”:26802068,“TotalFileCount”:228,“TotalFileSize”:26802068,“StillCounting”:false}
  • updaterState : Waiting
  • updatedVersion :
  • updateReady : false
  • updateDownloadProgress : 0
  • proposedSchedule : [{“Item1”:“1”,“Item2”:“2018-09-17T07:00:00Z”}]
  • schedulerQueueIds : [{“Item1”:4,“Item2”:“1”},{“Item1”:6,“Item2”:null},{“Item1”:8,“Item2”:null}]
  • pauseTimeRemain : 0

Well, this morning I decided to simplify even further. I deleted the Google Drive backup and created a new backup within Duplicati. It took a few attempts because every time it did something, the backend hung and I’d have to restart it. Once I turned off ‘Automatic’, I could start Duplicati and have it not immediately hang the backend. Then it complained about missing data and to run ‘repair’. I tried that, and also tried ‘repair/recreate’. After that completed successfully, I had to restart and then manually ran my backup. Again, the backend complained that a file was missing and then was hung again.

I’m pretty sure Duplicati is just not for me. It probably works fine under Windows and maybe Linux but this is a house full of Macs. It’s very unfortunate for me because it ticks almost all of my requirements and what’s missing is probably scriptable.

I’m happy to continue to debug the problem if someone is interested in giving me instructions but I need to move on to finding a replacement for Crashplan.

2018-09-17 11:43:56Z - Information: Backend event: List - Started: ()
2018-09-17 11:43:56Z - Information: Backend event: List - Completed: (2 bytes)
2018-09-17 11:43:56Z - Warning: Missing file: duplicati-b5b228fae5a2c4825ad02bb478f23b633.dblock.zip.aes
2018-09-17 11:43:56Z - Error: Found 1 files that are missing from the remote storage, please run repair
2018-09-17 11:44:07Z - Information: Backend event: List - Started: ()
2018-09-17 11:44:07Z - Information: Backend event: List - Completed: (2 bytes)
2018-09-17 11:44:07Z - Information: Rebuild database started, downloading 1 filelists
2018-09-17 11:44:07Z - Information: Backend event: Get - Started: duplicati-20180914T125645Z.dlist.zip.aes (13.11 KB)
2018-09-17 11:44:12Z - Information: Backend event: Get - Completed: duplicati-20180914T125645Z.dlist.zip.aes (13.11 KB)
2018-09-17 11:44:13Z - Information: Filelists restored, downloading 1 index files
2018-09-17 11:44:13Z - Information: Backend event: Get - Started: duplicati-ie8fcc4e18498428a9f33aee982d7f330.dindex.zip.aes (29.56 KB)
2018-09-17 11:44:13Z - Information: Backend event: Get - Completed: duplicati-ie8fcc4e18498428a9f33aee982d7f330.dindex.zip.aes (29.56 KB)
2018-09-17 11:44:13Z - Error: Remote file referenced as duplicati-b5b228fae5a2c4825ad02bb478f23b633.dblock.zip.aes, but not found in list, registering a missing remote file
2018-09-17 11:44:13Z - Information: Recreate completed, verifying the database consistency
2018-09-17 11:44:13Z - Information: Recreate completed, and consistency checks completed, marking database as complete
2018-09-17 11:44:32Z - Information: Backend event: List - Started: ()
2018-09-17 11:44:32Z - Information: Backend event: List - Completed: (2 bytes)
2018-09-17 11:44:32Z - Warning: Missing file: duplicati-b5b228fae5a2c4825ad02bb478f23b633.dblock.zip.aes
2018-09-17 11:44:32Z - Error: Found 1 files that are missing from the remote storage, please run repair
2018-09-17 11:44:32Z - Error: Fatal error
Duplicati.Library.Interface.UserInformationException: Found 1 files that are missing from the remote storage, please run repair
at Duplicati.Library.Main.Operation.FilelistProcessor.VerifyRemoteList (Duplicati.Library.Main.BackendManager backend, Duplicati.Library.Main.Options options, Duplicati.Library.Main.Database.LocalDatabase database, Duplicati.Library.Main.IBackendWriter log, System.String protectedfile) [0x001aa] in :0
at Duplicati.Library.Main.Operation.BackupHandler.PreBackupVerify (Duplicati.Library.Main.BackendManager backend, System.String protectedfile) [0x000fd] in :0
at Duplicati.Library.Main.Operation.BackupHandler.Run (System.String sources, Duplicati.Library.Utility.IFilter filter) [0x003c6] in :0

Macs do get used. https://usage-reporter.duplicati.com/ shows about 2000 backups a day (if I read that right). Windows is the biggest use, with Linux use somewhere in between. Here I spot a few Mac users who help out with many things (and I might try to get involved in this debug session because I’m getting a bit past my skills).

I hinted earlier at how one might translate numbers to backup job names via URL substitution, and I think that what you would plug in (based on this) would be Item2 # for activeTask and schedulerQueueIds and Item1 for proposedSchedule. That might lead you to a backup (i.e. you’re stuck in 3 now, and have 1 scheduled to run when/if 3 finishes, and due to be scheduled again at 2018-09-17T07:00:00Z), but the work to run is piling up.

Let me be more specific about how I think you can map backup numbers into human-recognizable job names.

If I run “Show log” on a job I have scheduled, URL is localhost:8200/ngax/index.html#/log/2 (notice “2” at end)
The reason I think “2” is a backup id is because “200” gives “Failed to connect: Invalid or missing backup id”. Plugging in a valid backup id at the end of whatever URL you get (details may vary) should get the job name.

Though you could still try to map backup numbers to names (to see if what I just said makes any sense), you have queued tasks that don’t seem to have an associated backup number, and I’m not sure how to see them.

Before leaving this train for a bit, do you have anything after the backup, e.g. reports by email or something? Earlier logging seemed to show you getting to the end of the backup, and yet the backup seems to be active.

Pushing some automatic scheduling aside can be done by running a web UI job Export --> As Command-line then using that. I think it runs mono for you. If it runs it wrong, peel off the mono and use /usr/bin/duplicati-cli. You can add additional logging options as desired. What I hope happens is when backup is done, you return to your shell prompt without any mysterious hangs or next activities like the web UI is seemingly getting into…

Perhaps what you can do is to have the web UI make the simplest possible backup to an existing local folder, not let the web UI run it, and export it to run manually. If it works, try something more complex. And thank you. Sometimes going through the debug is what it takes to get to the answer. If you have several machines, you could of course pick one for this chase (as long as we’re up for it), and one for testing other backup options.

Having just made a pitch for debug, I’m also suggesting a new job because I don’t want to also be debugging whatever led to the repair. In test mode we have the luxury of trying to pick one issue at a time, e.g. the hang.

Let me see if @JonMikelV has anything to add, at least on OSX (which I don’t run, so can’t say much about).

Hello @hpeyerl, welcome to the forum - and thanks for being willing to debug!

I’ve you a Mac running Duplicati (since 2.0.2.something) and only ever had issues with my local USB backup when I forgot to attach the drive - my SFTP process works fine.

I don’t recall seeing an export of your job settings, but one of your posts from 7d ago had something about downloading over 1,000M. Do you have a custom --dblock (Update volume) size?

The “log in use” error was likely due to trying to include the log file in the backup itself. Are you running as a daemon / root or just the logged in user?

The local test backup suggestion from @ts678 is good as that can isolate Duplicati from possible network issues. There is a known bug where losing connecting while uploading a file causes stalling similar to what you’re seeing.

Oh, and yes - the --console-log-level setting does do something with the GUI if you use the GUI command-line option. Of course it’s still scrolling through the browser so I feel works best with a targeted console filter level command. Note that I believe all the console-log stuff came in AFTER 2.0.3.3 beta (but I might be misremembering).

You remembered correctly, but it’s confusing because it came in at 2.0.3.2 in the canary sequence. How-to.

This reminds me that there’s a chance 2.0.3.11 is better. If so, Beta users can maybe wait for the next Beta.

Checklist of items to be fixed is shrinking. I’m not sure if the repair issue is a regression, or just never great.

I woke up late today so didn’t get any debugging time. Just thought I’d drop a note to say I’ve seen the further instructions and will get to them tomorrow morning.

I’m not sure why we’re trying to find the backup name. I appreciate the information but I’m not sure what to do with it. I only ever have one backup configured.

The last backup I configured that I reported on above was a small 25MB folder and I backed it up to a locally mapped folder on my NAS.

I don’t see anything after the backup. No email reports (though that is configured). But yes, the backup appears to be completing and then the backend hangs, so presumably there’s some internal thread that has gotten wedged and the main thread is blocked waiting for it. I’m guessing on the internal structure of the code however, but it smells like it.

JonMikeIV: In the early debugging, I set the --dblock to 1G based on something I read in the docs somewhere. For the more recent testing, I lowered it to 100M. And yes, I surmised that the error on duplicati.log was because it was open for Write. I didn’t think that error was salient but included it for the sake of completeness. I’m just logged in as myself and I launch Duplicati either from Finder or, for the sake of debugging, from the shell in /Applications/Duplicati.app/Contents/Resources with “./duplicati”

I’ll hopefully be back with more results tomorrow morning.

Thanks.

I wondered if email (or similar) was configured to run because of Duplicati web itnerface hangs on “Finished!”, which is also why I’m suggesting any test backup be as simple as possible initially, before going to complexity.

The backup name only matters if it helps with debug. And I could have done the below analysis earlier. Sorry.
Discovering what’s going on (even without names) was the objective, because I didn’t know what you had set up, yet something seemed like it might have been getting in your way and making you wait. I wanted to know. This was also a chance to see if internal ID #s can successfully be mapped into job names using URL editing. The desire arises from time to time, but the current “Server State properties” web output isn’t human-friendly.

proposedSchedule : [{“Item1”:“1”,“Item2”:“2018-09-17T07:00:00Z”}] would be a scheduled job.
schedulerQueueIds : [{“Item1”:4,“Item2”:“1”} says the job is already in the queue to run (backlog).
activeTask : {“Item1”:3,“Item2”:null} seems to say it’s stuck on an internal task that’s not a backup. Unfortunately I haven’t yet found a way to turn task numbers into anything that would actually help debugging.

–dblock-size advice can be found at Choosing sizes in Duplicati although I doubt it’s related to current hangs.

I only configured the email later as an experiment to see if it being unconfigured was the cause of the hanging. I’ll certainly unconfigure it tomorrow but I suspect it’s not the cause of the problem.

[Skip to the embarassing conclusion by going down to “SOLVED”]

This morning I tried to re-run the backup to NAS folder that I’d done the other day but it complained the database was in need of repair. I tried to ‘repair’ and ‘delete and recreate’ a few times but kept getting errors so eventually I just blew away the ~/.config/Duplicati/ sqlite file corresponding to that backup. Then I exported the job to commandline and ran it in a terminal. It ran successfully a number of times to completion. So I tried ‘run now’ in the GUI and it ran to completion successfully a couple of times. This is something I’d never seen before so that was encouraging. I removed the ‘–send-mail-to=’ option when I did that so that was interesting but I was dubious that it was related.

So then I created a new backup but to my googledrive. Same options as the local-nas backup that was successful, only a different destination. I exported to commandline again and ran it in terminal and it ran successfully to completion:

mono /Applications/Duplicati.app/Contents/Resources/Duplicati.CommandLine.exe backup “googledrive://Backups/test3?authid=ce0f6fb86a37fe5d93e0333c81fc6a52%3Ap_N0m”’!’“7Us7s6Oz9kFnZaR” /Users/hpeyerl/Software/ --backup-name=googledrivetest --dbpath=/Users/hpeyerl/.config/Duplicati/LKPUZURPWS.sqlite e=aes --compression-module=zip --dblock-size=100MB --passphrase=mumble --log-level=Information --log-file=duplicati.log --default-filters=OSX --disable-module=console-password-input
Backup started at 9/19/2018 6:22:40 AM
Checking remote backup …
Listing remote folder …
Scanning local files …
228 files need to be examined (25.56 MB)
0 files need to be examined (0 bytes)
removing temp files, as no data needs to be uploaded
Checking remote backup …
Listing remote folder …
removing file listed as Temporary: duplicati-bd49f09fe351e4f75b6beb11ad8d7d056.dblock.zip.aes
removing file listed as Temporary: duplicati-i729e9c4547dd4a39ad31d43e66370272.dindex.zip.aes
Verifying remote backup …
Remote backup verification completed
Downloading file (13.11 KB) …
Downloading file (29.56 KB) …
Downloading file (20.71 MB) …
Duration of backup: 00:00:08
Remote files: 3
Remote size: 20.76 MB
Total remote quota: 1.00 TB
Available remote quota: 353.27 MB
Files added: 0
Files deleted: 0
Files changed: 0
Data uploaded: 0 bytes
Data downloaded: 20.76 MB
Backup completed successfully!

Encouraged, I ran it again but added a --send-mail-to and the job did not complete.

mono /Applications/Duplicati.app/Contents/Resources/Duplicati.CommandLine.exe backup “googledrive://Backups/test3?authid=mumble” /Users/hpeyerl/Software/ --backup-name=googledrivetest --dbpath=/Users/hpeyerl/.config/Duplicati/LKPUZURPWS.sqlite e=aes --compression-module=zip --dblock-size=100MB --passphrase=mumble --log-level=Information --log-file=duplicati.log --default-filters=OSX --disable-module=console-password-input --send-mail-to="hpeyerl+duplicati@beer.org"
Backup started at 9/19/2018 6:23:19 AM
Checking remote backup …
Listing remote folder …
Scanning local files …
228 files need to be examined (25.56 MB)
0 files need to be examined (0 bytes)
removing temp files, as no data needs to be uploaded
Checking remote backup …
Listing remote folder …
removing file listed as Temporary: duplicati-b036751bac0534fe9830cb979173bf7d4.dblock.zip.aes
removing file listed as Temporary: duplicati-ic523c59ec8194685944e0069a7ed8997.dindex.zip.aes
Verifying remote backup …
Remote backup verification completed
Downloading file (13.11 KB) …
Downloading file (29.56 KB) …
Downloading file (20.71 MB) …

hang for >10 minutes.

However, while in that state, I then went to the web interface and clicked ‘Run now’ on the same job (with no email configured) and it ran to completion, while the aforementioned command line job was hung. I was able to do that a few times, including CommandLine within the browser (not to be confused with the above command line ones which were done in a terminal session):

Backup started at 9/19/2018 6:37:26 AM
Checking remote backup …
Listing remote folder …
Scanning local files …
224 files need to be examined (15.77 MB)
0 files need to be examined (0 bytes)
removing temp files, as no data needs to be uploaded
Checking remote backup …
Listing remote folder …
removing file listed as Temporary: duplicati-bedc3ac53d1004f1c8178be5ca112a9ff.dblock.zip.aes
removing file listed as Temporary: duplicati-i45dea23f32a94a7dbdc068ea89b015a1.dindex.zip.aes
Verifying remote backup …
Remote backup verification completed
Downloading file (13.11 KB) …
Downloading file (29.56 KB) …
Downloading file (20.71 MB) …
Duration of backup: 00:00:08
Remote files: 3
Remote size: 20.76 MB
Total remote quota: 1.00 TB
Available remote quota: 353.27 MB
Files added: 0
Files deleted: 0
Files changed: 0
Data uploaded: 0 bytes
Data downloaded: 20.76 MB
Backup completed successfully!
Return code: 1

So this is new because I’ve not been able to do that before. The only change is really the email which is curious. For the sake of completion, I re-ran the same backup again in the browser in CommandLine, with the --send-mail-to= option and curiously it hung in the same way as the run that was hung in the terminal.

SOLVED
After this result, I dug into the mail thing and realized that I hadn’t set up all the rest of the options. I do remember when initially configuring Duplicati, I just somehow turned on email status reports and then the default recipient address was some bogus default that I changed to my own recipient. In the back of my mind I wondered how it would know which mail server to use, etc but assumed it would just use my local mail client to do the sending… This morning I noticed there was additional options including the smtp server URL, etc. After setting that, my mail server auth info, I was able to re-run the job and it completed successfully and I got an email.

Backup started at 9/19/2018 6:48:38 AM
Checking remote backup …
Listing remote folder …
Scanning local files …
228 files need to be examined (25.56 MB)
0 files need to be examined (0 bytes)
removing temp files, as no data needs to be uploaded
Checking remote backup …
Listing remote folder …
removing file listed as Temporary: duplicati-b1f9f896518c841a08447c8d5f6d95038.dblock.zip.aes
removing file listed as Temporary: duplicati-ib129573459c247b8b74362b0a689d6a3.dindex.zip.aes
Verifying remote backup …
Remote backup verification completed
Downloading file (13.11 KB) …
Downloading file (29.56 KB) …
Downloading file (20.71 MB) …
Email sent successfully using server: smtp://mumble:587/?starttls=when-available
Duration of backup: 00:00:09
Remote files: 3
Remote size: 20.76 MB
Total remote quota: 1.00 TB
Available remote quota: 353.27 MB
Files added: 0
Files deleted: 0
Files changed: 0
Data uploaded: 0 bytes
Data downloaded: 20.76 MB
Backup completed successfully!

We can see that the mail gets sent after the point where the other backups hung. I was able to reproduce the same hang in the web browser by incorrectly configuring the mail server.

That left me to wonder how to reproduce what I originally saw when I installed Duplicati. Because I don’t remember going to “advanced options” at the time and thought that “send mail” was already there and populated with a bogus address. So I killed Duplicati, the process, and blew away $HOME/.config/duplicati, and then restarted Duplicati… I was in a clean configuration but I was not able to find the pre-populated default mail recipient that I remembered from when I first installed Duplicati. I also browsed through github looking for the change but didn’t find it.

In Summary, I believe that it was an incompletely configured email setup in the options that was the entire cause of the problem. It would have been helpful if Duplicati had said something like “Emailing Status report” prior to attempting to send the mail but that’s about it. I guess it would also be useful to have some sanity checking in the email config and/or a timeout. The rest is my fault.

Thanks for all of your help. I also now understand better how Duplicati is put together.

Just a brief update to close this thread. Duplicati is now working fine with my full 160GB backup to Google Drive and sending me automated mails every day.

Thanks for the help.

1 Like

Completely agree! :smiley:

(And glad you figured it out!)