Still displays "Waiting for upload to finish" 12 hours after logging CommitAfterUpload

This is my first attempt to run Duplicati. I installed version 2.0.5.1_beta_2020-01-18 on Linux Mint 19.3 and configured mega.nz as destination for a backup of approximately 29GB in 9000 files. It took a little more than 11 hours to upload all the data files. Now, 12 hours later, it’s still displaying the “Waiting for upload to finish” message, and no more files have been created in the mega.nz folder.

The last two entries in the live log have the same timestamp (to the nearest minute) as the last files created. The log entries are:

  • Starting - CommitAfterUpload
  • CommitAfterUpload took 0:00:00:00.060

That sounds to me as if it’s actually complete. But Duplicati doesn’t seem to know it.

I found this 2-year-old thread indicating that the cure for this quirk with mega.nz is to set --upload-verification-file=false. But I see that false is now the default setting, and I didn’t change it. So that isn’t the answer.

As far as I can tell there is still network traffic passing back and forth. But there are no new log entries or new files since CommitAfterUpload.

Edited to add: No, there isn’t any network traffic. I used a better, per-process monitor. Duplicati isn’t generating any traffic at all. It’s stuck.

I’d appreciate any suggestions. Is there any command I could give that might force it to recognize that it really is complete? Shall I give it another 12 hours? (I’m going to do that anyway.)

TIA.
AZSteve

Others have reported issues with mega.nz recently, such as in this thread.

Seems like mega.nz just isn’t that great with 3rd party applications. I might suggest a different back end, if possible.

1 Like

Thanks for the reply! I eventually stopped the backup it thought it was still doing, rebooted the computer, and restarted Duplicati. Initially it went right back to the “Waiting for upload to finish” message, and I left it to sit overnight.

This morning I find that it completed a small incremental backup about 30 minutes later. Now it sees two backups in the mega.nz folder. I guess that’s a good result.

The log page for that backup shows only one entry, the second, incremental backup. (I don’t know whether that’s normal, but it seems to me there should be one for the first too.) It completed with two warnings:

  1. [Warning-Duplicati.Library.Modules.Builtin.CheckMonoSSL-MissingCerts]: No certificates found…
  2. [Warning-Duplicati.Library.Main.Operation.Backup.FileBlockProcessor.FileEntry-PathProcessingFailed]: Failed to process path:…

Those don’t sound fatal. Sometime I might try restoring a couple of inessential files to see whether the backup is worth anything. In the meantime, I’m trying the same backup with Amazon S3. If that works better, I’ll consider ditching mega.nz.

Thanks again. I’ll leave this open for a bit to see if anyone has further insight to contribute.

I don’t have MEGA, but I’m wondering if you’re on a paid plan or a free one back when it was 50GB.
Free plans have a transfer quota that’s small enough it could pause a backup the size that you tried.
The library that Duplicati uses looks like it retries, and the retry plan got tweaked in a newer version.

Reached transfer limit but it isn’t lowering

Free accounts are provided with a transfer allowance (uploads plus downloads) that varies depending on our system utilization.

Change default retry values to complete before user’s demise #135

The default retry settings are exponential and take an exorbitant amount of time to finish. They should be changed to something more reasonable.

v1.8.0

gpailler released this 12 days ago
#135 / #143 - Changed default retry values and allow full customization of the retry delay

1 Like

Thanks for that info. I have a 400 GB paid plan that hasn’t come close to exceeding its storage or bandwidth limits.

Does this mean that Duplicati’s core “retry-delay” option is likely to get passed to the mega.nz library that newly supports this capability? I’m guessing not, since the current Duplicati beta is several months old and the library change was more recent.

Maybe that should be a feature request for Duplicati?

Just posted in Features. Thanks again, @ts678.

I’d agree with your release-timing-based guess. It’s not even clear exactly what the MEGA problem is.

If you were watching with a network monitor having very high precision (e.g. down to bytes), you may possibly spot a library-level retry, or (instead of a counter) a Wireshark trace would likely show activity.

The default Duplicati –retry-delay of 10 seconds (fixed) isn’t quite right for services like Backblaze B2 which request an exponential backoff between retries when faced with the occasional 503 error which normally occurs when a particular system is busy and wants you to go elsewhere. These are “normal” and the MEGA behavior also sounds “normal” (if annoying due to potentially very long time to restart).

You can easily see Duplicati-level retries in the live log at About → Show log → Live → Retry, which would probably be worth doing, if you want to help figure out what’s up with the original hanging issue.

This won’t show retries inside the library, because I think all Duplicati is doing is the call shown below:

1 Like

After recreating the database I was able to run the MEGA backup again, requiring only minor incremental changes. I watched the live log and also monitored (with Nethogs) the process running mono-sgen for network usage. In this instance of a very short run (9 minutes), it’s easy to see what events were logged from the time the “Waiting for upload to finish” message displayed until completion (about 6 minutes).

I’ll paste the complete log below. “Waiting for upload to finish” displayed after the event at timestamp 13:21:43. I could see there was still significant upload activity from the process, which continued until completion. I’m not sure why Duplicati considers the events beginning at 13:21:43 to be in a different category (i.e. “waiting for upload to finish”) than previous ones.

If anyone is interested in sorting it out, here’s the log.

{ "DeletedFiles": 0, "DeletedFolders": 0, "ModifiedFiles": 0, "ExaminedFiles": 8761, "OpenedFiles": 3, "AddedFiles": 3, "SizeOfModifiedFiles": 0, "SizeOfAddedFiles": 78194675, "SizeOfExaminedFiles": 28676855586, "SizeOfOpenedFiles": 78194675, "NotProcessedFiles": 0, "AddedFolders": 0, "TooLargeFiles": 0, "FilesWithError": 0, "ModifiedFolders": 0, "ModifiedSymlinks": 0, "AddedSymlinks": 0, "DeletedSymlinks": 0, "PartialBackup": false, "Dryrun": false, "MainOperation": "Backup", "CompactResults": null, "VacuumResults": null, "DeleteResults": { "DeletedSetsActualLength": 0, "DeletedSets": [], "Dryrun": false, "MainOperation": "Delete", "CompactResults": null, "ParsedResult": "Success", "Version": "2.0.5.1 (2.0.5.1_beta_2020-01-18)", "EndTime": "2020-04-19T20:29:53.707033Z", "BeginTime": "2020-04-19T20:29:52.698191Z", "Duration": "00:00:01.0088420", "MessagesActualLength": 0, "WarningsActualLength": 0, "ErrorsActualLength": 0, "Messages": null, "Warnings": null, "Errors": null, "BackendStatistics": { "RemoteCalls": 10, "BytesUploaded": 78904753, "BytesDownloaded": 53216887, "FilesUploaded": 5, "FilesDownloaded": 3, "FilesDeleted": 0, "FoldersCreated": 0, "RetryAttempts": 0, "UnknownFileSize": 0, "UnknownFileCount": 0, "KnownFileCount": 982, "KnownFileSize": 25555112222, "LastBackupDate": "2020-04-19T13:21:04-07:00", "BackupListCount": 3, "TotalQuotaSpace": 0, "FreeQuotaSpace": 0, "AssignedQuotaSpace": -1, "ReportedQuotaError": false, "ReportedQuotaWarning": false, "MainOperation": "Backup", "ParsedResult": "Success", "Version": "2.0.5.1 (2.0.5.1_beta_2020-01-18)", "EndTime": "0001-01-01T00:00:00", "BeginTime": "2020-04-19T20:21:04.467914Z", "Duration": "00:00:00", "MessagesActualLength": 0, "WarningsActualLength": 0, "ErrorsActualLength": 0, "Messages": null, "Warnings": null, "Errors": null } }, "RepairResults": null, "TestResults": { "MainOperation": "Test", "VerificationsActualLength": 3, "Verifications": [ { "Key": "duplicati-20200416T191502Z.dlist.zip.aes", "Value": [] }, { "Key": "duplicati-i91f77e3a476a4441b31382ece36dadbb.dindex.zip.aes", "Value": [] }, { "Key": "duplicati-b252e06d75cfb45e2a7097d7b30904840.dblock.zip.aes", "Value": [] } ], "ParsedResult": "Success", "Version": "2.0.5.1 (2.0.5.1_beta_2020-01-18)", "EndTime": "2020-04-19T20:30:18.405193Z", "BeginTime": "2020-04-19T20:29:58.123334Z", "Duration": "00:00:20.2818590", "MessagesActualLength": 0, "WarningsActualLength": 0, "ErrorsActualLength": 0, "Messages": null, "Warnings": null, "Errors": null, "BackendStatistics": { "RemoteCalls": 10, "BytesUploaded": 78904753, "BytesDownloaded": 53216887, "FilesUploaded": 5, "FilesDownloaded": 3, "FilesDeleted": 0, "FoldersCreated": 0, "RetryAttempts": 0, "UnknownFileSize": 0, "UnknownFileCount": 0, "KnownFileCount": 982, "KnownFileSize": 25555112222, "LastBackupDate": "2020-04-19T13:21:04-07:00", "BackupListCount": 3, "TotalQuotaSpace": 0, "FreeQuotaSpace": 0, "AssignedQuotaSpace": -1, "ReportedQuotaError": false, "ReportedQuotaWarning": false, "MainOperation": "Backup", "ParsedResult": "Success", "Version": "2.0.5.1 (2.0.5.1_beta_2020-01-18)", "EndTime": "0001-01-01T00:00:00", "BeginTime": "2020-04-19T20:21:04.467914Z", "Duration": "00:00:00", "MessagesActualLength": 0, "WarningsActualLength": 0, "ErrorsActualLength": 0, "Messages": null, "Warnings": null, "Errors": null } }, "ParsedResult": "Warning", "Version": "2.0.5.1 (2.0.5.1_beta_2020-01-18)", "EndTime": "2020-04-19T20:30:18.630867Z", "BeginTime": "2020-04-19T20:21:04.467909Z", "Duration": "00:09:14.1629580", "MessagesActualLength": 27, "WarningsActualLength": 1, "ErrorsActualLength": 0, "Messages": [ "2020-04-19 13:21:04 -07 - [Information-Duplicati.Library.Main.Controller-StartingOperation]: The operation Backup has started", "2020-04-19 13:21:05 -07 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: List - Started: ()", "2020-04-19 13:21:10 -07 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: List - Completed: (977 bytes)", "2020-04-19 13:21:20 -07 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Started: duplicati-b87d64b6e297f49c8883bcc4eeb385bc4.dblock.zip.aes (49.97 MB)", "2020-04-19 13:21:24 -07 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Started: duplicati-b57addedad254466ab39233f708a88c3f.dblock.zip.aes (24.46 MB)", "2020-04-19 13:21:25 -07 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Started: duplicati-20200419T202104Z.dlist.zip.aes (779.14 KB)", "2020-04-19 13:21:43 -07 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Completed: duplicati-20200419T202104Z.dlist.zip.aes (779.14 KB)", "2020-04-19 13:24:17 -07 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Completed: duplicati-b57addedad254466ab39233f708a88c3f.dblock.zip.aes (24.46 MB)", "2020-04-19 13:24:17 -07 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Started: duplicati-i79568b5f0da74365aa057a6008e8ce60.dindex.zip.aes (21.09 KB)", "2020-04-19 13:24:20 -07 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Completed: duplicati-i79568b5f0da74365aa057a6008e8ce60.dindex.zip.aes (21.09 KB)", "2020-04-19 13:29:51 -07 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Completed: duplicati-b87d64b6e297f49c8883bcc4eeb385bc4.dblock.zip.aes (49.97 MB)", "2020-04-19 13:29:51 -07 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Started: duplicati-i260d80b05e9547419fc1c93b176f0446.dindex.zip.aes (42.64 KB)", "2020-04-19 13:29:52 -07 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Completed: duplicati-i260d80b05e9547419fc1c93b176f0446.dindex.zip.aes (42.64 KB)", "2020-04-19 13:29:52 -07 - [Information-Duplicati.Library.Main.Operation.DeleteHandler:RetentionPolicy-StartCheck]: Start checking if backups can be removed", "2020-04-19 13:29:52 -07 - [Information-Duplicati.Library.Main.Operation.DeleteHandler:RetentionPolicy-FramesAndIntervals]: Time frames and intervals pairs: 7.00:00:00 / 1.00:00:00, 28.00:00:00 / 7.00:00:00, 365.00:00:00 / 31.00:00:00", "2020-04-19 13:29:52 -07 - [Information-Duplicati.Library.Main.Operation.DeleteHandler:RetentionPolicy-BackupList]: Backups to consider: 4/18/2020 12:38:05 AM, 4/16/2020 12:15:02 PM", "2020-04-19 13:29:52 -07 - [Information-Duplicati.Library.Main.Operation.DeleteHandler:RetentionPolicy-BackupsToDelete]: Backups outside of all time frames and thus getting deleted: ", "2020-04-19 13:29:52 -07 - [Information-Duplicati.Library.Main.Operation.DeleteHandler:RetentionPolicy-AllBackupsToDelete]: All backups to delete: ", "2020-04-19 13:29:53 -07 - [Information-Duplicati.Library.Main.Operation.DeleteHandler-DeleteResults]: No remote filesets were deleted", "2020-04-19 13:29:53 -07 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: List - Started: ()" ], "Warnings": [ "2020-04-19 13:21:13 -07 - [Warning-Duplicati.Library.Main.Operation.Backup.FileBlockProcessor.FileEntry-PathProcessingFailed]: Failed to process path: /home/steve/Documents/politics/ja/vigil 2019-07-12/MyKey.key" ], "Errors": [], "BackendStatistics": { "RemoteCalls": 10, "BytesUploaded": 78904753, "BytesDownloaded": 53216887, "FilesUploaded": 5, "FilesDownloaded": 3, "FilesDeleted": 0, "FoldersCreated": 0, "RetryAttempts": 0, "UnknownFileSize": 0, "UnknownFileCount": 0, "KnownFileCount": 982, "KnownFileSize": 25555112222, "LastBackupDate": "2020-04-19T13:21:04-07:00", "BackupListCount": 3, "TotalQuotaSpace": 0, "FreeQuotaSpace": 0, "AssignedQuotaSpace": -1, "ReportedQuotaError": false, "ReportedQuotaWarning": false, "MainOperation": "Backup", "ParsedResult": "Success", "Version": "2.0.5.1 (2.0.5.1_beta_2020-01-18)", "EndTime": "0001-01-01T00:00:00", "BeginTime": "2020-04-19T20:21:04.467914Z", "Duration": "00:00:00", "MessagesActualLength": 0, "WarningsActualLength": 0, "ErrorsActualLength": 0, "Messages": null, "Warnings": null, "Errors": null } }

It’s only relatively complete (compared to others there) and is length-limited and organized by category.
You likely saw more in 'live log", and for the tidiest log (and needing most setup) there are –log-file and –log-file-log-level=retry Advanced options. If you paste a chunk with ``` above and below, it’d look nice, however the log here was also a success, right? We’re trying to work out where it is when/if it delays…

is probably when it says “Waiting for upload to finish”, and even in your log you can see it’s uploading final few dblock (file data) and dindex (index to dblock) files. It does a directory list to see how it looks.

Right. I thought it might help to identify what Duplicati is trying to do during the “waiting…” interval. Even in this very short backup, that was about 6 minutes.

Well, I surrounded that one with [code] tags instead, which I believe is the same thing. I think the reason there are no line breaks is I pasted from a Linux system, which uses LF rather than CRLF. In the text editor from which I pasted the log, it does have line breaks.

Possibly pasting into an editor and then copying out would get right line endings? Manually changed:

"2020-04-19 13:21:05 -07 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: List - Started: ()",
"2020-04-19 13:21:10 -07 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: List - Completed: (977 bytes)",
"2020-04-19 13:21:20 -07 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Started: duplicati-b87d64b6e297f49c8883bcc4eeb385bc4.dblock.zip.aes (49.97 MB)",
"2020-04-19 13:21:24 -07 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Started: duplicati-b57addedad254466ab39233f708a88c3f.dblock.zip.aes (24.46 MB)",
"2020-04-19 13:21:25 -07 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Started: duplicati-20200419T202104Z.dlist.zip.aes (779.14 KB)",
"2020-04-19 13:21:43 -07 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Completed: duplicati-20200419T202104Z.dlist.zip.aes (779.14 KB)",
"2020-04-19 13:24:17 -07 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Completed: duplicati-b57addedad254466ab39233f708a88c3f.dblock.zip.aes (24.46 MB)",
"2020-04-19 13:24:17 -07 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Started: duplicati-i79568b5f0da74365aa057a6008e8ce60.dindex.zip.aes (21.09 KB)",
"2020-04-19 13:24:20 -07 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Completed: duplicati-i79568b5f0da74365aa057a6008e8ce60.dindex.zip.aes (21.09 KB)",
"2020-04-19 13:29:51 -07 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Completed: duplicati-b87d64b6e297f49c8883bcc4eeb385bc4.dblock.zip.aes (49.97 MB)",
"2020-04-19 13:29:51 -07 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Started: duplicati-i260d80b05e9547419fc1c93b176f0446.dindex.zip.aes (42.64 KB)",
"2020-04-19 13:29:52 -07 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Completed: duplicati-i260d80b05e9547419fc1c93b176f0446.dindex.zip.aes (42.64 KB)",

So all this shows is the 5 uploads. I don’t know what your connection speed is, or how fast MEGA is.
Short delays could be caused by network issues (dropped packets, etc.). 12 hours is hard to explain.

I have tried Mega, only free version, but Mega was significantly slower than any other service (Google Drive, OneDrive, BackBlaze, JottaCloud…), and I gave up using that service with Mega.

Try connecting to a free Google Drive account (15 GB) and check if there’s a difference.

Rickard