This is my first attempt to run Duplicati. I installed version 220.127.116.11_beta_2020-01-18 on Linux Mint 19.3 and configured mega.nz as destination for a backup of approximately 29GB in 9000 files. It took a little more than 11 hours to upload all the data files. Now, 12 hours later, it’s still displaying the “Waiting for upload to finish” message, and no more files have been created in the mega.nz folder.
The last two entries in the live log have the same timestamp (to the nearest minute) as the last files created. The log entries are:
Starting - CommitAfterUpload
CommitAfterUpload took 0:00:00:00.060
That sounds to me as if it’s actually complete. But Duplicati doesn’t seem to know it.
I found this 2-year-old thread indicating that the cure for this quirk with mega.nz is to set --upload-verification-file=false. But I see that false is now the default setting, and I didn’t change it. So that isn’t the answer.
As far as I can tell there is still network traffic passing back and forth. But there are no new log entries or new files since CommitAfterUpload.
Edited to add: No, there isn’t any network traffic. I used a better, per-process monitor. Duplicati isn’t generating any traffic at all. It’s stuck.
I’d appreciate any suggestions. Is there any command I could give that might force it to recognize that it really is complete? Shall I give it another 12 hours? (I’m going to do that anyway.)
Thanks for the reply! I eventually stopped the backup it thought it was still doing, rebooted the computer, and restarted Duplicati. Initially it went right back to the “Waiting for upload to finish” message, and I left it to sit overnight.
This morning I find that it completed a small incremental backup about 30 minutes later. Now it sees two backups in the mega.nz folder. I guess that’s a good result.
The log page for that backup shows only one entry, the second, incremental backup. (I don’t know whether that’s normal, but it seems to me there should be one for the first too.) It completed with two warnings:
[Warning-Duplicati.Library.Modules.Builtin.CheckMonoSSL-MissingCerts]: No certificates found…
[Warning-Duplicati.Library.Main.Operation.Backup.FileBlockProcessor.FileEntry-PathProcessingFailed]: Failed to process path:…
Those don’t sound fatal. Sometime I might try restoring a couple of inessential files to see whether the backup is worth anything. In the meantime, I’m trying the same backup with Amazon S3. If that works better, I’ll consider ditching mega.nz.
Thanks again. I’ll leave this open for a bit to see if anyone has further insight to contribute.
I don’t have MEGA, but I’m wondering if you’re on a paid plan or a free one back when it was 50GB.
Free plans have a transfer quota that’s small enough it could pause a backup the size that you tried.
The library that Duplicati uses looks like it retries, and the retry plan got tweaked in a newer version.
Thanks for that info. I have a 400 GB paid plan that hasn’t come close to exceeding its storage or bandwidth limits.
Does this mean that Duplicati’s core “retry-delay” option is likely to get passed to the mega.nz library that newly supports this capability? I’m guessing not, since the current Duplicati beta is several months old and the library change was more recent.
Maybe that should be a feature request for Duplicati?
I’d agree with your release-timing-based guess. It’s not even clear exactly what the MEGA problem is.
If you were watching with a network monitor having very high precision (e.g. down to bytes), you may possibly spot a library-level retry, or (instead of a counter) a Wireshark trace would likely show activity.
The default Duplicati –retry-delay of 10 seconds (fixed) isn’t quite right for services like Backblaze B2 which request an exponential backoff between retries when faced with the occasional 503 error which normally occurs when a particular system is busy and wants you to go elsewhere. These are “normal” and the MEGA behavior also sounds “normal” (if annoying due to potentially very long time to restart).
You can easily see Duplicati-level retries in the live log at About --> Show log --> Live --> Retry, which would probably be worth doing, if you want to help figure out what’s up with the original hanging issue.
This won’t show retries inside the library, because I think all Duplicati is doing is the call shown below:
After recreating the database I was able to run the MEGA backup again, requiring only minor incremental changes. I watched the live log and also monitored (with Nethogs) the process running mono-sgen for network usage. In this instance of a very short run (9 minutes), it’s easy to see what events were logged from the time the “Waiting for upload to finish” message displayed until completion (about 6 minutes).
I’ll paste the complete log below. “Waiting for upload to finish” displayed after the event at timestamp 13:21:43. I could see there was still significant upload activity from the process, which continued until completion. I’m not sure why Duplicati considers the events beginning at 13:21:43 to be in a different category (i.e. “waiting for upload to finish”) than previous ones.
If anyone is interested in sorting it out, here’s the log.
It’s only relatively complete (compared to others there) and is length-limited and organized by category.
You likely saw more in 'live log", and for the tidiest log (and needing most setup) there are –log-file and –log-file-log-level=retry Advanced options. If you paste a chunk with ``` above and below, it’d look nice, however the log here was also a success, right? We’re trying to work out where it is when/if it delays…
is probably when it says “Waiting for upload to finish”, and even in your log you can see it’s uploading final few dblock (file data) and dindex (index to dblock) files. It does a directory list to see how it looks.
Right. I thought it might help to identify what Duplicati is trying to do during the “waiting…” interval. Even in this very short backup, that was about 6 minutes.
Well, I surrounded that one with [code] tags instead, which I believe is the same thing. I think the reason there are no line breaks is I pasted from a Linux system, which uses LF rather than CRLF. In the text editor from which I pasted the log, it does have line breaks.
So all this shows is the 5 uploads. I don’t know what your connection speed is, or how fast MEGA is.
Short delays could be caused by network issues (dropped packets, etc.). 12 hours is hard to explain.