Sorry for being unable to follow up for a good while as I was on a visit to a parallel universe.
I will try to answer my best now as I’ve lost or can’t find some files related to the diagnostic of this issue.
Right now, what I’ve heard is that “On a certain backup destination” List is “some times” slow. Need more info.
Well, one of my backup destinations is through a bit unstable connection over VPN to a home DSL line in another continent. I know it’s unstable because some pings time out some times. Aside from that the connection isn’t that bad, because after the aforementioned delay in the List() command, the upload goes nicely at 300 KBps which is almost my top upload speed, and can go on for hours without a noticeable issue.
A yield return appears to be a “pull” mechanism, so a concern I had was whether possibly Duplicati wasn’t iterating fast enough and some ssh thing timed out…
I believe Duplicati processing is not slow at all. The system appears in total rest when the issue happens.
To be fair, the occurrence of this problem has been rare recently, but I have upgraded to the most recent canary version as of this writing and I noticed the issue happening once.
Other jobs run just fine, but they have either local destinations (on same PC) or over SSH on a LAN machine, but over the same VPN mentioned above.
What does “debug output” mean?
I was just viewing the debug from the web interface. Yes, using the “Profiling” level.
I once logged it to a file and it was the same thing. No details at all. I lost the file, but it just says starting List(), and after that List() is completed. No details.
Ah! It just happened a while ago, and luckily the newest version of Duplicati keeps the log details:
"Messages": [
"2019-01-19 07:01:11 - [Information-Duplicati.Library.Main.Controller-StartingOperation]: The operation Backup has started",
"2019-01-19 07:01:48 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: List - Started: ()",
"2019-01-19 07:50:50 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: List - Completed: (41.12 KB)",
"2019-01-19 07:50:51 - [Information-Duplicati.Library.Main.Operation.BackupHandler-SkipUsnForVolume]: Performing full scan for volume \"D:\\\"",
"2019-01-19 07:51:50 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Started: duplicati-b20f18645a4e84e20b8f78debe81ba9b3.dblock.zip.aes (200.03 KB)",
"2019-01-19 07:52:04 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Completed: duplicati-b20f18645a4e84e20b8f78debe81ba9b3.dblock.zip.aes (200.03 KB)",
"2019-01-19 07:52:04 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Started: duplicati-i74697cb8af004c05991830b6f1c1fc84.dindex.zip.aes (1.12 KB)",
"2019-01-19 07:52:05 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Completed: duplicati-i74697cb8af004c05991830b6f1c1fc84.dindex.zip.aes (1.12 KB)",
"2019-01-19 07:52:06 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Started: duplicati-20190119T040054Z.dlist.zip.aes (12.71 MB)",
"2019-01-19 07:52:56 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Completed: duplicati-20190119T040054Z.dlist.zip.aes (12.71 MB)",
"2019-01-19 07:52:56 - [Information-Duplicati.Library.Main.Operation.DeleteHandler:RetentionPolicy-StartCheck]: Start checking if backups can be removed",
....
As you can see, 12MB are uploaded in 10 seconds, so the connection was fine. I even tested the connection while List() was stalled, and it was working normally.
The job contains some 100,000 files.
Does your network monitor show steady traffic coming from the remote SSH server, or does its transfer finish early, then Duplicati processes results?
It’s been a while and I seem to have lost (or just can’t find) the screenshots and files related to the issue, but from what I remember, it would be a small amount of transfer, then long silence, then another little transfer, then silence, and so on. It doesn’t seem like Duplicati was doing any processing during the silence (no CPU activity). It’s more like it’s just waiting for data to arrive. And as I’ve mentioned in an earlier comment: while the amount transferred is reported in KBytes, it’s actually a few megabytes. This is why I assumed that the whole remote file list was being transferred mutliple times.
Does the SSH server maintain any logs of its own that may directly show the number of lists, and their times?
I’m not sure I can find that info. I’m not so keen with Linux.
Have you timed the directory list with any other sftp tool, such as psftp, WinSCP, or maybe even a Linux sftp?
Yes, I opened a SSH consol manually remotely and used the ls
command just to see how fast it goes. It was very fast, and didn’t notice any hiccup in listing the 15k+ remote files.
Thanks again for your help and follow up. I’d be happy if I can be of further help.