B2 download v slow, + verify fails

I solved my other problem (about B2 upload) by only sending one file at a time.

Now I’m having trouble with download: When Duplicati tries to verify a file, that file downloads very slowly - a few kbps, with a burst to ~2mbps every second or two. After a few minutes of trying to download like this, Duplicati stops and retries. After a few retry attempts, it reports:

"ClassName":"System.IO.IOException","Message":"Received an unexpected EOF or 0 bytes from the transport stream."

I’ve tested using a desktop B2 client, and this happily downloads things at my line speed (approx 11mbps). So it appears to be a Duplicati issue.

I haven’t tried doing a restore, but I fear that the same might happen…

Any hints or advice? Happy to try stuff for troubleshooting.

Thanks

Have now tried a test restore, and the same happens.
Quite concerned now, as I effectively have no backups.

Did backup or restore work before (if so, how long, and what changed), or are you still getting started?

“One or more errors occurred” is (I assume) the B2 upload problem. With neither upload nor download working well, I suspect you have network issues even though some other downloader (what) does OK.

Duplicati.CommandLine.BackendTool.exe would be the Duplicati pure downloader (or uploader) which could do a test without the extra encryption and compression loads that actual backup or restore have.

Have you looked at Task Manager to see if there’s much Disk or CPU loading? Keep your core count in mind, so for example a four core system might be limited to 25% loading at Performance → CPU stats.

In addition to testing Duplicati standalone download and upload speed, you could try the Speed Test at DSLReports, with Settings set to single stream to more realistically compare to Duplicati’s network use:

image

Are you WiFi connected? Sometimes a physical connection works better. Any other computers to test?

Duplicati.CommandLine.BackendTester.exe can do an automated test involving upload and download.
This (like BackendTool) requires a URL which is similar to what Export As Command-line show to you.

Unless there’s a CPU or Disk limitation visible in Task Manager, this might turn into network debugging.
What’s your level of system performance or network troubleshooting skill, so I can try to adjust for that?

Thanks for the detailed reply. I’ll try to respond to as many points as I can.

  • It did work before, but I’m not certain whether or not it has worked since I upgraded to 2.0.5.1. I also haven’t tested, until last week, since moving into my current apartment, i.e. with current internet connection.
  • The other downloader is Filezilla Pro, which includes a Backblaze B2 client. That will download from B2 at ~11mbps.
  • duplicati.commandline.backendtester.exe is a useful tool - thanks for the pointer. When using this, there is no major CPU or disc loading, and it is producing the same errors (although not consistently). I’ll post a couple of outputs from this at the bottom.
  • The DSLreports tester, set to a single stream, shows about 11-12mbps download and ~800kbps upload.
  • I’ve tested wifi and ethernet, with similar results.
  • No other computers at present (though that may change next week). The machine is a laptop, but given the present lockdown I can’t take it to a different network! I do have a different router / modem that I could try. Though it seems odd that a low-level network problem would affect Duplicati and not anything else?

Re “what’s your level of system performance or network troubleshooting skill”, that’s tough to answer. I guess “high, but not guru”? :wink:

Thanks again - I appreciate your spending the time.

Oh, wow, I am kicking myself. And am somewhat embarassed.

The reason Duplicati was getting terrible network speeds, and other clients weren’t? I had throttling turned on. I had used the “click to set throttle options” icon in the web UI, with settings as shown in the screenshot. The choice to limit upload to 75KByte/s was deliberate, to avoid saturating my rather small upload bandwidth. There was no throttling set for download, so it still seems odd that download was so slow - this is beyond what the upload limit should cause via TCP return traffic. But, OK.

With throttling turned off, from some (so far brief) testing Duplicati seems to work fine again - although my internect connection becomes unusable for anything else while it’s uploading :wink:

This suggests to me that there is a bug somewhere, in that using the throttling options appears to break things rather than just slow them down - but from my immediate perspective, I have working backups again.

Sorry for wasting your time!Untitled

This leads to another question: based on the way things were going wrong with throttling enabled, I may have quite a few corrupted uploads. Is there a command that will verify the whole backup? I realise that this will entail downloading it all, so it’ll have to wait for the start of a new month’s data allowance… (unless there’s any way to check it, eg via hash, without retrieving it?)

–throttle-download ignored, --throttle-upload throttles download too #4115 is fixed, but not in Beta.
Canary is generally too unknown for backups that are important, and is for people willing to test…
Yours is a somewhat special need, but there’s a chance your router can do your throttling instead.

v2.0.5.104-2.0.5.104_canary_2020-03-25

Improved logic around throttle values, thanks @seantempleton

The TEST command with all can verify that everything remote looks as expected, via downloads (unfortunately B2 might charge, unlike for uploads). You can most easily run that in Commandline.

A sample Restore is always a good idea, even normally. Check –no-local-blocks Advanced option.

md5 & sha1 inclusion in verification json #2189 is on the wish list. I can’t think of another fast way.
–upload-verification-file could help if you had direct B2 access, but you don’t. There is the default Verifying backend files after a backup, but it’s small unless raised , e.g. by –backup-test-samples. Sampling is balanced, so if nothing is throwing complaints at you, there’s a chance it’s mostly OK.

B2 pricing (I don’t know if your ISP is also an issue) makes it cheaper to start over, but that loses previous versions that you might care about. There is a folder listing test for every backup, so any missing files should be seen. I’m not certain if there’s a size check then, but there is after uploads:

Assuming B2 didn’t drop anything that was once there, and assuming no content corruption that isn’t visible from name and size checks, you might try DB Browser for SQLite to read-only open Database. Looking at the Remotevolume table and seeing lots of Verified might give you some fast assurance.

EDIT: and the caveat on that is that remote files looking as Duplicati thinks they should doesn’t prove that Duplicati’s idea is right. There are still occasionally bugs being found, especially in error handling.

Thanks for all that. You’re right that uploading everything again is cheaper than downloading it again, with B2, but it would also take about 10x longer! And downloading everything in one day would only cost me ~$0.60 :wink: