I tried that, but didn’t see any speed there…
Upload a large file (i.e. 1GB or 5GB) via the web interface and manually time the upload.
For me, my home PC has been uploading to B2 from Duplicati at a rate of around 5 - 7GB/hr (on a 50 megabit FIOS connection, with a reported upload speed to Backblaze of about 25 megabits).
Damn, that was slow. I uploaded a 200MB file because 500MB was the max upload size from the web interface. That took a bit over 3 minutes and that give a speed around 8mbit/s. So I guess it’s not Duplicati that is slow but Backblaze is.
Maybe Duplicati should show real-time upload (backing up) and download (restoring) speeds somewhere?
( …and the throttle settings should be shown as mbps, not MB/s, uuuggghh, no-one uses MB/s for internet speeds! )
I made a ticket with backblaze and got this answer,
This is due to the latency on the connection.
To upload more quickly you will want to get a software that supports multiple threads to upload the data.
That way you can use multi threads to bolster the upload output on the connection.
The 3rd party applications that you can utilizes are here: B2 Cloud Storage - Integrations
Not all integrations will have threading.
The Backblaze Team
Benchmarking different storage providers
So are Duplicati using multiple threads? or is that something that can be implemented? =)
No, it is using a single upload connection.
Yes, it should be a matter of setting a block size and then go.
But if I’m setting the block size whould not make Duplicati to upload by more threads? I have blocksize of 50MB now but It just make one 50MB each time? It’s not working with multiple blocks?
I should have been more careful with the word “block”.
Duplicati does not support parallel uploads of files because it is hard to handle failures (the early upload fails, but the latter succeeds).
What I meant was that the uploads can be “chunked” where multiple chunks for the same file are uploaded in parallel (as suggested by B2). This requires a change in code.
I don’t really understand the difference here? whats the difference of upload parallel files or have the file in multiple chunks?
Uploading multiple chucks should be the same speed as a large single file, unless there is a bottleneck with the B2 service. I cant see how threading this would make it faster?
With chunking, all (or some) chunks can be uploaded in parallel.
For example, a normal upload would be:
put file1 put file2 ...
For a chuncked upload, it would be:
put chunk1 | put chunk2 | .... put chunk1 | put chunk2 | ....
So you have multiple connections to the server.
This would only increase upload speed if the connection is not saturated.
If the user has a 1 mbps upload connection, BackBlaze and Duplicati should be uploading at 1 mbps.
Uploading using multi-threading will reduce that to 500 kbps per file for 2 files, etc. .
Unless there is a time delay between uploading finishing and the next file starting, i cant see how multi-threading will increase throughput?
Is there any negotiation on the line before and after the upload? If so, then multi-threading will increase overall throughput.
True! As I understand the problem, the B2 servers limit the upload speed (either intentionally, or due to load), and the parallel uploads is meant to spread this out as you will be hitting different upload servers on their side.
Yes, you need to start and finalize the upload operation.
It’s just Backblaze that is slow from Europe I think, I changed to Azure Blob and got much better upload speed.
No, I believe it’s throttled per connection. I’m in Australia and have been using CloudBerry Backup until very recently. On one thread I would get between 1.8 and 3.2 Mbits. When I set up 10+ threads I could easily max out at 40Mbits.
With Duplicati I see the same ‘one connection’ speed. If parallel uploads can be implemented (at least for B2) I’ll be moving everything over to Duplicati!
I have the same issue with Backblaze.
Is it planned to implement multiple thread upload?
There is a bounty for this over at GitHub, however I don’t know that it’s actively being worked on yet:
I have just add an additional $10 to that bounty.
I hope we will get this feature quickly.