It’s a question (IMO) of priorities. There is far more that could be done than can be done.
Sometimes one might work off of demand in forum and Issues. Or it might get personal.
“Every good work of software starts by scratching a developer’s personal itch.” concept.
As a free effort from volunteers, progress depends on them, and there are very few lately.
I’m here constantly asking for volunteers for code, test, docs, support, etc. It takes a lot…
On chronology, Support parallel uploads for a backend [$381] #2026 was opened in 2016.
Implement parallel uploads to a single backend #3684 was its answer which was in 2019.
I’m not sure if there’s an equivalent issue open for downloads. If so, please link from here.
You can read through that to view commentary from some developers not available today.
Original author (who has had other priorities lately – volunteers are permitted that) posted:
Wow, large refactoring and update.
The point there is that there is a fair amount of skill and time needed to do major evolution.
If you find something that does a good job of keeping your 8 Gb pipe full, please post back.
I’m certainly not familiar with all products, especially ones suited for your seeming usages.
For another case in point of how developers contribute things without centralized planning:
Implementing the feature to restore any version of a single file discusses a wish from 2014
where you can see how someone began it (maybe an interest?), had to pause, came back.
Display versions for a file and allow any version to be restored #4805 is in a queue awaiting
processing by a scarce developer with permission to review it and bring it into the program.
From what I can see, this is sometimes addressed with a local backup. Also have a remote
that will almost certainly transfer more slowly (speed of light…) but is a beneficial safeguard.
Some people call it a “hybrid” backup. Another popular concept is “3-2-1” (one copy off site).
Backblaze has a nice feature for those with slow connections, they’ll fast-ship an 8 TB drive.
The target market for that likely doesn’t have a fast connection, even if the software used it…
They make some good blog posts, such as:
Server Backup 101: On-premises vs. Cloud-only vs. Hybrid Backup Strategies
You might still need to choose backup software carefully for performance that fits your needs.
Or design around limits. For example, a fast download to local might help speed up a restore.
Or you could have a cloud sync for latest copy, and historical restores use something slower.
EDIT:
rclone is good at concurrent file transfers, and supports lots of providers, including Azure Blob.
If you happen to be a developer, Azure “looks” likes it can parallelize within a single file transfer.
Some other providers permit such things too, but each would need its own support in Duplicati.
Duplicati’s upload concurrency is at the file level, and almost every provider can manage that…