Maximize Upload Speed Dropbox

Configuration tips for large (200TB) SME cloud backup has my personal thinking on backups that big, especially when lots of files are present. I don’t know if that’s the case for you, but it seems likely. The usual increase of blocksize helps keep total block count below the generally recommended 1 million when files are large, however each file also has a small metadata block, so a lot of files mean a lot of blocks, thus slow running due to block overload, e.g. in SQL operations and other block-scaled areas.

That’s not an immediate upload-speed result, and one can’t expect scale-up linearly. It drops sharply.

Or far worse. I talk about restore speed. Duplicati does parallel uploads and single threaded restores. Single threaded network operations are limited by things like speed of light (latency), but parallelizing potentially can help. You might want to reconsider whether or not you can tolerate long restore delay, however the other topic was a business, and probably would be losing revenue while system was out.

If you see you can measure 10 Gbit down, that’s with a whole lot of parallel connections typically, and single connection is far worse. The cloud provider probably also couldn’t run that fast on a connection.

I was getting about 80 Mbit from OneDrive on a recent test on a 1000 Mbit link (limited by WiFi as well) which would mean 30 TB at 10 Mbyte/second would take 3 million seconds (35 days), if no other limits. Faster might not help, as 10 Mbyte/second might run into the hard drive’s (if used) maximum write rate.

IMO something of this scale needs a lot of planning and testing, including test of disaster recovery time. Although 30 TB is not the 200 TB case, it’s still a lot of data, and I’m not sure you can tune in big speed.