Really Low Backup Speeds

I am experiencing VERY slow backup speeds. I am backing up to the cloud and am seeing speeds that never surpass 1.5 MB/Sec. When backing up 100s of GB of data you can imagine how slow this is.

A speed test on my system shows that I am testing upload speeds over 10 MB/sec. I have no throttle set on my system nor in the Duplicati software.

Is there any reason why I am seeing such a low speed?

Thanks,

Can you post some details of your backup job?

Was that converting bits to bytes? The top 5 “internet speed test” sites Google found for me all use Mb/sec, where lowercase b is a bit. Uppercase B is a byte (8 bits). Duplicati reports bytes. 1.5 MB/sec is 12 Mb/sec which would be a little surprising because Duplicati usually can’t quite match a multi-threaded speed test…

1 Like

Well, its been running for 22 hours and has backup up only 11 GB worth of my data in that time. Still has 117 GB to go. I’m getting 380 KB/s right now.

Not sure what you mean. Is there a config file or something I can post for you?

Your long-term stats show about 1 Mbit/sec and your short-term about 3 Mbit/sec, neither of which are 10 Mbit/sec (which is usually an unrealistic best-case measurement using multiple threads, often to nearby).

Next step is for you to search for a speed limiter which might be things like disk, RAM, CPU, network, etc. Perhaps something will be found. Perhaps nothing specific will be found. Specific reasons require data…

What OS is this? Windows has some tools, Linux tools vary, and I’m not at all familiar with macOS tools.

What cloud are you going to, and how far away is the datacenter? Latency can be important to the speed.

You can also try to see what very-short-term network speeds look like via Windows Task Manager, e.g. in graph of some Ethernet or Wi-Fi adapter. The goal is to have a fairly smooth upload at maximum line rate. Large gaps or extremely erratic upload rates (assuming nothing else on system is uploading) may be bad.

Logging can provide quite a lot of information, but tradeoff is big output. One compromise level would be a live log view at About --> Show log --> Live --> Retry which should show files going up, and maybe retries. The best (very long) log is at Profiling level, but it’s probably too much to read live rather than with a log file.

The backup log also makes a “BackendStatistics” “RetryAttempts” number, but it only arrives after backup. Generally the initial backup (is this that?) is best done as increments, and that also eases debug of issues.

Has the backend been ruled out? For example, I’ve seen in my own testing extremely slow transfers to Mega. Perhaps test another backend.

And another backend might even include local files, which will probably be pretty similar in CPU and memory usage but of course completely exempt from network issues. At the moment there is information on nothing.