My goodness - slow backup!

Hello,

I am trying to use Duplicati for a very basic backup scenario: 1TB of data (~200K files) from my Mac mini over sftp to a remote site (over internet, 40Mb/s upload) where I have a disk connected to a Raspberry Pi.

Now the problem is: this is SLOW.

It started off pretty OK, but slowed down considerably within a few hours and now it has been running for daysssss with an average speed of <10Kb/s. The characteristic is - looking at the network monitor on the Mac here - very low to no traffic for long times, and then a small spike of a few MB/s, then pausing again for long periods of time.

I have seen others complain about this, but no solution or analysis.

Oh yes before you ask: the Mac mini is not really doing anything else, and neither is the Raspberry Pi. Raw SFTP to the Pi works flawlessly and speedy. Duplicati uses less than 10% of the CPU time on the Mac mini.

Anyone?
This is kinda frustrating :wink:

I guess that the reason in compression. So try to switch off it via Options: set zip-compression-level to 0 and if you don’t need encryption: set no-encryption to true.

@renato3
I’d try first to look at About / Show log / Live → set Level to Explicit-only

@avmaksimov
compression runs by default on 2 cores. It don’t seem likely that 2 cores of M1, said to be a very fast architecture, can’t compress faster than 10Kb(ytes?) per second.

Welcome to the forum @renato3

The symptoms here are kind of mixed. Gaps in uploads can occur if preparation falls behind, because uploaded files fill towards a size limit (default 50 MB, but Options can set Remote volume size higher).

The long duration and low system load suggest it’s not a preparation limit, but let’s isolate issue some.

You could look in your temporary folder at files starting with dup-. How many 50 MB files are queued?
You can also look at About → Show log → Retry to watch Duplicati trying to parallel-upload those files.

For how long? Does that look like 50 MB? You can watch uploads at About → Show → Log → Retry.
Log files at profiling level can give an exact per-file upload speed, but it’s a bit cumbersome to set up.

How remote? Duplicati uses SSH.NET for transfers, and it has a known issue that can make it slow.
Improve SFTP performance on medium/high latency connections #866
SFTP issue with high bandwidth high latency connections is the discussion from the Duplicati forum.

Maybe try the same test using Export As Command-line to get the URL, modify it to use empty folder, Duplicati.CommandLine.BackendTool.exe upload. Or try Duplicati.CommandLine.BackendTester.exe.
Commandline in GUI can run these, or you can try whatever native command line you’re familiar with.

1 Like

I know it’s a little late but this may help someone else.

Have you tried moving the tempdir in settings to an external drive? If you’re tempdir defaults to a directory that’s on the Raspberry Pis internal SD card then it will easily bottleneck the backup.

1 Like

Welcome to the forum @digit

Good point about a bottleneck for some cases, but I think case here was on Mac mini.

Raspberry Pi 3 micro SD card speed and a more lengthy update below talk about this.

Raspberry Pi microSD card performance comparison - 2019 talks about speed limiters.

Slow random access is apparently a limitation of SD cards as well as mechanical HDD.

Unlike the destination which gets sequential write, tempdir probably gets more random.

Thanks for the note!