Tips for diagnosing and speeding up large backup

I’ve be running several backup jobs on a few computers. I had a large backup job that never ended up completing, and have since split it up into smaller jobs, but one remains ~500 GB. When I started the job I got between 1-2 MB/s according to the Duplicati interface, but after a short while it slowed to < 1 MB/s. I have seen it get as low as 500 KB/s in previous attempts at this large backup task.

I’ve increased the dblock size to 150 MB and the block size to 5 MB. I’m working in Mac OS, to OneDrive. The computer is older, only 2 cores, 8 GB of RAM, and the source files are on an external hard drive.

I’m trying to find out where my slowdowns are so that I can speed up these large tasks (instead of having them take days and then stall out). I know the incremental backups will be much faster later. I have been using the OS activity monitor to check CPU % and RAM usage; in both cases they don’t seem fully utilized. Are there any settings I should modify, or other setups I could use to check for problems and speed up the process?

Hello and welcome to the forum!

I notice you are backing up data that is on an external (USB?) hard drive. I know from experience with some other backup products, using multiple readers on a single USB drive can reduce performance. I do not have experience with Duplicati backing up from a USB drive myself, but perhaps you can try adjusting the concurrency (threading) options to see if it makes a difference.

Try setting concurrency-max-threads to 1. Also maybe set concurrency-block-hashers and concurrency-compressors to 1.

I could be off base here, but thought I’d give you an idea to try since no one else has responded yet.

Good luck!

1 Like