Speed increases with

This is more of a FYI but I just wanted to share my experience.

TL;DR - If you have used prior versions and updated to a more recent version (and notice longer backup times), stop the automatic running of previous backups and create a new backup with same source, to a new folder at the same destination, speeds will increase quite a bit.

I have been using Duplicati for at least a year now, started with 2.0.3-something. With the first 2.0.4-something canary update, I had to change some settings and recreate some of the backup processes. Since then I have been through at least 4 or 5 major version updates but the backups themselves have been working what appeared to be normally.

Forward to this week, I thought backup performance was pretty well set as 3 of my main backups that varied from 200-800GB each would average over an hour per daily backup (1h:10m to 1h:45m). This was reading from various network drives, source is Enterprise SSDs, connected via gigabit network connection, storing on local RAID drives that average 200MB/s R/W speeds. Then I noticed one of the larger 700GB backups was not running, it was stuck on “reading files” for several days (meaning it was not able to run any other backups during that timeframe). I checked permissions, file access, temp files, etc and nothing on the remote drive would be the cause, so it had to be something with the backup. Recreating the database did not help.

(Using example names here): So for testing I stopped the failing one (drive1 bkup), created a new backup (drive1 bkup2) to a new destination folder (backup2 drive1) on the same destination drive (\lannas\storebkups\BD2), and was amazed at how much faster it is now, with the new initial read and backup taking only a few hours (versus several days for initial backup of this 700GB drive with prior versions of Duplicati). Prior versions would take 24+ hours for a new fresh backup over 200GB, averaging 10GB per hour (for my specific setup), versus the new ones are going MUCH faster, and subsequent backups that previously took over an hour is now taking 20-30 minutes.

As always, those backing up to online/internet locations will see MUCH slower speeds due to routing and network speeds, but hopefully it will still be faster than it was in previous versions. I tend to suggest backing up locally or on your local network, and then a separate process to copy/xcopy/robocopy/rsync the backup files to your online storage once a week since that process tends to be faster than the built in Duplicati backup process that has to deal with a lot of network latency delaying verification of everything.

So after seeing that, I am going through the rest of my backup processes and starting new scheduled backups, so far the others I have done have run quite a bit faster.

1 Like

This. I’m also doing this for all the larger backup sets. For small ones, it’s not worth of it. But of course the copying to off-site storage is done immediately when the duplicati backup finishes.

Additionally, do test full restore test from the remote location, periodically. Without doing that, you’ll probably end up with backup which can’t be restored. That’s the real hazard with Duplicati in general. Always remember to do full restore tets. The verification itself seems to be almost pointless.