Taking a long time for 400GB

@Dan, what OS are you running on?

200GB in 9 hours is incredibly slow, that’s about 53Mbps or 6.5MB/s. Can’t think of any reason why it’d be going that slow. What kind of CPU do you have?

The closest I had to test with was setting up a local backup to an internal 5900RPM SATA disk. 200GB (~18K files) completed an initial backup in a little under 3 hours with mostly default settings. While it was backing up it was maxing out a single core, and using between 100-150MB of RAM.

Of the three non-default things I’m doing, only the first one is likely to have much impact on performance:

  • Tell Windows Defender (or whatever AV you have) to ignore the Duplicati.Server.exe service
  • Run as a service (using SYSTEM account)
  • Set Up VSS

I don’t think Duplicati is the fastest (very interested in performance after multi-threading gets added), but it is adequately quick and I haven’t experienced it being infuriatingly slow either. And I second @JonMikelV, I’d trade a good bit of speed for the refreshingly low memory utilization compared to the gigabytes of RAM CrashPlan would chew up.

Same here. :slight_smile:

I’m also curious why some people seem to be getting such slow speeds. I know there are lots of external factors that can reduce speed (good catch on AV) but it would be great if we could identify some of the more common ones.

My initial 200GiB backup to a NAS also took 9h, and the second run (with 6GiB new data) took 22 Minutes. There is a strong disk activity in the temp folder and 50% CPU load on my i5-6200.

Info on my system and OS:

  • I’m on Windows 10 Pro
  • External HDD is 7200RPM connected via eSATA
  • CPU is Core2Quad Q8300
  • 8GB of RAM

How do I run this as a service? I wasn’t able to find it in my Services (well nothing with Duplicati in it).

And lastly, what is VSS?

This How-To explains how to migrate Duplicati from a user-based to a service-based setup:

This video shows how to register the service after a fresh install:

VSS is a technology that enables backup software to make a copy of files that are in use by another process. More information can be found here:

What volume size are you using, out of curiosity? I’ve talked about this elsewhere – for a local drive backup, I found that the default 50MB volume size is way too small, as Duplicati ends up spending most of its time preparing and compressing the data blocks, instead of spending its time utilizing the upload / local drive bandwidth.

For mine I settled on 2GB volume sizes (NOTE: again this is for LOCAL backups only, you should not do this for B2), because it ends up with quite a bit less clutter and less file processing overhead as far as I can tell. I’m not sure what official benchmarks would show though, i’ll have to try it soon. I’d also suggest trying out 500MB and/or 1GB volume sizes if 2GB sounds too big - even the smallest of these is 10x fewer dblock files than the default size.

Yeah, used the default which is 50MB. I can retry using a bigger volume size and with the suggestions you provided earlier.

1 Like

Dan, what version of Duplicati you’re running. @kenkendk recently switched Duplicati to use a faster hashing mechanism in the latest Canary builds.

I’m running on

I will attempt to test post-service install and bigger blocks. Will report back.

FWIW, I did a test with a bigger block size (500MB) and was seeing high writes to disk, sometimes it peaks in the 100MB/s. However, it fluctuates quite a bit, coming back down below 20MB/s. Previously, I think it was pretty constant at 2xMB/s.

After a short while, I nuked my external HDD, started VSS (it was set as manual startup) and installed Duplicati as a service. I seem to be getting the same performance as I had by simply increasing the block size.

@drakar2007 @sanderson The transfer is still going… 7+ hours.

Check out this Task Manager screenshot:

Notice C: utilization constantly at the top, and I: my backup drive, being written sporadically.


Got this shot of the Task Manager just now:
C: 100% I: barely anything

Lots of moving parts to keep track off …

There is a performance measurement here:

It suggests setting --zip-compression-level=1 and --no-encryption to get speedups.

The disk usage could be related to the SQLite transactions being written, hammering in with the temporary files as well.

@Dan, if you have a second local drive you could point your temp folder there and see if the high disk usage follows it.

It may not be a “fix” but it could help narrow down a specific cause of you slowness.

I tried the --zip-compression-level=1 and I did notice more frequent writes to my external HDD. However, we’re still talking 10+ hours and it still had 100GB+ to go. I want encryption in there though.

Unfortunately, I don’t have a spare drive. The other partition (ie. F) contains the folder that I’m going to be backing up (movies, pictures). Not sure if it’s wise to put the temp file on that drive.

Theoretically if you have “standard Windows exclusions” set on then Duplicati will ignore the files even if they’re in a folder being backed up.

As for the wise-ness of it, I don’t know you’d want to leave it there. :slight_smile:

If you have time to test it, we are considering using in-memory storage when building the volumes.

You can set up a ram disk, and then set it to be the --tempfolder, which should reveal if the disk is really the issue:

If this shows significant speedup, I will prioritize the changes to get “in-memory” volume support.

1 Like

I created a 2GB RAM drive, set the temporary files to be written to this RAM drive. Encryption was also set to 0.

I see less activity on C:, and less idle time on my external HDD (J: in this case). Here’s a screenshot of Task Manager showing this.

So, I started it up 3 hours ago and there’s almost 100GB gone from 370GB total. This would peg an entire backup to around 10 hours-ish, which would be an improvement right?

Edit: How would I see the RAM drive’s performance? It’s not showing in Task Manager (in this case, R)

Well… based on your initial estimate of over 10 hours, I would say less than 10 hours is an improvement.

I have not tried it, but there is an article here:

Forgot to update… I let it go overnight and came back to it in the morning having 30GB left. No idea why it didn’t complete sooner based on initial calculations.