Explain default size for an Upload Volume

I understand from the manual that for most cases default size (50mb) for an Upload Volume is ok for most users. But If there is a setting for default size for an Upload Volume there should be cases when it is better to increase it or decrease it. So my questions are:

  1. When and why (please explain why that happens) should default size for an Upload Volume be increased? What are Pros and Cons of increasing default size for an Upload Volume?
  2. When and why (please explain why that happens) should default size for an Upload Volume be decreased? What are Pros and Cons of decreasing default size for an Upload Volume?

Hi Vasili, a lot of the pros/cons of increasing/decreasing volume size is detailed here: Choosing Sizes in Duplicati - Duplicati 2 User's Manual

The only time I’d say reducing the volume size from 50MB is a good idea would be for very tiny backup jobs with lots of very small files where the total size is guaranteed to never exceed a certain threshold (1gb maybe), and/or when restore operations will be pretty frequent (especially if download bandwidth costs extra).

In this case, the larger number of volumes would allow specific files to be restored without needing to download as much from the storage location.

In most other cases I actually set mine higher.

  • For my B2 backups I set it to 200 - 250 MB. I don’t go much higher than that since B2 only allows so much downloading per day and the duplicati verification process downloads a chunk to verify after each backup operation completes.
  • For backups to my local USB drive, mostly of very large (movie) files, I set it to 2GB per block, since download bandwidth is both fast and un-costly.

For my B2 backups I set it to 200 - 250 MB. I don’t go much higher than that since B2 only allows so much downloading per day and the duplicati verification process downloads a chunk to verify after each backup operation completes.

If the size is increased, the verification process will need to download more data, right?

Yes - during verification phase (under default settings), Duplicati chooses and downloads 1 previously-uploaded volume to perform verification on that volume, sort of a “spot check”. Meaning every backup cycle would use 200MB (for example) of download bandwidth.

Yes - during verification phase (under default settings), Duplicati chooses and downloads 1 previously-uploaded volume to perform verification on that volume

During the verification phase, does the Duplicati always download only one volume or it can download several ones (if the backup is big)?

If I remember correctly, these are things that can be changed in advanced settings (and per backup job). So for example Duplicati can be told to verify X number of chunks per backup job instead of the default of 1.

1 Like

@drakar2007 is correct - using the --backup-test-samples parameter you can adjust the number of filesets downloaded to something other than the default of 1. (Setting it to 0 will cause NO testing to happen.)

--backup-test-samples
After a backup is completed, some files are selected for verification on the remote backend. Use this option to change how many. If this value is set to 0 or the option --no-backend-verification is set, no remote files are verified
Default value: “1”


Note that this tests a fileSET so leaving it at the default of 1 means it will actually download and test 3 files - a dindex, dlist, and dblock file. The dblock is by far the largest (that’s your “Upload volume size”) with dindex and dlist files usually being under 1M each.

It has been noted that with larger backups / smaller dblock (Upload volume) sizes / less frequent backup runs it’s possible to be (by default) testing a single fileset per even if you’re adding MORE than one per backup run. This means that, while useful to have, Duplicati may never “catch up” and actually test EVERY uploaded file.

It’s been suggested that the parameter should be able to take a percentage instead of a hard integer. If you feel like chiming in on that, feel free to check out the topic at Should test / verify commands & --backup-test-samples support percentage parameter?.


I assume you mean dblock (Upload volume size) not ACTUAL block (as in file chunk) size… :wink:

1 Like

LOL, yup. I seem to forget to type “dBlock” like half the time.

Hey guys,

I am new to Duplicati and am currently about to test its performance (using B2 cloud as a target).

Now i have backup set that has over 1 300 000 files and the complete size of set is about 140GB.
I am using the default 50MB “upload volume size” for this backup set and i have no “advanced settings” done. This backup set contains mainly small (text) files but it has few larger files as well (2GB max).

I would like to run the (incremental) backups of this set every hour (which i am easily be able to do with my current solution with another backup program).
However the hourly backups of this set is taking something like 3-11 hours.

So can i speed the things up with setting the upload volume size or dblock size?
And what would you guys recommend for this kind of set?

The upload is not taking time, but the “verifying backend data” and “completing backup” seems to take HUGE amount of time.

Any ideas to speed things up?

I have split up my backup jobs in sets that contains files that change more frequently (jobs for “Pics from 2018”, Documents); those are scheduled daily but hourly would be possible as well. And slowly changing files (Music, Movies, All pics (including 2018)) which I backup only once a week or even once a month. This has had a huge impact on the duration of the backups.

Next to that, for backups that I run every week or month I select a larger volume size (eg. 500MB).