Hi all. New Duplicati user here. Awesome software. Many thanks for that
I’ve been reading about this ‘Remote volume size’ thing… That you can increase or decrease that number, and that that may have a positive (or negative) impact on the speed of the backup and restore process. But to me it feels like somewhat of a trial and error-thing per use case. There don’t seem to be specific numbers to abide by, other than the default 50MB.
From the manual:
‘If you have large datasets and a stable connection, it is usually desirable to user a larger volume size. Using volume sizes larger than 1gb usually gives slowdowns as the files are slower to process, but there is no limit applied.’
So. Larger than 50MB. But maybe smaller than 1GB…
So now I’m wondering if someone has a similar backup situation like I have here, and if that someone has maybe figured out the “ultimate setting” regarding ‘Remote volume size’, for optimal speed in the backup and restore process.
I have a folder here with 1.5TB of files. The average file size is about 600MB.
All is to be backed up to an off-site SFTP-server.
My downspeed as well as upspeed is 200Mbit.
The downspeed and upspeed of the off-site location is 120/12Mbit.
It’d be a shame if I were to use settings that make my entire 1.5TB process maybe take twice as long as needed. So what would you recommend in this regard?