Best practise backup to remote NAS

Hi there,
new user to Duplicati, it sure looks like a very fine piece of software.
I run Duplicati on a QNAP device, backing up to a Synology NAS running in a remote location using FTP. What options would you recommend to use? I notice that Duplicati takes ages to verify the backup files, for example.
Backup volume should not be an issue (I’m backing up only office files and such, low volume), but of course latency etc will…

Hello and welcome!

What is the circuit speed at each of your two sites (up/down)?
How much data did you select to back up? Did the backup complete?

Did you change any options from the Duplicati defaults, like dblock-size or blocksize?

Hi there. I set up the backup storage with a Remote Volume Size of 5 GB, but changed that back to 50 MB (the default, I think). At the Duplicati location, I have 200MBit/s downstream and 50 MBit/s upstream, the remote location has 50MBit downstream and 10MBit upstream.
The backup source makes up aboutn 1TB, most of that has already been backed up (seeded). Change rate is low, as I said mostly Office or PDF files

5GB volume size would be killer. When you switched back to 50MB did you delete all the backup data and start over? If not some of the remote volumes are probably still huge 5GB files. This will cause slowness if volumes need to be compacted or verified. (Duplicati likes to verify one volume at random each time a backup is run as a safety check.)

My NAS backs up about 600GB of data using Duplicati and I do notice that the phase where it is looking for compactible volumes takes like an hour. This phase is CPU intensive and the NAS doesn’t have the most powerful processor.

I installed Webdav service on my Synology NAS to support incoming backups from elsewhere. It worked well and supports TLS encryption. I would probably use that instead of FTP, but I don’t know if it will speed anything up for you.

1 Like