How to reduce data transfer to a minimum? Duplicati on backend for verification?


after reading a lot here (and learning a lot :-), I have some questions
because I have to use a (very) slow connection between frontend and backend:
(The initial backup is transfered by external disk from frontend to backend)

How to reduce the communication and data transfer to a minimum?

Due to the fact that I have full access to both sides, it should be no
security to run Duplicati on the backend (Asustor NAS -> it is already there :slight_smile:

How can I do a backend verification and let the frontend know, everything fine?
Or at least let the backend create the hashes for verification?

(Should I split this into two threads?)

Thank you all for your great work and support!


To start with you’d want to enable --no-auto-compact (disable re-compressing less-than-full archives) and --no-backend-verification (disable random validation of a dblock archive).

Destination / backend verification isn’t directly support at this time, nor is using destination provided hashes (I believe it has been discussed, but isn’t on any roadmap yet).

My memory says there was a post somewhere about a destination side script that could do verification, but I’m not finding it right now so might have imagined it… :crazy_face: