How to keep local storage small?

I am new to Duplicati 2, though I was using Duplicati 1.

On my notebook, I want to run off-line backups, and when at home connected to my home network, copy them to my NAS backup server. But I don’t want to keep whole backups also locally, so I’m looking for a way how to keep locally only the dblock indexes (dindex and dlist files) and remove locally the actual large static dblock files.
Is there a way how to do it?
I was looking for an advanced option, and found none. I also tried to delete or modify the dblock files, but then Duplicati was complainig that the local storage is corrupt.
Or maybe there’s a completely different way, eg. by using the asynchronous-upload-folder?

Or what about the combination of

   no-local-blocks
   no-backend-verification

? It looks promising, but does it really do what I’m aiming for? Does it create correct backups that can be use later for restore? Or does it have some undesirable side-effects?

Not really. The idea is that you have a destination and back up to that. if you meddle with the files, things start to break unfortunately.

This is only used for restoring, and disables copying of data from existing files when restoring, and then reads everything from the dblock files.

This will effectively disable the check, so you get no errors. It is also a bit dangerous as you can end up having tons of backups that will not work well when you try to restore.

If you use --no-backend-verification you should also set --no-auto-compact to make sure it does not attempt to read the files you have removed.

Thank you. I’ve also verified that backup run with no-backend-verification actually works even when the target is completely empty, it makes an incremental backup. Everything that is stored in dlink and dindex messages is probably stored in the internal Duplicati database as well.

So I expect my process to look like this:

  1. Run backup on request, with a local directory as target
  2. Periodically copy the created backup files from the local directory to NAS. At NAS, there is a complete backup stored.
  3. After successful copy, delete the local files
  4. I have Duplicati installed on the NAS too. Periodically verify consistency of the whole backup at NAS, if Duplicati is able to do so. (Is it?)

Just one similar question: If Duplicati looses its internal database, I can restore it from the NAS backup. What files are necessary to do that? Do suffice the dlink and dindex files only, or dblock files are needed too?

Yes, the local database is basically a map of the remote data, which is why it works without any files on the destination.

You can use the option --upload-verification-file which will upload an extra file after each backup that contains a full list of what files (including hash and size) it expects there to be on the remote end.

There is a script here that can use this file and verify that the destination is intact: duplicati/Tools/Verification at master · duplicati/duplicati · GitHub

Yes, the “repair” command can rebuild the database with the remote data. Optimally it will only need dindex and dlist but if data is somehow missing, it will attempt to recover this information from the dblock files (which is a really slow process).