File server backup

Yes, Duplicati uses deduplication and therefore needs to track blocks and hashes. It does this with a local database.

It also creates temporary files, but they are by default not very large. Default “remote volume size” is 50MiB which works fine for most users, and by default it may create 4 of these concurrently.

Duplicati reads all your data and breaks it into chunks (default 100KiB as mentioned earlier), deduplicates the chunks (so it never stores it more than once on the back end), and repackages the chunks into volumes (default 50MiB).

You can read more about how Duplicati works here: How the backup process works • Duplicati

I recommend you test it out on small sets of data so you can see how it works before you jump into the deep end and start backing up 12TB of data.