I wonder how this should work successful if Duplicati needs tons of local disc space for a remote sftp backup - means: it used local in addition the same size of disc space - before(!) it try to send the backup to the remote sftp-Server. This behavior goes totaly wrong since this would force the system to lost all free disc space if the compressed size of the files to backup bigger than the free disc space and could harm the whole system.
I have a 1TB disk and have to backup round about 470GB. Before I started there was 514GB free of disk. After starting the initial backup the free disc space reduced continuously - at the end are just 145GB free.
Also for my opinion the order of the backup build (1st. locale storage, 2nd. transfer only after the 1st step is completely done) expands the time usage extremely.
Duplicati 2.04 on macOS 10.14.6