Hi, I am just testing out Dupicati, making some small backups of local folders to a network drive. However, it appears as Duplicati first buffers everything on my local drive, taking up loads of space, before copying the backup to the network drive. Can this be?
Obviously this is not desired, and when doing full backups, this cannot work because i do not have enough space. Is there a buffer setting I am not seeing?
P.S. otherwise the program looks pretty sweet, props!
Hello @Mushin and welcome to the forum!
There’s buffering, but the limits are so low that you wouldn’t likely notice unless you increased some sizes. Sometimes people hugely increase “Remote volume size” on step 5 of a job setup from the 50 MB default.
While 184.108.40.206 beta lacks user aids, 220.127.116.11 canary explains:
This option does not relate to your maximum backup or file size, nor does it affect deduplication rates. See this page before you change the remote volume size.
In addition to the above link which is on the size control, there are controls on the file count and behaviors.
thanks for the fast reply!
Depending on what version you are using there was a bug (fixed in 18.104.22.168) that was causing temp files to not be cleaned up.
Thanks for the reminder about 22.214.171.124 @JonMikelV , though we’ll have to find out what @Mushin is running, and maybe when that issue was introduced. Duplicati temp dup-xxxx files not being deleted suggests a 126.96.36.199 regression with a 188.8.131.52 fix, however there seems to be at least some issue in 184.108.40.206 (see my update there).