I’ve been using Duplicati for years and I’m very happy with it. So far I’ve been using v1.3.4 (not moved to 2.0 yet) and I wonder if v2.0 has improved/optimized the way it does full backups. My case is simple: I just have a huge “C:\Doc” directory with many subdirectories and files, everything spanning some GB. The problem arises when Duplicati runs a full backup against my remote server through a low-speed Internet, taking a lot of time. As far as I have learned about how incremental backups work, full backups are performed to “rebase” the backup to a full-known state. However, what if among all the stuff I have in “C:\Doc”, only few files are added/modified in a regular basis? Is there any way in 2.0 (or maybe in 1.3.4) to prevent Duplicati from making full backups of stuff that almost doesn’t change?