I’m currently trying to move the Windows backup of my laptop from Windows File History to Duplicati. (File History forgot one of my target paths during a Windows feature upgrade and so I am no longer confident in using it).
At first glance, it looks great, I like the UI and the backups are (thanks to deduplication) smaller than File History’s.
However, it does not seem to be possible to run a backup when the destination folder is not available (File History has an option to cache changed files in a cache directory and then upload them when the destination folder is available again). Which does happen since it is a laptop which is not always in my home network and not always online. Yet still I’d like to be able to restore file versions that were backed up while I was offline.
My question: From the design of duplicati, would such a feature be possible? I noticed that the destination folder contains lots of large dblock files (totalling about 100GB after the initial backup and larger than the free space on my local drive) and lots of small dindex and dlist files (totalling about 100MB). If the dblock files are not read during a backup, one could try to implement a backend that stores dindex/dlist files on two places and dblock files only once (initially on local disk but moved to remote storage when available again)? Having the small files permanently on my disk would not be a problem, it’s the large files that I’d like to have offloaded.
For restore, it is obviously fine for me to have access to the destination - that is not different with File History, either.