I don’t know what the right thing to call it is… incremental? Differential? I currently have it set to: “Keep All Backups” in step 5 of the configuration. From the description of that option I thought it was the right thing but my storage space at the destination is climbing like a rocket ship! Maybe I’m not saying this right - I only want the deltas (the changes since last backup) to be added to cloud storage not a sum of full backups Because of the the way the data is output (the backup format) I can’t really tell what I’m ending up with at the destination - just that I’m watching the space used climb by the gigabytes daily.
Is this possible with Duplicati? If so, how?
Duplicati already works this way. It only stores the differences between backups. (In fact there’s no way to turn that feature off - it is core to its backup engine.)
To reduce storage on the back end, you’ll have to reduce retention.
Consider setting retention to Smart
Smart Retention is the same as setting a custom retention of
7D:1D,4W:1W,12M:1M - be aware that it will delete backups over 1 year old.
I was kind of wondering if that should be changed. It might even be an inadvertent leftover from early days when a deletion wasn’t implied. You could go digging in the history to see what pieces were done when…
Feature/issue 3008 retention optimizations #3026
--retention-policy now deletes backups that are outside of any time frame. No need to specify
--keep-time as well.
That’s a really good question. I wasn’t aware that retention policy had a different behavior before!
Thanks. Appreciate the info.