tophee
September 24, 2017, 9:19pm
6
kenkendk:
I notice a keep everything option as well, so with de-dupe for ‘archival’ storage of images this sounds like it won’t use up space but will provide protection for any idiotic behaviour with deletions ongoing ?
That is the idea. Each backup will only have the overhead of actual changed data
So “keep everything” works fine for the case of large files that are rarely changed. But what about smaller files that change frequently? With “keep everything” enabled, my backup would grow slowly infinitely, right?
For anyone else trying to understand how retention works, here are some other relevant topics:
For the question, I think @JonMikelV has the right answer.
To clarify further, Duplicati works with “backup sets” or “snapshots” which is “a collection of files and folders as it looked at a specific point in time”.
Each backup makes a “snapshot” of your files. This snapshot includes all files currently on disk. For your example, each snapshot will include both test1.txt and test2.txt.
When you delete older versions, you delete snapshots. For test1.txt that means you will loose the ability to…
Another quick workaround until a better solution becomes available, at the cost of some initial storage memory (though still better than what you have right now), would be to create two backups with the same source files/folders.
Configure one to run every hour and keep backups until they’re older than 7 days and the other one to run once a week and keep backups for a year (add backups/tweak durations to fit your needs).
Of course you don’t benefit from deduplication between the backups but i…