As explained, the manual does not cover these options, so of course the missing section is not clear.
The retention can be set in 3 ways
is where it ends. but I posted a link to the issue asking that the manual cover this – and your concern.
What you quote seems correct, is written for context I’ll quote below, but isn’t addressing your context:
Notice how that flows to:
any file deleted from the source will subsequently be deleted from the backup when it reaches the retention policy limit.
(and now that I read it again, this is translating backup versions into file terms – maybe a tip for us…)
What I see as the step-too-far, possibly inspired by the context it’s in though:
It’s talking of fell-off-the-end, not deleted-from-middle, so I agree with your point more than your quote.
I think pointing it out would be better. Manual needs writing, and GUI text could expand. Any thoughts?
Either way, this needs someone who knows the mechanics of how to do it (and volunteers to do so…).
Is that Yep to CrashPlan? Regardless, how do any products that you know deal with version thinning?
Can anybody thin versions in a way that doesn’t lose deleted files into holes created by the deletions?
Care to explain, especially if on-topic and maybe even if it’s not? Duplicati can’t bend its model totally.
I thought about maybe (if developers help) moving last version of a deleted file into a nearby survivor.
That’s kind of ugly though, to have things show up in a version that weren’t originally backed up there.
I think the motivation for thinning was to get rid of intermediate file versions that are no longer useful.
OK, so you think “one backup” per year is no clue whatsoever that there’s a gap? I guess we disagree.
Regardless, “concise” meant it was brief, and brief leaves the impact for user to figure out (clue or not).
It’s definitely possible to be too brief, and this one could use expansion, though expansion has limits…
You’re listing the three the manual covers now, so there’s definitely a blank canvas to try to explain this.