Since pretty much all destinations have finite storage would it make sense have a “keep until” setting for maximum destination size?
Once reached (approached?), Duplicati would automatically start pruning the oldest revisions until back within the keep size limit.
Things to consider include
Is it a hard (if limit will be hit by next upload, stop backup and prune?) or soft limit (finish backup even if over limit then prune until under)
Should add warning notification if source size gets more than x% of destination limit (tough to estimate due to compression variances). Maybe better to do a minimum revisions limit (alert if drop below X revisions after pruning)
status emails (and UI?) should include note of largest number of revisions stored
conflicts with other parms (e.g. auto-cleanup)
side effects could include pruning of deleted files from archive and excessive bandwidth usage while doing iterative pruning
Of course there doesn’t have to be any action, it could just be a notification / email that “hey, your destination is bigger than the preset limit - you should manually prune”.