How to limit backups to a certain total size?

How do I set a limit of my backups to a certain size to stay under my cloud storage quota?

Hi @dataman1818 and welcome to the forum.

The work is still in flux, but it looks like you can get warnings if you’re on canary or experimental, but not beta, when you get close to the quota that’s given either by your cloud storage provider or (if they have none) you.

Check this, and what it says (which I could be misreading) seems to be progressing through the change log.

I’m not aware of anything that goes beyond warnings, though the link I referenced did talk of it, and also the distinction between total provider quota, and a quota that you might want to add for some individual backup.

If you find a forum topic that fits, feel free to offer input. That’s one good thing about early development. :grin:

I think the position at this point in time is that nobody knows how a hard quota limit should work.
The deduplication makes it virtually impossible to guess how much the backup will grow in size until the backup run has finished.

We could simply check before every volume upload, but that would just cause the backup to cancel in the middle of the run without even producing a proper finished backup. And it would still use all the space even though the backup can’t be completed.

1 Like

That is indeed a problem. Maybe a user could set a soft limit and Duplicati could do a quick check of the remote files in the database. If it exceeds a user defined threshold it could throw a warning? Or add it to the reporting output.

--quota-warning-threshold will already do this, but it won’t help much if your backup goes from below the threshhold to full quota within one backup run.

While I haven’t tested out this option, its description does state that it’s ignored if the destination doesn’t report the available space. Not sure whether it stacks with the --quota-size switch. So you them both and it works? I’ll have to test this out.

Yes quota size parameter will let you use quota warnings with backends that do not report quota size to Duplicati.

Because of the difficulty knowing how much space a backup will need before it’s finished (now imagine threading where uploads are started before the fine scan is finished!) I think a soft limit is a good start.

When a backup finishes and it’s over that limit, a warning is triggered. Future backups abort with a quota reached error UNLESS pre-backup destination check is enabled and finds destination files got smaller or quota size setting got bigger.

1 Like

Yeah that’s a good idea … although in fairness storage costs are relatively low and I would imagine most users backup to their own hosted storage like a NAS. It would still be a ‘nice have’ :slight_smile:

True.

I don’t recall if it was mentioned here or not but I’d like you see a size based retention feature. Back up all versions until size X is reached then start pruning versions until below X again.

Of course this sort of process is risky due to things like accidental inclusion of a giant video file in your backup blowing far beyond your size limit causing pruning off ALL you backups. :frowning:

1 Like