Best pratcice for multiple backups to a common cloud?

My Duplicati backups are going to be broken up into several individual jobs running on the same Duplicati server, all backing up to Backblaze B2. My question is whether I should use a single B2 bucket and point each one to a different directory within it, or use a separate bucket for each one. What are the pros and cons of the different approaches?


PMFBI but would multiple buckets be more corruption etc resilient? And would restores be faster; assuming you know which bucket to access?

1 Like

Please don’t apologise - all input is welcome. And yes, I am leaning towards putting them each in a separate bucket to provide better isolation from potential corruption or loss. You raise an interesting point re knowing which bucket is which - how does one (or even can you) retrieve the backup configuration from a remote repository in the event of a catastrophic loss at the source?

Thanks :smile: With a logical basis for the split and naming the buckets accordingly, shouldn’t be a problem to find a file.

A recently added “pro” of separate buckets is application keys that are not your master keys. Use them right and it’d be helpful to know how well these work, given that Duplicati hasn’t (to my knowledge) changed code for them.

An old “con” that has not yet been removed is that bucket names must be globally unique, and you only get 100.

1 Like