Backup storage decisions

Hi, I need backup storage. It will be stored, to use, if some disaster happened, about 300GB initial data and 5GB monthly new data
Regrading price, what do you think is it better to use:

  1. Google storage class Archive
  2. Google drive
  3. Amazon Cold storage
  4. Amazon S3
  5. Some other solution

Thanks

Archive class storage is problematic for Duplicati. See previous threads on the issue.

Personally I’d recommend looking at Wasabi or Backblaze B2. Both provide hot storage at prices that are really attractive - closer to archive class pricing in Amazon/Azure/etc.

Tnx
What do you prefer? Seems like Backblaze is longer on market, maybe it is more secure?

I agree for cold storage, and there are a remarkable number of ways to do it. I’m less sure of archive.

What storage is that? Put your archive data on ice with new storage offering is Google Cloud Storage.

the Archive class provides almost instantaneous (milliseconds) access to your data when needed. Access and management is performed through the same consistent set of APIs used by our other Cloud Storage classes, with full integration into Object Lifecycle Management. You get the same experience as our hot storage options

which at least for access sounds less awful for Duplicati than archives that need lots of work and time.
Amazon S3 does have a programmatic way to unfreeze cold archives, but Duplicati doesn’t support it.

I don’t use Google Cloud Storage, so might be wrong, but it’s sometimes seemed like more of a billing consideration based on the intended access pattern than a technical hurt . See Cloud Storage pricing.

Note that Duplicati by default does do sampled file download verification, and if you have space churn, compacting will do additional downloads and uploads that will cost more if you go with archive class…

While looking at just the storage cost makes such things look cheap, look at the rest of pricing picture.
Also consider how hard you want to work to learn and administrator. Enterprises may have specialists.

I have used both and they both work well. Wasabi is more “AWS-like” and supports similar json bucket policies, if that’s important to you. Backblaze B2 is simpler.

Personally I chose B2 because I like that they can send you your bucket contents on USB if desired.

An advantage to Wasabi is that there are no egress costs.

Thank you for you clarification and help. Think I will go with Backblaze, only, I have little confusion with their caps. If I have every 4 hours backup I will spend more money on Cap then the backup space. I am start to use Backblaze backup free and after 3 backups I receive message:

" Download Bandwidth Cap Reached 100%

You have reached 100% of the free Backblaze Daily Download Bandwidth Cap. To increase your Daily Download Bandwidth Cap or to change your Cap Notification Settings log on to the ‘Caps & Alerts’ page"

" Class B Transactions Cap Reached 100%

You have reached 100% of the free Backblaze Daily Class B Transactions Cap. To increase your Daily Class B Transactions Cap or to change your Cap Notification Settings log on to the ‘Caps & Alerts’ page"

Is it possible to reduce Cap usage with Duplicati?

I’m not familiar with their caps but I am paying for my storage. Are you going to use less than 10GB? If not you’ll need to upgrade to a paid plan which I believe will remove the caps.

Did you restore? I see that you posted a topic asking for restore progress indicator.
Depending on how you do the restore, that may download much data from backup.

Free limits for B2 are 10 GB storage and 1 GB download per day, per Pricing page.

A backup typically downloads at least 3 files, maybe 50 MB. See your job log value:
Complete log --> BytesDownloaded. Also note FilesDownloaded for other problem:

This ought to be hard to do unless you reduced the cap or something else is in B2.
Class B transactions are explained here, and you get 2500 for free. These look like
mainly downloads. You can test what raises Today value on Caps & Alerts page.
I’d guess a backup that does 3 downloads would cause 3 of these. You can check.

EDIT 1:

Trying to understand why I hit my 1GB download daily limit explains another source
of downloads is Compacting files at the backend which reduces your space usage.
Whether or not compacting happens a lot depends on how fast your files change…

EDIT 2:

I think that’s right, but you can put caps back on, otherwise, use beyond free is paid.
Feel free to sanity-check my Class B transaction theories if you still do B2 backups.
I’m especially puzzled about hitting the Class B transaction cap. Have no stats now.

I’m, curios: do you know of backup storage from Sia, Tardigrade (Storj), Sleek, or Filecoin?

I still wonder why i don’t read posts about this services in the mainstream magazines for regular endusers here in my country. I had to search for some longer time to find them… although they are cheaper, and safer. (Compared to my 200GB Google Cloud for 3$/month, i save 30%-82% of costs using Tardigrade. And my backup can’t be handed over to anyone in readable form, which kicks out NSA spies and whoever from my backups.)
edit: well, duplicaty does encrypt, i guess thats not importend then that cloud can encrypt. but still, the whole architecture of the mentioned providers makes it impossible (practically speaking) to crack your encrypted data.