Newbie: What is the most cost effective way to backup 12+ terabytes windows folder of data daily to AWS Deep Archive?

Thanks for jumping in. I don’t have anything AWS, and I personally worry about things like testing backups.

Because this one sounds like emergency-only use, I’m wondering how one will know that it’s working well. Downloading everything for a test will cost over $1000 I think – isn’t usual S3 egress around $90/terabyte?

Trying to arrange files so dlist and dindex stay in hot storage can test Recreate, but it might need a dblock.

You were at one time doing sync of a local backup on a NAS, where at least one could test local copy well. Additionally this would allow things like version deletes and compacting of wasted space from it to happen.

Later on, you were pursuing direct backup from Duplicati, and liked it. Did the good view here grow worse?