S3 remote backup cost planning

I am setting up remote backups and plan to use AWS S3. I’d like to estimate out my costs so did a small backup test run. I’m familiar with S3 and the associated costs and after my test I noticed data egress charges resulting from multiple GETS. I would expect storage costs and PUTS but wasn’t expecting to see data being downloaded from my bucket for a backup. I set up some logs and notice that 6 PUTS were executed matching the 6 dblock.zip files which were uploaded, but then saw subsequently there were 3 GETS and files were downloaded.

I assume this is part of the verification process, however I’m trying to understand how many files would actually be downloaded when I run my full backup which will be roughly 100x the size of my test.

If this download is not part of verification, any other reason why duplicati would download some files immediately after upload?

I’m using 1GByte remote volume sizes if that matters.

Welcome to the forum!

By default Duplicati will verify (download) 1 of each type of file (dblock, dlist, and dindex) after each backup operation.

Also, Duplicati may decide to do a compact operation in an effort to reduce wasted space on the back end. Compaction involves downloading files, repackaging them, and uploading new files.

Either of these operations can be disabled if you really want, but there are downsides: the verification helps spot hidden issues with your back end data, and compaction helps reduce the space used on the back end (lowering costs).

Thanks for reply.

It looks like the 2 options for this would be:
–no-backend-verification = true
–no-auto-compact = true

Does that look right?

Yep, that looks right.

I am using rclone from a local backup instead of using cloud as a direct backup target. You might like that option. You can kick off the rclone sync from the after- script of the relevant backup.

That’s what I do as well. I back up to a local NAS and sync to cloud. Verification and compaction is done against the local NAS.