I have 15TB of data to backup and there is about a 10% change daily. What would be the best settings for block size and volume size if it is being sent to cloud store like wasabi or b2?
Most of it contains files which have internal compression. Should I turn off compression or just add those extensions to the excluded list?
Also, when I tested duplicati before, it seems like it was opening/reading files that had not changed. Is that by design or is there something I’m doing that is making it do that?