Benchmarking different storage providers

Ouch…somebody used the word “percentile”…brain…losing…focus… (I’m terrible with stats.) :smiley:

My thoughts on your numbers would be:

  1. I always assumed de-dupe was for not saving duplicate blocks between multiple files as well as versions. That being said, working on encrypted source files will severly constrain de-dupe-ability due to the likelihood of even a single byte change causing a completely different encrypted file coming out the other end. So basically you’re probably backing up the ENTIRE file after every every change, no matter how minor.
  2. Good to know about the relative stability of Box WebDAV vs API. I’m not fully sure I understand your spreadsheet but did you find the Box API errors with all tested tools (it’s Box’s fault) or just Duplicati (it’s “our” fault)?
  3. The inconsistancy in cloud provider performance has so many variables (user’s bandwidth, other local users on the trunk, distance from cloud providers servers, potential cloud provider maintenance, etc., etc.) I think it’s almost not worth testing. Unless of course you’re testing specifically for clod provider performance it would probably make for more comparable numbers to test the tools (Duplicati, Duplicicy, etc.) against local attached or network storage so the transfer medium side of performance can be minimized.
  4. I know nothing about Onedrive sync - does it provide versioning and ability to keep locally deleted files or is it really just a “files on Onedrive match files on local drive” tool?

Thanks for these initial testing numbers!