Question about load testing

Please let me know if any load testing has been conducted. I intend to use it in a production environment with a full dataset of approximately 30 terabytes and daily incremental data of around 10 gigabytes. If such testing has taken place, could you kindly advise where I can find the test scripts, configurations, or related materials?

Hi @byron, welcome to the forum!

I think this size is a bit more than regular users have reported. The average size according to the usage reporter is around 200GiB (some overflows make the numbers unreliable).

One common setting is to increase the block-size, which means fewer blocks to keep track of, and generally faster speeds. For 30 TiB I would suggest 2-3MiB block-size, but it depends on your data.

You probably also want some larger compressed files, again to reduce the total number of files to handle. I suggest somewhere around 500MiB to 1GiB volumes.

See also the general guide on size settings.

I am not aware of any test setups for datasets of this size. You can see the comparison topic for an overview of what parameters are effective.