Large volumes broken down into smaller parts

Hi, I’m new to the forum, and I apologize if I’m asking something that has already been answered.
I have a OneDrive-imposed limitation of 100GB per file, and some customers we have have backups larger than that size.
I currently use cobian, but the latest version will not have this option to partition backups into smaller zips, so I need to get ahead of myself.
Would anyone know, how do I configure this partitioning for larger files?

You set a backup volume size and the backups are managed into files of no larger than that. I use 512MB which is on the large side. The default is 50MB.

SO with default settings a 200GB backup will get around 4000 data files and an equal number of index files. None larger than 50MB.

You don’t have to mess with this, duplicati manages it automatically.

1 Like