I’ve used Duplicati in a small client with success and it’s been running smooth for almost a year now. So, first off, thank you very much for this beautiful piece of software.
The necessity for a backup grew in another client, tho, who is bigger and things aren’t how they should be as of right now. I finally got the green light to start backing up their data.
They have a PACS Server, so a lot, and I mean A LOT of small files that together make up for a huge amount of data (probably 11TB, not in a single drive, tho, which means they will probably be different jobs).
I read somewhere Duplicati’s DB could get messy with such amount of files. Even splitting jobs, I’ll probably have to upload around 3TB of data per job at some point, while keeping versions. I probably can do a new job each month, but I’d like something more “set it and forget it”.
Will Duplicati handle it well enough or will I run into problems? I’d like to think my setup would be as easy as in the first client, but this is a more complex setup on its own.