I know this is going to sound a bit harsh… But it’s nevertheless question no. 1 when talking about backup software: How reliable is Duplicati 2.0?
We’re currently in the decision process about what backup software to use on our Windows machines in the future and found, that some of our users were using Duplicati 1.3.4 before - so it seems natural to check Duplicati 2.0. We’re talking about Windows machines only, on Linux machines we rely on Borg and are very happy with it. However, Borg isn’t available for Windows (resp. requires “hacks” like WSL), so it isn’t on the table. Duplicati 2.0 sounds very, very promising!
However, since Duplicati 2.0 isn’t compatible with Duplicati 1.3.4 we had to move all old Duplicati 1.3.4 backups to a new platform, requiring us to restore all old backups and creating new ones in the process. It was a disaster. I’m not going to mince matters here, so please excuse the harsh words, but Duplicati 1.3.4 is the worst backup software I’ve ever seen. Thus I’m pretty skeptical towards Duplicati 2.0 whether it meets its promises - Duplicati 1.3.4 made a lot promises, too, but didn’t deliver. But I’m still very open to Duplicati 2.0 - Duplicati 1.3.4 is old and I don’t want to generalize my experiences. However, due to my experiences I have very specific questions:
First of all it was close to impossible to get Duplicati 1.3.4 running. I know, you guys no longer support Duplicati 1.3.4 and this is totally fine, but since you decided not to maintain BC, I must be able to run it somewhere at least. Backup software is long-term software, restoring a 10 years old backup must not be a problem. Since I couldn’t run Duplicati 1.3.4 on newer machines I tried virtual machines. A VM with Windows XP or Windows 7? No chance, Duplicati can’t restore from network drives and copying 720 GB compressed backups to a VHD is no option. So I tried Ubuntu 14.04. Duplicati’s CLI is rather… unorthodox to use. But after a lot of pain I got it working.
Out of 73 backups of a single machine (16 full backups, 57 incremental backups) there are just 3 backups (sic!) that had no errors. When ignoring errors which were caused by Duplicati not being able to restore symlinks, there are 20 close-to-fully-restored backups (10 full backups, 10 incremental backups). Most errors were caused by corrupt volume files, not just yielding all following incremental backups useless, even yielding all following data in the same backup useless (for example, the first volume of a 22.64 GB full backup was corrupt, yielding the whole (!) full backup as well as the following 4 incremental backups useless). Another very common error was Duplicati failing to restore a diff or snapshot file. Restoring 720 GB of compressed data took about 7 days, running 24/7, even though 35 backups failed somewhere in the process due to corrupted volumes. I mean… Wow… A 27% success rate in >160 hours.
To be fair, we’re talking about Duplicati 1.3.4. It’s a kinda old software and the stuff I read about Duplicati 2.0 looks promising. However, the fact that you guys still don’t consider Duplicati 2.0 stable (according to Wikipedia you focus on Duplicati 2.0 since 2013?) and Duplicati 1.3.4 was a total nightmare, I’m not so sure about whether I can really trust the promises made on the website. Thus I want to ask for honest feedback, whether the issues of Duplicati 1.3.4 have been fixed in Duplicati 2.0.
Is Duplicati 2.0 stable? You guys promote it as the only supported, yet not stable (beta) version of Duplicati. According to Wikipedia you focus on Duplicati 2.0 since 2013, so it’s 7 years in which Duplicati 2.0 still didn’t reach a stable version. What’s the reason for this?
How much of Duplicati 1.3.4’s code base is still present in Duplicati 2.0?
What about long-term support? I must trust Duplicati to be able to restore my backups in 10+ years - either by still supporting old backup formats or by providing a platform that can be used to restore my backups using a old version of Duplicati 2.0. Duplicati 1.3.4 didn’t run well in virtual machines since it couldn’t use network shares to access files. What about Duplicati 2.0, can I run it on a virtual machine with Windows 10, even though it’s 2030? What are the long-term support plans?
What does Duplicati 2.0 do to recover corrupted backups? We’re talking about long-term backups, it’s kinda common that one or the another bit flips in 10 years. Is Duplicati 2.0 able to recover from these situations? Or is the data thrown away? If it is thrown away, how much data is thrown away (the whole corrupted backup plus n following backups? “just” the backup? a multiple hundred MBs large volume? or just a small chunk of a few KB?)?
How fast is Duplicati 2.0? Recovering 73 backups with a total compressed size of 720 GB within >160 hours is unacceptable. How much time does Duplicati 2.0 take for this - assuming everything is stored on a common HDD?
What issues did you guys experience in production in the past? This includes both already fixed and still unfixed issues. Did you guys loose backups in the past due to bugs? How hard was it to recover corrupted backups? No software is perfect, I rather want to know how common issues are and how easy they could be solved.
What do you guys do when someone (like myself) asks you for help about a 10 years old backups? Tell me it was made with a now unsupported version? Or try to help me with disaster recovery, even though you naturally won’t release a fixed version of such a old milestone?
Thank you for your time and feedback!