Reliability of Duplicati 2.0

To add to that, the main author of all Duplicati versions has very little time lately, thus is unlikely to jump in. There was more activity earlier. GitHub Contributors history shows that very few go back to the beginning.

Speaking for myself, 1.3.4 is too old, 2.0 stability is too subjective, and questioning honesty put me off…
My usual answer to why progress is not fast (at least for features) is that there are few people available.

As you might know, getting obscure hard-to-reproduce bugs out of complicated software is very difficult.
At some point you call it stable but it’s still not perfect. No software is. Best practice is multiple methods.

1.3.4 is ancient history to most people here. Old news articles talk about some of the changes from 1.3.4.
Block-based storage engine is on current web site, and it links to a whitepaper describing the new design.

https://github.com/duplicati/duplicati/releases/tag/1.3.4 is 1.3.4. If you want code comparison, have at it…
There’s probably a huge amount of code moving (if not total rewriting) due to the new core design for 2.0.

I kind of like the suggestion by @Ralf that a paid solution “might” do better for you, however even very big companies do eventually end support. I don’t know whether any have specialized tools, e.g. just to restore.
Independent restore program is Duplicati’s offering, which is still subject to Python and dependencies drift.
How the backup process works and How the restore process works (and lower-level docs) cover designs.

You can certainly put your specific configurations on a virtual machine, test it works now, then document it. Duplicati 2.0 backups are completely portable between different destination storage, and that will also help.

Duplicati maintains records of all backup destination file hashes, and has a variety of ways to test some or verify all files (DuplicatiVerify.* scripts do need local access). So there’s some ability to detect file damage.

Disaster Recovery is a lab demo of intentionally corrupting backup files then using Duplicati recovery tools. You can see the progression. Sometimes you just purge some files. If things go way off, there’s a tool that forgives more types of damage, but Duplicati.CommandLine.RecoveryTool.exe is for emergency restores.

You can search the forum for when it was last pulled out, but IIRC it was around the start of 2020, and the Usage statistics for Duplicati show about 3 million backups per month. Not ALL complain to the forum. :wink:

Support forums and issue trackers tend to gather issues, although big issues will make people just leave.

Big Comparison - Borg vs Restic vs Arq 5 vs Duplicacy vs Duplicati gives some insights. Try it yourself…

The little known (and it may stay that way) Duplicati FAQ has a 2014 entry by lead author which explained:

Q : Is 1.3.x dead?

A : Yes. We do not have the man-power to maintain 1.3.x and develop 2.0. You can continue to use it, but we do not develop on it ourselves, as we have chosen to focus on 2.0. Feel free to checkout the 1.3.x branch if you want to continue the development of that version.

So that’s the history, and future is unknown. Perhaps some large vendor can give you a credible answer?