Is Duplicati good solution?

Duplicati has been in a very long Beta, Beta meaning it’s not considered ready for the Stable channel, however no software is perfect and I just tested a commercial product that failed ridiculously very fast.

Best way to verify backup after each incremental run was a post about products (run two) and testing where I wrote about the different kinds of testing done various ways, and issues about local database which has pros and cons. If you lose it, you need to rebuild it, subject to surprises, but while it exists it could add speed and serve as a (sometimes noisy) check that destination files are as they should be.

That post also gives some of the extreme testing that I run (partly to be quick to act if an issue arises). What I wonder about is whether any product has enough self-test or easy user test to 100% count on. Duplicati doesn’t (read the post), but the more effort one invests, the more assurance all will end well.

Good practice for backups says keep at least two (one offsite), but I’d also suggest different programs depending on how important the data is (and maybe more than two if the data is extremely important).

I think this is one of the slow spots for find-and-fix of bugs compared to backup itself, since it runs less. Precise reproducible steps with small data sets could help, but it’s sometimes very hard to nail it down.

Error during compact forgot a dindex file deletion, getting Missing file error next run. #4129 was one I referenced earlier today when someone had that sort of result, and you can see what an effort it took getting it to where it is well set for a scarce available developer. In terms of developer count, we’re low.