Is Duplicati 2 ready for production?

#11

Hello
Today I use on 15 clients in parallel uses another backup solution called Acronis.

I believe that one more year the duplicati will have stability.

The biggest problem I think is in the sqlite database, I really believe in the developers.

There is a little investment left on our part, I will donate $ 30 a month monthly.

If good part made it this product would be top, there is no perfect product without investment.

For being Open Source I just have to thank the development.

Anderson

2 Likes

#12

AFAIK, test requires local db. I want to do actual real world and authentic restore test where data is restored from backup location, not just checked against the local database. As far as I know, the test won’t run without the local database, which of course isn’t available if you’ll need to do full restore.

Did I get something wrong? If so, it would be great. But I doubt it.

0 Likes

#13

Although it takes disk space, restore with –no-local-db, or from the web UI “Direct restore from backup files”. would be a good test of Disaster Recovery. The TEST command unfortunately doesn’t accept --no-local-db.

1 Like

#14

Yep, yet the test doesn’t mention that requirement as clearly as help text for some like affected does. That’s why I’ve automated the full restore process. Anyway, it’s good to do full restore, because it also checks the final restored files for validity and so on. Cost of disk space is absolutely negligible in this case.

Automated restore test will restore all backups in succession doing further backed up DB validity / corruption checks meanwhile and then deleting everything and testing next data batch. - This is important, because even if the backup software is working perfectly, some other failures could cause a situation where the database is technically corrupted.

I would love to be able to run the test run without having local db. Of course I could first build the local db and then run the test. But I don’t know if that’s worth of it? Full restore does the rebuild anyway, and as mentioned, it’s way better test than the Duplicati’s internal test.

0 Likes

#15

It’s especially way better than the internal test if the backup settings don’t set --backup-test-samples above the default 1. Should backup-test-samples be changed to percentage of backup set? suggests a way to scale that.

0 Likes

#16

Well, one way could be testing all fresh files + some old randomly.

Anyway, what then if test shows that something is wrong? There’s pretty much nothing you can do about that. Because Duplicati still fails with latest canary. All repair options are more or less broken. Test, Repair, List-Broken-Files and Purge-Broken-Files. Test fails, but there’s no way to remedy the situation, other than deleting everything and starting backup set from clean slate. Restore keeps failing, even if backing up the data shows deceptively that it’s all good. - Program logic is still very seriously lacking and restore fails.

Serious things like this, are the exact reason why the software shouldn’t be used in production. It won’t work reliably or in a sane way from logic / integrity point of view. -> Very dangerous and deceptive to use. -> All efforts and resources used to create backups, which can’t be restored, are absolutely and literally wasted energy.

I’m forced to completely recreate the 264 GB data backup set which was over 100 GB as Duplicati files, because those are unrecoverable due to bad software logic. Probably very small operations would have been enough to fix the backup set, but that logic is missing from the software. Or I could have manually try to delete more files, but that would have probably been uncertain outcome. These are exactly the things which shouldn’t be manually handled.

0 Likes

My experience with Database Rebuild / Recreate
Backup failing due to unexpected difference in fileset
#17

oh boy. all of the complex issue i read about here has me second guessing using duplicati for my home DR solution. I want to set it and forget it until i need it and don’t want to have to deal with all the esoteric problems that many here have with it. My luck is that when i need to do a restore, it won’t work. Maybe i’ll go with Cloudberry for now until more bugs are worked out and the product continues to mature.

0 Likes

#18

I have it running for over a year without issues. I do a test restore from time to time. All goes perfectly fine. Running the beta.

0 Likes

#19

could you share your options configuration?

0 Likes

#20

I haven’t tried Cloudberry myself but haven’t heard anything bad about it…

Keep in mind that almost all posts here are focused on fixing an issue, so you don’t see anything about the thousands of backups a day where it all works.

Generally, Duplicati either works well and keeps doing so or it is troublesome from the start - and remains so. We still haven’t figured out what are the triggers for one outcome vs the other. :frowning:

Of course there’s nothing that says you can’t run two different backup solutions at the same time! :wink: We just want you to have SOMETHING doing backups because some (many?) of us know how painful data loss can be.

0 Likes

#21

The only issue with Cloudberry is the cost of the software and that you need to pay for it on each client machine you run it on. It is also limited to 5TB backups. You can use the free version but with no encryption or compression and are limited to 200GB and can’t run it in a domain. Otherwise it’s a very nice app.

0 Likes

#22

I’ve investigated dozens of alternate solutions. The problem usually is:

  1. Bad software
  2. No efficient de-dupliation
  3. Lack of proper encryption
  4. Per client pricing
  5. Expensive storage

Second option after Duplicati was actullay SOS Online Backup. You only pay for storage. But it’s still problematic, if you’ve got databases sized at terabytes and the de-duplication is done on server side. SOS Online does provide proper client side “pre-cloud” encryption as option.

If I would have found perfect option for Duplicati, I wouldn’t be using Duplicati, because there are serious problems as mentioned. But on the paper (excluding bugs) the software is absolutely awesome, features are really good.

I wouldn’t bother whining about problems, if it wouldn’t worth of it. :wink:

0 Likes

#23

That sounds useless… Encryption effectively scrambles the data ruining any deduplication potential server side.

0 Likes

#24

Keep in mind that almost all posts here are focused on fixing an issue, so you don’t see anything about the thousands of backups a day where it all works.

I can provide some numbers for that. At duplicati-monitoring.com, we have received more than half a million Duplicati backup reports within the last year. 84% were reporting successful backups:

grafik

Of course this does not tell you something about restore problems etc. but it may give you some idea. I guess most failed backups are not due to a Duplicati bug I guess, but due to issues like backing up to a USB drive that is not connected, remote storage quota exceeded etc.

2 Likes

#25

Thanks pro that pie graph.
I think that ratio of successful backups can be little bit misleading.

For example, I have backup job that work’s for more than year, but in last week I had reboot during backup.

When I was verifying and purging, I find out that also some old couple months unmodified backup files have wrong hash.

When I was purging those files I encounter (bug from 2016 still open…)


and DB recreate failed.

So one interrupted backup was enough to completely kill my backup and I will be probably forced to start from zero.

I think Duplicati can be a good tool for production, but it must not be the only one. It’s still too fragile

2 Likes

#26

Just about “my problems”. I’ll run 100+ different backup runs daily. And 1000+ weekly. As well I try to test the backups around monthly, I’ve fully automated the backup testing.

I see problems every now and then, and I’ve got automated monitoring and reporting for all of the jobs.

But there are fundamental issues, and things which really shouldn’t get broken, do get broken, as well as the recovery from those issues is at times poor to none. -> Which just means that over all reliability isn’t good enough for production use. You don’t ever know, when you can’t restore the data anymore. Or when the restore takes a month instead of one hour and still fails.

But many of those issues are linked to secondary problems. First something goes slightly a miss, and then the recovery process is bad, and that’s what creates the big issue. On production ready software both of those recovery situations should work. FIrst the primary issue should be deal with efficiently and automatically, and even if that fails, then the secondary issue process should be also automated or manual, but work in some adequate and sane way.

As example if we use file system or databases. If system is hard reset, journal should allow on boot recovery and even if that fails, there should be process to bring the consistency back, even if that wouldn’t be as fast as the journal based roll forward or backward. And if the software is really bad, then the file system gets corrupted during normal operation even without the hard reset. Currently the backup works out mostly ok, but you’ll never know when it blows up. And at times it’s also blown up without the hard reset. Which makes me really worried. As I’ve mentioned. As well as recovering from that, has been usually nearly impossible.

Sure I’ve had similar situations with ext4 and ntfs, but in most of cases, that’s totally broken SSD or “cloud storage backend”. And it’s joyfulyl rare, that file system becomes absolutely and irrecoverably corrupted. We’re not there with the Duplicati yet.

Anyway, as mentioned before. Duplicati features are absolutely awesome. If I wouldn’t really like the software, I wouldn’t bother complaining about the issues. I would simply choose something different.

0 Likes

#27

it would be nice to be able to recover data even with a corrupt archive, at least to be able to recover all the good stuff.
I for security use cobian for full monthly backups, and duplicates2 for the daily

0 Likes

#28

I think data are almost always recoverable via Duplicati.CommandLine.RecoveryTool.exe
It’s not pleasant and fast, but it works.
For example in my case when backup was interrupted and I could not backup/restore/recreate DB - I was able to recover data via that tool without problem or warnings.

0 Likes

#29

It’s also usually always possible to restore “directly from files” as that creates a new simpler database.

0 Likes

#30

Yes, but that could take weeks or months, that’s the problem. Even with quite small backup restore isn’t completed in a weekend.

Sometimes restore times are quite critical for production systems. It won’t help a lot if you’ll tell the customer that the system will be back in a month. Ok, it’s positive thing if the restore works, “nothing is lost”, but it’s still a huge mess. But it turns even nicer, if after that month you’ll find out that the restore actually failed. - Ouch!

0 Likes