Is Duplicati good solution?

Hi Guys,

Hi, I have cloud server on Asphostportal and I’ve been trying to find a good solution for backing up my important files here. I’m having a hard time evaluating various software because I don’t know what features are needed. If I use their managed services, there will be additional cost for it, so I plan to backup it on my own.

I’m confused by articles saying I need a “syncing” solution and not a “backup” solution, though. Can I use Duplicati as a straight copy-to-NAS (with deduplication), or am I forced to look for non-backup software?

Thank you


it depends on the risks you care about.
if what is concerning you is hardware failure, syncing and backup can be solutions.

If you are concerned by accidental or malicious deleting of files that you want to get at after some time once you realized you miss them, only backups give you a chance to do that. When a file a deleted from the source, it get deleted from the target at the next synchronisation, while it’s deleted according to your retention policy with backups.

Hi gpatel,

Thanks for your update. Actually, I’m afraid of virus or malware, something like that.

what you should NOT do then is to rely on any online duplication, that is, a copy that is available to the protected system through standard operating system access - typically a network drive, because the locker software could access it and crypt the duplicated data just as well as the original. This - backuping through a network share - is what some Duplicati users are doing and I don’t think it’s a good idea.

What I do is always backup with SFTP. If it’s with a NAS, I create a partition on the NAS and don’t enable it for CIFS access (in other words there is no Windows share). In this way the risk factor is non existent unless the malicious adversary is specifically targetting the system and is competent enough to get at the specific backup configuration, password, and so on. It’s very unlikely unless your risk model is protecting against a state actor or something like that. You can also use cloud with SFTP or any of the other proprietary modes that Duplicati supports.

There’s still a problem with compacting which could lead to data corruption and not being able to restore the backup.

I wish I would have known this earlier. Because this problem really means that you can’t trust the backup. Which is kind of bad deal. Disaster recovery backup, should be highly available, and it being covertly corrupted almost completely negates point of having the backup in the very first place.

Also repeated restore testing is tedious (and potentially expensive and slow) task. But without doing that, taking backups is pointless, because it’s possible that you can’t restore the backup.

I just blog posted about this, and I’m only saying this to warn people about this risk. As far as I know, it’s the only real show stopper before I would call it 1.0 (non beta!) release. In technical terms, the problem source is very small, but it’s effects can be truly devastating.

Note. As far as I’ve observed, you can avoid this risk, by not running compact ever. But it means that you’ll backup data set will grow forever and no data is being removed from it (except full blocks, if you’re using small files and or deleting large non-updated data sets).

Duplicati has been in a very long Beta, Beta meaning it’s not considered ready for the Stable channel, however no software is perfect and I just tested a commercial product that failed ridiculously very fast.

Best way to verify backup after each incremental run was a post about products (run two) and testing where I wrote about the different kinds of testing done various ways, and issues about local database which has pros and cons. If you lose it, you need to rebuild it, subject to surprises, but while it exists it could add speed and serve as a (sometimes noisy) check that destination files are as they should be.

That post also gives some of the extreme testing that I run (partly to be quick to act if an issue arises). What I wonder about is whether any product has enough self-test or easy user test to 100% count on. Duplicati doesn’t (read the post), but the more effort one invests, the more assurance all will end well.

Good practice for backups says keep at least two (one offsite), but I’d also suggest different programs depending on how important the data is (and maybe more than two if the data is extremely important).

I think this is one of the slow spots for find-and-fix of bugs compared to backup itself, since it runs less. Precise reproducible steps with small data sets could help, but it’s sometimes very hard to nail it down.

Error during compact forgot a dindex file deletion, getting Missing file error next run. #4129 was one I referenced earlier today when someone had that sort of result, and you can see what an effort it took getting it to where it is well set for a scarce available developer. In terms of developer count, we’re low.

Afaik that’s not a problem with latest canaries anymore. Haven’t seen that for quite a while. It getting that error was better than not getting that error. And it also helped to know when backup’s aren’t working. As well as running repair fixed that. I think I posted several times about that specific issue as well. Also that problem would trigger immediate alert because backup won’t run. So I most likely would know if that would be happening.
I know some issues take effort, as I’ve said earlier, I’ve got my own (more or less) buggy code bases to work with as well.