Do you people trust Duplicati?

I’ve been using Duplicati for about 1½ years if I remember correctly, and I would really like to like but I just can’t. It does the one thing I need a backup sofware to do, it automatically backs up my data to my FTP server. It’s the first software I’ve tried and I was very happy to see it was really simple to get configured.

But two things bother me so much that I’m giving up. When it makes backups, it everytime gives some error(s) or warning(s). The few first times after configuring a backup, it works, but after a while, errors and warnings every time. And the log (or what ever that is I get after clicking ‘result’ after backup) gives me ridiculous amount of data of wich 100% seem to be tech talk and no clear reason for anything.

The backups still seem to finnish and when I go to restore, usually even the latest files are there so that’s not the biggest problem.

But for me, and as I¨’ve read, for many others, the real problem is restoring stuff. If I make nice small test-backup with 100 files / 45mb, the restore works well. But when I try to restore one ******g .jpg from my real backups, it just keeps going and going and going. And it’s not about my connection, restoring doesn’t even transfer data more than a few seconds every now and then. What I’ve read, people have had their restores running for a few days so I’m not alone with this. I’ve read this and that and some more about it, but nothing helps.

Right now I’m testing restore on my new (=fresh install) laptop and it managed to finish in half an hour (didn’t expect that!), but again. 1 warning, 5435 errors. And the backup set is just few months old.

So, I’ve had it. It’s like finding a perfect girlfriend who turns out to be everything else. Very nice but nothing I trust with my 20+ years of (digital) history.

Sorry to hear about your troubles but it’s perfectly understandable to feel that way with your evidence. I’ve also been using Duplicati about 1.5 years but for me the backups I leave alone are working great.

Granted, the same can’t be said about ones I use for testing when doing forum support - but that’s not unexpected as I’m usually TRYING to break it the same way it broke for a user. :slight_smile:

Good luck in your search for a new tool! If you find something you like that you feel is feature comparable to Duplicati please stop by to let us know.

Also consider checking back in a while to see if Duplicati is out of beta (or at least stable enough for your needs).

And if course, don’t forget - there’s nothing that says you can’t have two different backup programs running on your system… :wink:

2 Likes

No, this has been discussed earlier. I just today started running backup verification again, and found several backups which are OK, until you’ll try to do full restore and then it fails.

Until these kind of traps are fixed, the only obvious answer is absolutely not.

Is Duplicati 2 ready for production? has other anecdotal comments and some statistics. I think it does vary. Generally it’s best practice to create multiple backups using different software, especially if files are critical.

Duplicati is young software and still in beta because problems do occur from time to time. Study the forum. The tech talk Duplicati provides helps with assisted diagnosis now, and may lead to better messages later.

My personal opinion is that Duplicati is better suited for shorter-term protection than historical file recovery. What worries me greatly is people who put their files in Duplicati then delete original files to make space…

My personal use is I trust it enough to use it, but not in ways that would hurt a lot, if something goes wrong. Addditionally, I do test restores from time to time (they work), which I recommend for any backup software.

Good luck in your search!

I’ve been using Duplicati with a small-ish company now for just shy of a year. Duplicati is our 3rd level backup system as level 1 and 2 has random issues itself. Level 1 is primary critical backup with a commercial solution they’ve used for years that cost them a lot of money and uses a LOT of extra storage space than it really should (almost a full backup every day). Level 2 is same commercial system but backs it up off site to a Synology box.

I see occasional errors but it usually ends up being stuff that does not affect the backup/restore process itself. Anytime I do see something like what the original poster is showing usually ends up being file permissions, files being used by another process/backup, OS issues, other drive usage that limits file access, or something else affecting it, almost never Duplicati itself (unless it is a documented bug).

Files in use, lots of changed files, lots of smaller files can all increase backup time, same with restore files and what else is in use on the same system.

Some also need to remember that if it is in the process of a backup, it can only do one process at a time so the backup needs to finish before it goes to the next job (which may be another backup or your restore job). Personally every single time I needed to restore a file or folder during the day (no backups scheduled), it restored it every single time within 10-20 minutes (more files = longer restore time).

I kept having issues/errors/warnings but it all came down to wrong configuration.
Currently I’m using it for incremental backups for 250gb+ and it works without a single warning on 3 different separate networks

1 Like

Good to hear we have an opportunity to improve our interface / instructions / help tips! :wink:

1 Like

I’ve had similar experience in the past, especially with Windows instances, and have even had a restore of very valuable file set fail: I couldn’t almost get any data out of it (fortunately the data had been also backed up in two other ways - Code42 and a worst-case scenario dead-simple 1:1 clone).

I installed Duplicati on my Ubuntu VM on Unraid since it was the only backup application I could find that fulfilled my needs. It’s used to back up a few Minecraft Server instances and it shuts the game down before performing the backup (to ensure it’s atomic), then starts it back up with a scripts post-backup. Scripts also explain that the server is down for maintenance on Discord through a bot and notify players when it’s back up. At first, I used SMB (SAMBA Unraid host) to perform backups, but eventually after a while all backups would end up throwing errors, failing and corrupting. According to some forum posts, the issue is with SMB, a very complex protocol. Mounted SAMBA shares sound like one of the most likely usecases so it’s bananas that a backup application has issues with them. But, just for the sake of trying, I switched from SMB to KVM’s “9p” protocol for shares, and - to my great surprise - after creating a new backup it hasn’t thrown a single error and I’ve been using it successfully for about a year. I even rescued my data by successfully restoring a backup at one point where (sadly) I needed it.

I use Duplicati to backup 3 computers in my office and all of my families computers. I have had to do full restores when a family member’s laptop has died several times. It has never failed me. I always got my files restored.

I also frequently do small directory restores when I need to recover things from a week or two ago. It works consistently for me.

I backup to a local 2 Tb USB drive and online to Backblaze.

I trust Duplicati.

3 Likes

Duplicati can deal with regular crashes I’m having or a half functional backend connection(that cannot download). I think the configuration need’s to be changed(WebDAV, very confusing for me) Even missing files can be repaired (it’s only happening with the WebDAV backend sometimes). I don’t know why I shouldn’t trust Duplicati.

For what it’s worth, I use the WebDAV back end on most of my machines at home. WebDAV service is running on a Synology NAS, which then syncs to B2. I also use B2 natively on some other machines.

No problem with either setup. Duplicati is very reliable for me, I do trust it.

I trust dupilcati and it is my mainline for big restores but also do run parallel another system for spot (one file/folder) restores

The DB corruption previously (verisions about 6 months ago) were very concerning but seems to be fixed now.

I’d give it another year or so before trusting more

How are you doing it? I’ve read about something with backslashes and slashes making a difference, in my setup there are slashes. Should there be a leading slash for the path? Are there rules to follow?

Use forward slashes in the WebDAV paths. No leading slash is required but adding one doesn’t break it in my experience. I also use no trailing slash in the path.

Only use backslashes when you’re running Duplicati on Windows and you’re specifying the source data. Even on Windows, always use forward slashes in the Destination configuration page unless I suppose you’re saving to a local disk. All the other protocols expect regular forward slashes as far as I know.

Thank you very much, I’m going to try it out. Others got the same with rclone, it’s the cloud. And rclone can’t even read the filestructure, so I’m better with Duplicati :sunglasses: The same adress refers to the website, so I’m going to try other configurations and let it run if theres no solution, have to choose the provider better next time. But then I know there shouldn’t be any backslashes
and it will run fine :wink:

I dont trust any servise. Just use rules “3-2-1” of backups!

1 Like