Crashplan is starting to annoy me now so I’ve been looking at alternatives, so far Duplicati has by far the best versioning settings. My plain is to run the “frequent changes” backup with a small block size for things like browsers, “main” for general day to day stuff like my user folder, and “static” for bigger files with a 1gb block size.
In the two days I’ve been trying to get stuff set up, I’ve had numerous backup jobs just simply break. The first time was I managed to crash the PC running about 4 different backup programs at once (when testing them) while gaming, and upon restarting, Duplicati was unable to continue backing up, even after recreating the database. I’ve had a few that have just decided to completely die if I stop the backup midway through and try to resume later. An hour ago I changed the “destination” path and it completely broke, repairing the database to try backup to the new location broke it beyond repair.
My question is can I actually rely on this? I’ve been having way too many issues so soon, so it definitely makes me a bit wary.
Looking through my history, here are some of the issues I’ve had that resulted in a broken backup:
One thing that would be nice btw, is global exclusion settings, makes no sense to have to copy the same settings to each one.
Edit: Since posting this 2h ago, I’ve had yet another “Found inconsistency in the following files while validating database: , actual size 214016, dbsize 0, blocksetid: 914 . Run repair to fix it.”, when repair does nothing. Guess I’m deleting and recreating the backup yet again.
Edit 2: 6h after posting and another backup has failed. It claimed the database needed repair, so I did that, and now get the message “The database was attempted repaired, but the repair did not complete.”. Looks like I’m recreating that backup too.