I am transitioning from Crashplan, and running Duplicati in parallel to gain experience and to build up a history of file versions; GUI on Windows, and don’t want to learn/use the command line. Above all else I want the confidence that if/when disaster strikes Duplicati will save the day; and TBH some of the problems I read about corrupt databases and restoring make me nervous.
I have set up duplicate backups, encrypted, to B2 and OneDrive although I intend to use B2 long term as I assume it is designed for purpose and likely to be faster/more reliable. Is that a reasonable conclusion? I could continue with both as OneDrve is effectively free.
My data is all on my d: drive, which as a Windows(10) tinkerer allows me to restore my c: drive from a lean and clean image on a regular basis. Crashplan handles synchronising the then out of date database seemlessly and quickly, with no re-uploading of files required - will Duplicati do the same once ‘newly’ installed ie without a, or with an out of date, database? I have taken the precaution of regularly backing up the Duplicati sqlite files, so would it be better to copy them back across before allowing Duplicati to resync? I did move the databases but then Duplicati would not back them up as open (VSS doesn’t help?)
I find the log/reporting from Duplicati unfriendly and difficult to read/interpret; and am using “Duplicati Monitoring” which at least formats it nicely. Is there a local viewer tool available or perhaps a way to get the data into Excel in a decent format?
I have been using Duplicati almost a year now on 10 computers and like it a lot. It can be a bit trickier to configure compared to CrashPlan, but it is sooo much more flexible.
Regarding your question about redoing your C: drive - I haven’t done that exactly, but I have done PC migrations where most of the data is moved to a new computer. I uninstall Duplicati from the old machine, install it on the new one, and transfer the sqlite files over. When I start Duplicati on the new machine everything just works perfectly.
If you didn’t want to or couldn’t move the sqlite files for some reason, you would need to recreate the backup job and then have Duplicati regenerate the local database by reading the remote backup data.
Thanks, that makes me feel a lot more confident. So if the local and remote data was identical, Duplicati would create a new database without uploading or downloading any files. What I would expect/hope of course.
Would a regenerated database pick up all the versioning as well? Is it a reasonably speedy operation for a smallish data set (80gb)? I’ll keep backing up the sqlite files anyway.
I supppose I c/should cause a “disaster” by deleting a database and job and see what happens.
Well no, if you didn’t have the sqlite files and you asked Duplicati to rebuild the database, it WOULD have to read the remote backup data. But it wouldn’t have to download everything. Duplicati on the remote end stores 3 types of files: dblocks (contains actual file data), and dindex and dlist files (metadata for the backed up data). When you need to recreate the local database only the metadata files are downloaded, which are significantly smaller.
After the local database is recreated then Duplicati knows about all the backed up data, including all versions.
I would recommend testing it to help you feel more confident.
Thank you, that’s a big help. I am becoming more familiar and more confident with and in Duplicati; still a way to go though, but then that’s why I’m running parallel with Crashplan.
I will test recovery both with and without copying back the sqlite files using a fresh install on another pc.
Yeah I did the same when I first started using Duplicati. But when you do this test on a secondary computer, make sure Duplicati is not active on your main computer. Two computers running Duplicati should try to access the same back end data at the same time.