[Idea] Backup Duplicati database to avoid recreate

What’s the 500 MB in the example? If it’s the entire database, you wouldn’t send it to a database, but maybe you’d send it to a remote destination for safeguard. Assuming that was the idea, the question

would be to restore the database from its backup which would be a secondary done after the main job, because a backup can’t backup its own database because that database is still changing at the time…

If you’ve lost the whole original Duplicati, you’d probably bootstrap getting back by either setting up the secondary job again, or just doing Direct restore from backup files assuming you have necessary info.

Probably the usual way to reinstall a database is to use Database management screen to let you know where database belongs, and either put restored database there, or tell Duplicati the new database file.

This is space efficient but can slow down the restore of a file that undergoes constant change, because different generations of changes may wind up in different backup files, all of which might need download.

A database is a great example of a potentially big file that undergoes constant change, scattered around, making deduplication less effective. People who have tried backup have found they upload just about the database size every backup. This plus more reliable 2.0.5.1 Recreate, makes DB backup less attractive.

If you really want to try, I’d suggest using a low Backup retention to keep the collecting-the-parts issue somewhat under control, but you’ll still endure frequent automatic compact because of high churn rate…

Given a suitably large blocksize (100 KiB is too small for big backup to be fast), ideally only the dlist and dindex files would download. If you get dblock files, that’s a bad sign. Before 2.0.5.1 it was too common.

If the progress bar gets past 70%, it’s downloading dblock files. After 90%, it’s downloading all the rest…
About → Show log → Live → Verbose will also show you what you’re downloading, and how far you are.

Doing an occasional test of DB Recreate is a good idea to make sure it’s healthy when you really need it. You can copy off the old database for safety, and your new database will also be a lot smaller than it was.

Why not backung up latest Database? talks about an exotic DB backup method that eliminates standard Duplicati processing (which, as noted, doesn’t add much for DB backup, and might even make it worse).

I’m not sure how solid the control file code is, but you could pioneer its use if you want to see how it does. You could keep the usual number of versions of primary backup, and maybe just two of database backup because super-stale databases are nearly useless. As dlist files obsolete, they’re deleted not compacted.