Would it be a good idea of Duplicati would also put the local database (encrypted) on the remote location?

Although it’s not explicitly said, I guess step 3 is how the out-of-sync Duplicati database got installed.
Reversing the first two steps may have avoided that, and would have been easier than manual copy.

Repair command deletes remote files for new backups #3416 may have caused the problems seen, however as noted there it’s technically tough to deal with, maybe especially if compact moved things.

Having the repair command be either an adjustment (maybe a poor one) or recreate confuses things. Having recreate be sometimes slow (hopefully faster in usual cases soon) may encourage repair too.

local DB downgrade will corrupt backup forever (on destination) #3223 is a more direct claim of issue, reminding me of the “workaround” comment earlier. Manual DB backup is more clearly a workaround.

Moving back to the question of the topic’s title though, one drawback of DB backups is that databases sometimes get very large, with distributed changes that hinder deduplication, so backup may be slow. Choosing sizes of key settings for big backups can help out, but the settings would also help recreate, meaning that making DB sizes small enough to backup nicely may reduce benefit of doing backup. :disappointed:

Or so I think from forum posts. Someone who’s been doing backups might chime in with their findings.