Each of us who have created a separate backup job to back up the local databases, and only them, already know how simple it is to define that job. There are some aspects in the setup, however, that might need to get locked into Duplicati itself to ensure the integrity of the backup set. Yet, there is no urgency with such a new feature.
This seems like an ideal topic to try a crowdsourcing approach: how about sharing snippets of our specific solutions, evaluating together their weak points, and thereby gradually working towards a setup that eventually may find its way into Duplicati?
I’ll begin the evaluation by making a few notes on the merits of having a special backup job for the purpose.
Recently, the internal disk on my Windows laptop broke down and I had to run the restore operation of my backup sets without access to any of the local databases. Until then I had not bothered to back up the local databases at all, and as I set out to perform the lengthy restore operation, I realized that even if I had included the databases in some backup job, I would not have had access to any one of the databases anyway, not until I had done at least one restore. Therefore, if I want include the local databases in a backup job, I will want to be able to restore that particular backup set quickly, which means that the job has to include only the local databases, and nothing else. The local database of this metajob itself need not and should not be included, as it is the one being active while the job runs.
If I can somehow ensure that the databases are backed up after succesful backups only, keeping just one version of the backup set is quite enough. I just can’t see any reason to go further back into the history of these databases. If I have not done anything else to ensure the integrity, however, then keeping a few more versions obviously is safer than just one. In any case, I expect a small number of versions to also speed up the restore operation in a disaster situation.
Thanks to easy-to-use Duplicati, the restoration I needed was succesful, thank you. I have since created a metajob on both of my laptops (Windows and Ubuntu), following the principles above. Next, I could export my job definition as a command and/or json file (with some degree of anonymization) to share here. And then someone else could share their different approaches to ensuring that only consistent databases are backed up, and so on.
By sharing some simple, even trivial, templates to each other, we can invite more viewpoints and refine whatever needs refining, until at some point down the rosy lane, maybe we could write a polished proposal on what kind of settings such a metajob needs a user to fill in, and Duplicati to support, and why it would be better to integrate the feature into Duplicati instead of us just sharing templates.
Here is the command line from my Ubuntu machine (with username modified and without password):
mono /usr/lib/duplicati/Duplicati.CommandLine.exe backup file:///media/USER/Duplicati/meta/ /home/USER/.config/Duplicati/ --backup-name=meta --dbpath=/home/USER/.config/Duplicati/meta.sqlite --encryption-module=aes --compression-module=zip --dblock-size=50mb --keep-versions=1 --disable-module=console-password-input --exclude="/home/USER/.config/Duplicati/control_dir_v2/" --exclude="/home/USER/.config/Duplicati/updates/" --exclude="/home/USER/.config/Duplicati/meta.sqlite" --exclude="/home/USER/.config/Duplicati/*.sqlite-journal"
Surely this is no suprise to anyone who has already done it. The job on my Windows machine only differs from this one by having less excludes; on Windows, the local folder does not have the updates subdirectory, nor does it have any journal files during Duplicati runs.