I think it may make sense in a reinstall scenario or an accidental deletion of the job configuration.
It doesn’t have to hold connection info or credentials, both because you already have those but also because they might change outside of Duplicatis control.
I think restoring things like file selection, filters, and the database path could be interesting because you could realistically lose 20 minutes of configuration time if you accidentally remove a local job and in this case even more recovery time if you don’t know which DB was last used.
I second that reinstall or deletion scenario.
It actually would help @jerrac with his problem as he could retrieve the job config from his backup data folder and look at it for the database name.
Having a non-backuped config file as your only source of password/url to a backup destination is a fail in itself (me thinks).
So here the case would be:
Go to your backup destination -> retrieve encrypted job description -> import in Duplicati -> answer passphrase for encrypted job description - ready to go again.
I personally did use this scenario during use of ARQ and reinstall ARQ on the same machine (which unfortunately lost it’s settings).
I too would prefer a only the relevant backup job description per backup at destination - not all of them everywhere. But the file should be encrypted anyway!
And when we are at this - i’d really like to have a way to backup ALL settings (Duplicati itself and backup jobs) at once. You never know…
A folder BackupConfig could be created in the Duplicati data folder (or anywhere outside any Duplicati folder). As soon as a new backup job is created, it will be automatically exported to a JSON file in this backup folder, encrypted with the passphrase of that job and a timestamp appended to the filename.
Everytime after a job has been edited, a new backup version will be added. At any time, the BackupConfig folder will contain all versions of all backup jobs that ever existed on that system.
In case of a reinstall on the original host, the only thing the user has to know is the passphrase, but this has to be known in all suggested scenarios. This is enough to recreate the complete backup task.
The BackupConfig folder can be included in the backup source for everyone who likes to get access to the config files from the destination.
The procedure takes some more time, but doesn’t need more information than the other suggestions: URL and credentials of the remote location and the passphrase. Just choose “Direct restore from remote files” and restore the BackupConfig folder. After the restore operation, backup configurations can be imported from the restored files.
If my system dies and I’m restoring to a new system, I would want to restore everything exactly the same as it was. Including filters and paths. Even if I had to go manually update drive letters or mount paths, being able to pull that stuff in from the backed up data would be very helpful.
Um, I wouldn’t want credentials stored that way though. I should need them stored there anyway, since to access the backed up data, I’d have to already know them.
Thanks everyone for your replies so far. It appears I triggered a really interesting conversation.
My situation started because I was moving the databases to a different drive. They were on my SSD, but it is running low on space. My Duplicati databases were taking up over 5GB. As part of that move I was renaming them from the random names to something based on the job name. For some reason I thought recreating the db was the way to go instead of just renaming the file and going from there… Not my smartest moment. But that is why I lost the random name of the job I’ve been trying to recreate.
@kenkendk Would it be possible to name the sqlite db’s based on the job name from the start?
I just took a peek in one of my databases. Somewhere at the beginning of the file there is a section with (i think) last added or all filenames with paths in readable format.
If your backup sets differ in what files are backup’d - perhaps you can it distinguish this way?
I just brute-opend it in UltraEdit (on Windows). Not really a database viewer but hex will do
I tried different dbs and could find at least some distinct filenames with paths. As i do have an job for each partition that i backup i can use this information.
easiest way to correlate db to job is to rename db through gui
great feature would be to save encrypted job description to backup destination for adopting/reimporting (sorry @kenkendk, but i think that still would be a great feature for hardening/securing/backing up the backup itself and preventing the user from harm if something stupid to the local things regarding duplicati configs happens
forums with engaged people (this one goes to @Pectojin!) are great.
Hmm… I’d say, ask if they want to rename the db as well.
For UI, maybe add a database name field under the job name field when a new job is being created. Autofill it as they type the job name, and have a way for them to override it if they want. Same for when it’s being edited. (I’ve seen this kind of workflow used when something needs a “machine name”. It works well.)
If someone is using it from the command line, I would assume they know how to reference the proper db file. I know every time I’ve used the command line, I’ve had to just look up which database file to point at.
I never have created a database from the command line, though. So I’m not sure what that workflow looks like.
I recently had a Docker issue that ate my backup config. Luckily it wasn’t very complicated so I was able to make a new job to point to the existing destination, but it did make me think it would have been nice to be able to restore the backup settings from the backup itself…