How to test which database belongs to which job?

I have a backup of all my Duplicati sqlite databases. Unfortunately, I forgot to note down which belongs to what job.

So I tried just recreating the database from the backup data. That has run into the whole super slow recreate issue. It’s been over a week at this point…

Is there a way to check each sqlite file against a certain job to see if it is the one with the correct data? Maybe point the job at the db, then try to run the job? If it error’s, it’s the wrong one?

Since this is an odd post, let me restate what I said. I’m not sure I was clear.

I have BackupJob configured in Duplicati. I removed the .sqlite database for that job. Recreating the database is taking forever. I do have a copy of the entire Duplicati databases folder that contains the old .sqlte file. But it also contains other jobs .sqlite files. I want to somehow figure out which one of the .sqlite applies to BackupJob.

Any suggestions?

See: How to test which database belongs to which job? for a link to a tool that made the solution easy to perform.

If your backup has last modified dates on the files that would be easiest. Else you probably have to guess by size and try to run your configuration with each DB.
As far as I know it would check DB against remote files and error out if they didn’t match right away.

Exporting to the commandline or clicking the “database” link in the advanced section of a backup job should reveal the name of the associated database for that job.

I dont think he has the original configuration anymore

Little off topic: this shows the necessity to even backup the backup program :wink:
Perhaps we can implement an automatism/option to backup all job configurations once in a while…

1 Like

Maybe we could put the configuration a separate encrypted file on the target and add an option to recover the configuration.

2 Likes

That’s exactly how ARQ is doing it. This way you could even have the option to import/adopt an complete job/backup from scratch.

1 Like

That would be pretty cool

As I understand it, he only deleted the database, not the complete backup job.
This method could also be used to find the db names for all other backup jobs, resulting in one remaining db file that should belong to the missing backup job.

Ah, re-reading the original post I think you’re right

@kees-z as i read it, he did not delete the databases. He just don’t know which one belongs to which job.

Perhaps we can make a feature request out of it. If you could import the saved/encrypted config from backup destination you could look in there for the database name.
This is what i meant with off topic :slight_smile:

For the OP, there is no backup-id or similar, and the database is deliberately nit storing the destination information. You can look at the paths being backed up, and try to match them.

For storing the config remotely, we discussed this in our old Slack channel (sadly not public).

It is technically possible to store it, and there is support in the code for reading and writing extra “control files”, which was meant to store this.

But, the most important information to store is: passphrase and serverurl (including credentials).

Before you can retrieve the data, you need that exact information. In other words, storing the config would allow you to retrieve information you have already supplied.

You could also restore more, like filters, source paths, settings and the local dbpath. But these relate primarily to the machine where the backup was created, and may or may not make sense on another machine. Since I think you would want to restore the config on another machine (after a crash or similar) I don’t see the benefits. If you have a use-case where it makes sense, I will reconsider.

We also discussed storing all backup configs in every backup. That way you would only need to know one set of information, and get back your entire setup. This would make sense, but you are also exposing your config data on multiple destinations. If you mess up in one (e g. weak passphrase) you expose everything (i.e. logon credentials to other machines). Also the “new machine, different settings” argument also applies here.

We could not find a great way to implement/communicate this feature, so we skipped it.

I think it may make sense in a reinstall scenario or an accidental deletion of the job configuration.

It doesn’t have to hold connection info or credentials, both because you already have those but also because they might change outside of Duplicatis control.

I think restoring things like file selection, filters, and the database path could be interesting because you could realistically lose 20 minutes of configuration time if you accidentally remove a local job and in this case even more recovery time if you don’t know which DB was last used.

1 Like

I second that reinstall or deletion scenario.
It actually would help @jerrac with his problem as he could retrieve the job config from his backup data folder and look at it for the database name.

Having a non-backuped config file as your only source of password/url to a backup destination is a fail in itself (me thinks).

So here the case would be:
Go to your backup destination -> retrieve encrypted job description -> import in Duplicati -> answer passphrase for encrypted job description - ready to go again.
I personally did use this scenario during use of ARQ and reinstall ARQ on the same machine (which unfortunately lost it’s settings).

I too would prefer a only the relevant backup job description per backup at destination - not all of them everywhere. But the file should be encrypted anyway!

And when we are at this - i’d really like to have a way to backup ALL settings (Duplicati itself and backup jobs) at once. You never know… :thinking:

What about copy configs to a local destination?

A folder BackupConfig could be created in the Duplicati data folder (or anywhere outside any Duplicati folder). As soon as a new backup job is created, it will be automatically exported to a JSON file in this backup folder, encrypted with the passphrase of that job and a timestamp appended to the filename.
Everytime after a job has been edited, a new backup version will be added. At any time, the BackupConfig folder will contain all versions of all backup jobs that ever existed on that system.

In case of a reinstall on the original host, the only thing the user has to know is the passphrase, but this has to be known in all suggested scenarios. This is enough to recreate the complete backup task.

The BackupConfig folder can be included in the backup source for everyone who likes to get access to the config files from the destination.
The procedure takes some more time, but doesn’t need more information than the other suggestions: URL and credentials of the remote location and the passphrase. Just choose “Direct restore from remote files” and restore the BackupConfig folder. After the restore operation, backup configurations can be imported from the restored files.

1 Like

Is there a check command I could use? Or would I just point the db path at the sqlite file I’m checking and then run the backup job?

If my system dies and I’m restoring to a new system, I would want to restore everything exactly the same as it was. Including filters and paths. Even if I had to go manually update drive letters or mount paths, being able to pull that stuff in from the backed up data would be very helpful.

Um, I wouldn’t want credentials stored that way though. I should need them stored there anyway, since to access the backed up data, I’d have to already know them.

1 Like

Thanks everyone for your replies so far. It appears I triggered a really interesting conversation. :slight_smile:

My situation started because I was moving the databases to a different drive. They were on my SSD, but it is running low on space. My Duplicati databases were taking up over 5GB. As part of that move I was renaming them from the random names to something based on the job name. For some reason I thought recreating the db was the way to go instead of just renaming the file and going from there… Not my smartest moment. But that is why I lost the random name of the job I’ve been trying to recreate.

@kenkendk Would it be possible to name the sqlite db’s based on the job name from the start?

That’s what I would do. The backup will first check for expected files on the remote target and fail out if more or less files were found.

I think they’re intentionally psuedo random to prevent issues with duplicate database names.

However, I think you can move the DB through the web interface, that way you should also be able set the name.