How to test which database belongs to which job?

That’s exactly how ARQ is doing it. This way you could even have the option to import/adopt an complete job/backup from scratch.

1 Like

That would be pretty cool

As I understand it, he only deleted the database, not the complete backup job.
This method could also be used to find the db names for all other backup jobs, resulting in one remaining db file that should belong to the missing backup job.

Ah, re-reading the original post I think you’re right

@kees-z as i read it, he did not delete the databases. He just don’t know which one belongs to which job.

Perhaps we can make a feature request out of it. If you could import the saved/encrypted config from backup destination you could look in there for the database name.
This is what i meant with off topic :slight_smile:

For the OP, there is no backup-id or similar, and the database is deliberately nit storing the destination information. You can look at the paths being backed up, and try to match them.

For storing the config remotely, we discussed this in our old Slack channel (sadly not public).

It is technically possible to store it, and there is support in the code for reading and writing extra “control files”, which was meant to store this.

But, the most important information to store is: passphrase and serverurl (including credentials).

Before you can retrieve the data, you need that exact information. In other words, storing the config would allow you to retrieve information you have already supplied.

You could also restore more, like filters, source paths, settings and the local dbpath. But these relate primarily to the machine where the backup was created, and may or may not make sense on another machine. Since I think you would want to restore the config on another machine (after a crash or similar) I don’t see the benefits. If you have a use-case where it makes sense, I will reconsider.

We also discussed storing all backup configs in every backup. That way you would only need to know one set of information, and get back your entire setup. This would make sense, but you are also exposing your config data on multiple destinations. If you mess up in one (e g. weak passphrase) you expose everything (i.e. logon credentials to other machines). Also the “new machine, different settings” argument also applies here.

We could not find a great way to implement/communicate this feature, so we skipped it.

I think it may make sense in a reinstall scenario or an accidental deletion of the job configuration.

It doesn’t have to hold connection info or credentials, both because you already have those but also because they might change outside of Duplicatis control.

I think restoring things like file selection, filters, and the database path could be interesting because you could realistically lose 20 minutes of configuration time if you accidentally remove a local job and in this case even more recovery time if you don’t know which DB was last used.

1 Like

I second that reinstall or deletion scenario.
It actually would help @jerrac with his problem as he could retrieve the job config from his backup data folder and look at it for the database name.

Having a non-backuped config file as your only source of password/url to a backup destination is a fail in itself (me thinks).

So here the case would be:
Go to your backup destination -> retrieve encrypted job description -> import in Duplicati -> answer passphrase for encrypted job description - ready to go again.
I personally did use this scenario during use of ARQ and reinstall ARQ on the same machine (which unfortunately lost it’s settings).

I too would prefer a only the relevant backup job description per backup at destination - not all of them everywhere. But the file should be encrypted anyway!

And when we are at this - i’d really like to have a way to backup ALL settings (Duplicati itself and backup jobs) at once. You never know… :thinking:

What about copy configs to a local destination?

A folder BackupConfig could be created in the Duplicati data folder (or anywhere outside any Duplicati folder). As soon as a new backup job is created, it will be automatically exported to a JSON file in this backup folder, encrypted with the passphrase of that job and a timestamp appended to the filename.
Everytime after a job has been edited, a new backup version will be added. At any time, the BackupConfig folder will contain all versions of all backup jobs that ever existed on that system.

In case of a reinstall on the original host, the only thing the user has to know is the passphrase, but this has to be known in all suggested scenarios. This is enough to recreate the complete backup task.

The BackupConfig folder can be included in the backup source for everyone who likes to get access to the config files from the destination.
The procedure takes some more time, but doesn’t need more information than the other suggestions: URL and credentials of the remote location and the passphrase. Just choose “Direct restore from remote files” and restore the BackupConfig folder. After the restore operation, backup configurations can be imported from the restored files.

1 Like

Is there a check command I could use? Or would I just point the db path at the sqlite file I’m checking and then run the backup job?

If my system dies and I’m restoring to a new system, I would want to restore everything exactly the same as it was. Including filters and paths. Even if I had to go manually update drive letters or mount paths, being able to pull that stuff in from the backed up data would be very helpful.

Um, I wouldn’t want credentials stored that way though. I should need them stored there anyway, since to access the backed up data, I’d have to already know them.

1 Like

Thanks everyone for your replies so far. It appears I triggered a really interesting conversation. :slight_smile:

My situation started because I was moving the databases to a different drive. They were on my SSD, but it is running low on space. My Duplicati databases were taking up over 5GB. As part of that move I was renaming them from the random names to something based on the job name. For some reason I thought recreating the db was the way to go instead of just renaming the file and going from there… Not my smartest moment. But that is why I lost the random name of the job I’ve been trying to recreate.

@kenkendk Would it be possible to name the sqlite db’s based on the job name from the start?

That’s what I would do. The backup will first check for expected files on the remote target and fail out if more or less files were found.

I think they’re intentionally psuedo random to prevent issues with duplicate database names.

However, I think you can move the DB through the web interface, that way you should also be able set the name.

I just took a peek in one of my databases. Somewhere at the beginning of the file there is a section with (i think) last added or all filenames with paths in readable format.
If your backup sets differ in what files are backup’d - perhaps you can it distinguish this way? :wink:

Can we just appended those random characters to the name?

How did you take a peak? If this was on one of my Linux machines, I’d just use Adminer, but it’s my Windows computer. So I’m not very aware of what database tools exist for it.

I just brute-opend it in UltraEdit (on Windows). Not really a database viewer but hex will do :slight_smile:
I tried different dbs and could find at least some distinct filenames with paths. As i do have an job for each partition that i backup i can use this information.

sqlite%5D%20-%20UltraEdit%2064-bit

I use SQLiteBrowser on MacOS, which works very well. It looks like they have a windows version too.

It does all the stuff you’d expect of a DB browser.

2 Likes

@Pectojin: Even better! :slight_smile: They even have a portable version for Windows.

@jerrac: just open the database with this tool and have a look at the table “file”.
There are the backup’d files in plan text:

Hope that helps to identify the right database :slight_smile: