I found backup file in config folder

Hi all
I’ve checked my Duplicati config folder and I’ve found a lot of “backup 20190205XYZXYZXYZ.sqlite” files there. what are they? backup of jobs file created after an application upgrade? can I delete olders?
thanks
M.

The .dblock files are the actual backups of your data. The .sqlite is, I believe, information about the backup jobs. I even have some .backup files of the same names as some of the .sqlite files in my config folder.

There are .sqlite files with ten character filenames which I believe are the details about the backup job itself. The other .sqlite is probably file lists. The local database. I haven’t come across this specifically in the manual, though, that I recall. Can anyone confirm or deny?

Since sqlite files are database files, I guess I could browse through them, looking at the tables and contents.

Looks like one of these .sqlite files contains these tables:

Block
Configuration
FilesetEntry
RemoteOperation
BlocklistHash
DeletedBlock
IndexBlockLink
Remotevolume
Blockset
DuplicateBlock
LogData
Version
BlocksetEntry
File
Metadataset
ChangeJournalData
Fileset
Operation

Some of these contents look like an activity log, like File and BlocklistHash.

When Duplicati is upgraded (sometimes on Canary upgrades and almost always on real Beta upgrade) database needs its format upgraded, and at that point a backup is made in case something goes wrong. Reverting is only really meaningful for a short time, as destination files soon drift from SQLite DB record.

If the random-letters part matches some other sqlite file, it’s probably a backup and mostly likely stale by now. 2.0.5.1 Beta came out in January, and possibly you’re seeing the backup DB from that conversion. Looking at the file date in File Explorer (or whatever) can also confirm that the file is in fact out of service.

How the backup process works explains the dlist (list of file), dblock (data blocks) and dindex (index of dblock) files which are stored at the configured destination. The local .sqlite DB is a local cache of that. There’s not much configuration information there. Configuration for all jobs is in Duplicati-server.sqlite, which has different table structure. The sample .sqlite file tables just posted would be a per-backup DB.

EDIT:

A slightly stale picture of per-backup local DB format is at Documentation for the local database format, however there’s not really enough text to make best sense of it. Sometimes forum or GitHub posts help.

1 Like

Does this mean that they can be manually deleted? Will Duplicati ever prune these local database files itself? If it needs to be done manually, is there any recommendation for frequency, which files, age, etc?

After appropriate checking for recent use, maybe matching up to “live” DB, considering staleness, delete.

Frequency is up to you. As mentioned, actually doing backups makes a backup DB go stale very quickly.
There might have been some talk of auto-purge, but I can’t find it. When updated Duplicati seems fine, or testing has gone on for so long that you don’t want to try to stick a stale DB back in service and wrestle it (maybe) to where it can deal with a backup destination it has never seen in current form, time to delete…

While some backups just make new files that Repair (if run) may delete (thus losing newer backup data), Compacting files at the backend will rearrange things to the point where stale DB will see missing files…

Theoretically even an accidental delete of a current DB should be recoverable by Recreate, but Recreate doesn’t always work, or can have challenges that make it very slow. Current live DB is in Database page.