Repair does not repair

As shown in the attached image I am prompted to click repair, but it doesn’t help.

Are there any procedure to execute in order to get the backup repaired?

Fedora Linux 39.


Recovering by purging files but what portion of the total backup is 63 files?

What sort of storage is the destination? Are there other backups sharing it?

Be sure to never point two backups at the same folder, or they will interfere.

If that didn’t happen, can you give the history about how the situation arose?

thank you for your answer.
Storage destination is a web-dav mounted remote drive.
It is the only backup into this directory.
As I remember it, I’ve got this error since the beginning or shortly there after. That’s about one month.

Is the count stable or climbing? I’m wondering how healthy the remote drive is.

File names can be clues, e.g. watch About → Show log → Live → Warning during Verify files.
The *-broken-files work best on dblock files, but dindex and dlist files also exist. Which are yours?

HI, agian.

The count is stable.

The warning log gives me something like this:

22 May 2024 16:18: Failed to perform cleanup for missing file:, message: Repair not possible, missing 516 blocks. If you want to continue working with the database, you can use the “list-broken-files” and “purge-broken-files” commands to purge the missing data from the database and the remote storage.

What are dblock, dindex and dlist files? Something I should find locally or remote?


How the backup process works explains, but you can also look at remote file names.

A dblock file by default is 50 MB max (configurable). A dindex is the index for a dblock.

A dlist lists files for a backup version (date is in name) and holds assembly directions.

You also have a local database, and an intact one can repair a missing dlist or dindex.

The dblock files have actual source data, and losing one means a file backup got hurt.

Damaged source files get removed from the backup by purging them from the backup.

Source files are broken into blocks for storage, and blocks are kept in remote volume called dblock.files.
A missing dblock means its blocks are missing, so some source files backups broke and need a purge.

Then maybe the destination is working but something went wrong on initial backup. Any recollection of it?

If not, maybe just follow the directions to list and purge broken files.

Trying tp run on comandline I get this:

duplicati-cli list-broken-files ~/mounts/\ filer/

Enter encryption passphrase:
The operation ListBrokenFiles has failed with error: Database file does not exist: /home/klaus/.config/Duplicati/BLDLLJHLHS.sqlite => Database file does not exist: /home/klaus/.config/Duplicati/BLDLLJHLHS.sqlite

ErrorID: DatabaseDoesNotExist
Database file does not exist: /home/klaus/.config/Duplicati/BLDLLJHLHS.sqlite

Do I do something wrong?

Yes, however it’s probably not obvious. When you don’t tell CLI which database to use, it invents one.

You need at least dbpath, however all this (and passphrase) is done for you in the GUI Commandline.
Generally you just have to change the command to the desired one, and get rid of unnecessary stuff.

Thank you for your answer.

The reason for me to fall into the cmd-line was that none of the commands in the gui did fix anything. But it is not obvious for me how to run commands using Duplicati-cli. I tried this:

duplicati-cli list-broken-files /home/klaus/.config/Duplicati/DM

Enter encryption passphrase:
The operation ListBrokenFiles has failed with error: Database file does not exist: /home/klaus/.
config/Duplicati/ZGUAAKMXOZ.sqlite => Database file does not exist: /home/klaus/.config/Duplicat

ErrorID: DatabaseDoesNotExist
Database file does not exist: /home/klaus/.config/Duplicati/ZGUAAKMXOZ.sqlite


Then run them in GUI Commandline, as suggested.

Did you run an actual list-broken-files? I don’t spot that mentioned.

You can’t just toss that on the line. Maybe you don’t use command-line?

How to use the Duplicati Command Line tool

Advanced options with a value are preceeded with 2 dashes (--), followed by an equal=sign (=) and the desired value:


Path to the file containing the local cache of the remote file database.

So you put that together with Local database path (not just the filename) on database screen.

But GUI Commandline is easier, and I still encourage it. Change Command and clear arguments.


Hi thatks for your patience.

Running from the gui I get:

Found 14 commands but expected 1, commands:

“file:///home/klaus/mounts/ filer/”
Return code: 200


is the part you missed.

Oh, thank you. I need to learn to read all the text.

Now, with clear arguments, I get:

The operation ListBrokenFiles has failed with error: Filters are not supported for this operation => Filters are not supported for this operation

ErrorID: FiltersAreNotSupportedForListBrokenFiles
Filters are not supported for this operation
Return code: 100

Can’t find anything mentioning filter on the page.


They would be from the Filter section of your job’s Source Data screen 3.
Here though, the include and exclude filters are separated, not combined.
Click x to remove any you see.

So I took out the exclude *.iso in the second list window. Then it started to do something sensible.

Hopefully I can purge and get a clean run this time.

Purged some files, and I started a backup


1 Like

Thanks for your help - now I can get one task running without errors or warnings.

In another task I get this error:

2024-05-29 09:38:50 +02 - [Warning-Duplicati.Library.Main.Operation.FilelistProcessor-MissingRemoteHash]: remote file is listed as Uploaded with size 21135360 but should be 52335613, please verify the sha256 hash “5p7L5IePWPBwGm0wfxOb57SCkHLXUZMenoYEicqFQcc=”

I’ve checked with sha256sum and get a different checksum.

How do I repair? It’s a dblock, which i guess is backup data.


sha256sum outputs 256 bits in hex, while the (not very friendly or useful IMO) message is in Base64. says a match (unlikely, since size is different) would look like:


WebDAV remote drive (which I still worry about) only uploaded part of the file, and declared success.
Duplicati caught it by looking at its directory listing. You can maybe look some other way for its length.
21135360 is a very binary-even 0x1428000, and such even binary sizes seem common in corruption.

If you really want to, you can try decrypting it to verify it’s dead, but it probably is, so delete and do the
list-broken-files and purge-broken-files for it. Watch out for chronic Destination unreliability.