No more nerves for Duplicati!

I have been using Duplicati for 2 months and I think the basic concept is very good. But I’m about to uninstall Duplicati! It has never really worked. I was able to make a complete backup (about 230GB), but the next time I synchronize, I get various error messages again.
I have already found many solutions in the forum, but they have not helped. I am also not a PC specialist!

I have had to delete the entire backup a few times and start again. But that is not the aim of a backup. It also takes over 24 hours to redo the backup. Repairing or deleting/restoring the local database also takes a very, very long time.

The backup target is mounted as a drive in Windows via WebDav. Sometimes it also loses the connection and Duplicati then always has error messages. Despite this, it always says “last successful backup”, although it was not successful!

Error while running photos
Recreated database has missing blocks and 1 broken filelists. Consider using “list-broken-files” and “purge-broken-files” to purge broken data from the remote store and the database

I run list-broken-files

Found 6 commands but expected 1, commands:
“file://Z:\Backup\Duplicati\Fotos?auth-username=xxx&auth-password=xxxx”
"D:\xxx"
"D:\yyyyy"
“D:\cccc”
"D:\bbbb"
“D:\nnnn”
Return code: 200

2nd attempt:
The operation Backup has failed with error: The database was attempted repaired, but the repair did not complete. This database may be incomplete and the backup process cannot continue. You may delete the local database and attempt to repair it again. => The database was attempted repaired, but the repair did not complete. This database may be incomplete and the backup process cannot continue. You may delete the local database and attempt to repair it again.

ErrorID: DatabaseRepairInProgress
The database was attempted repaired, but the repair did not complete. This database may be incomplete and the backup process cannot continue. You may delete the local database and attempt to repair it again.
Return code: 100

Same with purge-broken-files

My backup is broken for the umpteenth time! WTF!

In this case, Windows is doing the WebDav and is responsible for trying to maintain a connection.
Local folder or drive is probably what you use for Storage Type on the Destination screen, even though it’s not quite true. Sometimes, with virtualized drives, this is necessary, but you could use:

WebDAV directly since the option exists. Possible it will keep things going better, or at least have better awareness of what’s going wrong on the connection compared to an unreliable local folder.

It says the time of the last successful backup, not that the last backup was successful, however if you see a failed backup and the time updates to it, then it’s an issue, otherwise it’s likely misread.

and probably did nothing because of wrong usage. The LIST-BROKEN-FILES command syntax:

Duplicati.CommandLine.exe list-broken-files <storage-URL> [<options>]

Your run looks like Using the Command line tools from within the Graphical User Interface where you changed the command from the dropdown. Originally it was probably backup so it would use Commandline arguments for the source folders you chose. Clear those for list-broken-files.

Recovering by purging files has an example of what the command (as one line) should look like.

I suppose you did it correctly this time, otherwise it would have failed the same way as first time.

You must have run a Repair at some point, and it gave some warnings or errors. Is that correct?

Maybe not? 2nd attempt might not be list-broken-files, as it says “Backup”. Was it backup?

Question still holds though. Was there a repair? What sort of warnings or errors did it show you?

If it errors out, it says that and certain other things don’t work, but what was it complaining about?

Sometimes Recreate can clean things up, but I can’t say much more without knowing messages.

Maybe this is them, although the context of the message wasn’t given, so guessing from the text:

This is the kind of thing that a silently failed storage of default 50 MB file with dblock in the name would cause. The dblock file contains actual source data, meaning some source file backups are damaged. The purge-broken-files removes such broken files, after a list-broken-files shows them.

Problem is that the database is needed to do checks, and you don’t have an intact database now.

The root cause of all of this might be the unreliable Windows WebDAV. What is the server, and is there an unreliable Internet connection in between? Are there any other access ways you can try?

If you’re stuck on WebDAV, then at least you could try telling Duplicati that you’re using WebDAV.

As always, messages are needed, preferably right at the initial failure rather than after other work.

Your job log might have caught some, or maybe About → Show log → Stored and go by its times.

Could be modified to an empty folder, and used with Duplicati.CommandLine.BackendTester.exe.
If that doesn’t hold up, then you have a problem somewhere outside of Duplicati, on your Z: drive. Changing Duplicati to use WebDAV directly may or may not help things. Depends on failure point.

Using Duplicati with a consistently unreliable destination is likely to get in trouble. Please focus on test and repair, or some alternative storage. Duplicati can be no more reliable than your storage…

First of all, thank you for the detailed answer. I probably don’t understand everything (technically).

The online storage has no other option than WebDav (except via Broswer)
I have a stable Internet connection. Sometimes Windows shuts down when it takes a long time or updates. Or runs out of battery :(. Or I don’t have time to wait any longer and switch off the laptop. But I think the program must be able to correct this itself!
Yes, I run the Graphical User Interface, I don’t know how else. Why are the fields not deleted immediately when a command is selected? I have now deleted the filters and so on, seems to run through. But it doesn’t solve the problem!
Here is everything I have tried:

Repair:
The operation Repair has failed with error: The backup storage destination is missing data files. You can either enable --rebuild-missing-dblock-files or run the purge command to remove these files. The following files are missing: duplicati-bcdc73bf0f0904a31b78b1674d333c586.dblock.zip.aes => The backup storage destination is missing data files. You can either enable --rebuild-missing-dblock-files or run the purge command to remove these files. The following files are missing: duplicati-bcdc73bf0f0904a31b78b1674d333c586.dblock.zip.aes

ErrorID: MissingDblockFiles
The backup storage destination is missing data files. You can either enable --rebuild-missing-dblock-files or run the purge command to remove these files. The following files are missing: duplicati-bcdc73bf0f0904a31b78b1674d333c586.dblock.zip.aes
Return code: 100

Backup:
Error while running Fotos neu
Found 2 files that are missing from the remote storage, please run repair

Purge:
Missing file: duplicati-bcdc73bf0f0904a31b78b1674d333c586.dblock.zip.aes
Missing file: duplicati-ibb3819b50c674446b0ee9c3d6837d459.dindex.zip.aes
Found 2 files that are missing from the remote storage, please run repair
The operation PurgeFiles has failed with error: Found 2 files that are missing from the remote storage, please run repair => Found 2 files that are missing from the remote storage, please run repair

ErrorID: MissingRemoteFiles
Found 2 files that are missing from the remote storage, please run repair
Return code: 100

Repair:
The operation Repair has failed with error: The backup storage destination is missing data files. You can either enable --rebuild-missing-dblock-files or run the purge command to remove these files. The following files are missing: duplicati-bcdc73bf0f0904a31b78b1674d333c586.dblock.zip.aes => The backup storage destination is missing data files. You can either enable --rebuild-missing-dblock-files or run the purge command to remove these files. The following files are missing: duplicati-bcdc73bf0f0904a31b78b1674d333c586.dblock.zip.aes

ErrorID: MissingDblockFiles
The backup storage destination is missing data files. You can either enable --rebuild-missing-dblock-files or run the purge command to remove these files. The following files are missing: duplicati-bcdc73bf0f0904a31b78b1674d333c586.dblock.zip.aes
Return code: 100

I create this 2 Files an run repair again
Repair:
remote file duplicati-bcdc73bf0f0904a31b78b1674d333c586.dblock.zip.aes is listed as Verified with size 0 but should be 52337501, please verify the sha256 hash “X1x5vbHmsOnV63NWqYf8/UN6FHjwH6y0pg188qgEBrs=”
Downloading file duplicati-bcdc73bf0f0904a31b78b1674d333c586.dblock.zip.aes (unknown) …
Downloading file duplicati-bcdc73bf0f0904a31b78b1674d333c586.dblock.zip.aes (unknown) …
Downloading file duplicati-bcdc73bf0f0904a31b78b1674d333c586.dblock.zip.aes (unknown) …
Downloading file duplicati-bcdc73bf0f0904a31b78b1674d333c586.dblock.zip.aes (unknown) …
Downloading file duplicati-bcdc73bf0f0904a31b78b1674d333c586.dblock.zip.aes (unknown) …
Failed to perform verification for file: duplicati-bcdc73bf0f0904a31b78b1674d333c586.dblock.zip.aes, please run verify; message: Invalid header marker => Invalid header marker
Uploading file duplicati-ibb3819b50c674446b0ee9c3d6837d459.dindex.zip.aes (541 bytes) …
Return code: 0

verify:
remote file duplicati-bcdc73bf0f0904a31b78b1674d333c586.dblock.zip.aes is listed as Verified with size 0 but should be 52337501, please verify the sha256 hash “X1x5vbHmsOnV63NWqYf8/UN6FHjwH6y0pg188qgEBrs=”
Downloading file duplicati-20240712T085556Z.dlist.zip.aes (66,40 KB) …
Downloading file duplicati-ic505f57c04434026b4018656d01dde6d.dindex.zip.aes (33,90 KB) …
Downloading file duplicati-b5135592515914655827a464a2b74a15a.dblock.zip.aes (49,95 MB) …
Examined 3 files and found no errors
Return code: 0

run backup again:
2024-07-15 10:21:49 +02 - [Warning-Duplicati.Library.Main.Operation.FilelistProcessor-MissingRemoteHash]: remote file duplicati-bcdc73bf0f0904a31b78b1674d333c586.dblock.zip.aes is listed as Verified with size 0 but should be 52337501, please verify the sha256 hash “X1x5vbHmsOnV63NWqYf8/UN6FHjwH6y0pg188qgEBrs=”
2024-07-15 10:22:03 +02 - [Warning-Duplicati.Library.Main.Operation.FilelistProcessor-MissingRemoteHash]: remote file duplicati-bcdc73bf0f0904a31b78b1674d333c586.dblock.zip.aes is listed as Verified with size 0 but should be 52337501, please verify the sha256 hash “X1x5vbHmsOnV63NWqYf8/UN6FHjwH6y0pg188qgEBrs=”

verify
remote file duplicati-bcdc73bf0f0904a31b78b1674d333c586.dblock.zip.aes is listed as Verified with size 0 but should be 52337501, please verify the sha256 hash “X1x5vbHmsOnV63NWqYf8/UN6FHjwH6y0pg188qgEBrs=”
Downloading file duplicati-20240715T075709Z.dlist.zip.aes (1,87 MB) …
Downloading file duplicati-i2b570aefd3e74e389c88b4bdcff78b82.dindex.zip.aes (37,50 KB) …
Downloading file duplicati-b6b61b6eb387d4583b4cbfeef01adc4a8.dblock.zip.aes (49,93 MB) …
Examined 3 files and found no errors
Return code: 0

repair directly in duplicati (without command):
2024-07-15 10:50:56 +02 - [Error-Duplicati.Library.Main.Operation.RepairHandler-RemoteFileVerificationError]: Failed to perform verification for file: duplicati-bcdc73bf0f0904a31b78b1674d333c586.dblock.zip.aes, please run verify; message: Invalid header marker
CryptographicException: Invalid header marker

This is a never ending story!
In the end, I deleted the backup and started again. After a week, the same error again:

2024-07-20 12:44:51 +02 - [Warning-Duplicati.Library.Main.Operation.FilelistProcessor-MissingRemoteHash]: remote file duplicati-b4d2586fcce284e9fa10e1b504a119bf1.dblock.zip.aes is listed as Uploaded with size 0 but should be 52362253, please verify the sha256 hash “kuYrmTor23d1TqDJrceLby7TVGgG1feRnswrcvxT9uc=”
2024-07-20 12:45:12 +02 - [Warning-Duplicati.Library.Main.Operation.FilelistProcessor-MissingRemoteHash]: remote file duplicati-b4d2586fcce284e9fa10e1b504a119bf1.dblock.zip.aes is listed as Uploaded with size 0 but should be 52362253, please verify the sha256 hash “kuYrmTor23d1TqDJrceLby7TVGgG1feRnswrcvxT9uc=”

So Duplicati is not a program you can trust for your backups!

It should and usually does. It should clean up the half-done stuff, and continue doing backup.

If you doubt this, you can certainly try it on purpose, but it’s not 100% without possible issues.

I’ve killed Duplicati abruptly thousands of times, and some of the findings are in the list above.

I’m not testing with your WebDAV though, and I said some things on it before, and will repeat.

Especially for the options, some are essential or desirable to the command that you are doing.
That argument is weaker for the top boxes such as Target URL and Commandline arguments, however the GUI Commandline experience is similar to what a text-based commandline gives.

Using Duplicati from the Command Line

and if you’re trying to match a GUI job, you do Export As Command-line for the backup export, changing the command and what follows it by following the directions for it on the page above.

If you don’t do it that way, there is a good chance you’ll forget something, and have to struggle.
Perhaps GUI support will improve someday, but for now some things need a commandline run.

There are possibly some things that would make life a little easier at smaller development work (because developers are always in short supply), such as giving a link to the user manual page.

This means the local database is not seeing a file it expects. Seeing deeper history of that file is possible in the database, but it would need your posting a link to a bug report. Let’s wait on that.

The rebuild suggestion is usually not very effective, so let’s hold off and see what else happens.

I was going to ask for About → Show log → Live → Warning for names, but they might be below.

Was that actually The PURGE command? Or was it The PURGE-BROKEN-FILES command?

I think the message intended the latter (probably a bug). Anyway, those might be the two files.
Maybe a developer will stop by to confirm the typo and fix it, and advise on recovery methods.

Recovering by purging files is optional list-broken-files and then a purge-broken-files which deletes the backups of any source file versions that have data in the missing dblock file.

How it went missing is a different question. Duplicati does rely on a destination not losing files. Offhand, I know of one Duplicati bug where an interruption at the wrong time can make it think there’s a missing dindex file (false alarm), but I don’t think I know of a false positive on dblocks.

Especially after your second failure after another week, some upload issue might actually exist.

Do you have the job log from the last Recreate? If so, the Complete log might hold some clues. Perhaps you remember something also? Any messages? Were you watching its progress bar?
The last 10% can become slow if the backup destination is missing blocks and needs a search.
What’s supposed to happen is the dindex files locate everything, but you’re missing one dindex.

To tell more about the history and importance of the missing files would need the DB bug report.

You can’t just toss files into the destination and think that Duplicati will think they are proper files.
Duplicati checks the backup, and is complaining. This is expected when the destination is wrong.

Well, no, or at least not that you posted anywhere here. It looks like you got a different error, right?

This one looks like a recently uploaded default 50 MB file is now size 0. Can you confirm the size?

I still don’t like your using a Windows drive letter, as mentioned above. Are you still doing that? It’s another place to lose data, and it’s harder for Duplicati to notice if Windows WebDAV is losing files.

WebDAV is probably the safer choice for using WebDAV on Duplicati, to avoid Windows problems.

Do you have any idea now (maybe you’re paying attention?) to whether there was any interruption of the previous backup? Knowing that would help to figure out if interruptions are causing troubles.

You can check your job logs and also the server log at About → Show log → Stored for their clues.

Having a log-file=<path> at log-file-log-level=information would have gotten information on second failure. For example, we could easily see interrupted backups, and also the filenames that got lost.

Currently, that chance is lost, but you can still do About → Show log → Live → Warning and push Verify files button to get missing file names (might need click on line). If one dblock, it’s easier compared to last time, and you can probably do list-broken-files and purge-broken-files.

Files should not be breaking, and I still suspect that your Z: drive plan is adding extra areas to fail.

By the way, Microsoft also advises against using their WebDAV client. It’s already deprecated

Deprecated features for Windows client
Inquiry about the Deprecation of WebClient/WebDAV Services, Overall Impact, and Replacement Options are people guessing why that is.

On the WebDAV server, are there any logs that would have information on the lost/empty files?

Using the WebDAV Redirector shows a 50,000,000 decimal FileSizeLimitInBytes.
You can see if you can see a larger one, but one that failed to upload was 52,337,501.

I don’t know what size your normal backups are, but you can figure it out by file dates.
Even if your backup updates are small, I’d have thought initial backup would be large.

Zero byte files possibly explains those, where that’s what some clients do. Windows?

WebDAV API Functions is maybe similar to what the drive letter you use has available.
You can see the talk of having to flush local files to the server. What if it’s interrupted?

Webdav upload fast but hangs at 99% until it eventually finishes said Windows caches.
Personally, I’m not positive, but if you’re not using Duplicati WebDAV, I’d prefer you did.