Fatal error: Found inconsistency in the following files while validating database

Hello everyone, I recently encountered this fatal error in my backup and am looking for help. My setup:

  • Duplicati - 2.1.0.5_stable_2025-03-04 running on Windows 11 (latest).
  • Source is local computer, destination is local network server running ubuntu 24.04.2 LTS.
  • Backup was running fine (daily) until a week ago
  • I have not recently modified the backup settings or fileset in any way
  • The files flagged in the error below are also untouched, and I have confirmed that the 2 files listed below open correctly. In addition, the same files are also backed up to a remote (AWS) server, and that backup continues to run normally.

Error message:
Failed: Found inconsistency in the following files while validating database:
D:\D***1.tif, actual size 24701532, dbsize 24599132, blocksetid: 411493
D:\D***2.tif, actual size 24847848, dbsize 24745448, blocksetid: 411521
. Run repair to fix it.
Details: System.IO.InvalidDataException: Found inconsistency in the following files while validating database:
D:\D***1.tif, actual size 24701532, dbsize 24599132, blocksetid: 411493
D:\D***2.tif, actual size 24847848, dbsize 24745448, blocksetid: 411521
. Run repair to fix it.
at Duplicati.Library.Main.Database.LocalDatabase.VerifyConsistency(Int64 blocksize, Int64 hashsize, Boolean verifyfilelists, IDbTransaction transaction)
at Duplicati.Library.Main.Operation.Backup.BackupDatabase.<>c__DisplayClass34_0.<VerifyConsistencyAsync>b__0()
at Duplicati.Library.Main.Operation.Common.SingleRunner.DoRunOnMain[T](Func``1 method)
at Duplicati.Library.Main.Operation.BackupHandler.PreBackupVerify(String backendurl, Options options, BackupResults result)
at Duplicati.Library.Main.Operation.BackupHandler.RunAsync(String[] sources, IFilter filter, CancellationToken token)
at CoCoL.ChannelExtensions.WaitForTaskOrThrow(Task task)
at Duplicati.Library.Main.Operation.BackupHandler.Run(String[] sources, IFilter filter, CancellationToken token)
at Duplicati.Library.Main.Controller.<>c__DisplayClass17_0.<Backup>b__0(BackupResults result)
at Duplicati.Library.Main.Controller.RunAction[T](T result, String& paths, IFilter& filter, Action``1 method)`

Here’s what I have tried, with the results. The backup continues to fail with the error above.

  1. Database Repair: [Information-Duplicati.Library.Main.Operation.RepairHandler-DatabaseIsSynchronized]: Destination and database are synchronized, not making any changes
  2. list-broken-files: [Information-Duplicati.Library.Main.Operation.ListBrokenFilesHandler-NoBrokenFilesetsInDatabase]: No broken filesets found in database, checking for missing remote files
    [Information-Duplicati.Library.Main.Operation.ListBrokenFilesHandler-NoMissingFilesFound]: Skipping operation because no files were found to be missing, and no filesets were recorded as broken.
  3. Ran chkdsk /f on source files, sfc /scannow, dism /online /cleanup-image /restorehealth on source computer, restarted.
  4. Restarted destination computer

I have not yet tried a rebuild database, looking to see if there is an intermediate step I should try. Happy to provide any further info, thanks for reading!

Additional troubleshooting done now, per this forum post tips from @Wim_Jansen:

  1. Made backup of my SQLite DB
  2. Located Dblock with ID 411493 using SELECT DISTINCT Remotevolume.Name FROM Remotevolume INNER JOIN Block ON (Block.VolumeID=Remotevolume.ID) INNER JOIN BlocksetEntry ON (Block.ID=BlocksetEntry.BlockID) WHERE BlocksetID=411493; (modified slightly from original post)
  3. Moved that ***.dblock.zip file out of remote folder
  4. Did a list-broken-files per this post

Result (note that this ran for almost 4 hours):
2025-07-11 12:00:12 -04 - [Information-Duplicati.Library.Main.Controller-StartingOperation]: The operation ListBrokenFiles has started
2025-07-11 15:56:01 -04 - [Warning-Duplicati.Library.Modules.Builtin.ReportHelper-ReportSubmitError]: Failed to send message: System.NotImplementedException: The method or operation is not implemented.

System.NotImplementedException: The method or operation is not implemented.
at Duplicati.Library.Main.Operation.ListBrokenFilesHandler.MockList``1.System.Collections.IEnumerable.GetEnumerator()
at Duplicati.Library.Utility.Utility.PrintSerializeObject(Object item, TextWriter writer, Func``3 filter, Boolean recurseobjects, Int32 indentation, Int32 collectionlimit, Dictionary``2 visited)
at Duplicati.Library.Utility.Utility.PrintSerializeObject(Object item, TextWriter writer, Func``3 filter, Boolean recurseobjects, Int32 indentation, Int32 collectionlimit, Dictionary``2 visited)
at Duplicati.Library.Utility.Utility.PrintSerializeObject(Object item, StringBuilder sb, Func``3 filter, Boolean recurseobjects, Int32 indentation, Int32 collectionlimit)
at Duplicati.Library.Modules.Builtin.ResultSerialization.DuplicatiFormatSerializer.Serialize(Object result, Exception failException, IEnumerable``1 loglines, Dictionary``2 additional)
at Duplicati.Library.Modules.Builtin.ReportHelper.ReplaceTemplate(String input, Object result, Exception exception, Boolean subjectline, IResultFormatSerializer resultFormatSerializer)
at Duplicati.Library.Modules.Builtin.ReportHelper.ReplaceTemplate(String input, Object result, Exception exception, Boolean subjectline)
at Duplicati.Library.Modules.Builtin.ReportHelper.OnFinish(Object result, Exception exception)

This is an issue that has fixed since 2.1.0.5.

If you use the latest experimental or canary builds, the repair process can fix the database for you.

This one is a warning that has also been fixed. It fails to send the operation report due to a bug in the serializer, but it has no impact on the operation itself.

Thank you to everyone who helped. As I was not ready to trust my backups to a experimental or canary build, I decided to to a rebuild database instead. This also failed, but eventually I discovered an error in a RAM chip. Replaced that, did rebuild database, everything is back to normal now.

Posting here in case this helps someone. Perhaps the RAM error was unrelated to the original issue, but may be worth checking just in case.

1 Like