The operation Repair has failed with error: The response ended prematurely

I updated to 2.1.0.110 and Duplicati stopped backing up after a couple of days due to missing files (I think - I’ve wiped the db and hence the log).

I tried to run a repair and a purge on the command line but it didn’t help, so I deleted the db and ran a repair to recreate it.

After a couple of days it failed with this error:

The operation Repair has failed with error: The response ended prematurely. (ResponseEnded) => The response ended prematurely. (ResponseEnded)

System.Net.Http.HttpIOException: The response ended prematurely. (ResponseEnded)
   at System.Net.Http.HttpConnection.FillAsync(Boolean async)
   at System.Net.Http.HttpConnection.CopyToContentLengthAsync(Stream destination, Boolean async, UInt64 length, Int32 bufferSize, CancellationToken cancellationToken)
   at System.Net.Http.HttpConnection.ContentLengthReadStream.CompleteCopyToAsync(Task copyTask, CancellationToken cancellationToken)
   at Duplicati.Library.Backend.Backblaze.B2.GetAsync(String remotename, Stream stream, CancellationToken cancellationToken)
   at Duplicati.Library.Backend.Backblaze.B2.GetAsync(String remotename, Stream stream, CancellationToken cancellationToken)
   at Duplicati.Library.Main.Backend.BackendManager.GetOperation.DoGetFileAsync(IBackend backend, CancellationToken cancelToken)
   at Duplicati.Library.Main.Backend.BackendManager.GetOperation.ExecuteAsync(IBackend backend, CancellationToken cancelToken)
   at Duplicati.Library.Main.Backend.BackendManager.Handler.Execute[TResult](PendingOperation`1 op, CancellationToken cancellationToken)
   at Duplicati.Library.Main.Backend.BackendManager.Handler.Execute(PendingOperationBase op, CancellationToken cancellationToken)
   at Duplicati.Library.Main.Backend.BackendManager.Handler.ExecuteWithRetry(PendingOperationBase op, CancellationToken cancellationToken)
   at Duplicati.Library.Main.Backend.BackendManager.GetWithInfoAsync(String remotename, String hash, Int64 size, CancellationToken cancelToken)
   at Duplicati.Library.Main.Backend.BackendManager.GetFilesOverlappedAsync(IEnumerable`1 volumes, CancellationToken cancelToken)+MoveNext()
   at Duplicati.Library.Main.Backend.BackendManager.GetFilesOverlappedAsync(IEnumerable`1 volumes, CancellationToken cancelToken)+System.Threading.Tasks.Sources.IValueTaskSource<System.Boolean>.GetResult()
   at Duplicati.Library.Main.Operation.RecreateDatabaseHandler.DoRun(IBackendManager backendManager, LocalDatabase dbparent, Boolean updating, IFilter filter, NumberedFilterFilelistDelegate filelistfilter, BlockVolumePostProcessor blockprocessor)
   at Duplicati.Library.Main.Operation.RecreateDatabaseHandler.DoRun(IBackendManager backendManager, LocalDatabase dbparent, Boolean updating, IFilter filter, NumberedFilterFilelistDelegate filelistfilter, BlockVolumePostProcessor blockprocessor)
   at Duplicati.Library.Utility.Utility.Await(Task task)
   at Duplicati.Library.Main.Operation.RecreateDatabaseHandler.Run(String path, IBackendManager backendManager, IFilter filter, NumberedFilterFilelistDelegate filelistfilter, BlockVolumePostProcessor blockprocessor)
   at Duplicati.Library.Main.Operation.RepairHandler.RunRepairLocal(IBackendManager backendManager, IFilter filter)
   at Duplicati.Library.Main.Operation.RepairHandler.Run(IBackendManager backendManager, IFilter filter)
   at Duplicati.Library.Main.Controller.<>c__DisplayClass26_0.<Repair>b__0(RepairResults result, IBackendManager backendManager)
   at Duplicati.Library.Main.Controller.RunAction[T](T result, String[]& paths, IFilter& filter, Action`2 method)
   at Duplicati.Library.Main.Controller.RunAction[T](T result, IFilter& filter, Action`2 method)
   at Duplicati.Library.Main.Controller.Repair(IFilter filter)
   at Duplicati.CommandLine.Commands.Repair(TextWriter outwriter, Action`1 setup, List`1 args, Dictionary`2 options, IFilter filter)
   at Duplicati.CommandLine.Program.ParseCommandLine(TextWriter outwriter, Action`1 setup, Boolean& verboseErrors, String[] args)
   at Duplicati.CommandLine.Program.RunCommandLine(TextWriter outwriter, TextWriter errwriter, Action`1 setup, String[] args)

I’m going to try running it again (actually I’m going to switch to 2.1.0.5_stable and try that version), but does anybody know what this means?

The error means that the connection to B2 was closed before the file was transferred fully.
There should be retries in effect, but if there is something flaky, you can set --number-of-retries to something larger than the default 5.

How large is this backup?

Source 511GB, destination 616GB.

Since I re-ran my repair shortly after that message, it’s still going. Database currently at 5.6 GB.

I’m getting lots of “unexpected changes caused by block” messages interspersed with block downloads.

I guess it’s doing what it’s supposed to?

It finished with this error, which is basically where I started:

failed to retrieve file duplicati-bda2b7e33292948e0b3d0816006d03583.dblock.zip.aes => The requested file does not exist
Failed to use information from duplicati-bda2b7e33292948e0b3d0816006d03583.dblock.zip.aes to rebuild database: The requested file does not exist => The requested file does not exist
Found 5 missing volumes; attempting to replace blocks from existing volumesThe operation Repair has failed with error: Recreated database has missing blocks and 5 broken filelists. Consider using "list-broken-files" and "purge-broken-files" to purge broken data from the remote store and the database. => Recreated database has missing blocks and 5 broken filelists. Consider using "list-broken-files" and "purge-broken-files" to purge broken data from the remote store and the database.

ErrorID: DatabaseIsBrokenConsiderPurge
Recreated database has missing blocks and 5 broken filelists. Consider using "list-broken-files" and "purge-broken-files" to purge broken data from the remote store and the database.

I’ll try purge again but it didn’t work before I recreated the database.

Purge remote files gave me this:

  Uploading file duplicati-20250111T003915Z.dlist.zip.aes (80.923 MB) ...
  Deleting file duplicati-20250111T003914Z.dlist.zip.aes  ...
  Uploading file duplicati-20250224T003002Z.dlist.zip.aes (82.672 MB) ...
  Deleting file duplicati-20250224T003001Z.dlist.zip.aes  ...
  Uploading file duplicati-20250226T003001Z.dlist.zip.aes (82.712 MB) ...
  Deleting file duplicati-20250226T003000Z.dlist.zip.aes  ...
  Uploading file duplicati-20250227T003001Z.dlist.zip.aes (82.703 MB) ...
  Deleting file duplicati-20250227T003000Z.dlist.zip.aes  ...
  Uploading file duplicati-20250228T003002Z.dlist.zip.aes (82.699 MB) ...
  Deleting file duplicati-20250228T003001Z.dlist.zip.aes  ...
The operation PurgeBrokenFiles has failed with error: Found 335 file(s) with missing blocklist hashes => Found 335 file(s) with missing blocklist hashes

System.IO.InvalidDataException: Found 335 file(s) with missing blocklist hashes
   at Duplicati.Library.Main.Database.LocalDatabase.VerifyConsistency(Int64 blocksize, Int64 hashsize, Boolean verifyfilelists, IDbTransaction transaction)
   at Duplicati.Library.Main.Operation.PurgeBrokenFilesHandler.Run(IFilter filter)
   at Duplicati.Library.Main.Controller.<>c__DisplayClass38_0.<PurgeBrokenFiles>b__0(PurgeBrokenFilesResults result)
   at Duplicati.Library.Main.Controller.RunAction[T](T result, String[]& paths, IFilter& filter, Action`1 method)
   at Duplicati.Library.Main.Controller.RunAction[T](T result, Action`1 method)
   at Duplicati.Library.Main.Controller.PurgeBrokenFiles(IFilter filter)
   at Duplicati.CommandLine.Commands.PurgeBrokenFiles(TextWriter outwriter, Action`1 setup, List`1 args, Dictionary`2 options, IFilter filter)
   at Duplicati.CommandLine.Program.ParseCommandLine(TextWriter outwriter, Action`1 setup, Boolean& verboseErrors, String[] args)
   at Duplicati.CommandLine.Program.RunCommandLine(TextWriter outwriter, TextWriter errwriter, Action`1 setup, String[] args)

Where do I go from here?

I think the slow rebuilding and the messages “unexpected changes caused by block” show that something is wrong with the index files. If the index files do not fully contain the data needed to recreate the database, Duplicati will begin to download the dblock files which contains the actual data. This operation is of course really time consuming as it will (eventually) chew through the entire 616 GB of data in an attempt to find what is missing. (If the missing parts are found, it will stop before downloading everything).

This message is from an internal consistency check of the database that finds finds with nothing attached. This looks like a case where the purge-broken-files operation did not succeed.

Can you create a bugreport of the database and share it? Then I can perhaps drag out some clues as to why the purge did not work.

Otherwise, I would suggest trying again to run list-broken-files to see if it will reveal which versions are broken. I am guesing that the problem lies with the filesets that were re-uploaded during the purge command. If you can live without those versions, I would suggest trying to delete the versions, and that should clear up the errors.

If you have the patience for it, you could also try the recreate again, as the updated filelists should have caused the original problem to be solved.

If you just want to move on, you can start the backup over. You can still restore from the current data set, but until everything is in order with the data consistency, Duplicati will not make new backups to that destination.

Possibly, uploading a 6GB file somewhere is a pain though.

I did try deleting a partial backup but it fails due to missing files.

list-broken-files is no longer showing any files.

I’ve re-created to a new destination and that’s working. I don’t think I particularly need anything from the old backup so I’ll hold on to it for a month and then delete it.

Thanks

1 Like