How to recover from interrupted backup

Hello, I’m using Duplicati for long time, but this is biggest problem I encountered, so I need help.

  • Duplicati version : 2.0.3.2
  • Operating system : Windows 2012R2
  • Backend : SMB
  • Backup size : source 86GB, backup 67GB, 150 versions

Backup job was interrupted by windows update reboot.

Next backup failed with:

Failed: Found 6 file(s) with missing blocklist hashes
System.IO.InvalidDataException: Found 6 file(s) with missing blocklist hashes at Duplicati.Library.Main.Database.LocalDatabase.VerifyConsistency(IDbTransaction transaction, Int64 blocksize, Int64 hashsize, Boolean verifyfilelists) at Duplicati.Library.Main.Operation.BackupHandler.Run(String[] sources, IFilter filter) at Duplicati.Library.Main.Controller.<>c__DisplayClass17_0.<Backup>b__0(BackupResults result) at Duplicati.Library.Main.Controller.RunAction[T](T result, String[]& paths, IFilter& filter, Action`1 method) at Duplicati.Library.Main.Controller.Backup(String[] inputsources, IFilter filter) at Duplicati.Server.Runner.Run(IRunnerData data, Boolean fromQueue)

I already tried to fix this:

  1. Repair - no problem or
Unexpected update count: 0 at Duplicati.Library.Main.Database.LocalRepairDatabase.FixMissingBlocklistHashes(String blockhashalgorithm, Int64 blocksize) at Duplicati.Library.Main.Operation.RepairHandler.RunRepairCommon() at Duplicati.Library.Main.Operation.RepairHandler.Run(IFilter filter) at Duplicati.Library.Main.Controller.RunAction[T](T result, String[]& paths, IFilter& filter, Action`1 method) at Duplicati.Library.Main.Controller.Repair(IFilter filter) at Duplicati.CommandLine.Commands.Repair(TextWriter outwriter, Action`1 setup, List`1 args, Dictionary`2 options, IFilter filter) at Duplicati.CommandLine.Program.RunCommandLine(TextWriter outwriter, TextWriter errwriter, Action`1 setup, String[] args) Return code: 100
  1. Verify - no problem
  2. list-broken-files - nothing
  3. purge-broken-files - nothing
  4. Verify “all” with “–full-remote-verification=true” - I get Examined 907 files and found no errors Return code: 0

So I backed up my 4GB DB and did database recreate. Recreating was fast, 4GB DB in 1,5hours. But I got error:

The database was attempted repaired, but the repair did not complete. This database may be incomplete and the repair process is not allowed to alter remote files as that could result in data loss.

at Duplicati.Library.Main.Operation.RepairHandler.RunRepairRemote() at Duplicati.Library.Main.Operation.RepairHandler.Run(IFilter filter)   at Duplicati.Library.Main.Controller.RunAction[T](T result, String[]& paths, IFilter& filter, Action`1 method)   at Duplicati.Library.Main.Controller.Repair(IFilter filter)   at Duplicati.Server.Runner.Run(IRunnerData data, Boolean fromQueue)

I tried to create bug report and because my DB is larger than 4GB I got:

System.NotSupportedException: Attempted to write a stream that is larger than 4GiB without setting the zip64 option at SharpCompress.Writers.Zip.ZipWriter.ZipWritingStream.Write(Byte[] buffer, Int32 offset, Int32 count) at Duplicati.Library.Utility.Utility.CopyStream(Stream source, Stream target, Boolean tryRewindSource, Byte[] buf) at Duplicati.Library.Main.Operation.CreateBugReportHandler.Run() at Duplicati.Library.Main.Controller.RunAction[T](T result, String[]& paths, IFilter& filter, Action`1 method) at Duplicati.Library.Main.Controller.CreateLogDatabase(String targetpath) at Duplicati.Server.Runner.Run(IRunnerData data, Boolean fromQueue)

After recreating DB I’m still stuck
Backup will fail with:

The database was attempted repaired, but the repair did not complete. This database may be incomplete and the backup process cannot continue. You may delete the local database and attempt to repair it again.

And repair will fail with:

The database was attempted repaired, but the repair did not complete. This database may be incomplete and he repair process is not allowed to alter remote files as that could result in data loss

I can still restore backup of DB before repair, but what else can i try?
I have no problem with some data loss in backup, but I don’t want to loose all 150 versions of backup.

Thank you for any ideas.

I tested a GUI Commandline create-report that 7-Zip said was zip64. It had header warnings but it extracted.

  --zip-compression-zip64 (Boolean): Toggles Zip64 support
    The zip64 format is required for files larger than 4GiB, use this flag to toggle it
    * default value: False

It doesn’t seem to try remote operations (though I’m not sure why it would), but adding --dry-run didn’t hurt.

What happens to the big file after that isn’t clear to me, either for transfer, or analysis tools and techniques.

Log files set up with --log-file and --log-level options are somewhat more human-friendly. Do you have any?

Another advantage is they’re not in the job database, so are immune from actions that delete the database.

Some messages are stored in the server database, under About --> Show log. Those should still be intact.

In addition to getting this backup recovered, it would be great to better understand how the problems arise.

Did you by any chance save off copies of the job database at an early state in case that helps with history?

That leads up to the idea of saving off a copy of destination files, probably just to whatever backend it’s on.

Once the valuable backup is well backed up, it gives more freedom in attempting to get the original healthy.

Even safer might be to do this as a backend migration (see forum posts), then work on copies not originals.

The idea that I will tentatively propose (but strongly advise you don’t try without good safeguards) is that if Duplicati was healthy before the interruption (it looks like it does a VerifyConsistency before every backup), then date-based removal (which could just be moves to some folder not in the destination) of backend files from the failed backup forward might get you an earlier backend view. Then do the recreate based on that. There’s no guarantee it will work. There might have been some latent issue earlier, or even program bugs.

It would be best if it could somehow be verified that no compact was part of the backup. While backups are typically just uploads of new files to the backend, the repackaging from a compact defeats the dating plan. Verification might be from the --log-file log (if it exists) or (less likely due to the interruption?) job database.

Any help from the destination? Windows used to keep Previous Versions by VSS, but I think it’s worse now.

You have my ideas now. If nothing else, doing a backup of the backup shouldn’t hurt, and may take awhile, which might give the expert people time to consider and suggest (and it might be different from my idea…).

While I wouldn’t suggest this for an inexperienced person, you could also just take this and keep on trying. Basically, this is just precautions then a rollback, just as you might have Duplicati do for a source problem.

Or you could just wait for this to play out without rushing. Maybe a smaller new backup job could hold you.

1 Like

Hello, thank you for reply:)

I tested create-report with another smaller backup - it only allow me download zip with DB and text file with basic info about environment - I don’t think I need that anyway.

I have three copy of database
Just today I find out that I have backup of DB just before interrupted backup - so if everything fail, I could probably just restore DB and be fine

  1. DB copy - just before interrupted backup - I’m expecting that restoring this DB will solve my problem.
  2. DB copy after interrupted backup
  3. DB copy after failed rebuild

My backend is SMB - backup retention is set to keep all backup.
Compact was probably not part of backup. Backup job take 5 hours and reboot was in first hour.

I already tried to rebuild DB with and without files from interrupted backup - the result was the same.
It’s strange because last backup was successful.

And I created second new backup job so I I’m not in rush to solve this problem.

I can set up log-file - but what operation would be best to log and analyze? I can replicate every error.

This is verbose log from backup run with “Found 6 file(s) with missing blocklist hashes” error
https://pastebin.com/2vykMgZq

This is repair verbose log with " Unexpected update count: 0" error
https://pastebin.com/eDcRiwEb

Booth of this log are from backup job setting. Logging to file from global setting did not log anything to file.

Thank you

I tested create-report with another smaller backup - it only allow me download zip with DB and text file with basic info about environment - I don’t think I need that anyway.

I’m confused. Previously you documented a failure when you tried to create a bug report. The option was the solution to what the error message (need to scroll it sideways) stated about the need to set the zip64 option.

Creating a bug report just says “Click the Download button and send the report to the Duplicati development team for further investigation.” which is what you’d do (if requested, I suppose) with the Commandline report. Although the database inside is called log-database.sqlite, it’s your database, sanitized some, e.g. for paths. By the way, I only used Commandline to keep zip64 out of my backups even though it’s supposed to be safe.

Presumably someone expert and armed with the right tools could study the issue in a way an exception can’t. Same seems like it holds true for logs. Some stuff is easier, e.g. did it Compact? Someone used to the usual rhythm of the backup could possibly notice an oddity in the log, especially at detailed levels, e.g. Information.

I’m happy that you seem to be in a relatively good position, but I’m not sure how the folks at #1183 are doing.

.

Thank you, you are right, I was confused :slight_smile:
I thought that “Creating a bug report” just allow me download DB. But in fact the data in the DB report bug is anonymized (file names are replaced)

I successfully created bug report - so I can send it to someone who is interested in this error.

@JonMikeIV speculated in a #1183 post that a recreate could pick up a problem inside a file on the backend.

Web UI Recreate describes itself as delete and repair. The help for repair doesn’t say exactly which options it takes, but source implies it takes --time (which had bugs) and --version (which worked, even including range):

--version (String): The version to list/restore files
  By default, Duplicati will list and restore files from the most recent backup, use this option to select another item. You may enter multiple values separated with comma, and ranges using -, e.g. "0,2-4,7"

To make sure you can actually spot the problem from CLI, you’d first try an ample range, e.g. 0-999 ought to bring in your 150 versions then error. Next, delete the database (e.g. del in Command Prompt) and narrow…

For a starter command line, you can try exporting your job configuration as a command line, then edit it into:

Usage: repair <storage-URL> [<options>]

  Tries to repair the backup. If no local db is found or the db is empty, the db is re-created with data from the storage. If the db is in place but the remote storage is corrupt, the remote storage gets repaired with local data (if
  available).

I haven’t tried a version range on verify/test or other things, partly because it seemed most direct to isolate a repair failure. On the other hand, if it it really works with verify/test you might be able to set it and let it labor…

Usage: test <storage-URL> <samples> [<options>]

  Verifies integrity of a backup. A random sample of dlist, dindex, dblock files is downloaded, decrypted and the content is checked against recorded size values and data hashes. <samples> specifies the number of samples to be tested.
  If "all" is specified, all files in the backup will be tested. This is a rolling check, i.e. when executed another time different samples are verified than in the first run. A sample consists of 1 dlist, 1 dindex, 1 dblock.

  --time=<time>
    Checks samples from a specific time.
  --version=<int>
    Checks samples from specific versions. Delimiters are , -
  --full-remote-verification
    Checks the internal structure of each file instead of just verifying the file hash

Or you can wait for more expert guidance. Possibly your log files and/or your database report will be enough.

By the way, thanks for the information you’ve supplied, and if you chase this further, thank you for doing so. Such rare unrecoverable failures need to be prevented somehow, so it seems helpful to examine them well.

Next idea :wink: which seems like a bit of a long shot, and probably should be tried after trying to bisect repair. The good thing about this idea is that it may be faster because it only needs one download of backend files.

There’s a tool whose name is possibly listed wrong there. I have a Duplicati.CommandLine.RecoveryTool.exe whose help led me through download and index. An index made an index.txt with lines to map hash to dblock. If it ingested a bad file, there’s a chance it would error or maybe put something odd looking into the index file.

One bonus is that it decrypts the backend files, so it gets easier to check files suspected as bad (if you like).

Hello, thank you for helping :slight_smile: I also feel this might be important problem - I’m starting rely on Duplicati and things should not be fragile like this.

I already tried test/verify for storage (in my first post) - I even use “all” and “full-remote-verification=true” which should download all data, decrypt it and check content.

Do you really think I could pass --version parametr to repair? I will try that when I could.

Also thank you for idea with RecoveryTool - that may actually find some problem in backup files.
Only problem is That I have to find test environment with lot a free space - RecoveryTools need to download backup files locally.

Fortunately, I’m not in a hurry, so we have a chance of solving it :slight_smile:

My first thought when you reported verify success was – good, but what version? I’d guess it used version 0.
I’m sure you can pass --version to repair:

C:\>"C:\Program Files\Duplicati 2\Duplicati.CommandLine.exe" repair "file://C:\Duplicati Backups\local test 1\\" --dbpath="C:\ProgramData\Duplicati\70696975807290758771.sqlite" --encryption-module= --compression-module=zip --dblock-size=50mb --no-encryption=true --disable-module=console-password-input --version=0-0
  Listing remote folder ...
  Downloading file (812 bytes) ...
  Downloading file (2.89 KB) ...
  Downloading file (610 bytes) ...
  Downloading file (1.18 KB) ...
  Downloading file (650 bytes) ...

C:\>"C:\Program Files\Duplicati 2\Duplicati.CommandLine.exe" repair "file://C:\Duplicati Backups\local test 1\\" --dbpath="C:\ProgramData\Duplicati\70696975807290758771.sqlite" --encryption-module= --compression-module=zip --dblock-size=50mb --no-encryption=true --disable-module=console-password-input --version=0-2
  Listing remote folder ...
  Downloading file (1.42 KB) ...
  Downloading file (717 bytes) ...
  Downloading file (812 bytes) ...
  Downloading file (2.89 KB) ...
  Downloading file (610 bytes) ...
  Downloading file (1.18 KB) ...
  Downloading file (650 bytes) ...
  Downloading file (965 bytes) ...
  Downloading file (1.13 KB) ...
  Downloading file (592.93 KB) ...
  Downloading file (2.16 MB) ...

C:\>

(and my other command prompt has a “del” command to alternate with testing different ranges for the repair)

Although it doesn’t do version ranges directly, so is not great for search, this might help examine any bad file:

1 Like

Thank you, now we getting somewhere!

I lived in the belief that verify without parametr will check all versions… but I don’t know any more

I run verify with " –version=149" and after that with version 10 and 0
I get same error:

Listing remote folder …
Missing file: duplicati-be9040f5d75f543769ddd42dde42544fd.dblock.zip.aes
Missing file: duplicati-i970bfdb4dcd740feae4fdda0b408ed1f.dindex.zip.aes
Missing file: duplicati-b164e1caa808f4fbf9998ac5cbe46aa86.dblock.zip.aes
Missing file: duplicati-ic73b7685ca6d40a5814340fb3b434c16.dindex.zip.aes
Missing file: duplicati-bff4a003f542946398496638b59908da3.dblock.zip.aes
Found 5 files that are missing from the remote storage, please run repair
Found 5 files that are missing from the remote storage, please run repair
Return code: 100

I don’t know what changed, but now I getting this error even when I run verify without any parameter.
I first post I was getting “no problem” message

Then I run affected for one of dblock files:
Missing file: duplicati-bff4a003f542946398496638b59908da3.dblock.zip.aes

A total of 3 file(s) are affected:
D:\xyz\abc6\MSSQL\SUSDB_log.ldf
D:\xyz\abc6\MSSQL\workA.mdf
D:\xyz\abc6\MSSQL\workA_log.ldf
The following related log messages were found:
22.8.2018 10:25:42: MainOperation: Test Verifications: [ Key: duplicati-20180812T200000Z.dlist.zip.aes Va …
24.8.2018 10:05:50: {“Size”:31102797,“Hash”:“7hFhCpYG1TwqfIEpEJvxtyEzGvImMfFyHJwdMbBqHKw=”}
Return code: 0

Strangely if I run list-broken-files I get:

Finished!
  Listing remote folder ...
Return code: 0
And from log:
[Information-Duplicati.Library.Main.Operation.ListBrokenFilesHandler-NoMissingFilesFound]: 
No broken filesets found

When I run default repair:
1.)without any parameters,
2.) with –version=149
3.) with –version=0

I get this old error:

System.Exception: Unexpected update count: 0
   at Duplicati.Library.Main.Database.LocalRepairDatabase.FixMissingBlocklistHashes(String blockhashalgorithm, Int64 blocksize)
   at Duplicati.Library.Main.Operation.RepairHandler.RunRepairCommon()
   at Duplicati.Library.Main.Operation.RepairHandler.Run(IFilter filter)
   at Duplicati.Library.Main.Controller.RunAction[T](T result, String[]& paths, IFilter& filter, Action`1 method)
   at Duplicati.Library.Main.Controller.Repair(IFilter filter)
   at Duplicati.CommandLine.Commands.Repair(TextWriter outwriter, Action`1 setup, List`1 args, Dictionary`2 options, IFilter filter)
   at Duplicati.CommandLine.Program.RunCommandLine(TextWriter outwriter, TextWriter errwriter, Action`1 setup, String[] args)
Return code: 100

So it looks like to me that problem is not only in one version of backup?
I will try more testing later this week.
Thank again!

Are you deleting the database between repairs? That seems like the cleanest test anyway.
Also, I think I was having some not-quite-what-I-had-hoped-to-cause results without delete.
For example, I couldn’t force it to get additional versions. I think my test might have misled.
I spoke below it of another window doing “del” commands. I should have shown that in-line.

Hello
No I’m not deleting DB

Today I tried Duplicati.CommandLine.RecoveryTool.exe
I successfully decrypted files, indexed files and started restore from latest versions - no error message so far. So backup files are probably fine?

EDIT: whole backup (all of 87GB ) was successfully restored via RecoveryTool - so backup files seems really fine.

So maybe main problem is some bug in repair with handling "Unexpected update count: ?

Hi,

It may be a little late for you and I’m not sure if my steps apply to your exact issue as well, but I did a bit of testing with corrupted block / index files on the backup destination. I came up with these steps to return the backup into a usable state and having Duplicati re-backup the data from the corrupted block.

  1. sort remote storage by date/time modified
  2. locate affected block file (and via timestamp also index file belonging to it) - getting rid of the index file belonging to the block file you remove is important to avoid errors on a future restore done without the database
  3. move both into another folder outside of the backup folder (may be discarded after successful procedure)
  4. run list-broken-files (make note of affected files as those are lost and will be backed up again)
  5. run purge-broken-files
  6. run backup again (to back up the data again that was included in the block file you removed)
  7. run restore on affected files to check that they now restore again properly
    (with --no-local-blocks = true setting to ensure that the backup is fine and no local files are used)
  8. optionally run compact to remove other blocks that are now obsolete

For corrupted index files, simply delete and afterwards run a repair.

I hope these steps are helpful to you and others.

All the best,
Rain

Hello @rain, welcome to the forum - and thanks for sharing what worked for you!

I have a few comments about your suggestion as it should with fine in some instances, but in others might not be quite as straight forward.

For step 2, be careful with using timestamp to find associated dindex files as they may not always match with dblock files. During initial creation of dblock it should be close but as the files are it becomes more likely that they might be re-written as a part of a compact process (retention policy) or another repair.

Step 6 implies this will re-back up data that was in the broken dblock file which is correct if the file parts stored in it haven’t changed since they were backed up. But again, as the dblock files age the contents are more and more likely to become parts of older versions of the file.

So during your process you’ll be backup up the current versions of files, not older ones that may have been in the dblock. And of course files that have been deleted since the initial backup will not be restored.

Again, your process looks good and is a great way to get a broken backup working again - I just don’t want people to come at it with incorrect expectations.

@Pectojin, how hard do you think it would be to allow list-broken-files to check the live files and indicate which broken ones can be fully repaired, have lost versions, and no longer exist?

Thanks for reply:

only list of “affected” files I have are those from verify specific version. But they are not in backup folder. In the past I tried create them manual a run commands again, but nothing helpful.

Listing remote folder …
Missing file: duplicati-be9040f5d75f543769ddd42dde42544fd.dblock.zip.aes
Missing file: duplicati-i970bfdb4dcd740feae4fdda0b408ed1f.dindex.zip.aes
Missing file: duplicati-b164e1caa808f4fbf9998ac5cbe46aa86.dblock.zip.aes
Missing file: duplicati-ic73b7685ca6d40a5814340fb3b434c16.dindex.zip.aes
Missing file: duplicati-bff4a003f542946398496638b59908da3.dblock.zip.aes
Found 5 files that are missing from the remote storage, please run repair

List of backup files look like this: