Fatal error: Detected non-empty blocksets with no associated blocks

This is a serious problem. Here’s a list of all the recent reports (8 including mine):

And it’s serious because it is non-recoverable. One person was able to fix by restoring the Duplicati database which he backed up separately. Most people have to delete the backup and start again (which can take days).

I may have a clue as to what causes this. It may have to do with running alternately with and without privs, which changes what files are visible to Duplicati.

I was running version duplicati-2.0.4.5_beta_2018-11-28-x64 on Windows 8.

My sequence of runs was:

  1. Run as user1 with no privs, and attempt to backup user1 and user2’s files.
    • did not backup any of user2’s files - gave warnings.
    • did not see at all user1’s priv’ed files, eg C:\Users\USER1\NTUSER.DAT
  2. Run as user1 with privs (Run as administrator), but with --snapshot-policy=off
    • did backup user2’s files.
    • could see user1’s priv’ed files, eg C:\Users\USER1\NTUSER.DAT
    • but threw Failed to process path warnings on them.
  3. Run as user1 with privs (Run as administrator), with --snapshot-policy=on
    • did back up user2’s files.
    • could see user1’s priv’ed files, eg C:\Users\USER1\NTUSER.DAT
    • and no warnings on them. Assume it snapshotted them (VSS) and backed them up.
  4. PC rebooted.
  5. Duplicati restarted via c:\ProgramData\Microsoft\Windows\Start Menu\Programs\Startup which starts it UN-prived.
  6. So run again as user1 with no privs, and I got this error System.Exception: Detected non-empty blocksets with no associated blocks!

So here’s my wild guess: that final run that errors was with NO privs after running WITH privs. So the final error run would see fewer files than the previous one. And somehow that’s where the blocksets and blocks get out of sync. Just a guess, but this user suspected something similar

If I’m wrong, hopefully the sequence might be helpful to someone.

Here’s my stack trace (I manually added newlines before each “at”)

Jan 7, 2019 9:22 PM: Failed while executing "Backup" with id: 1
System.Exception: Detected non-empty blocksets with no associated blocks!
at Duplicati.Library.Main.Database.LocalDatabase.VerifyConsistency(Int64 blocksize, Int64 hashsize, Boolean verifyfilelists, IDbTransaction transaction)
at Duplicati.Library.Main.Operation.Backup.BackupDatabase.<>c__DisplayClass32_0.<VerifyConsistencyAsync>b__0()
at Duplicati.Library.Main.Operation.Common.SingleRunner.<>c__DisplayClass3_0.<RunOnMain>b__0()
at Duplicati.Library.Main.Operation.Common.SingleRunner.<DoRunOnMain>d__2`1.MoveNext()
--- End of stack trace from previous location where exception was thrown ---
at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)
at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)
at Duplicati.Library.Main.Operation.BackupHandler.<RunAsync>d__19.MoveNext()
--- End of stack trace from previous location where exception was thrown ---
at CoCoL.ChannelExtensions.WaitForTaskOrThrow(Task task)
at Duplicati.Library.Main.Controller.<>c__DisplayClass13_0.<Backup>b__0(BackupResults result)
at Duplicati.Library.Main.Controller.RunAction[T](T result, String[]& paths, IFilter& filter, Action`1 method)
at Duplicati.Library.Main.Controller.Backup(String[] inputsources, IFilter filter) at Duplicati.Server.Runner.Run(IRunnerData data, Boolean fromQueue)
1 Like

I missed this post saying now a repair fixes this problem. I did not try a repair.

I don’t understand the internals enough. There’s at least three entities: database, blocksets, and blocks. And dlists? and indexes?

I’m not sure what this error is indicating is broken, and what the various repairs do. Is there one I should try? I’ll try to read up on this when I have time.

Duplicati User’s Manual in the Articles section says how Backup and Restore work. Unfortunately there’s no equivalent article that talks about Repair (possibly because its behavior is variable depending on the need). The REPAIR command doesn’t say much, and Tries is a key word. Sometimes it can repair, sometimes not. Currently a good way to forecast whether it will work is to read other reports (nice list BTW) and their efforts.

My take on the complaint is that this is a Duplicati check on itself finding inconsistency between a table that can quickly give file length, and a different table used for file reassembly from blocks, BUT none are noted. Although I can’t say that cause is the same (cause is unknown, at least to me), this reminds me of a similar error where there ARE blocks noted as in that file, but their lengths added up don’t total to other file length.

Here is the code and the very simple SQL behind your error. Not far above it is the code of the similar issue which produces “Found inconsistency in the following files while validating database” using a little more SQL

I was hoping your steps would reproduce the error, but they never managed to recreate the database issue. Mine used two Command Prompt windows of an Administrator user, with just one using Run as administrator. Duplicati.GUI.TrayIcon.exe manually launched from one or the other. I didn’t reboot but I did restart Duplicati. The backed up areas also weren’t big, and changes were rare. Are you willing to test again, maybe simpler?

Maybe a more direct path (if you still have a failing database) is to open it in DB Browser for SQLite and run:

SELECT * FROM “Blockset” WHERE “Length” > 0 AND “ID” NOT IN (SELECT “BlocksetId” FROM “BlocksetEntry”);

to see what files it’s complaining about. While I know there are privacy issues, sometimes names are needed, not so much by others, but by you to see where the problem files came from. Having done that, you could get a sanitized-for-names-and-other-things database bug report, post it somewhere, and give a link in the forum. This would also reverse the weak encryption put on the database in Windows. If you want a direct view, you’d start the Server or TrayIcon with --unencrypted-database before opening DB in your favorite SQLite browser.

To decode the SQL some, a Blockset table row has a file, which has a length, and an ID which also locates its blocks in the BlocksetEntry table. Big files have more blocks, but a file with a length but no blocks is the error. How this happens is the question. A simple test case to reproduce issues like this would be very nice to have.
EDIT: To find the actual path from that Blockset table row, you would look ID up as BlocksetID in the File table.

1 Like

@ts678 Thanks for your efforts. Much appreciated!

You said in the other bug:

and above

So if I put those two together and elaborate the error message, I get:

“Found in the DB table, blockset entries with non-zero length, meaning there should be blocks in the backup store for these, and and they should be listed in the table of blocks (table of BlocksetEntry items?), but they’re not there.”

Did I do that right?

If so, then my guess was in the wrong direction. Going from “more files” to “less files” (by going from prived to unprived) should have resulted in something more like actual block entries for which there was no summary blockset entry, not the other way around.

Also, understanding more, I don’t see how any change in the source files selected could result in this kind of a database inconsistency, because aren’t we in a phase of database verification before Duplicati even scans the source set? Unless somehow the selected source files somehow affect indexing into these tables? I doubt that.

Deleting versions is promising. But it’s not going to happen on its own because this error is early before actual backup begins. Till I fix this, no more backups (and thus no deletions) will happen. (Altho, I have set to infinite versions anyway :wink:

Here’s another theory: while running with privs, Duplicate wrote some internal files that got protected, and when subsequently running with no privs, could not read those same files. Could that affect what BlocksetEntries are available? It sound like it’s all in one .db file so I doubt it. Can individual SQL entries be protected, needing privs to access?

Other notes. Yes my backup was big(ish): 20GB. And hundreds of (mostly useless temp) files change each backup (daily).

I think finding the filenames via the SQL query will be very helpful. I will try when I have access to that machine again.

By my reading, you nailed it.

I’m not that familiar with the code, but it looks like it starts the scanner first. Code for early and final verify.

Not as far as I know, and if total DB access is lost, there would definitely be some noise resulting from that.

@ts678 here are my results. And thanks again for the help!

First some cautions/helps:

First, when using SQLiteDatabaseBrowser you can freely modify the database. I strongly suggest anyone trying to follow these steps to make a copy of the sqlite file before “browsing” (I did myself). (EDIT: Or, File -> Open database read-only)

Second, the file was located here for me:

C:\Users\user1\AppData\Local\Temp\887265818...3848570.sqlite

There are 3 .sqlite files there and that’s the one, not “backup 2019...052714.sqlite” nor “Duplicati-server.sqlite

Third the way the syntax works in SQLiteDatabaseBrowser you can not use the quotes. I used (EDIT: or maybe it was just the particular variant of quote chars I got when I copy-pasted from this forum in my browser):

SELECT * FROM Blockset WHERE Length > 0 AND ID NOT IN (SELECT BlocksetId FROM BlocksetEntry)

The results were one hit:

ID: 65317
Length: 72184
Hash: ND1F/R+XlQ+HKvEZ65U5JJWY//A0s8VbR4RnNNwgcbs=

Unfortunately, when I looked up that ID as the BlocksetID in the File table, it was not found. This lookup was using the filter feature of SQLiteDatabaseBrowser at the top of the column while browsing data.

(EDIT) Or you could use this sql query from @ArdieX in this post:

SELECT * FROM File WHERE BlocksetID BETWEEN 211995 AND 212020

And change the numerical values to fit your situation.

Now that I re-read the sql statement, not finding the ID in the File table kind of makes sense. The sql is looking for missing BlocksetIds in the BlocksetEntry table, and it has found one. The fact that that same BlocksetID is also not found in the File table is perhaps almost expected, now that I think about it. The error message says “with no associated blocks”. And now I see that means “there isn’t even an entry at all in the block table”.

More information, though:

All the BlocksetIDs around this number were in the TEMP dir. And the id’s were not contiguous. Both of these two observations may suggest that my id was there at one point, and was deleted on disk, and deleted from both the File table and the BlocksetEntry table but not from the Blockset table.

And even more hints:

The immediately previous BlocksetID numerically to my hit was this file:

C:\Users\user1\AppData\Local\Temp\dup-3fa940f2-ba1f-4489-9ebb-74d11a070186

There are several dup-* files all created in the last few days in that Temp dir and I strongly suspect these are dup-licati files.

So from these hints, one can speculate a sequence of events that are involved in this bug:

  1. Duplicati deletes one of its own temp files
  2. This causes Duplicati to delete that file from the database File table and the BlocksetEntry table but not the Blockset table.

If we’re lucky, that will mean something to somebody :wink:

Also this exercise is making me think a database repair might help.

But now I feel like we might be on the trail to identifying a bug and I want to continue that process.

2 Likes

I just experienced the same issue on OSX 2.0.4.5_beta_2018-11-28… a simple database repair seems to have fixed it and I’m back up and running.

Same problem here. I’m using 2.0.4.5_beta_2018-11-28 on Windows 10 Pro 1809.
Reparing the database doesn’t help. One time, deleting and recreating the database was successful (until the error ocurred again two days later), the other time I had to delete the backup and start again (30 hours of uploading…)

Now I have this error for the third time. If this can’t be solved, I’ll have to look for another backup software. But I’d really like to continue using duplicati, it is a great project!

I’m running duplicati as a service (Duplicati.WindowsService.exe install), so privileges should not be a problem.

Perhaps some of my advanced options are important?
–snapshot-policy = on
–auto-cleanup = true
–retention-policy = 7D:1D,1M:7D,1Y:1M,5Y:1Y
–retry-delay = 60s
–usn-policy = on

Backend ist B2.

Keeping copies of the database as one goes through fix attempts (e.g. the repair you might run soon) is always good, but for SQLiteDatabaseBrowser use, I would hope that “Open Database Read Only” works.

Odd about the quotes. Mine accepts double quotes, but not two-in-a-row as one gets from source paste.

Duplicati puts many dup* files in tempdir, and its random names make it hard to tell what’s what without opening (try adding suffixes such as .zip or .aes. Sometimes it’s even SQLite database. Possibly this is significant, as active Duplicati temp files would possibly be locked intermittently. Locking was suspected in Locked files seems to cause corrupt backup #3594, where I’m pointing back here, and encouraging help.

That’s an open invitation to anybody, especially those on this thread. Finding a simple way to get it will be wonderful. Setting up higher levels of –log-file-log-level for –log-file may also help, if it’s random but likely. Logging can also help see if a prior locking error seems to be related, and might also be a workaround, if excluding areas giving issues is reasonable. Sometimes it’s not, but temporary files can likely be skipped. That’s coming at things the other way. One is how to make it, the other is how to avoid it. Both can help…

@thomasp Any chance you would have the time to follows the steps I followed given to me by @ts678 ?

Ie downloading DB Browser for SQLite there’s even a portable version, opening up your Duplicati database (read-only), doing the sql query then looking in the File table?

I suspect it would be helpful to compare your results to mine.

If your BlocksetID is not found (or even if it is) you can leave off the last digit and see the (up to 10) files around it. Or leave off the last 2 digits and see the (up to) 100 files around it.

@johnvk Ok, I’ll give it a try, though I’m not a database specialist.

My database is located in C:\WINDOWS\system32\config\systemprofile\AppData\Local\Duplicati because I’m running Duplicati as a service.

Your database query returns 10 hits with these IDs:

218415
218457
218469
218578
218703
218912
218918
219000
219096
219098

There are no corresponding files in the File database.
When I look at files with neighboring IDs, these are non-contiguous.

The files are from these folders (and subfolders):

C:\Users\Thoma\AppData\Roaming\Thunderbird\Profiles
C:\Users\Thoma\AppData\Roaming\HearthstoneDeckTracker
C:\Users\Thoma\AppData\Local\Temp
C:\Users\Thoma\AppData\Local\Microsoft\Windows\Explorer\iconcache_16.db
C:\Users\Thoma\AppData\Local\Google\Chrome\User Data\

This looks like these are files that were open while the backup was running.

@ts678 I have set the logging options (level warning), but as long as the backup isn’t working, I fear this won’t help. Duplicati doesn’t even start the backup at the moment.
The logfile shows:

2019-01-12 12:14:32 +01 - [Warning-Duplicati.Library.Main.Operation.Backup.FileEnumerationProcess FileAccessError]: Error reported while accessing file: C:\Users\Thoma\OneDrive\System.IO.IOException: (19) Das Medium ist schreibgeschützt: [\\?\GLOBALROOT\Device\HarddiskVolumeShadowCopy7\Users\Thoma\OneDrive\*]
bei Alphaleonis.Win32.NativeError.ThrowException(UInt32 errorCode, String readPath, String writePath)
bei Alphaleonis.Win32.Filesystem.FindFileSystemEntryInfo.FindFirstFile(String pathLp, WIN32_FIND_DATA& win32FindData)
bei Alphaleonis.Win32.Filesystem.FindFileSystemEntryInfo.<Enumerate>d__3`1.MoveNext()
bei System.Linq.Buffer`1..ctor(IEnumerable`1 source)
bei System.Linq.Enumerable.ToArray[TSource](IEnumerable`1 source)
bei Duplicati.Library.Snapshots.WindowsSnapshot.ListFolders(String localFolderPath)
bei Duplicati.Library.Utility.Utility.<EnumerateFileSystemEntries>d__23.MoveNext()
2019-01-12 12:14:32 +01 - [Warning-Duplicati.Library.Main.Operation.Backup.FileEnumerationProcess-FileAccessError]: Error reported while accessing file: C:\Users\Thoma\OneDrive\System.IO.IOException: (19) Das Medium ist schreibgeschützt: [\\?\GLOBALROOT\Device\HarddiskVolumeShadowCopy7\Users\Thoma\OneDrive\*]
bei Alphaleonis.Win32.NativeError.ThrowException(UInt32 errorCode, String readPath, String writePath)
bei Alphaleonis.Win32.Filesystem.FindFileSystemEntryInfo.FindFirstFile(String pathLp, WIN32_FIND_DATA& win32FindData)
bei Alphaleonis.Win32.Filesystem.FindFileSystemEntryInfo.<Enumerate>d__3`1.MoveNext()
bei System.Linq.Buffer`1..ctor(IEnumerable`1 source)
bei System.Linq.Enumerable.ToArray[TSource](IEnumerable`1 source)
bei Duplicati.Library.Snapshots.WindowsSnapshot.ListFiles(String localFolderPath)
bei Duplicati.Library.Utility.Utility.<EnumerateFileSystemEntries>d__23.MoveNext()
2019-01-12 12:14:34 +01 - [Error-Duplicati.Library.Main.Operation.BackupHandler-FatalError]: Fatal error System.Exception: Detected non-empty blocksets with no associated blocks!
bei Duplicati.Library.Main.Database.LocalDatabase.VerifyConsistency(Int64 blocksize, Int64 hashsize, Boolean verifyfilelists, IDbTransaction transaction)
bei Duplicati.Library.Main.Operation.Backup.BackupDatabase.<>c__DisplayClass32_0.<VerifyConsistencyAsync>b__0()
bei Duplicati.Library.Main.Operation.Common.SingleRunner.<>c__DisplayClass3_0.<RunOnMain>b__0()
bei Duplicati.Library.Main.Operation.Common.SingleRunner.<DoRunOnMain>d__2`1.MoveNext()
--- Ende der Stapelüberwachung vom vorhergehenden Ort, an dem die Ausnahme ausgelöst wurde ---
bei System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw()
bei System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)
bei Duplicati.Library.Main.Operation.BackupHandler.<RunAsync>d__19.MoveNext()

Thanks for your help!

Excellent. You and I had very similar results. Files in the temp/working dirs that could have been some combination of:

  1. file open at start of backup
  2. existing file newly opened by another program during backup
  3. existing file newly closed by another program during backup
  4. file created during backup
  5. file deleted during backup

Or some combination.

And that causes Duplicati to write inconsistent data to its database. I feel like this is getting closer to a description of the bug circumstances.

Since we’re both running --snapshot-policy=on the open files should not cause an issue, but who knows?

And I guess log during the detection of the inconsistent database is not very helpful. Ideally, what we want is log during the insertion of the inconsistent state. If we only knew when lighting was going to hit the clock tower…

I deleted and re-created my backup job. Files are uploading right now (might take a day or two). The logfile is set up - let’s see if the error occurs again.

Initial backup was successful.
The first incremental backup throws the following exception:

2019-01-16 13:35:34 +01 - [Warning-Duplicati.Library.Main.Operation.Backup.FileEnumerationProcess-FileAccessError]: Error reported while accessing file: C:\Users\Thoma\OneDrive\System.IO.IOException: (19) Das Medium ist schreibgeschützt: [\\?\GLOBALROOT\Device\HarddiskVolumeShadowCopy10\Users\Thoma\OneDrive\*]
bei Alphaleonis.Win32.NativeError.ThrowException(UInt32 errorCode, String readPath, String writePath)
bei Alphaleonis.Win32.Filesystem.FindFileSystemEntryInfo.FindFirstFile(String pathLp, WIN32_FIND_DATA& win32FindData)
bei Alphaleonis.Win32.Filesystem.FindFileSystemEntryInfo.<Enumerate>d__3`1.MoveNext()
bei System.Linq.Buffer`1..ctor(IEnumerable`1 source)
bei System.Linq.Enumerable.ToArray[TSource](IEnumerable`1 source)
bei Duplicati.Library.Snapshots.WindowsSnapshot.ListFiles(String localFolderPath)
bei Duplicati.Library.Utility.Utility.<EnumerateFileSystemEntries>d__23.MoveNext()
2019-01-16 13:35:47 +01 - [Warning-Duplicati.Library.Main.Operation.Backup.UploadSyntheticFilelist-MissingTemporaryFilelist]: Expected there to be a temporary fileset for synthetic filelist (2, duplicati-be1a9aec8e5684efb86def41732d91480.dblock.zip.aes), but none was found?
2019-01-16 13:37:47 +01 - [Error-Duplicati.Library.Main.Operation.BackupHandler-FatalError]: Fatal error System.IO.PathTooLongException: Der angegebene Pfad und/oder Dateiname ist zu lang. Der vollständig qualifizierte Dateiname muss kürzer als 260 Zeichen und der Pfadname kürzer als 248 Zeichen sein.
bei System.IO.Path.LegacyNormalizePath(String path, Boolean fullCheck, Int32 maxPathLength, Boolean expandShortPaths)
bei System.IO.Path.GetPathRoot(String path)
bei Duplicati.Library.Snapshots.UsnJournalService.IsPathEnumerated(String path)
bei Duplicati.Library.Main.Operation.BackupHandler.<>c__DisplayClass12_0.<RunMainOperation>b__2(String path, Int64 fileSize)
...

The Duplicati GUI tells me that there never was a successful backup.
When I click the restore option, it shows me the full backup and the first incremental backup as possible restore points.

This is the same behaviour as before (including errors). It will work for a few days and then complain about the non-empty blocksets again.

I continue to have this same error with two backup destinations. Deleting the database allows a backup to occur but then it gets back to this state. The problem is not with privileges for me – Duplicati has only been run as one user. If I delete the database can I back to 2.0.3.3 as that version worked fine for me on a different computer? I’d like to confirm if the bug was introduced somewhere between that version and 2.0.4.5.

1 Like

@thomasp Did you have your logfile on during the 2nd, incremental that errored? Anything in there useful?

That OneDrive folder it complains about is hard to explain. You are still running with --snapshot-policy-on? Still running as a service? On both the initial and subsequent incremental runs? Then access to OneDrive should work. It looks like it works first time and not subsequent.

Notice the full path implies VSS is on (has VolumeShadowCopy in its name):

[\\?\GLOBALROOT\Device\HarddiskVolumeShadowCopy10\Users\Thoma\OneDrive\*]

Why can’t it access this file the 2nd time?

I also note that when you ran the database query on your previous db you did not get files in OneDrive but rather temp files. Also confusing.

Also I noted this in your log: System.IO.PathTooLongException. I don’t see a hint as to what that path was that was too long. Is there anything in the logfile near this?

EDIT: this appears to be a known bug with --usn-policy. See this post

@kaltiz privs can still be a backdoor issue if the files you’re backing up are open and/or changing during the backup. Duplicati provides advanced option --snapshot-policy=on to access open files and freeze them, but you need privs on windows and linux to use it.

Both @kaltiz and @thomasp(again), also, can you perform the steps listed in reply 3 and elaborated on in reply 6 and post your results?

Today fixed yet again one backup with terabytes of data, which had this error with repair. Yet the repair is as uninformative as it can be. It says “listing remote folder” and that’s all. No indication whatsoever if something and was fixed or not and so on. But after this, Backup works again. I think this is fourth time this same set fails (on local disks) with the same reason. Systemic failure. (2.0.4.5 beta 2018-11-28)

I’m beginning to lose overview.
I didn’t have the time to watch duplicati very closely for the last days. It complained quite often about the PathTooLong-Exception but didn’t tell me which path was too long. The Restore GUI Option always offered to restore the initial backup and the last one but none in between. The backup job itself said there had never been a successful backup.

Today I (not really knowing what I was doing) clicked on “create error report” (or whatever it is labeled in english…) and Duplicati worked for about half an hour. Afterwards, I tried another incremental backup and - it worked! And it says it was successful…

Let’s see how long this is going to work.

I have the log-to-file-option enabled, but the logfile is growing and growing. Unfortunately, my computer is a notebook and I have it connected to different WLANs all day, so there are quite a lot of warning messages that the backblaze cloud (my backup target) is unreachable.
If you’re willing to analyze the logfile, I could send it to you (I don’t think it contains private data). I don’t see any new relevant information in it.

Ran into the same issue.

  • Added a new local path to an existing backup through the browser UI.
  • Started backup from the command line.
  • Upload got stuck with no network activity (a cellular connection), so I Ctrl-C’ed it.
  • A subsequent command line backup reported “remote files that are not recorded in local storage”.
  • Ran a database repair through the browser UI.
  • A subsequent command line backup reported “Abort due to constraint violation” and “non-empty blocksets”.
  • Another database repair through the browser UI was successful, but did not fix the problem.
  • A database rebuild through the browser UI was successful.
  • A subsequent command line backup was successful.

Duplicati 2.0.4.5_beta_2018-11-28
Ubuntu 18.10 (cosmic)
4.18.0-13-generic x86_64
Backblaze B2

A log below.


Backup started at 1/22/2019 11:44:52 PM
Checking remote backup …
Listing remote folder …
Scanning local files …
2406 files need to be examined (255.30 MB) (still counting)

Uploading file (49.93 MB) …
6279 files need to be examined (82.16 MB) (still counting)

37346 files need to be examined (1,007.73 MB) (still counting)
39677 files need to be examined (1.04 GB)
Uploading file (52.31 KB) …
^C


Backup started at 1/23/2019 12:16:36 AM
Checking remote backup …
Listing remote folder …
Extra unknown file: duplicati-i3387dfa62f9d47718c172f1a18f70e32.dindex.zip.aes
Found 1 remote files that are not recorded in local storage, please run repair
Fatal error => Found 1 remote files that are not recorded in local storage, please run repair

ErrorID: ExtraRemoteFiles
Found 1 remote files that are not recorded in local storage, please run repair


Backup started at 1/23/2019 12:17:20 AM
Checking remote backup …
Listing remote folder …
Scanning local files …
3085 files need to be examined (257.25 MB) (still counting)
Uploading file (17.98 KB) …
Failed to process path: /mnt/path/file.7z => Abort due to constraint violation
UNIQUE constraint failed: BlocklistHash.BlocksetID, BlocklistHash.Index
5569 files need to be examined (66.55 MB) (still counting)

38604 files need to be examined (1,007.89 MB) (still counting)
39677 files need to be examined (1.04 GB)
Uploading file (49.96 MB) …

Uploading file (49.92 MB) …
Fatal error => Detected non-empty blocksets with no associated blocks!
0 files need to be examined (0 bytes)

System.Exception: Detected non-empty blocksets with no associated blocks!
at Duplicati.Library.Main.Database.LocalDatabase.VerifyConsistency (System.Int64 blocksize, System.Int64 hashsize, System.Boolean verifyfilelists, System.Data.IDbTransaction transaction) [0x0017e] in :0
at Duplicati.Library.Main.Operation.Backup.BackupDatabase+<>c__DisplayClass32_0.b__0 () [0x00000] in :0
at Duplicati.Library.Main.Operation.Common.SingleRunner+<>c__DisplayClass3_0.b__0 () [0x00000] in :0
at Duplicati.Library.Main.Operation.Common.SingleRunner+d__21[T].MoveNext () [0x000b0] in <c6c6871f516b48f59d88f9d731c3ea4d>:0 --- End of stack trace from previous location where exception was thrown --- at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw () [0x0000c] in <8f2c484307284b51944a1a13a14c0266>:0 at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess (System.Threading.Tasks.Task task) [0x0004e] in <8f2c484307284b51944a1a13a14c0266>:0 at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification (System.Threading.Tasks.Task task) [0x0002e] in <8f2c484307284b51944a1a13a14c0266>:0 at System.Runtime.CompilerServices.TaskAwaiter.ValidateEnd (System.Threading.Tasks.Task task) [0x0000b] in <8f2c484307284b51944a1a13a14c0266>:0 at System.Runtime.CompilerServices.TaskAwaiter.GetResult () [0x00000] in <8f2c484307284b51944a1a13a14c0266>:0 at Duplicati.Library.Main.Operation.BackupHandler+<RunAsync>d__19.MoveNext () [0x00b1f] in <c6c6871f516b48f59d88f9d731c3ea4d>:0 --- End of stack trace from previous location where exception was thrown --- at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw () [0x0000c] in <8f2c484307284b51944a1a13a14c0266>:0 at CoCoL.ChannelExtensions.WaitForTaskOrThrow (System.Threading.Tasks.Task task) [0x00050] in <6973ce2780de4b28aaa2c5ffc59993b1>:0 at Duplicati.Library.Main.Operation.BackupHandler.Run (System.String[] sources, Duplicati.Library.Utility.IFilter filter) [0x00008] in <c6c6871f516b48f59d88f9d731c3ea4d>:0 at Duplicati.Library.Main.Controller+<>c__DisplayClass13_0.<Backup>b__0 (Duplicati.Library.Main.BackupResults result) [0x00035] in <c6c6871f516b48f59d88f9d731c3ea4d>:0 at Duplicati.Library.Main.Controller.RunAction[T] (T result, System.String[]& paths, Duplicati.Library.Utility.IFilter& filter, System.Action1[T] method) [0x0011d] in :0

1 Like

I just looked at my backup database which was having this issue with sqlitebrowser. This issue appears to be the same as others reporting – there were 2 entries in Blockset which were not in the File table. One of the nearby entries was for the Temp directory.

I will note that on my previous computer, which didn’t have this issue, I was running a version of Duplicati from earlier in 2018 which had a default “Windows” exclude option, which was checked. When I installed the latest version of Duplicati on this computer which is giving me issues, there was no such option for a default “Windows” exclude option. I’m wondering if the default exclude list in the older version of Duplicati was excluding the Temp directory? Why was that option removed from Duplicati?

I created an exclude filter to exclude the Temp directory and some other paths. I’m having Duplicati recreate the database right now and I’ll see if this fixes the problem or if it comes back.