Fatal error: Detected non-empty blocksets with no associated blocks

@ts678 Thanks for your efforts. Much appreciated!

You said in the other bug:

and above

So if I put those two together and elaborate the error message, I get:

“Found in the DB table, blockset entries with non-zero length, meaning there should be blocks in the backup store for these, and and they should be listed in the table of blocks (table of BlocksetEntry items?), but they’re not there.”

Did I do that right?

If so, then my guess was in the wrong direction. Going from “more files” to “less files” (by going from prived to unprived) should have resulted in something more like actual block entries for which there was no summary blockset entry, not the other way around.

Also, understanding more, I don’t see how any change in the source files selected could result in this kind of a database inconsistency, because aren’t we in a phase of database verification before Duplicati even scans the source set? Unless somehow the selected source files somehow affect indexing into these tables? I doubt that.

Deleting versions is promising. But it’s not going to happen on its own because this error is early before actual backup begins. Till I fix this, no more backups (and thus no deletions) will happen. (Altho, I have set to infinite versions anyway :wink:

Here’s another theory: while running with privs, Duplicate wrote some internal files that got protected, and when subsequently running with no privs, could not read those same files. Could that affect what BlocksetEntries are available? It sound like it’s all in one .db file so I doubt it. Can individual SQL entries be protected, needing privs to access?

Other notes. Yes my backup was big(ish): 20GB. And hundreds of (mostly useless temp) files change each backup (daily).

I think finding the filenames via the SQL query will be very helpful. I will try when I have access to that machine again.

By my reading, you nailed it.

I’m not that familiar with the code, but it looks like it starts the scanner first. Code for early and final verify.

Not as far as I know, and if total DB access is lost, there would definitely be some noise resulting from that.

@ts678 here are my results. And thanks again for the help!

First some cautions/helps:

First, when using SQLiteDatabaseBrowser you can freely modify the database. I strongly suggest anyone trying to follow these steps to make a copy of the sqlite file before “browsing” (I did myself). (EDIT: Or, File -> Open database read-only)

Second, the file was located here for me:

C:\Users\user1\AppData\Local\Temp\887265818...3848570.sqlite

There are 3 .sqlite files there and that’s the one, not “backup 2019...052714.sqlite” nor “Duplicati-server.sqlite

Third the way the syntax works in SQLiteDatabaseBrowser you can not use the quotes. I used (EDIT: or maybe it was just the particular variant of quote chars I got when I copy-pasted from this forum in my browser):

SELECT * FROM Blockset WHERE Length > 0 AND ID NOT IN (SELECT BlocksetId FROM BlocksetEntry)

The results were one hit:

ID: 65317
Length: 72184
Hash: ND1F/R+XlQ+HKvEZ65U5JJWY//A0s8VbR4RnNNwgcbs=

Unfortunately, when I looked up that ID as the BlocksetID in the File table, it was not found. This lookup was using the filter feature of SQLiteDatabaseBrowser at the top of the column while browsing data.

(EDIT) Or you could use this sql query from @ArdieX in this post:

SELECT * FROM File WHERE BlocksetID BETWEEN 211995 AND 212020

And change the numerical values to fit your situation.

Now that I re-read the sql statement, not finding the ID in the File table kind of makes sense. The sql is looking for missing BlocksetIds in the BlocksetEntry table, and it has found one. The fact that that same BlocksetID is also not found in the File table is perhaps almost expected, now that I think about it. The error message says “with no associated blocks”. And now I see that means “there isn’t even an entry at all in the block table”.

More information, though:

All the BlocksetIDs around this number were in the TEMP dir. And the id’s were not contiguous. Both of these two observations may suggest that my id was there at one point, and was deleted on disk, and deleted from both the File table and the BlocksetEntry table but not from the Blockset table.

And even more hints:

The immediately previous BlocksetID numerically to my hit was this file:

C:\Users\user1\AppData\Local\Temp\dup-3fa940f2-ba1f-4489-9ebb-74d11a070186

There are several dup-* files all created in the last few days in that Temp dir and I strongly suspect these are dup-licati files.

So from these hints, one can speculate a sequence of events that are involved in this bug:

  1. Duplicati deletes one of its own temp files
  2. This causes Duplicati to delete that file from the database File table and the BlocksetEntry table but not the Blockset table.

If we’re lucky, that will mean something to somebody :wink:

Also this exercise is making me think a database repair might help.

But now I feel like we might be on the trail to identifying a bug and I want to continue that process.

1 Like

I just experienced the same issue on OSX 2.0.4.5_beta_2018-11-28… a simple database repair seems to have fixed it and I’m back up and running.

Same problem here. I’m using 2.0.4.5_beta_2018-11-28 on Windows 10 Pro 1809.
Reparing the database doesn’t help. One time, deleting and recreating the database was successful (until the error ocurred again two days later), the other time I had to delete the backup and start again (30 hours of uploading…)

Now I have this error for the third time. If this can’t be solved, I’ll have to look for another backup software. But I’d really like to continue using duplicati, it is a great project!

I’m running duplicati as a service (Duplicati.WindowsService.exe install), so privileges should not be a problem.

Perhaps some of my advanced options are important?
–snapshot-policy = on
–auto-cleanup = true
–retention-policy = 7D:1D,1M:7D,1Y:1M,5Y:1Y
–retry-delay = 60s
–usn-policy = on

Backend ist B2.

Keeping copies of the database as one goes through fix attempts (e.g. the repair you might run soon) is always good, but for SQLiteDatabaseBrowser use, I would hope that “Open Database Read Only” works.

Odd about the quotes. Mine accepts double quotes, but not two-in-a-row as one gets from source paste.

Duplicati puts many dup* files in tempdir, and its random names make it hard to tell what’s what without opening (try adding suffixes such as .zip or .aes. Sometimes it’s even SQLite database. Possibly this is significant, as active Duplicati temp files would possibly be locked intermittently. Locking was suspected in Locked files seems to cause corrupt backup #3594, where I’m pointing back here, and encouraging help.

That’s an open invitation to anybody, especially those on this thread. Finding a simple way to get it will be wonderful. Setting up higher levels of –log-file-log-level for –log-file may also help, if it’s random but likely. Logging can also help see if a prior locking error seems to be related, and might also be a workaround, if excluding areas giving issues is reasonable. Sometimes it’s not, but temporary files can likely be skipped. That’s coming at things the other way. One is how to make it, the other is how to avoid it. Both can help…

@thomasp Any chance you would have the time to follows the steps I followed given to me by @ts678 ?

Ie downloading DB Browser for SQLite there’s even a portable version, opening up your Duplicati database (read-only), doing the sql query then looking in the File table?

I suspect it would be helpful to compare your results to mine.

If your BlocksetID is not found (or even if it is) you can leave off the last digit and see the (up to 10) files around it. Or leave off the last 2 digits and see the (up to) 100 files around it.

@johnvk Ok, I’ll give it a try, though I’m not a database specialist.

My database is located in C:\WINDOWS\system32\config\systemprofile\AppData\Local\Duplicati because I’m running Duplicati as a service.

Your database query returns 10 hits with these IDs:

218415
218457
218469
218578
218703
218912
218918
219000
219096
219098

There are no corresponding files in the File database.
When I look at files with neighboring IDs, these are non-contiguous.

The files are from these folders (and subfolders):

C:\Users\Thoma\AppData\Roaming\Thunderbird\Profiles
C:\Users\Thoma\AppData\Roaming\HearthstoneDeckTracker
C:\Users\Thoma\AppData\Local\Temp
C:\Users\Thoma\AppData\Local\Microsoft\Windows\Explorer\iconcache_16.db
C:\Users\Thoma\AppData\Local\Google\Chrome\User Data\

This looks like these are files that were open while the backup was running.

@ts678 I have set the logging options (level warning), but as long as the backup isn’t working, I fear this won’t help. Duplicati doesn’t even start the backup at the moment.
The logfile shows:

2019-01-12 12:14:32 +01 - [Warning-Duplicati.Library.Main.Operation.Backup.FileEnumerationProcess FileAccessError]: Error reported while accessing file: C:\Users\Thoma\OneDrive\System.IO.IOException: (19) Das Medium ist schreibgeschützt: [\\?\GLOBALROOT\Device\HarddiskVolumeShadowCopy7\Users\Thoma\OneDrive\*]
bei Alphaleonis.Win32.NativeError.ThrowException(UInt32 errorCode, String readPath, String writePath)
bei Alphaleonis.Win32.Filesystem.FindFileSystemEntryInfo.FindFirstFile(String pathLp, WIN32_FIND_DATA& win32FindData)
bei Alphaleonis.Win32.Filesystem.FindFileSystemEntryInfo.<Enumerate>d__3`1.MoveNext()
bei System.Linq.Buffer`1..ctor(IEnumerable`1 source)
bei System.Linq.Enumerable.ToArray[TSource](IEnumerable`1 source)
bei Duplicati.Library.Snapshots.WindowsSnapshot.ListFolders(String localFolderPath)
bei Duplicati.Library.Utility.Utility.<EnumerateFileSystemEntries>d__23.MoveNext()
2019-01-12 12:14:32 +01 - [Warning-Duplicati.Library.Main.Operation.Backup.FileEnumerationProcess-FileAccessError]: Error reported while accessing file: C:\Users\Thoma\OneDrive\System.IO.IOException: (19) Das Medium ist schreibgeschützt: [\\?\GLOBALROOT\Device\HarddiskVolumeShadowCopy7\Users\Thoma\OneDrive\*]
bei Alphaleonis.Win32.NativeError.ThrowException(UInt32 errorCode, String readPath, String writePath)
bei Alphaleonis.Win32.Filesystem.FindFileSystemEntryInfo.FindFirstFile(String pathLp, WIN32_FIND_DATA& win32FindData)
bei Alphaleonis.Win32.Filesystem.FindFileSystemEntryInfo.<Enumerate>d__3`1.MoveNext()
bei System.Linq.Buffer`1..ctor(IEnumerable`1 source)
bei System.Linq.Enumerable.ToArray[TSource](IEnumerable`1 source)
bei Duplicati.Library.Snapshots.WindowsSnapshot.ListFiles(String localFolderPath)
bei Duplicati.Library.Utility.Utility.<EnumerateFileSystemEntries>d__23.MoveNext()
2019-01-12 12:14:34 +01 - [Error-Duplicati.Library.Main.Operation.BackupHandler-FatalError]: Fatal error System.Exception: Detected non-empty blocksets with no associated blocks!
bei Duplicati.Library.Main.Database.LocalDatabase.VerifyConsistency(Int64 blocksize, Int64 hashsize, Boolean verifyfilelists, IDbTransaction transaction)
bei Duplicati.Library.Main.Operation.Backup.BackupDatabase.<>c__DisplayClass32_0.<VerifyConsistencyAsync>b__0()
bei Duplicati.Library.Main.Operation.Common.SingleRunner.<>c__DisplayClass3_0.<RunOnMain>b__0()
bei Duplicati.Library.Main.Operation.Common.SingleRunner.<DoRunOnMain>d__2`1.MoveNext()
--- Ende der Stapelüberwachung vom vorhergehenden Ort, an dem die Ausnahme ausgelöst wurde ---
bei System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw()
bei System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)
bei Duplicati.Library.Main.Operation.BackupHandler.<RunAsync>d__19.MoveNext()

Thanks for your help!

Excellent. You and I had very similar results. Files in the temp/working dirs that could have been some combination of:

  1. file open at start of backup
  2. existing file newly opened by another program during backup
  3. existing file newly closed by another program during backup
  4. file created during backup
  5. file deleted during backup

Or some combination.

And that causes Duplicati to write inconsistent data to its database. I feel like this is getting closer to a description of the bug circumstances.

Since we’re both running --snapshot-policy=on the open files should not cause an issue, but who knows?

And I guess log during the detection of the inconsistent database is not very helpful. Ideally, what we want is log during the insertion of the inconsistent state. If we only knew when lighting was going to hit the clock tower…

I deleted and re-created my backup job. Files are uploading right now (might take a day or two). The logfile is set up - let’s see if the error occurs again.

Initial backup was successful.
The first incremental backup throws the following exception:

2019-01-16 13:35:34 +01 - [Warning-Duplicati.Library.Main.Operation.Backup.FileEnumerationProcess-FileAccessError]: Error reported while accessing file: C:\Users\Thoma\OneDrive\System.IO.IOException: (19) Das Medium ist schreibgeschützt: [\\?\GLOBALROOT\Device\HarddiskVolumeShadowCopy10\Users\Thoma\OneDrive\*]
bei Alphaleonis.Win32.NativeError.ThrowException(UInt32 errorCode, String readPath, String writePath)
bei Alphaleonis.Win32.Filesystem.FindFileSystemEntryInfo.FindFirstFile(String pathLp, WIN32_FIND_DATA& win32FindData)
bei Alphaleonis.Win32.Filesystem.FindFileSystemEntryInfo.<Enumerate>d__3`1.MoveNext()
bei System.Linq.Buffer`1..ctor(IEnumerable`1 source)
bei System.Linq.Enumerable.ToArray[TSource](IEnumerable`1 source)
bei Duplicati.Library.Snapshots.WindowsSnapshot.ListFiles(String localFolderPath)
bei Duplicati.Library.Utility.Utility.<EnumerateFileSystemEntries>d__23.MoveNext()
2019-01-16 13:35:47 +01 - [Warning-Duplicati.Library.Main.Operation.Backup.UploadSyntheticFilelist-MissingTemporaryFilelist]: Expected there to be a temporary fileset for synthetic filelist (2, duplicati-be1a9aec8e5684efb86def41732d91480.dblock.zip.aes), but none was found?
2019-01-16 13:37:47 +01 - [Error-Duplicati.Library.Main.Operation.BackupHandler-FatalError]: Fatal error System.IO.PathTooLongException: Der angegebene Pfad und/oder Dateiname ist zu lang. Der vollständig qualifizierte Dateiname muss kürzer als 260 Zeichen und der Pfadname kürzer als 248 Zeichen sein.
bei System.IO.Path.LegacyNormalizePath(String path, Boolean fullCheck, Int32 maxPathLength, Boolean expandShortPaths)
bei System.IO.Path.GetPathRoot(String path)
bei Duplicati.Library.Snapshots.UsnJournalService.IsPathEnumerated(String path)
bei Duplicati.Library.Main.Operation.BackupHandler.<>c__DisplayClass12_0.<RunMainOperation>b__2(String path, Int64 fileSize)
...

The Duplicati GUI tells me that there never was a successful backup.
When I click the restore option, it shows me the full backup and the first incremental backup as possible restore points.

This is the same behaviour as before (including errors). It will work for a few days and then complain about the non-empty blocksets again.

I continue to have this same error with two backup destinations. Deleting the database allows a backup to occur but then it gets back to this state. The problem is not with privileges for me – Duplicati has only been run as one user. If I delete the database can I back to 2.0.3.3 as that version worked fine for me on a different computer? I’d like to confirm if the bug was introduced somewhere between that version and 2.0.4.5.

1 Like

@thomasp Did you have your logfile on during the 2nd, incremental that errored? Anything in there useful?

That OneDrive folder it complains about is hard to explain. You are still running with --snapshot-policy-on? Still running as a service? On both the initial and subsequent incremental runs? Then access to OneDrive should work. It looks like it works first time and not subsequent.

Notice the full path implies VSS is on (has VolumeShadowCopy in its name):

[\\?\GLOBALROOT\Device\HarddiskVolumeShadowCopy10\Users\Thoma\OneDrive\*]

Why can’t it access this file the 2nd time?

I also note that when you ran the database query on your previous db you did not get files in OneDrive but rather temp files. Also confusing.

Also I noted this in your log: System.IO.PathTooLongException. I don’t see a hint as to what that path was that was too long. Is there anything in the logfile near this?

EDIT: this appears to be a known bug with --usn-policy. See this post

@kaltiz privs can still be a backdoor issue if the files you’re backing up are open and/or changing during the backup. Duplicati provides advanced option --snapshot-policy=on to access open files and freeze them, but you need privs on windows and linux to use it.

Both @kaltiz and @thomasp(again), also, can you perform the steps listed in reply 3 and elaborated on in reply 6 and post your results?

Today fixed yet again one backup with terabytes of data, which had this error with repair. Yet the repair is as uninformative as it can be. It says “listing remote folder” and that’s all. No indication whatsoever if something and was fixed or not and so on. But after this, Backup works again. I think this is fourth time this same set fails (on local disks) with the same reason. Systemic failure. (2.0.4.5 beta 2018-11-28)

I’m beginning to lose overview.
I didn’t have the time to watch duplicati very closely for the last days. It complained quite often about the PathTooLong-Exception but didn’t tell me which path was too long. The Restore GUI Option always offered to restore the initial backup and the last one but none in between. The backup job itself said there had never been a successful backup.

Today I (not really knowing what I was doing) clicked on “create error report” (or whatever it is labeled in english…) and Duplicati worked for about half an hour. Afterwards, I tried another incremental backup and - it worked! And it says it was successful…

Let’s see how long this is going to work.

I have the log-to-file-option enabled, but the logfile is growing and growing. Unfortunately, my computer is a notebook and I have it connected to different WLANs all day, so there are quite a lot of warning messages that the backblaze cloud (my backup target) is unreachable.
If you’re willing to analyze the logfile, I could send it to you (I don’t think it contains private data). I don’t see any new relevant information in it.

Ran into the same issue.

  • Added a new local path to an existing backup through the browser UI.
  • Started backup from the command line.
  • Upload got stuck with no network activity (a cellular connection), so I Ctrl-C’ed it.
  • A subsequent command line backup reported “remote files that are not recorded in local storage”.
  • Ran a database repair through the browser UI.
  • A subsequent command line backup reported “Abort due to constraint violation” and “non-empty blocksets”.
  • Another database repair through the browser UI was successful, but did not fix the problem.
  • A database rebuild through the browser UI was successful.
  • A subsequent command line backup was successful.

Duplicati 2.0.4.5_beta_2018-11-28
Ubuntu 18.10 (cosmic)
4.18.0-13-generic x86_64
Backblaze B2

A log below.


Backup started at 1/22/2019 11:44:52 PM
Checking remote backup …
Listing remote folder …
Scanning local files …
2406 files need to be examined (255.30 MB) (still counting)

Uploading file (49.93 MB) …
6279 files need to be examined (82.16 MB) (still counting)

37346 files need to be examined (1,007.73 MB) (still counting)
39677 files need to be examined (1.04 GB)
Uploading file (52.31 KB) …
^C


Backup started at 1/23/2019 12:16:36 AM
Checking remote backup …
Listing remote folder …
Extra unknown file: duplicati-i3387dfa62f9d47718c172f1a18f70e32.dindex.zip.aes
Found 1 remote files that are not recorded in local storage, please run repair
Fatal error => Found 1 remote files that are not recorded in local storage, please run repair

ErrorID: ExtraRemoteFiles
Found 1 remote files that are not recorded in local storage, please run repair


Backup started at 1/23/2019 12:17:20 AM
Checking remote backup …
Listing remote folder …
Scanning local files …
3085 files need to be examined (257.25 MB) (still counting)
Uploading file (17.98 KB) …
Failed to process path: /mnt/path/file.7z => Abort due to constraint violation
UNIQUE constraint failed: BlocklistHash.BlocksetID, BlocklistHash.Index
5569 files need to be examined (66.55 MB) (still counting)

38604 files need to be examined (1,007.89 MB) (still counting)
39677 files need to be examined (1.04 GB)
Uploading file (49.96 MB) …

Uploading file (49.92 MB) …
Fatal error => Detected non-empty blocksets with no associated blocks!
0 files need to be examined (0 bytes)

System.Exception: Detected non-empty blocksets with no associated blocks!
at Duplicati.Library.Main.Database.LocalDatabase.VerifyConsistency (System.Int64 blocksize, System.Int64 hashsize, System.Boolean verifyfilelists, System.Data.IDbTransaction transaction) [0x0017e] in :0
at Duplicati.Library.Main.Operation.Backup.BackupDatabase+<>c__DisplayClass32_0.b__0 () [0x00000] in :0
at Duplicati.Library.Main.Operation.Common.SingleRunner+<>c__DisplayClass3_0.b__0 () [0x00000] in :0
at Duplicati.Library.Main.Operation.Common.SingleRunner+d__21[T].MoveNext () [0x000b0] in <c6c6871f516b48f59d88f9d731c3ea4d>:0 --- End of stack trace from previous location where exception was thrown --- at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw () [0x0000c] in <8f2c484307284b51944a1a13a14c0266>:0 at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess (System.Threading.Tasks.Task task) [0x0004e] in <8f2c484307284b51944a1a13a14c0266>:0 at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification (System.Threading.Tasks.Task task) [0x0002e] in <8f2c484307284b51944a1a13a14c0266>:0 at System.Runtime.CompilerServices.TaskAwaiter.ValidateEnd (System.Threading.Tasks.Task task) [0x0000b] in <8f2c484307284b51944a1a13a14c0266>:0 at System.Runtime.CompilerServices.TaskAwaiter.GetResult () [0x00000] in <8f2c484307284b51944a1a13a14c0266>:0 at Duplicati.Library.Main.Operation.BackupHandler+<RunAsync>d__19.MoveNext () [0x00b1f] in <c6c6871f516b48f59d88f9d731c3ea4d>:0 --- End of stack trace from previous location where exception was thrown --- at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw () [0x0000c] in <8f2c484307284b51944a1a13a14c0266>:0 at CoCoL.ChannelExtensions.WaitForTaskOrThrow (System.Threading.Tasks.Task task) [0x00050] in <6973ce2780de4b28aaa2c5ffc59993b1>:0 at Duplicati.Library.Main.Operation.BackupHandler.Run (System.String[] sources, Duplicati.Library.Utility.IFilter filter) [0x00008] in <c6c6871f516b48f59d88f9d731c3ea4d>:0 at Duplicati.Library.Main.Controller+<>c__DisplayClass13_0.<Backup>b__0 (Duplicati.Library.Main.BackupResults result) [0x00035] in <c6c6871f516b48f59d88f9d731c3ea4d>:0 at Duplicati.Library.Main.Controller.RunAction[T] (T result, System.String[]& paths, Duplicati.Library.Utility.IFilter& filter, System.Action1[T] method) [0x0011d] in :0

1 Like

I just looked at my backup database which was having this issue with sqlitebrowser. This issue appears to be the same as others reporting – there were 2 entries in Blockset which were not in the File table. One of the nearby entries was for the Temp directory.

I will note that on my previous computer, which didn’t have this issue, I was running a version of Duplicati from earlier in 2018 which had a default “Windows” exclude option, which was checked. When I installed the latest version of Duplicati on this computer which is giving me issues, there was no such option for a default “Windows” exclude option. I’m wondering if the default exclude list in the older version of Duplicati was excluding the Temp directory? Why was that option removed from Duplicati?

I created an exclude filter to exclude the Temp directory and some other paths. I’m having Duplicati recreate the database right now and I’ll see if this fixes the problem or if it comes back.

What --log-file-log-level did you use? I suggest verbose.

If verbose is too much, try Information

What would be helpful (but it’s a lot of steps) is

  1. Starting with a working backup job
  2. Turn on --log-file-log-level (as you have)
  3. experience non-empty blocksets error (well, I hope you don’t, I hope no one does, but if you do)
  4. perform sql query in database browser
  5. Find 5 of 10 files “around” where the failing file was
  6. go back to logfile and search for those files. This is likely from previous runs, not the current that received the error.
  7. hopefully something in the logfile will be revealing

@thomasp your PathTooLongException appears to be a known bug with --usn-policy. See this post.

I updated my reply #16 with an edit to this effect.

As a followup, I ended up adding filters to prevent Duplicati from trying to backup both the temp directory as well as the directory where Duplicati’s database was being kept, along with a few other directories containing some Windows files that were constantly changing and didn’t need to be backed up. Then I recreated the database. It’s been running for somewhere around a week and a half without a problem. Seeing as how this error previously kept being generated within a day or two of recreating the database, and that happened several times, I feel pretty confident that these exclude filters are what is fixing it for me and should point the developers toward fixing it in the code.