Unexpected difference in fileset version 4: ...found 6180 entries, but expected 6181

I have been getting this fatal error for weeks. I started getting this error on one my backup sets. I’ve not changed anything on that configuration AFAIK.

2.0.4.5 (2.0.4.5_beta_2018-11-28)

There are 16 versions of this backup set.
It runs daily.
The most recent successful run was on March 2nd.
The fileset from version 4 (2/24) has the unexpected difference in the number of entries.

Given that this is a moderately large fileset (57GB), I’d love to know what you’d suggest doing to repair it.

 Failed: Unexpected difference in fileset version 4: 2/24/2019 4:03:11 AM (database id: 100), found 6180 entries, but expected 6181
    Details: Duplicati.Library.Interface.UserInformationException: Unexpected difference in fileset version 4: 2/24/2019 4:03:11 AM (database id: 100), found 6180 entries, but expected 6181
      at Duplicati.Library.Main.Database.LocalDatabase.VerifyConsistency (System.Int64 blocksize, System.Int64 hashsize, System.Boolean verifyfilelists, System.Data.IDbTransaction transaction) [0x00370] in <c6c6871f516b48f59d88f9d731c3ea4d>:0

I’m now getting this error on 3 jobs. One of them has logging enabled and I will look over the logs today to see if anything jumps out. It’s a backup of large (multi-GB) tarred gzip files. First it was failing to Dropbox, and now it’s failing to an attached USB drive (same source).

ETA; so far, I have been able to delete the offending backups, but the error has always returned on subsequent job runs eventually.

Same here… there is one backup job on one machine that has this issue every week or so. It’s always the same machine and same backup job. Not sure why. I’m running the latest “beta” version on probably 10 machines.

Almost the same experience here. I have 26 jobs running on one machine. One job started with this failure (I’m pretty sure after I upgraded to 2.0.4.5 but I don’t have details on that), so I copied the job, which ran successfully for a couple of months before it failed, and this morning the same job with the USB drive as a destination failed for the first time.

I will admit his issue has really shaken my confidence in Duplicati because so far the only way to truly recover from it (even temporarily) has been to essentially delete all the backups and start over.

I scanned the information-level logs of the two failing jobs and nothing unusual appears on the days that failed vs any other days.

I went back to the “Restore” option on one job for the failed day in question. This job backs up 6 files every day. On the failed day, one of the files is missing from the “restore” file list (I expand the source folder and the file isn’t there).

So I have two issues:

  1. why is the file missing (maybe an issue with the source, etc. - I can live with this since I run backups every day and I retain a handful)

  2. why doesn’t Duplicati warn of the failure and then keep backing up? I’d rather have a backup set with one missing file that eventually gets deleted based on my retention settings than the situation I have now, which is that a backup hasn’t been run for more than 2 weeks because some random failure happened on February 26.

The only frequently-successful repair that I know of is to delete the fileset version, which is probably what @haus was suggesting (unless somehow it now requires deleting all the versions). What’s your history with the error? That might hold you for awhile if it’s rare. It’s too frequent for the two who have just posted.

Is there any way to fix an “unexpected difference in fileset” error other than deleting? is @drwtsn32 post although the original poster was also trying to avoid doing this, but troubleshooting there didn’t get very far.

The easiest way to delete a fileset is probably using the job’s GUI Commandline and adjusting the screen.

The BACKUP command syntax gets turned into The DELETE command by deleting the source files from Commandline arguments and adding --version=4. Change Command at top to delete, then Run at bottom.

Getting the error to stay away has been a problem for some people, but I hope we can use their results to try to figure out why it only seems to affect certain backups. There has definitely been a network-problem cause identified by me here, reproduced by @warwickmm here, leading to possible code fixes at below:

Ensure that failed put operations throw exception #3697

The problem is that this is not the only cause. People have seen it on backups to local disk. What’s yours?

Manual compact ran, now unexpected difference in fileset is @haus finding that compact without enough temporary file space can cause this. There have also been theories involving things like retention deletions.

I’m open to all kinds of theories, but rely on people who can see it to try to guess what encourages it and to consider adding logging, consider posting logs, and consider even posting database bug reports which are sanitized versions of the database, e.g. with source pathnames and other data removed to protect privacy. Posting big files here is not possible, so you’d post a link to a cloud storage share, Firefox Send, or similar.

So deleting versions stopped helping even temporarily? I believe deletion can also cause auto-compact, so using --no-auto-compact on the delete command might see whether delete itself moves the problem into a different fileset, or if it’s one of the follow-on operations (such as possible) automatic compact that did that. Ultimately this is the kind of thing that database dumps and the right expert (maybe not me?) can help with.

The “unusual” is sometimes the run before, at least it is in the case where network errors on a put upload causes Duplicati to bail. It does this same sanity test twice, once before a backup, and again after (but if it gets an exception of some sort, that might prevent the post-backup sanity test – so it hits that on next run).

This is actually a nice simple backup in a way. Did it say it found 5 entries, but expected 6? If so, then the display might be going by the left number, which I think is more realistic in terms of what can be restored.

We’re not likely to hear from the original designer, however this is one of the self-checks Duplicati does to make sure its various database records are in order. Programs are typically written to rely on things being the way they should be. If things look wrong, sometimes it’s better to stop than to spread the mess further.

Technical details of this internal check are found here. Count from those SQL queries should be the same. With such a small list, you could even humanly compare the file lists with an SQLite browser such as this.

For those of you who are seeing the error too much, I’d suggest looking in logs for compact operations just before it. Here is an example of how it looks in job logs. You can also see if using –no-auto-compact helps, change Backup retention to Keep all backups, and (if network backup) set –number-of-retries higher.

Fileset 0 is the newest, so damage to an older fileset such as 4 (with 16 versions total) seems like it might be some sort of non-backup maintenance work such as a compact coming through, doing wider changes.

Though there are reports of this happening on systems that are always on, it’d still be a good idea to see if that happens to you, or if things like sleep or restart tend to encourage this problem to show up sometime.

1 Like

Thank you for that info. Another version of this same backup job (this time to another local USB drive) failed this morning at ~4am. The error is:

Failed: Unexpected difference in fileset version 6: 3/18/2019 3:43:02 AM (database id: 53), found 11 entries, but expected 12

Details: Duplicati.Library.Interface.UserInformationException: Unexpected difference in fileset version 6: 3/18/2019 3:43:02 AM (database id: 53), found 11 entries, but expected 12

When I look in the logs for this job, it runs a compact operation every single night except 3/17, the night before, which seemed like it might be significant, but then I went back to another version of this job which backs up to dropbox and failed March 7, but it ran a compact on March 6. So maybe nothing useful there.

So now 4 separate versions of this backup job have all failed in the last few weeks. Two backing up to dropbox, and two backing up to local USB drives. If I start logging these jobs to a file, which log level do you suggest I use to get the most info? I tried “verbose” but it doesn’t seem to add much. Is “profiling” a better choice?

I don’t recall all the details of deleting versions (I will see if I posted it anywhere here), but I think I remember one time deleting the offending fileset only to find out that the next one had failed as well, and so on until I ended up deleting all of them. My solution to this unexpected fileset difference issue right now is to delete the database for the job, then delete all files on the remote side, and start the backup as if it was a new job.

If you want a huge amount of output, use --log-file-log-level=profiling and --profile-all-database-queries which isn’t yet in the manual, but was added in Release: 2.0.3.10 (canary) 2018-08-30 as a log option:

Removed some logging details in core performance areas (can be re-enabled with --profile-all-database-queries )

So “better” depends on your goal, but the above is as much logging as I think there is. Rotating or deleting logs would probably be good, and maybe install a large-log-file viewer like glogg, if your editor isn’t enough.

As a side note, I cranked up my scheduled backup frequency here, and also threw in some other suspect things such as smart retention. I managed to get this problem in the 6:45 run detected in the 6:55 run. The difference was fileset 0 so it may or may not be like the non-0 cases. Nevertheless I’m trying to look at it…

Exactly what inspired it isn’t clear because there are many things going on, including compact and deletion which are at least expected (whether they’re perfect isn’t known) and also some other oddball errors below:



Failed: Unexpected difference in fileset version 0: 3/25/2019 6:45:00 PM (database id: 99), found 432 entries, but expected 433
Details: Duplicati.Library.Interface.UserInformationException: Unexpected difference in fileset version 0: 3/25/2019 6:45:00 PM (database id: 99), found 432 entries, but expected 433
   at Duplicati.Library.Main.Database.LocalDatabase.VerifyConsistency(Int64 blocksize, Int64 hashsize, Boolean verifyfilelists, IDbTransaction transaction)
   at Duplicati.Library.Main.Operation.Backup.BackupDatabase.<>c__DisplayClass32_0.<VerifyConsistencyAsync>b__0()
   at Duplicati.Library.Main.Operation.Common.SingleRunner.<>c__DisplayClass3_0.<RunOnMain>b__0()
   at Duplicati.Library.Main.Operation.Common.SingleRunner.<DoRunOnMain>d__2`1.MoveNext()
--- End of stack trace from previous location where exception was thrown ---
   at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw()
   at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)
   at System.Runtime.CompilerServices.TaskAwaiter.GetResult()
   at Duplicati.Library.Main.Operation.BackupHandler.<RunAsync>d__19.MoveNext()
--- End of stack trace from previous location where exception was thrown ---
   at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw()
   at CoCoL.ChannelExtensions.WaitForTaskOrThrow(Task task)
   at Duplicati.Library.Main.Operation.BackupHandler.Run(String[] sources, IFilter filter)
   at Duplicati.Library.Main.Controller.<>c__DisplayClass13_0.<Backup>b__0(BackupResults result)
   at Duplicati.Library.Main.Controller.RunAction[T](T result, String[]& paths, IFilter& filter, Action`1 method)

Log data:
2019-03-25 18:55:50 -04 - [Warning-Duplicati.Library.Main.Operation.Backup.FileEnumerationProcess-FileAccessError]: Error reported while accessing file: C:\PortableApps\Notepad++Portable\App\Notepad++\Notepad++Portable\
System.UnauthorizedAccessException: (5) Access is denied: [\\?\GLOBALROOT\Device\HarddiskVolumeShadowCopy68\PortableApps\Notepad++Portable\App\Notepad++\Notepad++Portable\]
   at Alphaleonis.Win32.NativeError.ThrowException(UInt32 errorCode, String readPath, String writePath)
   at Alphaleonis.Win32.Filesystem.File.GetAttributesExCore[T](KernelTransaction transaction, String path, PathFormat pathFormat, Boolean returnErrorOnNotFound)
   at Duplicati.Library.Snapshots.SystemIOWindows.GetFileAttributes(String path)
   at Duplicati.Library.Utility.Utility.<EnumerateFileSystemEntries>d__23.MoveNext()
2019-03-25 18:55:53 -04 - [Warning-Duplicati.Library.Main.Operation.Backup.FileEnumerationProcess-FileAccessError]: Error reported while accessing file: C:\PortableApps\Notepad++Portable\App\Notepad++\plugins\
Alphaleonis.Win32.Filesystem.DeviceNotReadyException: (21) The device is not ready: [\\?\GLOBALROOT\Device\HarddiskVolumeShadowCopy68\PortableApps\Notepad++Portable\App\Notepad++\plugins\]
   at Alphaleonis.Win32.NativeError.ThrowException(UInt32 errorCode, String readPath, String writePath)
   at Alphaleonis.Win32.Filesystem.File.GetAttributesExCore[T](KernelTransaction transaction, String path, PathFormat pathFormat, Boolean returnErrorOnNotFound)
   at Duplicati.Library.Snapshots.SystemIOWindows.GetFileAttributes(String path)
   at Duplicati.Library.Utility.Utility.<EnumerateFileSystemEntries>d__23.MoveNext()
2019-03-25 18:55:53 -04 - [Warning-Duplicati.Library.Main.Operation.Backup.FileEnumerationProcess-FileAccessError]: Error reported while accessing file: C:\PortableApps\Notepad++Portable\App\Notepad++\themes\
Alphaleonis.Win32.Filesystem.DeviceNotReadyException: (21) The device is not ready: [\\?\GLOBALROOT\Device\HarddiskVolumeShadowCopy68\PortableApps\Notepad++Portable\App\Notepad++\themes\]
   at Alphaleonis.Win32.NativeError.ThrowException(UInt32 errorCode, String readPath, String writePath)
   at Alphaleonis.Win32.Filesystem.File.GetAttributesExCore[T](KernelTransaction transaction, String path, PathFormat pathFormat, Boolean returnErrorOnNotFound)
   at Duplicati.Library.Snapshots.SystemIOWindows.GetFileAttributes(String path)
   at Duplicati.Library.Utility.Utility.<EnumerateFileSystemEntries>d__23.MoveNext()
2019-03-25 18:55:53 -04 - [Warning-Duplicati.Library.Main.Operation.Backup.FileEnumerationProcess-FileAccessError]: Error reported while accessing file: C:\PortableApps\Notepad++Portable\App\Notepad++\updater\
Alphaleonis.Win32.Filesystem.DeviceNotReadyException: (21) The device is not ready: [\\?\GLOBALROOT\Device\HarddiskVolumeShadowCopy68\PortableApps\Notepad++Portable\App\Notepad++\updater\]
   at Alphaleonis.Win32.NativeError.ThrowException(UInt32 errorCode, String readPath, String writePath)
   at Alphaleonis.Win32.Filesystem.File.GetAttributesExCore[T](KernelTransaction transaction, String path, PathFormat pathFormat, Boolean returnErrorOnNotFound)
   at Duplicati.Library.Snapshots.SystemIOWindows.GetFileAttributes(String path)
   at Duplicati.Library.Utility.Utility.<EnumerateFileSystemEntries>d__23.MoveNext()
2019-03-25 18:55:53 -04 - [Warning-Duplicati.Library.Main.Operation.Backup.FileEnumerationProcess-FileAccessError]: Error reported while accessing file: C:\PortableApps\Notepad++Portable\App\Notepad++\
Alphaleonis.Win32.Filesystem.DeviceNotReadyException: (21) The device is not ready: [\\?\GLOBALROOT\Device\HarddiskVolumeShadowCopy68\PortableApps\Notepad++Portable\App\Notepad++\*]
   at Alphaleonis.Win32.NativeError.ThrowException(UInt32 errorCode, String readPath, String writePath)
   at Alphaleonis.Win32.Filesystem.FindFileSystemEntryInfo.FindFirstFile(String pathLp, WIN32_FIND_DATA& win32FindData)
   at Alphaleonis.Win32.Filesystem.FindFileSystemEntryInfo.<Enumerate>d__3`1.MoveNext()
   at System.Linq.Buffer`1..ctor(IEnumerable`1 source)
   at System.Linq.Enumerable.ToArray[TSource](IEnumerable`1 source)
   at Duplicati.Library.Snapshots.WindowsSnapshot.ListFiles(String localFolderPath)
   at Duplicati.Library.Utility.Utility.<EnumerateFileSystemEntries>d__23.MoveNext()
2019-03-25 18:55:54 -04 - [Error-Duplicati.Library.Main.Operation.BackupHandler-FatalError]: Fatal error
Duplicati.Library.Interface.UserInformationException: Unexpected difference in fileset version 0: 3/25/2019 6:45:00 PM (database id: 99), found 432 entries, but expected 433
   at Duplicati.Library.Main.Database.LocalDatabase.VerifyConsistency(Int64 blocksize, Int64 hashsize, Boolean verifyfilelists, IDbTransaction transaction)
   at Duplicati.Library.Main.Operation.Backup.BackupDatabase.<>c__DisplayClass32_0.<VerifyConsistencyAsync>b__0()
   at Duplicati.Library.Main.Operation.Common.SingleRunner.<>c__DisplayClass3_0.<RunOnMain>b__0()
   at Duplicati.Library.Main.Operation.Common.SingleRunner.<DoRunOnMain>d__2`1.MoveNext()
--- End of stack trace from previous location where exception was thrown ---
   at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw()
   at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)
   at System.Runtime.CompilerServices.TaskAwaiter.GetResult()
   at Duplicati.Library.Main.Operation.BackupHandler.<RunAsync>d__19.MoveNext()

Above is the contents of the email reporting the error. It’s picked out some things from 6:55 log, but to see 6:45 I have to use the log files that I set myself (mainly so analysis of problems that spring up is possible).

@ts678 - thank you so much for the detailed response. It’s good to know that folks are tracking the issue.

In my case, this backup is run by an always-on machine targeting a remote destination (Backblaze B2).

I just ran the delete operation on the offending version which has let the backup start running again normally.

For now, I’m not going to make any of the suggested changes because this is the first time I’ve hit this issue. I’ll keep things running as-is and report back if I hit the issue again.

I also got the same error today, backing up on osx from macbook & external usb drive to another external usb drive (ntfs).

{"ClassName":"Duplicati.Library.Interface.UserInformationException","Message":"Unexpected difference in fileset version 2: 30.06.2020 10:30:59 (database id: 20), found 1369434 entries, but expected 1369436","Data":null,"InnerException":null,"HelpURL":null,"StackTraceString":" at Duplicati.Library.Main.Database.LocalDatabase.VerifyConsistency (System.Int64 blocksize, System.Int64 hashsize, System.Boolean verifyfilelists, System.Data.IDbTransaction transaction) [0x0036e] in <a5461ff033f34a1a9500e6168b51a3be>:0 \n at Duplicati.Library.Main.Operation.Backup.BackupDatabase+<>c__DisplayClass34_0.<VerifyConsistencyAsync>b__0 () [0x00000] in <a5461ff033f34a1a9500e6168b51a3be>:0 \n at Duplicati.Library.Main.Operation.Common.SingleRunner+<>c__DisplayClass3_0.<RunOnMain>b__0 () [0x00000] in <a5461ff033f34a1a9500e6168b51a3be>:0 \n at Duplicati.Library.Main.Operation.Common.SingleRunner+<DoRunOnMain>d__21[T].MoveNext () [0x000b0] in :0 \n— End of stack trace from previous location where exception was thrown —\n at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw () [0x0000c] in :0 \n at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess (System.Threading.Tasks.Task task) [0x0004e] in :0 \n at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification (System.Threading.Tasks.Task task) [0x0002e] in :0 \n at System.Runtime.CompilerServices.TaskAwaiter.ValidateEnd (System.Threading.Tasks.Task task) [0x0000b] in :0 \n at System.Runtime.CompilerServices.TaskAwaiter.GetResult () [0x00000] in :0 \n at Duplicati.Library.Main.Operation.BackupHandler+d__20.MoveNext () [0x003cd] in :0 \n— End of stack trace from previous location where exception was thrown —\n at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw () [0x0000c] in :0 \n at CoCoL.ChannelExtensions.WaitForTaskOrThrow (System.Threading.Tasks.Task task) [0x00050] in <9a758ff4db6c48d6b3d4d0e5c2adf6d1>:0 \n at Duplicati.Library.Main.Operation.BackupHandler.Run (System.String sources, Duplicati.Library.Utility.IFilter filter, System.Threading.CancellationToken token) [0x00009] in :0 \n at Duplicati.Library.Main.Controller+<>c__DisplayClass14_0.b__0 (Duplicati.Library.Main.BackupResults result) [0x0004b] in :0 \n at Duplicati.Library.Main.Controller.RunAction[T] (T result, System.String& paths, Duplicati.Library.Utility.IFilter& filter, System.Action1[T] method) [0x0011c] in <a5461ff033f34a1a9500e6168b51a3be>:0 ","RemoteStackTraceString":null,"RemoteStackIndex":0,"ExceptionMethod":null,"HResult":-2146233088,"Source":"mscorlib"}

Could this be caused by some hard disc failure on the target usb drive?

Not likely. It’s a database inconsistency between different ways of figuring out how many files there are.

Could you see the job log before the failure (failure might not have made one) for Compact phase info?
That was where a problem like this (fixed or at least much improved in 2.0.5.1) was causing this issue.

What Duplicati version? Is this a GUI job or CLI?

It’s the latest canary version 2.0.5.108_canary_2020-07-09 and it’s a gui job

I’m not seeing any reference to the Compact phase. The live log (Log Level Verbose) shows the following:

Is there a normal logfile I could have a look at?

  • Jul 28, 2020 8:09 AM: The operation Backup has failed with error: Unexpected difference in fileset version 2: 30.06.2020 10:30:59 (database id: 20), found 1369434 entries, but expected 1369436
  • Jul 28, 2020 8:09 AM: Fatal error
  • Jul 28, 2020 8:09 AM: Including path as no filters matched: /Applications/Utilities/System Information.app/Contents/Resources/ms.lproj/SPInfo.strings
  • Jul 28, 2020 8:09 AM: Including path as no filters matched: /Applications/Utilities/System Information.app/Contents/Resources/ms.lproj/SPApp.strings
  • Jul 28, 2020 8:09 AM: Including path as no filters matched: /Applications/Utilities/System Information.app/Contents/Resources/ms.lproj/Localizable.strings

    Afterwards it continues with a lot of other including xyz messages

If your backup is still running, the live log is all there is unless you set up your own (not sure what you mean by “normal”). log-file=<path> and log-file-log-level=verbose (or whatever you like) is actually a nicer log, but takes more setup than the live log. If the backup has finished, Viewing the log files of a backup job ordinarily has logs with backup result, but early failure means no result. Viewing the Duplicati Server Logs might help.

Usually the spot that catches this error is early in the backup, as a sanity check before it actually backs up. Does the timing (which I can’t see in the short snippet) seem like it gave the error early on? Live log shows latest activity at the top, so be careful in screen reading. If by “Afterwards”, you mean below, that’s earlier…

I’d have thought it would stop pretty fast after failing the backup, but there’s a similar check after a backup. Compact was the cause of a previous problem like this. Because it’s part of deletion/cleanup, it’s after the backup itself, so after the post-backup check passed, compact broke things, then the next backup noticed during its check before starting. Because it stopped then, inconsistency would “stick” for any further starts.

I’d like to at least have a better idea of whether this was the pre-backup or the post-backup check going off.
The newest backup is 0 (Restore dropdown should also show the numbers), so 2 is recent, but not latest.

Generally the workaround for this was to delete the affected version, but sometimes the problem moves to a different version. All versions are tested, but failure fails on first inconsistency, so one doesn’t know how many versions are “broken”. I suppose complaining about all would be a possible enhancement to aid this.

Deleting versions is pretty easy using Commandline to run the delete with a particular --version. The usual Commandline arguments can be cleared, and you can type --version=2, or set in Add advanced option.

I guess it’s up to you how to proceed. Ask if needed. Some people have a lot of versions, and losing some wouldn’t matter much. Other people have very few versions, or have other preferences about what to do…

Sorry that it took me so long to answer. First of all thank you very much for your response!

With a normal logfile I meant a file in the filesystem with information like timestamp, severity, thread, message, exceptions etc.

The Log of the backup job does not show anything - the last log that is shown is my last successfuly backup from july.

The server log gives me the the information I’ve provided in my earlier posts.

The live log (set to profiling) gives me the information indicated in my previous post. With Afterwards I meant earlier. What I’m saying is that all I can see in the live log (with Profiling level selected) is

* Jul 28, 2020 8:09 AM: The operation Backup has failed with error: Unexpected difference in fileset version 2: 30.06.2020 10:30:59 (database id: 20), found 1369434 entries, but expected 1369436
* Jul 28, 2020 8:09 AM: Fatal error
* Jul 28, 2020 8:09 AM: Including path as no filters matched: /Applications/Utilities/System Information.app/Contents/Resources/ms.lproj/SPInfo.strings
* Jul 28, 2020 8:09 AM: Including path as no filters matched: /Applications/Utilities/System Information.app/Contents/Resources/ms.lproj/SPApp.strings
* Jul 28, 2020 8:09 AM: Including path as no filters matched: /Applications/Utilities/System Information.app/Contents/Resources/ms.lproj/Localizable.strings

[… about 100 other rows with the “Including path as no filters matched” message …]

I’ve just retried it with “ExplicitOnly” (is there an overview about what the different level means and why there are so many?) and the output is

  • Aug 13, 2020 12:11 PM: The operation Backup has failed with error: Unexpected difference in fileset version 2: 30.06.2020 10:30:59 (database id: 20), found 1369434 entries, but expected 1369436

  • Aug 13, 2020 12:11 PM: Running Backup took 0:00:08:47.591

  • Aug 13, 2020 12:11 PM: ExecuteNonQuery: PRAGMA optimize took 0:00:00:00.000

  • Aug 13, 2020 12:11 PM: Starting - ExecuteNonQuery: PRAGMA optimize

  • Aug 13, 2020 12:11 PM: Fatal error

  • Aug 13, 2020 12:11 PM: Including path as no filters matched: /Applications/OpenRA - Dune 2000.app/Contents/Resources/mods/d2k/weapons/debris.yaml

  • Aug 13, 2020 12:11 PM: ExecuteReader: SELECT “ID”, “Timestamp” FROM “Fileset” ORDER BY “Timestamp” DESC took 0:00:00:00.000

  • Aug 13, 2020 12:11 PM: Starting - ExecuteReader: SELECT “ID”, “Timestamp” FROM “Fileset” ORDER BY “Timestamp” DESC

  • Aug 13, 2020 12:11 PM: ExecuteReader: SELECT “ID”, “Timestamp” FROM “Fileset” ORDER BY “Timestamp” DESC took 0:00:00:00.000

  • Aug 13, 2020 12:11 PM: Starting - ExecuteReader: SELECT “ID”, “Timestamp” FROM “Fileset” ORDER BY “Timestamp” DESC

  • Aug 13, 2020 12:11 PM: ExecuteScalarInt64: SELECT COUNT(*) FROM “FilesetEntry” WHERE “FilesetEntry”.“FilesetID” = 20 took 0:00:00:00.538

  • Aug 13, 2020 12:11 PM: Including path as no filters matched: /Applications/OpenRA - Dune 2000.app/Contents/Resources/mods/d2k/mod.yaml

  • Aug 13, 2020 12:11 PM: Including path as no filters matched: /Applications/OpenRA - Dune 2000.app/Contents/Resources/mods/d2k/missions.yaml

  • Aug 13, 2020 12:11 PM: Including path as no filters matched: /Applications/OpenRA - Dune 2000.app/Contents/Resources/mods/d2k/metrics.yaml

  • Aug 13, 2020 12:11 PM: Including path as no filters matched: /Applications/OpenRA - Dune 2000.app/Contents/Resources/mods/d2k/icon.png
    [… about 100 other rows with the “Including path as no filters matched” message …]

As you can see by the “Running Backup took” message this happens rather early in the backup process. But at some point the live log just stops and does not show older information. I guess this is due to the fact that the live log does not show all log entries but just the last x entries. due to that I was asking if there is a real logfile in the filesystem I’d have a look at or attach here.

I’ve now added the the log-file and log-file-log-level options to the backup job and will retry it. Btw. Whats the difference between “log-file-log-level” and “log-level”?

This is the end of my > 300 MB log file. Does this help to get more information about the cause? Is there anything else I can do to track down this issue before deleting all the backup information and start from scratch (requires most likely less time as to delete single specific backup versions)?

[...]
2020-08-13 12:26:06 +02 - [Verbose-Duplicati.Library.Main.Operation.Backup.FileEnumerationProcess-IncludingPath]: Including path as no filters matched: /Applications/Sequel Pro.app/Contents/Resources/pt.lproj/SaveSPFAccessory.nib
2020-08-13 12:26:06 +02 - [Verbose-Duplicati.Library.Main.Operation.Backup.FileEnumerationProcess-IncludingPath]: Including path as no filters matched: /Applications/Sequel Pro.app/Contents/Resources/pt.lproj/UserManagerView.nib
2020-08-13 12:26:06 +02 - [Profiling-Timer.Finished-Duplicati.Library.Main.Database.ExtensionMethods-ExecuteScalarInt64]: ExecuteScalarInt64: SELECT COUNT(*) FROM "FilesetEntry" WHERE "FilesetEntry"."FilesetID" = 20 took 0:00:00:00.665
2020-08-13 12:26:06 +02 - [Profiling-Timer.Begin-Duplicati.Library.Main.Database.ExtensionMethods-ExecuteReader]: Starting - ExecuteReader: SELECT "ID", "Timestamp" FROM "Fileset" ORDER BY "Timestamp" DESC
2020-08-13 12:26:06 +02 - [Profiling-Timer.Finished-Duplicati.Library.Main.Database.ExtensionMethods-ExecuteReader]: ExecuteReader: SELECT "ID", "Timestamp" FROM "Fileset" ORDER BY "Timestamp" DESC took 0:00:00:00.000
2020-08-13 12:26:06 +02 - [Profiling-Timer.Begin-Duplicati.Library.Main.Database.ExtensionMethods-ExecuteReader]: Starting - ExecuteReader: SELECT "ID", "Timestamp" FROM "Fileset" ORDER BY "Timestamp" DESC
2020-08-13 12:26:06 +02 - [Profiling-Timer.Finished-Duplicati.Library.Main.Database.ExtensionMethods-ExecuteReader]: ExecuteReader: SELECT "ID", "Timestamp" FROM "Fileset" ORDER BY "Timestamp" DESC took 0:00:00:00.000
2020-08-13 12:26:06 +02 - [Verbose-Duplicati.Library.Main.Operation.Backup.FileEnumerationProcess-IncludingPath]: Including path as no filters matched: /Applications/Sequel Pro.app/Contents/Resources/ja.lproj/AboutPanel.nib
2020-08-13 12:26:06 +02 - [Error-Duplicati.Library.Main.Operation.BackupHandler-FatalError]: Fatal error
Duplicati.Library.Interface.UserInformationException: Unexpected difference in fileset version 2: 30.06.2020 10:30:59 (database id: 20), found 1369434 entries, but expected 1369436
  at Duplicati.Library.Main.Database.LocalDatabase.VerifyConsistency (System.Int64 blocksize, System.Int64 hashsize, System.Boolean verifyfilelists, System.Data.IDbTransaction transaction) [0x0036e] in <9d37c106a6af4e2db9ebdc93583bad34>:0 
  at Duplicati.Library.Main.Operation.Backup.BackupDatabase+<>c__DisplayClass34_0.<VerifyConsistencyAsync>b__0 () [0x00000] in <9d37c106a6af4e2db9ebdc93583bad34>:0 
  at Duplicati.Library.Main.Operation.Common.SingleRunner+<>c__DisplayClass3_0.<RunOnMain>b__0 () [0x00000] in <9d37c106a6af4e2db9ebdc93583bad34>:0 
  at Duplicati.Library.Main.Operation.Common.SingleRunner+<DoRunOnMain>d__2`1[T].MoveNext () [0x000b0] in <9d37c106a6af4e2db9ebdc93583bad34>:0 
--- End of stack trace from previous location where exception was thrown ---
  at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw () [0x0000c] in <dca3b561b8ad4f9fb10141d81b39ff45>:0 
  at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess (System.Threading.Tasks.Task task) [0x0004e] in <dca3b561b8ad4f9fb10141d81b39ff45>:0 
  at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification (System.Threading.Tasks.Task task) [0x0002e] in <dca3b561b8ad4f9fb10141d81b39ff45>:0 
  at System.Runtime.CompilerServices.TaskAwaiter.ValidateEnd (System.Threading.Tasks.Task task) [0x0000b] in <dca3b561b8ad4f9fb10141d81b39ff45>:0 
  at System.Runtime.CompilerServices.TaskAwaiter.GetResult () [0x00000] in <dca3b561b8ad4f9fb10141d81b39ff45>:0 
  at Duplicati.Library.Main.Operation.BackupHandler+<RunAsync>d__20.MoveNext () [0x003cd] in <9d37c106a6af4e2db9ebdc93583bad34>:0 
2020-08-13 12:26:06 +02 - [Profiling-Timer.Begin-Duplicati.Library.Main.Database.ExtensionMethods-ExecuteNonQuery]: Starting - ExecuteNonQuery: PRAGMA optimize
2020-08-13 12:26:06 +02 - [Profiling-Timer.Finished-Duplicati.Library.Main.Database.ExtensionMethods-ExecuteNonQuery]: ExecuteNonQuery: PRAGMA optimize took 0:00:00:00.000
2020-08-13 12:26:06 +02 - [Profiling-Timer.Finished-Duplicati.Library.Main.Controller-RunBackup]: Running Backup took 0:00:09:11.353

Most are likely named to be a bit self-explanatory, but for those that are less so, here’s some info:

–log-file-log-level vs. –log-level (and –log-file-log-filter), and, how many log files are there, and, what are the different logs, and, what is the meaning of the levels

–log-level is deprecated by rewrite. See above. Interestingly, only CLI (not GUI, or manual) says:

  --log-level (Enumeration): Log information level
    [DEPRECATED]: Use the log-file-log-level and console-log-level options
    instead
    Specifies the amount of log information to write into the file specified
    by --log-file
    * values: ExplicitOnly, Profiling, Verbose, Retry, Information, DryRun,
    Warning, Error
    * default value: Warning

and the DEPRECATED text seems like it’s code-generated, including the references to “instead”:

new CommandLineArgument(“log-level”, CommandLineArgument.ArgumentType.Enumeration, Strings.Options.LoglevelShort, Strings.Options.LoglevelLong, “Warning”, null, Enum.GetNames(typeof(Duplicati.Library.Logging.LogMessageType)), Strings.Options.LogLevelDeprecated(“log-file-log-level”, “console-log-level”)),

New log system with filters talked about some of the changes between the old and new log systems.

You’ve found one reason why there are many levels. The amount of data that’s logged varies quite a bit.
2.0.3.10_canary_2018-08-30. made default Profiling quite a bit shorter than it used to be though, due to

Removed some logging details in core performance areas (can be re-enabled with --profile-all-database-queries )

What’s unfortunately needed to find the original cause is the --profile-all-database-queries log of issue creation. Tripping over an existing issue doesn’t say much about how the issue was created originally.

Reproducible-from-scratch breaks are debuggable, but I don’t think we have such a situation with this.

Creating a bug report can sometimes offer clues, but this too is mostly point-in-time, not a root-cause.
This was mentioned in an earlier post, which also talks about deleting a single version, which worked. Sometimes it does. Other times this problem moves. Only the first version found that fails is reported.

I’m not sure fresh start is faster. Especially if you set no-auto-compact, I think it should be quite quick, involving some marking of local database, and a remote dlist file deletion. It’s probably even reversible using file copy of the database and the dlist files (but not tested). If compact runs, that’s not reversible.

If you want to try the delete to see if it helps, set --no-auto-compact (recommended, not required) then Commandline, change Command to Delete, and change Commandline arguments box to --version=2.

Thank you as again for your prompt response.
I’ve tried to delete version as as suggested and got the following output

System.Exception: Unexpected number of deleted filesets 2 vs 3
  at Duplicati.Library.Main.Database.LocalDeleteDatabase+<DropFilesetsFromTable>d__5.MoveNext () [0x0011f] in <9d37c106a6af4e2db9ebdc93583bad34>:0 
  at System.Linq.Buffer`1[TElement]..ctor (System.Collections.Generic.IEnumerable`1[T] source) [0x00087] in <63992662b765477a898ef49cdcc99ee2>:0 
  at System.Linq.Enumerable.ToArray[TSource] (System.Collections.Generic.IEnumerable`1[T] source) [0x00011] in <63992662b765477a898ef49cdcc99ee2>:0 
  at Duplicati.Library.Main.Operation.DeleteHandler.DoRun (Duplicati.Library.Main.Database.LocalDeleteDatabase db, System.Data.IDbTransaction& transaction, System.Boolean hasVerifiedBackend, System.Boolean forceCompact, Duplicati.Library.Main.BackendManager sharedManager) [0x0018d] in <9d37c106a6af4e2db9ebdc93583bad34>:0 
  at Duplicati.Library.Main.Operation.DeleteHandler.Run () [0x00075] in <9d37c106a6af4e2db9ebdc93583bad34>:0 
  at Duplicati.Library.Main.Controller.<Delete>b__17_0 (Duplicati.Library.Main.DeleteResults result) [0x00012] in <9d37c106a6af4e2db9ebdc93583bad34>:0 
  at Duplicati.Library.Main.Controller.RunAction[T] (T result, System.String[]& paths, Duplicati.Library.Utility.IFilter& filter, System.Action`1[T] method) [0x0011c] in <9d37c106a6af4e2db9ebdc93583bad34>:0 
Return code: 100

No errors in the log though

2020-08-14 07:35:26 +02 - [Information-Duplicati.Library.Main.Controller-StartingOperation]: The operation Delete has started
2020-08-14 07:35:26 +02 - [Profiling-Timer.Begin-Duplicati.Library.Main.Controller-RunDelete]: Starting - Running Delete
2020-08-14 07:35:26 +02 - [Profiling-Timer.Begin-Duplicati.Library.Main.Database.ExtensionMethods-ExecuteScalarInt64]: Starting - ExecuteScalarInt64: INSERT INTO "Operation" ("Description", "Timestamp") VALUES ("Delete", 1597383326); SELECT last_insert_rowid();
2020-08-14 07:35:26 +02 - [Profiling-Timer.Finished-Duplicati.Library.Main.Database.ExtensionMethods-ExecuteScalarInt64]: ExecuteScalarInt64: INSERT INTO "Operation" ("Description", "Timestamp") VALUES ("Delete", 1597383326); SELECT last_insert_rowid(); took 0:00:00:00.001
2020-08-14 07:35:26 +02 - [Profiling-Timer.Begin-Duplicati.Library.Main.Database.ExtensionMethods-ExecuteReader]: Starting - ExecuteReader: SELECT "Key", "Value" FROM "Configuration" 
2020-08-14 07:35:26 +02 - [Profiling-Timer.Finished-Duplicati.Library.Main.Database.ExtensionMethods-ExecuteReader]: ExecuteReader: SELECT "Key", "Value" FROM "Configuration"  took 0:00:00:00.000
2020-08-14 07:35:26 +02 - [Profiling-Timer.Begin-Duplicati.Library.Main.Database.ExtensionMethods-ExecuteReader]: Starting - ExecuteReader: SELECT "Key", "Value" FROM "Configuration" 
2020-08-14 07:35:26 +02 - [Profiling-Timer.Finished-Duplicati.Library.Main.Database.ExtensionMethods-ExecuteReader]: ExecuteReader: SELECT "Key", "Value" FROM "Configuration"  took 0:00:00:00.000
2020-08-14 07:35:26 +02 - [Profiling-Timer.Begin-Duplicati.Library.Main.Database.ExtensionMethods-ExecuteScalarInt64]: Starting - ExecuteScalarInt64: SELECT COUNT(*) FROM "Block" WHERE "Size" > 102400
2020-08-14 07:35:26 +02 - [Profiling-Timer.Finished-Duplicati.Library.Main.Database.ExtensionMethods-ExecuteScalarInt64]: ExecuteScalarInt64: SELECT COUNT(*) FROM "Block" WHERE "Size" > 102400 took 0:00:00:00.003
2020-08-14 07:35:26 +02 - [Profiling-Timer.Begin-Duplicati.Library.Main.Database.ExtensionMethods-ExecuteReader]: Starting - ExecuteReader: SELECT DISTINCT "Fileset"."ID", "Fileset"."Timestamp" FROM "Fileset", "RemoteVolume" WHERE "RemoteVolume"."ID" = "Fileset"."VolumeID" AND "Fileset"."ID" IN (SELECT "FilesetID" FROM "FilesetEntry")  AND ("RemoteVolume"."State" = "Uploading" OR "RemoteVolume"."State" = "Temporary")
2020-08-14 07:36:07 +02 - [Profiling-Timer.Finished-Duplicati.Library.Main.Database.ExtensionMethods-ExecuteReader]: ExecuteReader: SELECT DISTINCT "Fileset"."ID", "Fileset"."Timestamp" FROM "Fileset", "RemoteVolume" WHERE "RemoteVolume"."ID" = "Fileset"."VolumeID" AND "Fileset"."ID" IN (SELECT "FilesetID" FROM "FilesetEntry")  AND ("RemoteVolume"."State" = "Uploading" OR "RemoteVolume"."State" = "Temporary") took 0:00:00:41.438
2020-08-14 07:36:08 +02 - [Profiling-Timer.Begin-Duplicati.Library.Main.BackendManager-RemoteOperationList]: Starting - RemoteOperationList
2020-08-14 07:36:08 +02 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: List - Started:  ()
2020-08-14 07:36:22 +02 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: List - Completed:  (4,31 KB)
2020-08-14 07:36:22 +02 - [Profiling-Timer.Finished-Duplicati.Library.Main.BackendManager-RemoteOperationList]: RemoteOperationList took 0:00:00:14.876
2020-08-14 07:36:23 +02 - [Profiling-Timer.Begin-Duplicati.Library.Main.Database.ExtensionMethods-ExecuteReader]: Starting - ExecuteReader: SELECT "ID", "Timestamp" FROM "Fileset" ORDER BY "Timestamp" DESC
2020-08-14 07:36:23 +02 - [Profiling-Timer.Finished-Duplicati.Library.Main.Database.ExtensionMethods-ExecuteReader]: ExecuteReader: SELECT "ID", "Timestamp" FROM "Fileset" ORDER BY "Timestamp" DESC took 0:00:00:00.000
2020-08-14 07:36:23 +02 - [Profiling-Timer.Begin-Duplicati.Library.Main.Database.ExtensionMethods-ExecuteReader]: Starting - ExecuteReader: SELECT DISTINCT "Name", "State" FROM "Remotevolume" WHERE "Name" IN (SELECT "Name" FROM "Remotevolume" WHERE "State" IN ("Deleted", "Deleting")) AND NOT "State" IN ("Deleted", "Deleting")
2020-08-14 07:36:23 +02 - [Profiling-Timer.Finished-Duplicati.Library.Main.Database.ExtensionMethods-ExecuteReader]: ExecuteReader: SELECT DISTINCT "Name", "State" FROM "Remotevolume" WHERE "Name" IN (SELECT "Name" FROM "Remotevolume" WHERE "State" IN ("Deleted", "Deleting")) AND NOT "State" IN ("Deleted", "Deleting") took 0:00:00:00.124
2020-08-14 07:36:23 +02 - [Profiling-Timer.Begin-Duplicati.Library.Main.Database.ExtensionMethods-ExecuteReader]: Starting - ExecuteReader: SELECT "IsFullBackup", "Timestamp" FROM "Fileset" ORDER BY "Timestamp" DESC
2020-08-14 07:36:23 +02 - [Profiling-Timer.Finished-Duplicati.Library.Main.Database.ExtensionMethods-ExecuteReader]: ExecuteReader: SELECT "IsFullBackup", "Timestamp" FROM "Fileset" ORDER BY "Timestamp" DESC took 0:00:00:00.000
2020-08-14 07:36:23 +02 - [Information-Duplicati.Library.Main.Operation.DeleteHandler:RetentionPolicy-StartCheck]: Start checking if backups can be removed
2020-08-14 07:36:23 +02 - [Information-Duplicati.Library.Main.Operation.DeleteHandler:RetentionPolicy-FramesAndIntervals]: Time frames and intervals pairs: 7.00:00:00 / 1.00:00:00, 28.00:00:00 / 7.00:00:00, 365.00:00:00 / 31.00:00:00
2020-08-14 07:36:23 +02 - [Information-Duplicati.Library.Main.Operation.DeleteHandler:RetentionPolicy-BackupList]: Backups to consider: 11.07.2020 13:22:32, 30.06.2020 10:30:59, 20.06.2020 10:20:05, 11.06.2020 11:22:48, 02.05.2020 10:14:15, 29.03.2020 18:43:05, 08.02.2020 08:12:08
2020-08-14 07:36:23 +02 - [Profiling-Duplicati.Library.Main.Operation.DeleteHandler:RetentionPolicy-NextTimeAndFrame]: Next time frame and interval pair: 7.00:00:00 / 1.00:00:00
2020-08-14 07:36:23 +02 - [Profiling-Duplicati.Library.Main.Operation.DeleteHandler:RetentionPolicy-BackupsInFrame]: Backups in this time frame: 
2020-08-14 07:36:23 +02 - [Profiling-Duplicati.Library.Main.Operation.DeleteHandler:RetentionPolicy-NextTimeAndFrame]: Next time frame and interval pair: 28.00:00:00 / 7.00:00:00
2020-08-14 07:36:23 +02 - [Profiling-Duplicati.Library.Main.Operation.DeleteHandler:RetentionPolicy-BackupsInFrame]: Backups in this time frame: 
2020-08-14 07:36:23 +02 - [Profiling-Duplicati.Library.Main.Operation.DeleteHandler:RetentionPolicy-NextTimeAndFrame]: Next time frame and interval pair: 365.00:00:00 / 31.00:00:00
2020-08-14 07:36:23 +02 - [Profiling-Duplicati.Library.Main.Operation.DeleteHandler:RetentionPolicy-BackupsInFrame]: Backups in this time frame: 08.02.2020 08:12:08, 29.03.2020 18:43:05, 02.05.2020 10:14:15, 11.06.2020 11:22:48, 20.06.2020 10:20:05, 30.06.2020 10:30:59, 11.07.2020 13:22:32
2020-08-14 07:36:23 +02 - [Profiling-Duplicati.Library.Main.Operation.DeleteHandler:RetentionPolicy-KeepBackups]: Keeping  backup: 08.02.2020 08:12:08
2020-08-14 07:36:23 +02 - [Profiling-Duplicati.Library.Main.Operation.DeleteHandler:RetentionPolicy-KeepBackups]: Keeping  backup: 29.03.2020 18:43:05
2020-08-14 07:36:23 +02 - [Profiling-Duplicati.Library.Main.Operation.DeleteHandler:RetentionPolicy-KeepBackups]: Keeping  backup: 02.05.2020 10:14:15
2020-08-14 07:36:23 +02 - [Profiling-Duplicati.Library.Main.Operation.DeleteHandler:RetentionPolicy-KeepBackups]: Keeping  backup: 11.06.2020 11:22:48
2020-08-14 07:36:23 +02 - [Profiling-Duplicati.Library.Main.Operation.DeleteHandler:RetentionPolicy-DeletingBackups]: Deleting backup: 20.06.2020 10:20:05
2020-08-14 07:36:23 +02 - [Profiling-Duplicati.Library.Main.Operation.DeleteHandler:RetentionPolicy-DeletingBackups]: Deleting backup: 30.06.2020 10:30:59
2020-08-14 07:36:23 +02 - [Profiling-Duplicati.Library.Main.Operation.DeleteHandler:RetentionPolicy-KeepBackups]: Keeping partial backup: 11.07.2020 13:22:32
2020-08-14 07:36:23 +02 - [Information-Duplicati.Library.Main.Operation.DeleteHandler:RetentionPolicy-BackupsToDelete]: Backups outside of all time frames and thus getting deleted: 
2020-08-14 07:36:23 +02 - [Information-Duplicati.Library.Main.Operation.DeleteHandler:RetentionPolicy-AllBackupsToDelete]: All backups to delete: 30.06.2020 10:30:59, 20.06.2020 10:20:05
2020-08-14 07:36:23 +02 - [Information-Duplicati.Library.Main.Operation.DeleteHandler-DeleteRemoteFileset]: Deleting 3 remote fileset(s) ...
2020-08-14 07:36:23 +02 - [Profiling-Timer.Begin-Duplicati.Library.Main.Database.ExtensionMethods-ExecuteNonQuery]: Starting - ExecuteNonQuery: DELETE FROM "Fileset" WHERE "Timestamp" IN (1593505859,1592641205,1593505859) 
2020-08-14 07:36:23 +02 - [Profiling-Timer.Finished-Duplicati.Library.Main.Database.ExtensionMethods-ExecuteNonQuery]: ExecuteNonQuery: DELETE FROM "Fileset" WHERE "Timestamp" IN (1593505859,1592641205,1593505859)  took 0:00:00:00.000
2020-08-14 07:36:23 +02 - [Profiling-Timer.Begin-Duplicati.Library.Main.Database.ExtensionMethods-ExecuteNonQuery]: Starting - ExecuteNonQuery: PRAGMA optimize
2020-08-14 07:36:23 +02 - [Profiling-Timer.Finished-Duplicati.Library.Main.Database.ExtensionMethods-ExecuteNonQuery]: ExecuteNonQuery: PRAGMA optimize took 0:00:00:00.000
2020-08-14 07:36:23 +02 - [Profiling-Timer.Finished-Duplicati.Library.Main.Controller-RunDelete]: Running Delete took 0:00:00:57.132

Afterwards I’ve restarted the backup and got the following error again

Aug 14, 2020 7:57 AM: The operation Backup has failed with error: Unexpected difference in fileset version 2: 30.06.2020 10:30:59 (database id: 20), found 1369434 entries, but expected 1369436

I’ve tried to delete id 20 instead of id 2 (might not make sense :slight_smile:)

This time no error

Running commandline entry
Finished!

            
No certificates found, you can install some with one of these commands:
    cert-sync /etc/ssl/certs/ca-certificates.crt #for Debian based systems
    cert-sync /etc/pki/tls/certs/ca-bundle.crt #for RedHat derivatives
    curl -O https://curl.haxx.se/ca/cacert.pem; cert-sync --user cacert.pem; rm cacert.pem #for MacOS
Read more: http://www.mono-project.com/docs/about-mono/releases/3.12.0/#cert-sync
  Listing remote folder ...
  Deleting file duplicati-20200620T082005Z.dlist.zip.aes (57,24 MB) ...
  Deleting file duplicati-20200630T083059Z.dlist.zip.aes (78,03 MB) ...
  Deleting file duplicati-b010dc615551b45068b4558756538550d.dblock.zip.aes (249,93 MB) ...
  Deleting file duplicati-iacaa9da7b30841ad8cbee0c005bd529f.dindex.zip.aes (182,78 KB) ...
  Deleting file duplicati-bbe6d61dccaaf47a4bc2ee61d4f54acca.dblock.zip.aes (249,92 MB) ...
  Deleting file duplicati-i5909c04554ca4a58aa752cfcbeb26ee4.dindex.zip.aes (279,11 KB) ...
  Deleting file duplicati-b9dd9259ca8ce4c47954b09bec69cc95b.dblock.zip.aes (249,91 MB) ...
  Deleting file duplicati-i27350f80f78c49108b5a49f6e97d021d.dindex.zip.aes (267,20 KB) ...
  Deleting file duplicati-b10a12abe87bc4cca80656faab0dcc726.dblock.zip.aes (249,91 MB) ...
  Deleting file duplicati-if80ad6a863124994a73554480840c1a1.dindex.zip.aes (266,56 KB) ...
  Deleting file duplicati-bd0ba08d55e784c80862c9d6c1cdda92b.dblock.zip.aes (249,94 MB) ...
  Deleting file duplicati-i67e9238807bb4e9da8d48a0e67ddc2ca.dindex.zip.aes (366,50 KB) ...
  Deleting file duplicati-bedaa3c139fbc4fb2a68b3136cafc9615.dblock.zip.aes (249,91 MB) ...
  Deleting file duplicati-i8be1ec84806b4530b0239bf208d34bfb.dindex.zip.aes (365,00 KB) ...
These filesets were deleted:
3: 20.06.2020 10:20:05
2: 30.06.2020 10:30:59
Return code: 0

I’ve started another backup and now it fails with version 1 / id 23

Unexpected difference in fileset version 1: 11.07.2020 13:22:32 (database id: 23), found 1368537 entries, but expected 1368539

I’l try to delete version 1 and retry

After deleting version 1 and starting a backup it failed with this error

Aug 14, 2020 10:27 AM: Failed while executing "Backup" with id: 2
System.TypeLoadException: Failure has occurred while loading a type.
  at Duplicati.Library.Encryption.EncryptionBase.Encrypt (System.IO.Stream input, System.IO.Stream output) [0x00000] in <6df07cc5452c450cb3c3854d714fdd41>:0 
  at Duplicati.Library.Encryption.EncryptionBase.Encrypt (System.String inputfile, System.String outputfile) [0x0000e] in <6df07cc5452c450cb3c3854d714fdd41>:0 
  at Duplicati.Library.Main.Operation.Common.BackendHandler+FileEntryItem.Encrypt (Duplicati.Library.Main.Options options) [0x0003b] in <9d37c106a6af4e2db9ebdc93583bad34>:0 
  at Duplicati.Library.Main.Volumes.VolumeWriterBase.CreateFileEntryForUpload (Duplicati.Library.Main.Options options) [0x00018] in <9d37c106a6af4e2db9ebdc93583bad34>:0 
  at Duplicati.Library.Main.Operation.Backup.DataBlockProcessor+<>c__DisplayClass0_0+<<Run>b__0>d.MoveNext () [0x004aa] in <9d37c106a6af4e2db9ebdc93583bad34>:0 
--- End of stack trace from previous location where exception was thrown ---
  at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw () [0x0000c] in <dca3b561b8ad4f9fb10141d81b39ff45>:0 
  at Duplicati.Library.Main.Operation.Backup.DataBlockProcessor+<>c__DisplayClass0_0+<<Run>b__0>d.MoveNext () [0x006c4] in <9d37c106a6af4e2db9ebdc93583bad34>:0 
--- End of stack trace from previous location where exception was thrown ---
  at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw () [0x0000c] in <dca3b561b8ad4f9fb10141d81b39ff45>:0 
  at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess (System.Threading.Tasks.Task task) [0x0004e] in <dca3b561b8ad4f9fb10141d81b39ff45>:0 
  at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification (System.Threading.Tasks.Task task) [0x0002e] in <dca3b561b8ad4f9fb10141d81b39ff45>:0 
  at System.Runtime.CompilerServices.TaskAwaiter.ValidateEnd (System.Threading.Tasks.Task task) [0x0000b] in <dca3b561b8ad4f9fb10141d81b39ff45>:0 
  at System.Runtime.CompilerServices.ConfiguredTaskAwaitable+ConfiguredTaskAwaiter.GetResult () [0x00000] in <dca3b561b8ad4f9fb10141d81b39ff45>:0 
  at CoCoL.AutomationExtensions+<RunTask>d__10`1[T].MoveNext () [0x00083] in <9a758ff4db6c48d6b3d4d0e5c2adf6d1>:0 
--- End of stack trace from previous location where exception was thrown ---
  at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw () [0x0000c] in <dca3b561b8ad4f9fb10141d81b39ff45>:0 
  at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess (System.Threading.Tasks.Task task) [0x0004e] in <dca3b561b8ad4f9fb10141d81b39ff45>:0 
  at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification (System.Threading.Tasks.Task task) [0x0002e] in <dca3b561b8ad4f9fb10141d81b39ff45>:0 
  at System.Runtime.CompilerServices.TaskAwaiter.ValidateEnd (System.Threading.Tasks.Task task) [0x0000b] in <dca3b561b8ad4f9fb10141d81b39ff45>:0 
  at System.Runtime.CompilerServices.ConfiguredTaskAwaitable+ConfiguredTaskAwaiter.GetResult () [0x00000] in <dca3b561b8ad4f9fb10141d81b39ff45>:0 
  at Duplicati.Library.Main.Operation.BackupHandler+<RunMainOperation>d__13.MoveNext () [0x0035f] in <9d37c106a6af4e2db9ebdc93583bad34>:0 
--- End of stack trace from previous location where exception was thrown ---
  at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw () [0x0000c] in <dca3b561b8ad4f9fb10141d81b39ff45>:0 
  at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess (System.Threading.Tasks.Task task) [0x0004e] in <dca3b561b8ad4f9fb10141d81b39ff45>:0 
  at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification (System.Threading.Tasks.Task task) [0x0002e] in <dca3b561b8ad4f9fb10141d81b39ff45>:0 
  at System.Runtime.CompilerServices.TaskAwaiter.ValidateEnd (System.Threading.Tasks.Task task) [0x0000b] in <dca3b561b8ad4f9fb10141d81b39ff45>:0 
  at System.Runtime.CompilerServices.ConfiguredTaskAwaitable+ConfiguredTaskAwaiter.GetResult () [0x00000] in <dca3b561b8ad4f9fb10141d81b39ff45>:0 
  at Duplicati.Library.Main.Operation.BackupHandler+<RunAsync>d__20.MoveNext () [0x00a16] in <9d37c106a6af4e2db9ebdc93583bad34>:0 
--- End of stack trace from previous location where exception was thrown ---
  at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw () [0x0000c] in <dca3b561b8ad4f9fb10141d81b39ff45>:0 
  at CoCoL.ChannelExtensions.WaitForTaskOrThrow (System.Threading.Tasks.Task task) [0x00050] in <9a758ff4db6c48d6b3d4d0e5c2adf6d1>:0 
  at Duplicati.Library.Main.Operation.BackupHandler.Run (System.String[] sources, Duplicati.Library.Utility.IFilter filter, System.Threading.CancellationToken token) [0x00009] in <9d37c106a6af4e2db9ebdc93583bad34>:0 
  at Duplicati.Library.Main.Controller+<>c__DisplayClass14_0.<Backup>b__0 (Duplicati.Library.Main.BackupResults result) [0x0004b] in <9d37c106a6af4e2db9ebdc93583bad34>:0 
  at Duplicati.Library.Main.Controller.RunAction[T] (T result, System.String[]& paths, Duplicati.Library.Utility.IFilter& filter, System.Action`1[T] method) [0x0011c] in <9d37c106a6af4e2db9ebdc93583bad34>:0 

I think I’ll now delete the database as well as all remote files and start over from scratch

This has not helped. Even after deleting of remote files, local database I’m still getting “Failure has occurred while loading a type.”. (using latest duplicati version 2.0.5.110_canary_2020-08-10)

* Aug 14, 2020 11:27 AM: The operation Backup has failed with error: Failure has occurred while loading a type. 
{"ClassName":"System.TypeLoadException","Message":"Failure has occurred while loading a type.","Data":null,"InnerException":null,"HelpURL":null,"StackTraceString":"  at Duplicati.Library.Encryption.EncryptionBase.Encrypt (System.IO.Stream input, System.IO.Stream output) [0x00000] in <6df07cc5452c450cb3c3854d714fdd41>:0 \n  at Duplicati.Library.Encryption.EncryptionBase.Encrypt (System.String inputfile, System.String outputfile) [0x0000e] in <6df07cc5452c450cb3c3854d714fdd41>:0 \n  at Duplicati.Library.Main.Operation.Common.BackendHandler+FileEntryItem.Encrypt (Duplicati.Library.Main.Options options) [0x0003b] in <9d37c106a6af4e2db9ebdc93583bad34>:0 \n  at Duplicati.Library.Main.Volumes.VolumeWriterBase.CreateFileEntryForUpload (Duplicati.Library.Main.Options options) [0x00018] in <9d37c106a6af4e2db9ebdc93583bad34>:0 \n  at Duplicati.Library.Main.Operation.Backup.DataBlockProcessor+<>c__DisplayClass0_0+<<Run>b__0>d.MoveNext () [0x004aa] in <9d37c106a6af4e2db9ebdc93583bad34>:0 \n--- End of stack trace from previous location where exception was thrown ---\n  at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw () [0x0000c] in <dca3b561b8ad4f9fb10141d81b39ff45>:0 \n  at Duplicati.Library.Main.Operation.Backup.DataBlockProcessor+<>c__DisplayClass0_0+<<Run>b__0>d.MoveNext () [0x006c4] in <9d37c106a6af4e2db9ebdc93583bad34>:0 \n--- End of stack trace from previous location where exception was thrown ---\n  at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw () [0x0000c] in <dca3b561b8ad4f9fb10141d81b39ff45>:0 \n  at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess (System.Threading.Tasks.Task task) [0x0004e] in <dca3b561b8ad4f9fb10141d81b39ff45>:0 \n  at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification (System.Threading.Tasks.Task task) [0x0002e] in <dca3b561b8ad4f9fb10141d81b39ff45>:0 \n  at System.Runtime.CompilerServices.TaskAwaiter.ValidateEnd (System.Threading.Tasks.Task task) [0x0000b] in <dca3b561b8ad4f9fb10141d81b39ff45>:0 \n  at System.Runtime.CompilerServices.ConfiguredTaskAwaitable+ConfiguredTaskAwaiter.GetResult () [0x00000] in <dca3b561b8ad4f9fb10141d81b39ff45>:0 \n  at CoCoL.AutomationExtensions+<RunTask>d__10`1[T].MoveNext () [0x00083] in <9a758ff4db6c48d6b3d4d0e5c2adf6d1>:0 \n--- End of stack trace from previous location where exception was thrown ---\n  at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw () [0x0000c] in <dca3b561b8ad4f9fb10141d81b39ff45>:0 \n  at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess (System.Threading.Tasks.Task task) [0x0004e] in <dca3b561b8ad4f9fb10141d81b39ff45>:0 \n  at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification (System.Threading.Tasks.Task task) [0x0002e] in <dca3b561b8ad4f9fb10141d81b39ff45>:0 \n  at System.Runtime.CompilerServices.TaskAwaiter.ValidateEnd (System.Threading.Tasks.Task task) [0x0000b] in <dca3b561b8ad4f9fb10141d81b39ff45>:0 \n  at System.Runtime.CompilerServices.ConfiguredTaskAwaitable+ConfiguredTaskAwaiter.GetResult () [0x00000] in <dca3b561b8ad4f9fb10141d81b39ff45>:0 \n  at Duplicati.Library.Main.Operation.BackupHandler+<RunMainOperation>d__13.MoveNext () [0x0035f] in <9d37c106a6af4e2db9ebdc93583bad34>:0 \n--- End of stack trace from previous location where exception was thrown ---\n  at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw () [0x0000c] in <dca3b561b8ad4f9fb10141d81b39ff45>:0 \n  at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess (System.Threading.Tasks.Task task) [0x0004e] in <dca3b561b8ad4f9fb10141d81b39ff45>:0 \n  at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification (System.Threading.Tasks.Task task) [0x0002e] in <dca3b561b8ad4f9fb10141d81b39ff45>:0 \n  at System.Runtime.CompilerServices.TaskAwaiter.ValidateEnd (System.Threading.Tasks.Task task) [0x0000b] in <dca3b561b8ad4f9fb10141d81b39ff45>:0 \n  at System.Runtime.CompilerServices.ConfiguredTaskAwaitable+ConfiguredTaskAwaiter.GetResult () [0x00000] in <dca3b561b8ad4f9fb10141d81b39ff45>:0 \n  at Duplicati.Library.Main.Operation.BackupHandler+<RunAsync>d__20.MoveNext () [0x00a16] in <9d37c106a6af4e2db9ebdc93583bad34>:0 \n--- End of stack trace from previous location where exception was thrown ---\n  at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw () [0x0000c] in <dca3b561b8ad4f9fb10141d81b39ff45>:0 \n  at CoCoL.ChannelExtensions.WaitForTaskOrThrow (System.Threading.Tasks.Task task) [0x00050] in <9a758ff4db6c48d6b3d4d0e5c2adf6d1>:0 \n  at Duplicati.Library.Main.Operation.BackupHandler.Run (System.String[] sources, Duplicati.Library.Utility.IFilter filter, System.Threading.CancellationToken token) [0x00009] in <9d37c106a6af4e2db9ebdc93583bad34>:0 \n  at Duplicati.Library.Main.Controller+<>c__DisplayClass14_0.<Backup>b__0 (Duplicati.Library.Main.BackupResults result) [0x0004b] in <9d37c106a6af4e2db9ebdc93583bad34>:0 \n  at Duplicati.Library.Main.Controller.RunAction[T] (T result, System.String[]& paths, Duplicati.Library.Utility.IFilter& filter, System.Action`1[T] method) [0x0011c] in <9d37c106a6af4e2db9ebdc93583bad34>:0 ","RemoteStackTraceString":null,"RemoteStackIndex":0,"ExceptionMethod":null,"HResult":-2146233054,"Source":"mscorlib","TypeLoadClassName":null,"TypeLoadAssemblyName":null,"TypeLoadMessageArg":null,"TypeLoadResourceID":0}
* Aug 14, 2020 11:27 AM: Running Backup took 0:00:10:35.116
* Aug 14, 2020 11:27 AM: ExecuteNonQuery: PRAGMA optimize took 0:00:00:00.000
* Aug 14, 2020 11:27 AM: Starting - ExecuteNonQuery: PRAGMA optimize
* Aug 14, 2020 11:27 AM: Fatal error

I’ve created "Failure has occurred while loading a type. " error during backup on osx for the Failure has occurred while loading a type. ” error as this is most likely not related.