"Unexpected difference in fileset" on every backup

Failed: Unexpected difference in fileset version 0: 11/2/2021 8:50:14 AM (database id: 1), found 95877 entries, but expected 95878
Details: Duplicati.Library.Interface.UserInformationException: Unexpected difference in fileset version 0: 11/2/2021 8:50:14 AM (database id: 1), found 95877 entries, but expected 95878

What in the world would cause my backups to continue to fail over a SINGLE file difference!?

I have a number of other backups on the same system that work just fine, but this one in particular ALWAYS fails. I have to use the command line tools to delete the old version (version 0) every time.

I would really appreciate some help with this.

Hello and welcome!

Before doing a backup, Duplicati does a sanity check on the local database to make sure everything is valid. What it’s doing here is telling you there is an inconsistency. Since it’s considered fatal, the job doesn’t run. It may appear you are getting this with “every backup” but in reality it just keeps telling you about the problem and not running a backup.

What version of Duplicati are you using? This issue was more prevalent in the past but I haven’t seen it happen with recent versions. Upgrade to 2.0.6.3_beta if you are using an older beta, but note that upgrading doesn’t solve the database inconsistency. It should just stop it from happening again in the future.

To resolve the issue with the database, I would just “delete” the offending backup version:

  • Click your backup job in the web UI to expand options
  • Click the blue “commandline…” link
  • Change the “command” to “delete”
  • Replace the contents of the “commandline arguments” box with “–version=0” (since your error is complaining about version 0)
  • Finally, scroll to the bottom and clicking the “run delete command now” button

Watch the results and after it is successful try running a backup again.

1 Like

To resolve the issue with the database, I would just delete the offending backup version…

Like I said, this is what I have to do every time to get anything to backup.

Sorry, I missed the part where you said you’d delete version 0 already.

What version are you using?

2.0.6.100_canary_2021-08-11

You might want to try a full database recreate. Make a copy of the existing database before doing so, just in case.

I have tried that multiple times. Initially with Backblaze B2 and then with Wasabi. Both produced the same result: Unexpected difference in fileset version 0

The full database recreation completes successfully, then you are able to run one backup, and then the next one gives you the error? If you can reproduce this consistently, then to me it sounds like something may not be written properly to your local database at the end of an otherwise successful backup job. (We may need to get some verbose logging to see…)

Can you reproduce the problem with a brand new backup job where you only have a small amount of data selected for backup?

The full database recreation completes successfully, then you are able to run one backup, and then the next one gives you the error?

That’s correct.

Can you reproduce the problem with a brand new backup job where you only have a small amount of data selected for backup?

This issue was present in a larger backup and we narrowed it down by splitting it into multiple smaller ones. However, I don’t think it’s going to be easy/worth the time to split it again into more small backups to continue narrowing. Right now the backup size in question is 950GB.

Ok, it might be useful to delete version 0 to get the database back into a sane state then run a backup job with logging options enabled. Perhaps there is an indication of some database issue at the end. Just a guess though…

OK, I’m giving that a try now. I’ll let you know how it goes.

I tried working past this previously. Once it did, another time it couldn’t. At least when only using everything for DB fixing directly in Duplicati.

You might find that you need to start a fresh backup. Technically, you should start a fresh backup at least once per year if not more to protect against issues, or such as now, fix issues.

To me, since Duplicati puts all files into containers, it would be far better to never make such a large backup anyway. Maybe split your files into about 100-130GB backups at most by creating backups that only deal with certain folders each. If those folders may receive more then lower the current size for extra room.

Also if there are large files that don’t change such as videos, there’s no need to be backing them up through Duplicati continually or otherwise and just do that once on the side or a manual backup in Duplicati. Depends on the files though.

1 Like

I guess we’ll see how the current direction goes. The ideal is to get a reproducible test case, then let the developers put it under a microscope using heavy logging, possibly debuggers attached to program, etc.

After lots of chasing, the last big case of this, which was fortunately reproducible for me to narrow down:

“Unexpected difference in fileset” test case and code clue #3800

but this scenario was fixed in mid-2019. Though things are much better now, maybe there’s another case.

One thing you could look at is whether this one is also in a compact, e.g. does checking no-auto-compact postpone it until you manually use the Compact now button. I think you can use the Verify files to try a verification like you’re having fail under the automatic one. If you’re lucky enough to get the usual job log, it breaks down the individual pieces of a run (such as compact) but it also may be skipped on failed backup.

To add to the fine thoughts from @Xavron (thanks!), Duplicati benefits from a larger blocksize for larger backups, because the default 100 KB creates too many blocks, and causes database operations to slow.
Previous rule-of-thumb was 1 million blocks per backup, but more recent testing here says go a bit higher.

Here are the results of the verbose logging:

2021-11-10 16:54:13 -05 - [Information-Duplicati.Library.Main.Controller-StartingOperation]: The operation Backup has started
2021-11-10 16:56:35 -05 - [Error-Duplicati.Library.Main.Operation.BackupHandler-FatalError]: Fatal error
Duplicati.Library.Interface.UserInformationException: Unexpected difference in fileset version 0: 11/2/2021 8:50:14 AM (database id: 1), found 95877 entries, but expected 95878
  at Duplicati.Library.Main.Database.LocalDatabase.VerifyConsistency (System.Int64 blocksize, System.Int64 hashsize, System.Boolean verifyfilelists, System.Data.IDbTransaction transaction) [0x0036e] in <1d9349a5d6874088879a75d074c92b62>:0
  at Duplicati.Library.Main.Operation.Backup.BackupDatabase+<>c__DisplayClass34_0.<VerifyConsistencyAsync>b__0 () [0x00000] in <1d9349a5d6874088879a75d074c92b62>:0
  at Duplicati.Library.Main.Operation.Common.SingleRunner+<>c__DisplayClass3_0.<RunOnMain>b__0 () [0x00000] in <1d9349a5d6874088879a75d074c92b62>:0
  at Duplicati.Library.Main.Operation.Common.SingleRunner.DoRunOnMain[T] (System.Func`1[TResult] method) [0x000b0] in <1d9349a5d6874088879a75d074c92b62>:0
  at Duplicati.Library.Main.Operation.BackupHandler.RunAsync (System.String[] sources, Duplicati.Library.Utility.IFilter filter, System.Threading.CancellationToken token) [0x0034d] in <1d9349a5d6874088879a75d074c92b62>:0
2021-11-10 16:56:37 -05 - [Information-Duplicati.Library.Modules.Builtin.SendMail-SendMailComplete]: Email sent successfully using server: smtp://smtp.gmail.com:587?starttls=always
2021-11-11 00:00:00 -05 - [Information-Duplicati.Library.Main.Controller-StartingOperation]: The operation Backup has started
2021-11-11 00:01:14 -05 - [Error-Duplicati.Library.Main.Operation.BackupHandler-FatalError]: Fatal error
Duplicati.Library.Interface.UserInformationException: Unexpected difference in fileset version 0: 11/2/2021 8:50:14 AM (database id: 1), found 95877 entries, but expected 95878
  at Duplicati.Library.Main.Database.LocalDatabase.VerifyConsistency (System.Int64 blocksize, System.Int64 hashsize, System.Boolean verifyfilelists, System.Data.IDbTransaction transaction) [0x0036e] in <1d9349a5d6874088879a75d074c92b62>:0
  at Duplicati.Library.Main.Operation.Backup.BackupDatabase+<>c__DisplayClass34_0.<VerifyConsistencyAsync>b__0 () [0x00000] in <1d9349a5d6874088879a75d074c92b62>:0
  at Duplicati.Library.Main.Operation.Common.SingleRunner+<>c__DisplayClass3_0.<RunOnMain>b__0 () [0x00000] in <1d9349a5d6874088879a75d074c92b62>:0
  at Duplicati.Library.Main.Operation.Common.SingleRunner.DoRunOnMain[T] (System.Func`1[TResult] method) [0x000b0] in <1d9349a5d6874088879a75d074c92b62>:0
  at Duplicati.Library.Main.Operation.BackupHandler.RunAsync (System.String[] sources, Duplicati.Library.Utility.IFilter filter, System.Threading.CancellationToken token) [0x0034d] in <1d9349a5d6874088879a75d074c92b62>:0
2021-11-11 00:01:16 -05 - [Information-Duplicati.Library.Modules.Builtin.SendMail-SendMailComplete]: Email sent successfully using server: smtp://smtp.gmail.com:587?starttls=always
2021-11-12 00:00:00 -05 - [Information-Duplicati.Library.Main.Controller-StartingOperation]: The operation Backup has started
2021-11-12 00:01:09 -05 - [Error-Duplicati.Library.Main.Operation.BackupHandler-FatalError]: Fatal error
Duplicati.Library.Interface.UserInformationException: Unexpected difference in fileset version 0: 11/2/2021 8:50:14 AM (database id: 1), found 95877 entries, but expected 95878
  at Duplicati.Library.Main.Database.LocalDatabase.VerifyConsistency (System.Int64 blocksize, System.Int64 hashsize, System.Boolean verifyfilelists, System.Data.IDbTransaction transaction) [0x0036e] in <1d9349a5d6874088879a75d074c92b62>:0
  at Duplicati.Library.Main.Operation.Backup.BackupDatabase+<>c__DisplayClass34_0.<VerifyConsistencyAsync>b__0 () [0x00000] in <1d9349a5d6874088879a75d074c92b62>:0
  at Duplicati.Library.Main.Operation.Common.SingleRunner+<>c__DisplayClass3_0.<RunOnMain>b__0 () [0x00000] in <1d9349a5d6874088879a75d074c92b62>:0
  at Duplicati.Library.Main.Operation.Common.SingleRunner.DoRunOnMain[T] (System.Func`1[TResult] method) [0x000b0] in <1d9349a5d6874088879a75d074c92b62>:0
  at Duplicati.Library.Main.Operation.BackupHandler.RunAsync (System.String[] sources, Duplicati.Library.Utility.IFilter filter, System.Threading.CancellationToken token) [0x0034d] in <1d9349a5d6874088879a75d074c92b62>:0
2021-11-12 00:01:11 -05 - [Information-Duplicati.Library.Modules.Builtin.SendMail-SendMailComplete]: Email sent successfully using server: smtp://smtp.gmail.com:587?starttls=always

You might find that you need to start a fresh backup.

I have done this multiple times and still get the same error.

Maybe split your files into about 100-130GB backups at most by creating backups that only deal with certain folders each.

That seems like an unreasonable constraint for a backup client worth it’s salt. I’m working with a network attached storage device that is actively being used by an office with ~20 workers. Restricting their usage of the storage to adhere to the backup client limitations is not an option.

One thing you could look at is whether this one is also in a compact, e.g. does checking no-auto-compact postpone it until you manually use the Compact now button.

I have tried this too. It doesn’t seem to make any difference whether or not the no-auto-compact option is checked.

I think you can use the Verify files to try a verification like you’re having fail under the automatic one. If you’re lucky enough to get the usual job log, it breaks down the individual pieces of a run (such as compact) but it also may be skipped on failed backup.

I have done this as well. The logs always stop after the unexpected difference error. Not very helpful.

Previous rule-of-thumb was 1 million blocks per backup, but more recent testing here says go a bit higher.

Currently, I use the default remote volume size of 50MB. Would you suggest a change? (The office has a dedicated fiber line, so connection speed and reliability are not an issue).