Recurring error - "unexpected difference in fileset"

Hi all,

Would like to get some assistance in a recurring error that I am getting on a backup.

For some reason out of the blue I get dreaded little red box when trying the background (fileset 200mb, running nightly with no issues with 60 versions).

“Unexpected difference in fileset…”
(see screenshot)

Trying to run again results in same error. so I go to “Advanced” then “Database” and “Recreate (delete and repair)”

After some time, not too long as not a huge dataset - it finishes - I get another red box but looking at the error log it doesn’t look too bad. (see two screenshots)

Indeed, after this time “run” on the backup works and go through with no issues… with the “last successful backup” timestamp updated

BUT on next attempt at run; I get same error message as originally - ie haven’t progressed at all!

So I am outta ideas on what to do - any suggestions?

Many threads about this issue. Usually database recreation is not the best way to solve it. Instead just delete the offending backup version.

Quick instructions:
Click on your backup set in the Web UI, click the Commandline link, then select “delete” from the Command dropdown. Empty the contents of the Commandline Arguments text box. Then scroll to the bottom and pick “version” from the Add Advanced Option dropdown. Enter the version number to delete and click the “Run delete command now” button.

What version of Duplicati are you using? I assume you are not running the latest beta (2.0.5.1_beta). We think the bug that causes this issue has been resolved. So far we have had no reports of people having this issue on 2.0.5.1. (Note that merely upgrading to 2.0.5.1 won’t resolve an existing “unexpected difference” issue - you still need to delete the offending version.)

“Unexpected difference in fileset” test case and code clue #3800 was diagnosis of bug.

Fix database issue where block was not recognized as being used #3801 was code fix.

v2.0.4.22-2.0.4.22_canary_2019-06-30 was first release which solved my repeat issue:

Fixed data corruption caused by compacting, thanks @ts678 and @warwickmm

Viewing the log files of a backup job shows how to see if a compact ran before the error.
Look for CompactResults with numbers. It might be less “out of the blue” than it seems.
Not every compact will hit the problem, but the problem that was fixed is in the compact.

v2.0.5.1-2.0.5.1_beta_2020-01-18 attempted to release note fix by naming its symptom:

Fix for “Unexpected difference in fileset”. #3800

Release notes for all releases are in Releases category if you want to see fixes and bugs.
Of course, not all bugs will go on the announcement topic, but regressions sometimes do.

The missing files on Recreate are bad news, and dblock files mean potential loss of data.
Using the Command line tools from within the Graphical User Interface could be used for
The LIST-BROKEN-FILES command. Change Command, clear Commandline arguments.
The PURGE-BROKEN-FILES command can make things consistent but can’t fix the loss.

Sometimes people prefer to delete the database and the remote, and start a fresh backup.
Doing so on a release such as 2.0.5.1 that’s had a lot of backup integrity fixes will be best.

Thanks to the step-by-step from drwtsn32. Solved it.

In process upgraded to 2.0.5.101 so hopefully won’t get this problem in future as well.

Thanks all!

Note that in general I wouldn’t recommend running Canary releases for your main production systems as they get bleeding edge changes. That’s why I proposed 2.0.5.1 (beta). That being said, if you understand the risk and want to help test Canary, that’d be great. Thank you for your help!

More than happy to help. A lot of this is more for my own interest/hobby than a serious production (see other thread where I mention I am doing a face off between two different storage backends and duplicati and another backup system)

That being said there are times where the hobby becomes a bit of a chore. I run two main groups of back ups - a smaller one (few hundred MB or so) and a larger one (few hundred GB)

In the past if I got a red error I couldn’t resolve I might end up just deleting the whole job (database backend) and recreating the whole thing but this is getting more and more painful for the larger backup

It’s unfortunate that can’t run the more stable version side by side with canary

1 Like

Looking in prior reports, I see Canary was being used in June 2019. Maybe it hit problem below:

dblock put retry corrupts dindex, using old dblock name for index – canary regression #3932

2019-09-30 18:21:34 -04 - [Error-Duplicati.Library.Main.Operation.RecreateDatabaseHandler-MissingFileDetected]: Remote file referenced as duplicati-b53f2fc838c5e4d08b98125d606b36445.dblock.zip by duplicati-i1e699e9790774a88a06f9211e03711d6.dindex.zip, but not found in list, registering a missing remote file

looks a lot like the MissingFileDetected errors seen in original post. The bug was in Canary from about v2.0.4.16-2.0.4.16_canary_2019-03-28 to v2.0.4.31-2.0.4.31_canary_2019-10-19. Release note said:

Fixed a retry error where uploaded dindex -files would reference non-existing dblock files, thanks @warwickmm

but it would be hard to say without some deep research. Canary does have its risks, but those who can tolerate some risk and report issues early help keep the bugs from moving into later channels like Beta.

I don’t know if you did that this time, but if you did not, the Recreate will probably have the same errors, assuming the problem is actually in the remote files. Newer Duplicati shouldn’t add any further errors…

You can (I think), but it’s awkward.

Duplicati.GUI.TrayIcon.exe is designed to allow multiple copies, probably mainly with multiple users on servers in mind, but it can also be used multiple times by one user if you can keep databases separate, for example by adding --server-datafolder to one to override its usual location in your profile’s AppData. One quirk is that somehow they collide if in the same browser. Workaround is to use a second browser.

Next challenge is that Canary updates need to be kept away from the Beta browser, and I’m not sure it happens automatically by different channel. If not, set AUTOUPDATER_Duplicati_SKIP_UPDATE=true before starting the Beta one installed from a .msi should keep it from picking up any Canary updates…

Duplicati autoupdater puts updates in a different location, so what happens is Program Files one starts, then launches the latest update as a child. So Program Files can launch Canary for one side of the test while running as itself (no child) as Beta side. It won’t have the usual autoupdates, but Beta is not often.

Sorry I wasn’t too clear - I meant the real nuclear option where I deleted everything (job in listing/database/backend storage files) - so this always solves the problems at the cost of the time needed for a from-the-top full backup

I guess in future I can poke around the forums a bit more for clues on what to do but previously basically my troubleshooting went

  • Try repair
  • Try delete and repair
  • Give up

Anyway, lets see how Carnary goes…

Unfortunately, I have to draw a line on how much awkward handling I can sustain for a hobby (wifey already complains I spend too much time on computer :slight_smile: ) - will just have live with the bleeding edge for Canary

Sounds reasonable especially given your use of the backup. Without semi-nuclear start-again, you might see some “missing remote file” complaints again. New Canary can’t fix damage if old Canary made some.