Is there any way to fix an "unexpected difference in fileset" error other than deleting?


#21

Regardless of how the recreate got started (it really shouldn’t be doing it automatically) recreate performance is s known issue.

Much of it seems to be related to inefficient SQL checking the database for already existing records. While I found some time to build some metrics for my specific test case, I haven’t gotten around to actually testing any fixes. :frowning:

What generally seems to happen is the progress bar moves along until about 75% then seems to stop. Duplicati of actually still working, but it takes progressively longer to finish a percent of work the further along the process is.

In regards to recreate vs. start over it MIGHT be faster to start over, but you’llv likely use upload bandwidth re-pushing the files. You’ll also have a break with the history of previous backups.

One thing I’ve recommended to people is if they want to start over point to a new destination folder and leave the “broken” one in place. If needed, a direct restore from the “broken” destination can be done.

That way you’re not left with NO backup while the new one is “filling”. And of course you can keep the “broken” destination around for as long as you care about the history it might contain.


#22

This is just a backup to a local (external USB) drive, so bandwidth is not an issue. At the current rate, it looks like it is pretty stable at 90 days left to recreate the database. From your comment, it sounds like if anything it will slow down rather than speed up as it makes progress, so starting fresh would be much faster (I think the first backup took about two weeks). Since this has database issue has happened several times to me, though, I’m starting to think Duplicati just isn’t workable for my current setup and I should just use rsync or rdiff-backup until Duplicati is more stable/more efficient.


#23

I too have started getting this error on one my backup sets. I’ve not changed anything on that configuration AFAIK.

2.0.4.5 (2.0.4.5_beta_2018-11-28)

There are 16 versions of this backup set.
It runs daily.
The most recent successful run was on March 2nd.
The fileset from version 4 (2/24) has the unexpected difference in the number of entries.

Given that this is a moderately large fileset (57GB), I’d love to know what you’d suggest doing to repair it.

 Failed: Unexpected difference in fileset version 4: 2/24/2019 4:03:11 AM (database id: 100), found 6180 entries, but expected 6181
    Details: Duplicati.Library.Interface.UserInformationException: Unexpected difference in fileset version 4: 2/24/2019 4:03:11 AM (database id: 100), found 6180 entries, but expected 6181
      at Duplicati.Library.Main.Database.LocalDatabase.VerifyConsistency (System.Int64 blocksize, System.Int64 hashsize, System.Boolean verifyfilelists, System.Data.IDbTransaction transaction) [0x00370] in <c6c6871f516b48f59d88f9d731c3ea4d>:0