Select the job, click on ‘Command Line’, select ‘Test’ in the ‘Command’ combo, replace everything in the command line arguments by ‘All’ (without the quotes), then add an advanced option ‘full-block-verification’ and check it.
Note that if you do not set ‘all’ in the command line arguments, Duplicati will by default only check 3 remote files, it will be fast, but not exactly reliable on its own.
It was very fast, which seems strange. It returned the usual error
ErrorID: FilesetDifferences
Unexpected difference in fileset version 8: 29.09.2023 13:00:00 (database id: 5), found 317826 entries, but expected 317891
Return code: 100
Profiling : 2023-11-04 Test full block verification PROFILING - Google Диск
Au temps pour moi. Excuse my french.
It turns out that the ‘test’ function does check the database. And it’s not even possible to disable this verification. Sigh, it don’t make any sense to me.
Well, if I am the @#!# interim maintainer of this project, I could just as well take advantage of it to not appear completely hopeless yes ?
Here is a build of current Duplicati with the capability to bypass this check added:
So if you install this build, you can restart the test in the same way, but add also the
disable-filelist-consistency-checks
advanced option, set it to true, and it will bypass it and run (hopefully).
Maybe I should not say it - I feel like the bad Milou tempting poor Captain Haddock with the bottle - but if you add this very same option to your backup job, it will bypass the check and the backup should run. However it would be bad from a consistency point of view, I hasten to say.
The backup code puts it as below. Would you really want your backup based on an insane database?
Especially when altering data based on records, correct records are essential. Argument is weaker for diagnostic tools, but it’s the old question of whether one stops on first error, or continues to find others, possibly stumbling a bit because the records were bad. If you run test, you’ll see. Other paths do exist.
thank you a lot all for your help
If I would want that, I would probably be more insane that said database.
It might be the end of the day fatigue, but I don’t understand how to install a new build from the link you provided. Very sorry about that…
(start venting)The more and more I try to fix things, the more I realise I have no idea what I’m doing at all with my backups, I’m just using tools that I have no idea how they work and at the first problem I’m completely lost. I’m tempted to switch my backup account to a FTP backup protocol and use FreeFileSync to clone my computer online and forget everything I was once attempted to know about incremental backups.(end venting)
It’s possible that you don’t have a Github account. For some reason Github don’t enable by default build result for anonymous access. I can move it to a public part.
I’m trying to understand how Duplicati works for about one year (on and off). For that, I use an assortment of compilers, debuggers, editors, and many other tools but I have no idea of how these tools work either. Do you understand really how is produced the electric current powering your computer(s) ? How is working your phone and how it has been produced ?
There’s a .deb
file in build-results-linux
. Unfortunately it’s not a link unless logged into GitHub.
Looks exactly the same until you mouse over it or try to click it. I don’t know why GitHub does that…
I’m not sure what progress has been made in finding an alternative download spot, so while we wait
There was previously an idea of you running a Python script from me to check a dlist file against DB. Simpler (and more sure for us) and probably only a little bit slower for you is to download all the dlist
files (10) and all the dindex files (11706, but if Cyberduck can sort by name to multi-select, that’d do). That’s less than a GB in total. I installed Linux Mint 21.2 yesterday, and it already came with python3:
$ python3 --version
Python 3.10.12
$ time python3 checker13.py ~/DuplicatiBackups/test1
2 blocks seen in dindex files
0 blocklists seen in dindex files
0 blocks used by dlist files blocklists
0 blocks used by dlist large files data
1 blocks used by dlist small files data
1 blocks used by dlist files metadata
0 blocks unused
0 large blocksets in dlist files
1 small blocksets in dlist files
small file blocksets that are also metadata blocksets: set()
small file blocksets that are also blocklists: set()
real 0m0.157s
user 0m0.069s
sys 0m0.017s
$
All your numbers will be bigger than those, but the good result is seeing no lines about Missing items.
This is just a rough sanity check for smooth database recreate. Actually doing a recreate can be slow, especially past 90% on the GUI progress bar when it downloads all its dblocks in hope of finding data. Trying extra hard may take extra long. The Python script just tries to forecast if a problem is expected.
You could also just save your current database (in case recreate fails or must be killed), then recreate. Renaming the database will disable the Recreate button (nothing there to delete…), so it’s just Repair.
If database recreate works, then normal Duplicati seems more likely to be able to do desired deep test attempting to be extra sure that things are OK. Sometimes harmless oddities get flagged unfortunately.
Alternatively, postpone the deep test (which can be slow), and if recreate seems clean and backup has nothing to complain about, then you’ll get the backup update that you’re doubtlessly wishing would run.
Seeking feedback from you and @gpatel-fr on path. Deep test on test build would likely add some info, possibly identifying discrepancy between database and destination. Python script might find destination internally consistent, which sort of implies that database has been damaged, which we already suspect, though the details aren’t clear. If recreate actually runs, we can get another DB bug report for compares.
If database recreate gets into long dblock downloads, then destination was damaged, not just database.
I don’t understand what you are expecting from me to be candid, I don’t get what is the ‘path’, could you be more explicit ? Thanks.
The way forward from here, given diverse goals such as time (including time with no new backup), information gathering to understand the problem, safety in terms of checking all’s good, and so on.
I was pretty explicit about options already, and don’t want to repeat the whole post. Any questions?
EDIT:
At one time I think you proposed a deep test, then a recreate. Deep test hit a snag of check failure, followed by snag of GitHub login. An all-Duplicati option is recreate first, test later. To reduce risk of recreate getting in trouble, pre-test with the Python script can be done, then recreate, then backup, followed by deep test. That possibly gives less direct data than bypassing GitHub, but might be fast, allowing getting back into a routine while we continue to try to figure out what went wrong, and how.
I don’t think a simple recreate will be successful, because those 65 broken files will probably still have entries in their dlist volumes. It will just download all the volumes, find nothing and then fail.
It’s a lot of guesswork without any actual testing, but I guess that theory is that the destination is truly damaged for some backups, maybe all except the first and last, meaning required dblock/dindex left. Assuming dindex still describes dblock, the quick pre-test with the script could verify such a problem. Omitting some dlist files from the download area could also try to confirm which versions are still OK.
EDIT 1:
I would probably just kill it if it starts that. The maybe slightly slow test with Duplicati is to try recreate, avoiding worries about an unknown Python script, and the chore of manually downloading those files. Circling back to where this topic started, except hoping that Swift will get past an initial list
this time.
EDIT 2:
The repair-as-recreate actually can do a specific version, I think. Maybe it can do multiple. This would possibly permit recovery of the maybe-good first and last versions of the backup, but I’m not sure how unused blocks (if any) are handled. Perhaps ideal would be to consider them waste to get compacted.
Direct restore from backup files is a way to test versions one at a time, but each time may take awhile.
EDIT 3:
I had previously tried to database-only test an alternative theory that missing files left behind blocklists without a user, or blocks without a user. If I did the SQL right, the tests found no such excess material.
Just now I tested that in Block table, remote volume reference count was as expected – 11706. It was, which leans towards your theory of loss, however I don’t know of a good way to prove it without a look, either by Duplicati or something manual. Having the DB bug report is great but its use only goes so far.
I have rebuilt and exported here:
to install, unzip and use apt (sudo apt …/duplicati_2.0.7.3-1_all.deb)
this version can be used without github login.
Well, testing for that beats thinking about it.
The version I have built includes the change accelerating this process.
… so back on prior track, and @titilesoleil can either read or totally ignore my diversion while waiting. Intention was to offer some alternatives that might lead to faster information, and return to normalcy…
I’m pleased that a nicer build solution was devised, compared to copying the file to some public cloud.
Thanks all!
Hello all,
I hope that you won’t be disappointed in my course of acion, but I was getting a bit lost in all the suggestions and I decided to attempt a new database recreation, hoping whatever did not work the first time would maybe work. If this attempt would not work, I would still have the backup and be able to go back where we were so I thought it was worth a try.
And… it worked ! The recreation completed and then I was able to make a backup without fatal errors. Don’t know if it’s reliable though, as I understood from all the many things that I learned on this tread (thanks) that when the local database is deleted, Duplicati assumes that the distant backup is fine, and if it’s not then, well shadefreunde.
Sooo, after the backup was completed I performed the following command from the GUI tool :
TEST
all
–full-remote-verification
–full-block-verification
It’s been downloading and checking stuff since yesterday, but according to the green bar progression it should complete today.
If this command completes without error, does it mean that everything is fine ?
Already done. Here are the results, which I’m not sure what they mean.
duplicati-i65e21c89978948a780b1bde730945fa7.dindex.zip: 1 errors
Extra: WWBvU7gF3IAQXQdkYtW7VfqIeS78G50LMOBgx9JPRl8=
duplicati-i67c638cbc73c4fd89817b6b1a7686a2d.dindex.zip: 1 errors
Extra: 6+8xkpHzsHYgkJiXDBQsXG0nYk63VAfN4Z2I8pwAYf0=
duplicati-i77cd37af3c01421781ed01373822b3cd.dindex.zip: 1 errors
Extra: hiw3Zpir/vf1iywXxAjacSKchKvHzlU3KLjcAczgrGE=
duplicati-i93a412f43ca848e5a7be53721053fbd6.dindex.zip: 1 errors
Extra: NlAuaIZv1cdrWWfa6nzxgZa9kaszhq7XdovbanX+A7c=
duplicati-i9f5dd8c3f0694dbe90f1c8f9e5e1cb08.dindex.zip: 2 errors
Extra: Dqz/CCV6eKp2hBdhEN+zPxsYJx3qaSMfPmaRQUa8Mks=
Extra: Fki48mXs03u8FksUvHmiGLin/xe5MVh1BcQ5aNTC8mA=
duplicati-ibcbb6783c9d842b7830912db970cc251.dindex.zip: 1 errors
Extra: qiXpeARtaA74dA2Dfm3lvB4qLcYInb2hASVEtTjVP2U=
duplicati-ibd4f149ffa6a4777b22dfd36ab03575e.dindex.zip: 2 errors
Extra: lOlzAqfmd/ssJctATEfiwpWnCVUsv0N+yHWoJvM7roc=
Extra: nsSJE0zvJfnRV4Z01PgoaGSgFE0PCl8dnxEJHp//UL4=
duplicati-ic450fbd21fba444bad5abc6650769dfb.dindex.zip: 1 errors
Extra: Ot5qOy11oibza9U05fRammYkUzSzZ4SjNYsXVw6613Q=
duplicati-ic735b48c995e4390ad9017299763eb30.dindex.zip: 1 errors
Extra: 6MPql7BCFgN/PaYRfjmKErCTMi8Vcw63MBecXUB9DIA=
duplicati-icae0f4af5b0944328bb6c23c917c2d24.dindex.zip: 1 errors
Extra: H3N3s4Pl0uKGGaL4l+NcTNkXxF/hG6ehkbcZm3xV4q4=
duplicati-ice998ae5dcaa4267b5242a6a8c7bb1ff.dindex.zip: 1 errors
Extra: mT1rG2iXGshGrLcAL5/0cLtCtdi2mpWIvVQzY25saik=
duplicati-id13ed4ad86ab4aea8d43ee5dc20b9d47.dindex.zip: 2 errors
Extra: S8oXtGXmT74A1Fg8KhEW2zaJJsZfsL8+x5OGmVcd4dg=
Extra: xR06qbz851jksu5VWRJLqgNyHLeB31xGXoT8kIzm/SQ=
duplicati-id1a07527554d465689c3354d62ce8887.dindex.zip: 1 errors
Extra: X6KvlcMeE7ib5EKPe7yxb4TjuUPfDkT3m7icV75jF7M=
duplicati-id64adb4228ac4df6b76adfcca90fbd87.dindex.zip: 1 errors
Extra: YqElVF6b5aObbvFpw3m8WJc7ZZvwsiVP4xElFsjOAjI=
duplicati-ie21014469ccd44bf979fdfbc8c57388a.dindex.zip: 1 errors
Extra: PCTA1W1/aYV8PYBaR4413pXeEEzT0hCnxjw+4Yztfvo=
duplicati-ief1f4ed59bfb4f1e8fa7de0ded8dc846.dindex.zip: 2 errors
Extra: 3mFJp1femBuNqLU6MYQXAysfUYrTQ+cGur0kVC9YX5U=
Extra: jt3NVO8t7w+lr4QpWBDDPCkKAw58THsHUrYLp0S8CSw=
duplicati-ifbb7bc15c09e420eb6fcd2a94a06445a.dindex.zip: 1 errors
Extra: bqM5oL2lljJEV9vOkct8BKwk0sIhWnnwP9dcHErCKJ4=
duplicati-b093ddaa3c6d04d239e1051609fab162d.dblock.zip: 1 errors
Extra: PCTA1W1/aYV8PYBaR4413pXeEEzT0hCnxjw+4Yztfvo=
duplicati-b20e8459f3cd240bd86ffd30114d968d4.dblock.zip: 2 errors
Extra: 3mFJp1femBuNqLU6MYQXAysfUYrTQ+cGur0kVC9YX5U=
Extra: jt3NVO8t7w+lr4QpWBDDPCkKAw58THsHUrYLp0S8CSw=
duplicati-b21b4276853194de78e9e5341b8878b02.dblock.zip: 1 errors
Extra: 6+8xkpHzsHYgkJiXDBQsXG0nYk63VAfN4Z2I8pwAYf0=
duplicati-b40cc6e82b92846e2a83b03b980d85501.dblock.zip: 2 errors
Extra: lOlzAqfmd/ssJctATEfiwpWnCVUsv0N+yHWoJvM7roc=
Extra: nsSJE0zvJfnRV4Z01PgoaGSgFE0PCl8dnxEJHp//UL4=
duplicati-b48403b9f51534d62bba567f5f178af86.dblock.zip: 1 errors
Extra: H3N3s4Pl0uKGGaL4l+NcTNkXxF/hG6ehkbcZm3xV4q4=
duplicati-b521699ae52da462f9147746af35503e7.dblock.zip: 2 errors
Extra: S8oXtGXmT74A1Fg8KhEW2zaJJsZfsL8+x5OGmVcd4dg=
Extra: xR06qbz851jksu5VWRJLqgNyHLeB31xGXoT8kIzm/SQ=
duplicati-b5b8119f043b14c04bc6c8d3a1d61288d.dblock.zip: 1 errors
Extra: mT1rG2iXGshGrLcAL5/0cLtCtdi2mpWIvVQzY25saik=
duplicati-b63cfcfb14f0e4476b9e80582bead185c.dblock.zip: 2 errors
Extra: Dqz/CCV6eKp2hBdhEN+zPxsYJx3qaSMfPmaRQUa8Mks=
Extra: Fki48mXs03u8FksUvHmiGLin/xe5MVh1BcQ5aNTC8mA=
duplicati-b6b4f89a58f7c45ec9dbd57c50644b25d.dblock.zip: 1 errors
Extra: bqM5oL2lljJEV9vOkct8BKwk0sIhWnnwP9dcHErCKJ4=
duplicati-b71a96fff45654cdbb6c6751367d44d7a.dblock.zip: 1 errors
Extra: 6MPql7BCFgN/PaYRfjmKErCTMi8Vcw63MBecXUB9DIA=
duplicati-b92206f0de0274d35be83e88f4b346d5b.dblock.zip: 1 errors
Extra: NlAuaIZv1cdrWWfa6nzxgZa9kaszhq7XdovbanX+A7c=
duplicati-b940ba82bb0034ed784ed1d32bb80adb7.dblock.zip: 1 errors
Extra: qiXpeARtaA74dA2Dfm3lvB4qLcYInb2hASVEtTjVP2U=
duplicati-b98ae9719a556448eb2a1874bbd097efb.dblock.zip: 1 errors
Extra: X6KvlcMeE7ib5EKPe7yxb4TjuUPfDkT3m7icV75jF7M=
duplicati-ba2c9a7973f434f92815b475bc5f03c6a.dblock.zip: 1 errors
Extra: Ot5qOy11oibza9U05fRammYkUzSzZ4SjNYsXVw6613Q=
duplicati-ba8c9bd98e3054c08a740f1f7c681b091.dblock.zip: 1 errors
Extra: WWBvU7gF3IAQXQdkYtW7VfqIeS78G50LMOBgx9JPRl8=
duplicati-bb6925d190d90421b88a576531c1e6454.dblock.zip: 1 errors
Extra: YqElVF6b5aObbvFpw3m8WJc7ZZvwsiVP4xElFsjOAjI=
duplicati-bc142744f1f9c4bc28146dd758419c0da.dblock.zip: 1 errors
Extra: hiw3Zpir/vf1iywXxAjacSKchKvHzlU3KLjcAczgrGE=
Return code: 3
Great!
Not so great for analysis. Did you save a copy of the database right after recreate?
That could become another bug report and help to isolate the problem in other DB.
There might still be some value in doing that now, but it makes hard search harder.
“Extra” happens because the recycling may add a new block instead of reusing old.
“Extra” from this appears in identical pairs, and your listing is that way. I re-sorted it.
These are thought to be harmless. You can probably find these in DuplicateBlock
if you want to look in a table. The recreate would notice, and would put blocks there.
No, sorry, haven’t thought about that. Is there some use of copying it now before doing a new backup with it ?
Does it mean that I can setup my automatic backups again and go back to be anxious about other random daily things ?
Might as well just do a bug report. If you had old DB, it could be temporarily reinserted just for bug report (making sure nothing else runs), but there’s always some risk of mistake when rearranging the database.
Could you expand on the text below? Does that mean you got warnings of something, but it completed?
There isn’t a huge list of checking tools for manual use. I’d predict Repair and list-broken-files
will both be happy, but you can certainly try them. Doing actual restores occasionally is also much advised.
Here are the full logs :
Database repair
Backup after repair
Test with full remote and full block verifications