Fail to repair database

I’m having problem with Duplicati failing to do a database repair.
I just reinstalled my PC and moved from windows 8 to windows 10.
I exported my backups but did not kept the database.
In the windows 10 system I did “import from a file” and then backup -> database -> repair.
One backup run and finished, but the other run for more than 8hr and ended with:

Duplicati.Library.Interface.UserInformationException: The database was attempted repaired, but the repair did not complete. This database may be incomplete and the backup process cannot continue. You may delete the local database and attempt to repair it again.

Running a “Recreate” did not change a thing.

I backup to a NAS on my home network. Both the PC and NAS connect by wire.
The source is 170G with 22K files. I have a backup with 3 versions.
I’ve installed 2.0.2.19 and it did not help.

What can be done to build the database, so I can run the backup?
I will be happy to send log files that can help.

Zohar

I had a similar issue last year.
Try deleting and recreating using the GUI, and if that doesn’t work then try from the command line:

mono Duplicati.CommandLine.exe repair b2://folder?authid=xyz --dbpath=/newdb.sqlite --verbose

Note the above is using B2 storage, you’ll need to change that to whatever you’re using. Refer to the help on the command line here: duplicati/Duplicati/CommandLine/help.txt at master · duplicati/duplicati · GitHub

–dbpath=/newdb.sqlite can be any database name, but NOT the same as any that already exists. (you can rename the DB later if you wish, and edit the DB used for the job in the job settings)

Hi @zohar, welcome to the forum!

Sorry to hear about the database repair issue you’re having - hopefully @Joe’s GUI based suggestion will work.

If it doesn’t and you decide to try his command line suggestion, you’ll probably want to adjust the sample Linux / Mac based command a little bit to run on Windows. Probably something more like this:
"C:\Program Files\Duplicati 2\Duplicati.CommandLine.exe" repair b2://folder?authid=xyz --dbpath=/newdb.sqlite --verbose

Please let us know how it goes for you!

Did not help :frowning:

I run the command:
Duplicati.CommandLine.exe repair \\ROUTER-MAIN\data\Backups\Duplicati\media --dbpath=E:\Users\Duplicati\Photo.sqlite --verbos --no-encryption

It then run for more than 10hr. from the log file and the task manager it look like the program copied all of my 150G of dblock.zip files to my PC. according to task manager the speed of the copy process was very jumpy and not constant (while doing a copy of these files to the PC I get a constant copy speed).
The network traffic was under “antimailware service execution” and not under duplicati.

when I came home the end of the log was:

Recreate completed, verifying the database consistency

System.IO.InvalidDataException: Found inconsistency in the following files while validating database:
E:\Users\L\Docs\CBT\BDI.doc, actual size 245760, dbsize 0, blocksetid: 4024
E:\Users\L\Docs\CBT\boys questionaires-1.doc, actual size 114688, dbsize 0, blocksetid: 4026
E:\Users\L\Docs\CBT\pynoos developmental model of ptsd(9-24.pdf, actual size 209277, dbsize 0, blocksetid: 4035
E:\Users\L\Docs\CBT\Documents\pynoos developmental model of ptsd(9-24.pdf, actual size 209277, dbsize 0, blocksetid: 4035
E:\Users\L\Docs\CBT\Documents\pynoos developmental model of ptsd.pdf, actual size 209277, dbsize 0, blocksetid: 4035
... and 29933 more. Run repair to fix it.
   at Duplicati.Library.Main.Database.LocalDatabase.VerifyConsistency(IDbTransaction transaction, Int64 blocksize, Int64 hashsize, Boolean verifyfilelists)
   at Duplicati.Library.Main.Operation.RecreateDatabaseHandler.DoRun(LocalDatabase dbparent, Boolean updating, IFilter filter, NumberedFilterFilelistDelegate filelistfilter, BlockVolumePostProcessor blockprocessor)
   at Duplicati.Library.Main.Operation.RecreateDatabaseHandler.Run(String path, IFilter filter, NumberedFilterFilelistDelegate filelistfilter, BlockVolumePostProcessor blockprocessor)
   at Duplicati.Library.Main.Operation.RepairHandler.RunRepairLocal(IFilter filter)
   at Duplicati.Library.Main.Operation.RepairHandler.Run(IFilter filter)
   at Duplicati.Library.Main.Controller.RunAction[T](T result, String[]& paths, IFilter& filter, Action`1 method)
   at Duplicati.Library.Main.Controller.Repair(IFilter filter)
   at Duplicati.CommandLine.Commands.Repair(TextWriter outwriter, Action`1 setup, List`1 args, Dictionary`2 options, IFilter filter)
   at Duplicati.CommandLine.Program.RunCommandLine(TextWriter outwriter, TextWriter errwriter, Action`1 setup, String[] args)

This log is very strange, the backup process i’m trying to repair uses a source folder of "E:\Media"
But the log files talks about files from “E:\Users”. these files are part of my other backup (that finished it’s database repair without any problem). that backup probably has ~29933 files.
The backup files of that second process are stored in another folder \ROUTER-MAIN\data\Backups\Duplicati\users

So whats going on here, What am I’m missing??

I eventually gave up on database recreation. It was taking 21 days to run and then failing for different reasons each time.
Duplicati has issues with recreating databases; It needed to download EVERYTHING from the remote server in my case, and that was 250G.

I logged in to B2 and deleted all the files, then deleted the database file locally and started the upload process again, then I added a new job that uploads the database file to the remote straight after the original backup completes.

I wish duplicati would upload the database file to the remote location automaticly after a backup completes. That way, recreating it locally would, in most cases, no longer be required.

1 Like

I’m having the same issue and tried the suggestions here, but have had no luck and haven’t been able to backup my machine. Any suggestions on how I should proceed?

Welcome to the forum @Cassandra

How “same” is it? The original post, or just way too slow? “Too slow” is fixed in at least some situations, but isn’t in a beta yet. Recent canary releases have been pretty good and possibly approaching a beta, however canary always has the potential for surprises along with a large load of new fixes and features.

If yours is a speed problem and you have a second system not now on Duplicati, you could see if it can do some small direct restore without taking forever. That does a DB recreate, but a “partial, temporary”, which means it can hint at a quick enough full Recreate (or a problem) but it will probably take less time.

For me, beta version had problems. Only upgrade to canary version solved the problem:
v2.0.4.34-2.0.4.34_canary_2019-11-05

Beta version is simply not stable, almost unusable.

I would recommend to backport/cherry pick some bug fixes from canary, which was proven to work better (like slow database rebuild).

This would actually have been a great cherry-pick fix because it’s one line in one file. I think others vary.

There has been process talk, including beta update smoother than v2.0.4.23-2.0.4.23_beta_2019-07-14 came out (version number confused people who thought code was more advanced than lower numbers which were actually canary releases) but I’m not sure the threshold for an urgent update was discussed.

There’s a “help wanted” sign out for a release manager (and pretty much everything), because complex release mechanics take work. Even finishing betas is hard, and we’re stuck now with code one year old due to (IMO) process where Canary keeps taking changes instead of getting very selective before Beta.

Discussion: release cycle basically says what I said as part of a larger conversation. A very long (due to expansion of scope) topic exists in Developer category if you’d really like to get into process discussion.

I think many people want more frequent Beta releases, in general, and it might avoid the backport need. Alternatively, maybe backport is the only way to split the important-and-working fixes from the unknown.

1 Like