How to recreate a backup job

#8

Thank for your reply. I’m trying this out on windows as its easier than messing about in freenas.

I have fresh installed duplicati, created a backup of 100 files and run it successfully.

Then i deleted duplicati, and all in the duplicati appdata folder.

Reinstalled duplicati. This simulates where i am with my “real” backup system.

I have files in a backup, and no “job”.

So, I have ran the restore, and it restores a file successfully, but it doesn’t appear to create any .sqlite files, except “Duplicati-server.sqlite” in app data, which gives me a “corrupt database” error if i use that.

0 Likes

#9

@Joe. Sounds like you’re running the latest release beta. Try @JonMikelV approach …

0 Likes

#10

With the real remote server (B2), I recreated the job, ran it, it complained about unexpected files and i ran the database repair.

It took 2 hrs to get to approx. 90% complete, and its been at 90% now for 19 hrs.

It appears to have crashed. Mono is using 100% cpu.

0 Likes

#11

Hey @Joe Sorry to hear about your troubles. What does the live log show? Have you tried killing the process from the gui and running it from the command line?

Duplicati.CommandLine repair []

0 Likes

#12

Thanks for the reply- I chose stop after current file from the GUI, and nothing happened after 8hrs overnight so I chose stop now. Again nothing happened so I restarted the jail.

There is an error in the log that says “error in worker thread”. Nothing else, unless there is another log somewhere not just the “show log” page in the gui?

I’ve started a “delete and recreate database” process this morning from the GUI and will report back later.

The backup is 186GB so not huge. It’s quite worrying that it can’t recreate the database. I wonder if i’d be able to restore files if required (!).

The devs should probably take this seriously and at least reply here.

0 Likes

#13

Sorry - somehow I missed your reply. You’re probably way past this, but no - you’ll still have to select your Source files since Duplicati by default doesn’t store a “copy” of the backup job anywhere.

As mentioned in the post below, there is a crash log stored in %LOCALAPPDATA%Duplicati\crashlog.txt and you can manually specify a log be created with the --log-file parameter, but otherwise all logs are stored in the .sqlite database files themselves (which is what the GUI is displaying):

I don’t know anything about jails so I don’t know where that crashlog.txt file would be. I do know that with Docker containers if I’m not careful I can end up storing logs and temp files in the container which often runs out of space - is that a possibility with a jail?

0 Likes

#14

I don’t think it can run out of space with a jail; I’ve not limited the amount of disk it can use.
I’ve searched and cannot find “crashlog.txt”, so it’s not been created. Probably trying to write to somewhere it can’t.
It would be useful if logs were written to a “logs” folder next to the .sqlite and suppress_donation_messages.txt file location as this definitely has correct permissions.

Here is an errors i’m seeing:

2 Dec 2017 21:06: Error in worker
System.Threading.ThreadAbortException
at Duplicati.Server.Runner.Run (Duplicati.Server.Runner+IRunnerData data, System.Boolean fromQueue) [0x00712] in :0
at Duplicati.Server.Program+<>c.b__36_4 (Duplicati.Server.Runner+IRunnerData x) [0x00000] in :0
at Duplicati.Library.Utility.WorkerThread`1[Tx].Runner () [0x00191] in <1cb5198b00f34ae59d97ee7fe7a3a16c>:0

2 Dec 2017 21:06: Failed while executing “Backup” with id: 1
System.Threading.ThreadAbortException
at Duplicati.Library.Main.Controller.RunAction[T] (T result, System.String[]& paths, Duplicati.Library.Utility.IFilter& filter, System.Action`1[T] method) [0x0011a] in <118ad25945a24a3991f7b65e7a45ea1e>:0
at Duplicati.Library.Main.Controller.Backup (System.String[] inputsources, Duplicati.Library.Utility.IFilter filter) [0x00068] in <118ad25945a24a3991f7b65e7a45ea1e>:0
at Duplicati.Server.Runner.Run (Duplicati.Server.Runner+IRunnerData data, System.Boolean fromQueue) [0x00454] in :0

2 Dec 2017 21:03: Error in worker
System.Threading.ThreadAbortException
at Duplicati.Server.Runner.Run (Duplicati.Server.Runner+IRunnerData data, System.Boolean fromQueue) [0x00712] in :0
at Duplicati.Server.Program+<>c.b__36_4 (Duplicati.Server.Runner+IRunnerData x) [0x00000] in :0
at Duplicati.Library.Utility.WorkerThread`1[Tx].Runner () [0x00191] in <1cb5198b00f34ae59d97ee7fe7a3a16c>:0

2 Dec 2017 21:03: Failed while executing “Repair” with id: 2
System.Threading.ThreadAbortException
at Duplicati.Library.Main.Controller.RunAction[T] (T result, System.String[]& paths, Duplicati.Library.Utility.IFilter& filter, System.Action1[T] method) [0x0011a] in <118ad25945a24a3991f7b65e7a45ea1e>:0 at Duplicati.Library.Main.Controller.RunAction[T] (T result, Duplicati.Library.Utility.IFilter& filter, System.Action1[T] method) [0x00007] in <118ad25945a24a3991f7b65e7a45ea1e>:0
at Duplicati.Library.Main.Controller.Repair (Duplicati.Library.Utility.IFilter filter) [0x0001a] in <118ad25945a24a3991f7b65e7a45ea1e>:0
at Duplicati.Server.Runner.Run (Duplicati.Server.Runner+IRunnerData data, System.Boolean fromQueue) [0x004a9] in :0

Hopefully that means something to someone and this can be fixed…

I can abort the repair by clicking “stop after current file”, that seems to stop the repair but I don’t want to do that, i need Duplicati to repair the database and proceed with backing up.

the only other option is to delete the remote files, and start again, but unless this bug can be found, Duplicati cannot be relied upon, which is a shame :frowning:

Is there anything in any of the recent canary builds that may fix this?

Im currently running 2.0.2.1_beta_2017-08-01

0 Likes

#15

Unfortunately when it comes to database repair (and potentially mono) errors I defer to others who have some experience with it. :frowning:

1 Like

#16

OK Jon, thanks.

Anyone else know how to fix this?? It’s a fairly major issue!

0 Likes

#17

I appear to be able to restore files from the B2 backup. The “temporary” database creation succeeds and the file is restored.

Manual recreation fails. Hopefully this will help the devs find this and fix it (as i cant do a backup until its fixed).

Any ideas of how i can capture the temp database and use that instead? I couldn’t see where it was being created on disk, certainly not next to the other .sqlite database files in the config folder.

0 Likes

#18

@Joe, have you tried different versions of mono? Or recreating the FreeNAS jail altogether?

0 Likes

#19

I haven’t tried a different version of mono. The issues started when I had to reinstall duplicati into a fresh jail.

I have 2 jobs, the other is the same files to a different location using ftp and the database recreated ok.

Its the B2 database recreate that wont complete.

I may export the jobs and reinstall again to see if thats any better.

0 Likes

#20

I tried a complete jail reinstall and run today, here is the error:

10 Dec 2017 15:25: Failed while executing “Repair” with id: 1

System.Threading.ThreadAbortException
at Duplicati.Library.Main.Controller.RunAction[T] (T result, System.String[]& paths, Duplicati.Library.Utility.IFilter& filter, System.Action1[T] method) [0x0011a] in <118ad25945a24a3991f7b65e7a45ea1e>:0 at Duplicati.Library.Main.Controller.RunAction[T] (T result, Duplicati.Library.Utility.IFilter& filter, System.Action1[T] method) [0x00007] in <118ad25945a24a3991f7b65e7a45ea1e>:0
at Duplicati.Library.Main.Controller.Repair (Duplicati.Library.Utility.IFilter filter) [0x0001a] in <118ad25945a24a3991f7b65e7a45ea1e>:0
at Duplicati.Server.Runner.Run (Duplicati.Server.Runner+IRunnerData data, System.Boolean fromQueue) [0x004a9] in :0

10 Dec 2017 15:25: Error in worker

System.Threading.ThreadAbortException
at <0x00000 + 0x00000>
at Duplicati.Server.Database.Connection.OverwriteAndUpdateDb[T] (System.Data.IDbTransaction transaction, System.String deleteSql, System.Object[] deleteArgs, System.Collections.Generic.IEnumerable1[T] values, System.Boolean updateExisting) [0x00012] in <a6c0c2089b9a44ec9be5057a44f12116>:0 at Duplicati.Server.Database.Connection.RegisterNotification (Duplicati.Server.Serialization.NotificationType type, System.String title, System.String message, System.Exception ex, System.String backupid, System.String action, System.Func3[T1,T2,TResult] conflicthandler) [0x000a8] in :0
at Duplicati.Server.Runner.UpdateMetadataError (Duplicati.Server.Serialization.Interface.IBackup backup, System.Exception ex) [0x000c6] in :0
at Duplicati.Server.Runner.Run (Duplicati.Server.Runner+IRunnerData data, System.Boolean fromQueue) [0x006fb] in :0
at Duplicati.Server.Program+<>c.b__36_4 (Duplicati.Server.Runner+IRunnerData x) [0x00000] in :0
at Duplicati.Library.Utility.WorkerThread`1[Tx].Runner () [0x00191] in <1cb5198b00f34ae59d97ee7fe7a3a16c>:0

0 Likes

#21

I am aware that some users report poor (read: ridiculous) performance for recreating the database. The performance depends on multiple factors, such as disk type, number of files etc.

I have it as a priority item, but there are only so many hours in a day …

Should you need to get the files out, it is possible to restore data without building the database. There is a tool bundled with Duplicati called Duplicati.CommandLine.RecoveryTool.exe which will do it for you. There is also a Python implementation of it, should you want to restore without Mono installed.

All the errors mentions ThreadAbortException which usually happens if you force abort the job. I assume this is what you did.

Can you try to do the database recreate from the commandline?

It should be something like this:

mono Duplicati.CommandLine.exe repair b2://folder?authid=xyz --dbpath=/newdb.sqlite --verbose

This way it is easier to diagnose what goes wrong. You just need to make sure that you point to a non-existing file with --dbpath and Duplicati will build the database in that file.

0 Likes

#22

OK, I’m running from the command line.

It has downloaded a bunch of files, and here is a snippet of the output:

Downloading file (62.53 KB) …
Downloading file (19.26 KB) …
Downloading file (88.67 KB) …
Downloading file (17.97 KB) …
Downloading file (18.09 KB) …
Downloading file (62.43 KB) …
Downloading file (18.23 KB) …
Downloading file (36.87 KB) …
Downloading file (33.76 KB) …
Downloading file (17.97 KB) …
Downloading file (34.00 KB) …
Processing required 5 blocklist volumes: duplicati-b19ac4d4f8fb04310ac683f3d78f1a59e.dblock.zip.aes, duplicati-b2b07d78f62fc421eac6edabe008b7cd0.dblock.zip.aes, duplicati-b40a708025aff44a6a3d6f01009fd83a7.dblock.zip.aes, duplicati-ba896a6d741804561975ccf4fa0673037.dblock.zip.aes, duplicati-bdcadd8c4aab84469bc8e9729f19421d2.dblock.zip.aes
Downloading file (1.61 MB) …
Downloading file (50.08 MB) …
Downloading file (49.99 MB) …
Downloading file (50.02 MB) …
Downloading file (50.07 MB) …
Processing all of the 3536 volumes for blocklists: duplicati-b000a9e62beba4cda9f9de9b4c063b477.dblock.zip.aes, duplicati-b004f9199370a45fe8705d2a1c579c787.dblock.zip.aes, duplicati-b008edb18c42e4510b5844a3377ef

etc… for a load of zip.aes files, then it starts downloading again:

Downloading file (49.99 MB) …
Downloading file (49.98 MB) …
Downloading file (49.96 MB) …
Downloading file (49.93 MB) …
Downloading file (49.97 MB) …
Downloading file (49.99 MB) …
Downloading file (49.97 MB) …
Downloading file (49.99 MB) …
Downloading file (49.95 MB) …
Downloading file (49.91 MB) …
Downloading file (49.99 MB) …
Downloading file (49.99 MB) …
Downloading file (49.99 MB) …
Downloading file (49.99 MB) …
Downloading file (50.00 MB) …
Downloading file (49.94 MB) …
Downloading file (49.91 MB) …
Downloading file (49.99 MB) …
Downloading file (49.91 MB) …
Downloading file (49.94 MB) …

and it’s frozen up here. mono is using varying amounts of CPU from 90 to 100%, randomly.

0 Likes

#23

I stopped the process, and then this was displayed:

mono_os_mutex_lock: pthread_mutex_lock failed with “Invalid argument” (22)
Abort

Not sure if that’s because i stopped the process or not.

EDIT:

I’ve started this again, this time logging to a file, and will leave it running for a few days to see what happens.

0 Likes

#24

OK, I have checked whats in the logs.

It’s currently doing this:

Processing all of the 3536 volumes for blocklists: duplicati-b000a9e62beba4cda9f9de9b4c063b477.dblock.zip.aes, duplicati-b004f9199370a45fe8705d2a1c579c787.dblock.zip.aes
etc.
etc. for a bunch of simular lines, then:

Downloading file (49.99 MB) …
Downloading file (49.98 MB) …
Downloading file (49.96 MB) …
Downloading file (49.93 MB) …
Downloading file (49.97 MB) …

etc… over and over.

It’s still logging, it appears to download a file every 5 mins or so, CPU at 100%.

Above it said “Processing all of the 3536 volumes for blocklists”, does that mean it is downloading 3536 files, each at ~50MB each??? That’s about 176GB, the whole backup.

0 Likes

#25

Yes, it is now downloading everything, because it is missing some information that it expected to find in the dindex files, but failed to find for some reason. Since there is no way of knowing which of the dblock files has that information, it just keeps running through them until it finds what it needs.

0 Likes

#26

I figured that’s what it must be doing. Why does it take such a long time to download and process the files? Its going to take 12 days to download everything at its current speed.

0 Likes

#27

Is there any way to instruct the process to retry files forever?

My internet is not that great at the moment, and the database rebuild keeps aborting due to no internet…

Operation Get with file duplicati-b2923cce79af542119457b250e895f393.dblock.zip.aes attempt 1 of 5 failed with message: Error: NameResolutionFailure => Error: NameResolutionFailure
Downloading file (15.88 MB) …
Operation Get with file duplicati-b2923cce79af542119457b250e895f393.dblock.zip.aes attempt 2 of 5 failed with message: Error: ConnectFailure (Connection timed out) => Error: ConnectFailure (Connection timed out)
Downloading file (15.88 MB) …
Operation Get with file duplicati-b2923cce79af542119457b250e895f393.dblock.zip.aes attempt 3 of 5 failed with message: Error: ConnectFailure (Connection timed out) => Error: ConnectFailure (Connection timed out)
Downloading file (15.88 MB) …
Operation Get with file duplicati-b2923cce79af542119457b250e895f393.dblock.zip.aes attempt 4 of 5 failed with message: Error: ConnectFailure (Connection timed out) => Error: ConnectFailure (Connection timed out)
Downloading file (15.88 MB) …
Operation Get with file duplicati-b2923cce79af542119457b250e895f393.dblock.zip.aes attempt 5 of 5 failed with message: Error: ConnectFailure (Connection timed out) => Error: ConnectFailure (Connection timed out)

0 Likes