Help restore after hard drive crash

The SSD hard drive in my laptop crashed - I put a new SSD and reinstalled operating system and duplicati.
I download the backup files from the cloud and copied them directly to the C drive (about 200GB)
I pointed duplicati gui to restore from local folder (where the files have been downloaded from cloud), gave it the encryption key and began the restoration process.
for about 5 days duplicati has been slowing progressing “Running task: Recreating database” a nd finally one day it must have finished that part and it was back to normal screen WITHOUT having restored the files!!

Pls help - i have very important files stuck in there. What do i do next?

You got to the part where you could list and select the files you wanted to restore?

What version of Duplicati, OS etc.?

There are a few ways to work with it, either with a local database, or without a local database.

If you recreate the local database, you can actually continue making backups after you have restored the files, and you can also make incremental restores etc.

To restore the local database for something as large as what you have, I recommend restoring on the commandline, as that is easier to follow in terms of progress and simple steps.

Simply run this command in a commandline prompt:

"C:\Program Files\Duplicati 2\Duplicati.CommandLine.exe" repair "file://c:\where_data_is" --dbpath="C:\newdb.sqlite" --log-level=profiling --no-encryption

If you have an encrypted copy, remove the --no-encryption part.
Unfortunately, this will could run for 5 days again :cry:, but hopefully some of the wait time is cut when the UI is not running.

After the repair has completed, you will now have a database that can be used for restores, similar to this:

"C:\Program Files\Duplicati 2\Duplicati.CommandLine.exe" restore "file://c:\where_data_is" --dbpath="C:\newdb.sqlite" --restore-path="C:\restore_target" --no-encryption

If you prefer not waiting for 5 days, you can try a partial restore, with something like this:

"C:\Program Files\Duplicati 2\Duplicati.CommandLine.exe" restore "file://c:\where_data_is" --no-local-db --restore-path="C:\restore_target" --no-encryption

This should regenerate just enough of the database to recover the last version of your files. If you have a particular folder that you want restored “faster”, you can add it at the end of the command, and the database should only be built for the files you add (wildcards also work).

Finally, there is the option to no use a database, which may or may not be faster. To do this, use the recovery tool as explained here:

Since you have already downloaded the files, you may be able to skip step 1, unless they are encrypted.


This might be a good start for #howto

Also, since the archive files are already downloaded, how careful does one need to be to not continue backups into the local file set instead of the remote one?

This sounds like a good reason to have a separate backup task that backs up just the database from the main backup task. Wouldn’t that help in cases like this?

1 Like

Yes and no.

Yes in that the database restored from your backup would speed up your data restore, but no in that the actual restore would be more difficult due to the additional steps of:

  1. Recreating the database backup job
  2. Restoring the database
  3. Pointing your recreated data backup job (which would have a new database name) to the restored database

What would work really well for a scenario like this would be if the above steps could be automated as part of the Duplicati installer. “Oh crap, my drive died - that’s OK, I’ll just install Duplicati which will ask if I need to restore from an existing backup with (or without) a backed up database.”

This would also work well for the Restore as fast+easy as possible (on new machine)... with db? How? scenario of “I’ve been kidnapped and the ransom requires my backups to be restored” (less likely, but also less depressing than “I’m dead and my tech averse family needs their data back”).

Yeah I figured those additional steps would be needed. But it doesn’t seem so bad. I’d much rather do that than wait possibly DAYS for a local database to be rebuilt! I am thinking of doing local db backups for my larger backups (one is 375GB and another 520GB).

Since having both a local AND remote backup is “the best” solution, it would make sense to leverage each other’s destination. In other words:

  1. Have remote backup run and include local backup’s database
  2. Have local backup run and include remote backup’s database

Of course if your local back works for your restore, then you don’t really need the remote backup database… but at least this way you’re better covered if the local backup is only partially restorable due to something silly like, oh I don’t know - let’s say your dog’s giant wagging tail knocked the USB drive on the floor. :slight_smile:

I’m using on Windows 10 64 with SSD intel i3 cpu

Here is the procedure i had followed:

I installed Duplicati, went to restore, chose “directly restore from configuration”, pointed it to a folder on the drive where i saved the filed from cloud, typed in the encryption key, and after it started the recreating database - fetching path information.

After some time asked me to choose files that i want to restore - i selected the important files, Chose restore and i get "recreating database - building partial temporary database…

It made progress very slowly but eventually after 5 it finally came to an end but it never restored any files - just went back to home screen!

So the 5 days spent recreating database was a waste? can i not use that to continue to restore the files? im just scared to start a new process and loose what has already been done.

I’m not worried about doing any backups at the moment, just need to restore the files. Am i doing something wrong?

Sorry for the delay - hopefully you’ve already gotten this resolved, but in case you haven’t…

I tried a “Directly restore from configuration” process in and noticed that in step 2 (Encryption) of the restore some of my Advanced Options had too many dashes in front of them. Did you happen to notice this at all during your process?


Part of the difficulties described above comes from using Duplicati itself to back up the DBs. Since there is no need to save versions (I think), would not it be better to simply use another tool to save a copy of the [\AppData\Local\Duplicati] folder? Something like Freefilesync, for example? The execution of freefilesync could be scheduled for soon after Duplicati.

Thank you all for the support - I managed to get the files.
Here is what i did:

After letting the “recreating database” phase run for 5+ days, I went back to GUI and started the restore option all over again and this time it recovered all the files within 24 hours - i guess because it had already created a database.

In my opinion, this program is not ready to be used as a backup program due to lack of features and very time consuming restore process.

Thanks for your comment - I guess that’s part of why Duplicati 2 is still in beta and canary versions. And I agree that there are definitely some scenarios (such as large backup sets) where using Duplicati is harder than we’d like to be, but with input from users like you we can hopefully get those experiences improved.

The time it takes to rebuild the database varies. But I agree that it is not feasible to wait 5 days. I will think again and see if I can come up with some scheme that makes it faster to restore a database.

Yes, this is what happens behind the scenes. If you just go “restore directly”, it will create the database but keep it temporary. If your browser session times out during this time, it will delete the temporary database requiring you to start over, which is why I recommend doing the database rebuild on the commandline if it takes a long time.

I have been doing this for a couple weeks now and it seems to be working well.

I created the separate task in Duplicati to back up the primary task’s database, exported the command line, and saved to a batch file/script. Then the post-backup option was configured on the primary backup task to trigger the script.

On the Windows platform you have to double up the percentage characters for the command line to work correctly.

I would prefer an option to trigger another backup task via the web interface process instead of running the separate command line/batch file to be honest. This way I could see the progress and stats of the database backup task right in the web GUI.

I wonder how “off” things would be if you just used a pre or post script run to copy (or compress) the .sqlite DB to a different file that would be included in the backup.

The only drawback I see is you’d be backing up a “one version stale” database which would likely require a repair (but not a fill recreate) if it were ever needed.

There has been a few requests for something like this. I think it might make sense to add an additional file type that contains the non-redundant data for quick database recreate at the expense of more storage and upload data.

Would much space be saved if only 1 or 2 versions were kept in a “recovery” database?

Not sure, but I would probably store it in a different way, such that the database structure is more clear (i.e. just paste records into the database).

I am not sure that versions contribute to much, but I would need to investigate.

Hello, I am experiencing the same issue. I tried your step of repairing. I got no print outs except for it asking for my encryption password. I have gotten no errors. The CPU is still running at 100%. It has been running for 4 weeks now. Is this normal? What should I do. I still have not been able to recover my backup since January.

1 Like