Expected recovery time 2.5 years

Disaster recovery of large backups: The original hard drive is destroyed, along with the duplicati databases. There are several backups of different directories. The largest backup is 1.5 TB in size.

I have made several attempts to restore the largest backup. I have now set up a separate machine just for recovery. The backup files are now on a local SSD. I started the recovery from the command line. Below is an extract from the log file.

Duplicati takes about 40 minutes for each dblock. With nearly 31000 dblock files that would be about 2.5 years, until this step is completed. I hope my hardware will last that long.

Is there any way to speed up the process?

If I have to restart the computer during this time, is there a way to continue the recovery or does it start all over again?


2023-10-20 05:40:52 +02 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Get - Completed: duplicati-b4844e3b283af402dbafd2bebb7f72352.dblock.zip.aes (49,97 MB)
2023-10-20 06:17:56 +02 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Get - Started: duplicati-b3014411e1a42418d885639d7dad35431.dblock.zip.aes (49,99 MB)
2023-10-20 06:17:56 +02 - [Verbose-Duplicati.Library.Main.Operation.RecreateDatabaseHandler-ProcessingBlocklistVolumes]: Pass 3 of 3, processing blocklist volume 18 of 30986
2023-10-20 06:18:04 +02 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Get - Completed: duplicati-b3014411e1a42418d885639d7dad35431.dblock.zip.aes (49,99 MB)
2023-10-20 06:55:29 +02 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Get - Started: duplicati-bebfe15e488f44ab7aa51b23228c78702.dblock.zip.aes (49,96 MB)
2023-10-20 06:55:29 +02 - [Verbose-Duplicati.Library.Main.Operation.RecreateDatabaseHandler-ProcessingBlocklistVolumes]: Pass 3 of 3, processing blocklist volume 19 of 30986
2023-10-20 06:55:37 +02 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Get - Completed: duplicati-bebfe15e488f44ab7aa51b23228c78702.dblock.zip.aes (49,96 MB)
2023-10-20 07:33:03 +02 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Get - Started: duplicati-b37a20674288e4c7f9a4b088104778032.dblock.zip.aes (49,91 MB)
2023-10-20 07:33:03 +02 - [Verbose-Duplicati.Library.Main.Operation.RecreateDatabaseHandler-ProcessingBlocklistVolumes]: Pass 3 of 3, processing blocklist volume 20 of 30986
2023-10-20 07:33:10 +02 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Get - Completed: duplicati-b37a20674288e4c7f9a4b088104778032.dblock.zip.aes (49,91 MB)


There has been some work done to speed up this kind of situation. It has not yet landed in a released version, but you can download a temporary build here that you can install over existing Duplicati:

It will not make running the huge query any faster, but it should run it only when needed - it is needed when you have badly damaged blocks on your backend, so if you have only one, it will run one time only.

If you don’t have and don’t want to have a Github account, I can upload the build at a place where you will be able to get it without a Github login. These temporary builds expire in 3 days.

1 Like

Thanks for your fast solution. I will install it via github.

Ugh. The dreaded pass 3 of database recreation. Your progress bar is probably at 90% at this point, however remaining 10% is an exhaustive search for some data that hasn’t (and might not) be found.

We’ve been trying to find the cause of this, short of destination damage that we have no control over.

There’s also a Python script to quickly try to predict if you’ll have missing data after that long search. You’d need to grab the dlist and dindex files to get decrypted copies, if the backup is now encrypted.
This script just mimics the first 70% of a regular database recreate which starts with dlist and dindex.

Ideally they are enough information to find everything, but if not, there are 3 passes of dblock search where pass 3 is the big headache. The new build will probably get some fast block times, some slow. Totally different enhanced approach might work down a small missing block list, but that’s more SQL.

Duplicati.CommandLine.RecoveryTool.exe is useful if damage is bad, but I don’t know that it’s faster:

This tool can be used in very specific situations, where you have to restore data from a corrupted backup. The procedure for recovering from this scenario is covered in Disaster Recovery.

This one is more robust partly because it ignores the index files (so bad ones don’t matter), working directly from the dblock files (where the data actually lives) to build its own index then do the restore.

What, if anything, is looking busy or idle according to performance monitors. Possibly going down an additional path at the same time would not have much impact on the pass 3 performance, which will probably need a CPU core’s worth of use for SQLite, but might leave some other CPU cores too idle.

What OS is this? I know Windows performance tools better. There’s also a Duplicati option to employ more memory cache in SQLite, which keeps it from having to thrash as much in the database, which should also be on SDD (only backup files were mentioned, but they’re not the only possible high use).

1 Like

Update: I’m currently trying to recover with the duplicati version of @gpatel-fr . Pass 3 runs fine: almost 500 block files per hour as long as they are undamaged, instead of 40 minutes for each block file. This is huge progress. It will take another 2 or 3 days to finish pass 3 and see if this method is successful.

It is installed on Linux (ubuntu). I learned with htop that duplicati starts with a lot of parallel processes. Pass 3 only runs a single process and all other CPU cores are idle. So it is not faster on a separate machine.

1 Like

Yes, it should end faster that’s sure. Note that this problem occurs when the backend is badly damaged, so it may be necessary to finish the work by removing (purging) the damaged blocks. This has been shown before on this forum, the poster had to purge before getting to restore. But at least the database is recreated then, so repair functions are available.

1 Like