Help restore after hard drive crash

Hi, sorry for the late response. I had to wait several days for the little data that i had - cpu and hard drive were very active during this time. I have stopped using the program since this incident. Good luck

Try

After letting the “recreating database” phase run for 5+ days, I went back to GUI and started the restore option all over again and this time it recovered all the files within 24 hours - i guess because it had already created a database.

Yes, this is what happens behind the scenes. If you just go “restore directly”, it will create the database but keep it temporary. If your browser session times out during this time, it will delete the temporary database requiring you to start over, which is why I recommend doing the database rebuild on the commandline if it takes a long time.

1 Like

I have a 2.3TB backup set which works fine. I decided to backup my database and try to break it. I started a backup and killed Duplicati about halfway through, simulating a power failure basically. Tried restarting a backup, it recommends a database repair, which it says is successful. Backup fails again suggesting a repair. Rinse and repeat. I decide to recreate the database. It has been running for 12 days (since the latest release) and the database is still not even recreated to half of its original size. Waiting over a month to recreate the database in not feasible at all.

1 Like

I can confirm this. It seems I will have to wait severel weeks to. Especially the last 10% go painstakingly slow. Seeing that the whole procedure often doesn’t work, and that no one seems to care about it makes me question the whole Project.

Hi I’m new to the backup game, fleeing from CrashPlan, if restoring 200GB takes 5 days with duplicati, would duplicity, restic, or duplicacy be faster or is the limitation the decryption which is limited by the CPU ? I’m a novice, thank you for being patient with my ignorance.

How Much Faster will Duplicati be on a Newer CPU? got in that a bit indirectly, seeming to say that with a modern computer-class (as opposed to, perhaps, embedded-NAS-class) processor, something else will be the limit. I comment that it’s always something (disk, connection, etc.) but you need to find out what…

I think most or all of the programs you mention are of the more current design that tries to deduplicate, so maybe all of them will have to piece files together from blocks. The downside of minimal upload and store can be felt when it comes time to reassemble. If you find that these others solved that, please let us know.

Comments on Duplicacy are here.

Benchmarking different storage providers is even older.

As noted, if you are in a drive-loss situation, Duplicati can get going slowly unless you backed up the local database (which some people do following the main backup). Not sure which scenario you had in mind…

If you’re in a huge hurry for a post-disaster restore, some backup vendors will send a drive with your data.

For my usage, I restored 200 GB with duplicati within 2 hours from GDrive using a 1 Gbps up/down connection. I got slightly faster results if doing this locally (two computers connected via 10 Gbe cards), about 50 GB within 30 mins.

I think as long your CPU supports AES-NI, you are not bottlenecked by CPU but rather something else regardless of the software you use

Thanks for this Ken! One question for you though: you said that you could include a particular folder at the end to restore it faster - but what’s the command for that? For example: if I want to restore a “Documents” folder that was stored on my D: drive what would I type? I tried this command and it didn’t work: --include [D:\Documents]

My entire command line prompt looked like this: “C:\Program Files\Duplicati 2\Duplicati.CommandLine.exe” restore “file://D:\OneDrive\BACKUPXXXXX” --no-local-db --restore-path=“D:\Temp” --include [D:\Documents]

Thank you for the help!

You need a trailing slash on the include, something like:

“C:\Program Files\Duplicati 2\Duplicati.CommandLine.exe” restore “file://D:\OneDrive\BACKUPXXXXX” --no-local-db --restore-path=“D:\Temp\RestoredDocuments” --include="D:\Documents\*"

Sometimes Windows messes up the \* and thinks it should escape the *, but otherwise it works.

1 Like

Awesome - thank you!

I’m sorry to revive this old topic, but I have searched the forums and haven’t seen an update on this, so I decided to write.

Ken, I see you had been thinking some time ago about the possibility of storing the database within a distinct file type so that the entire backup set doesn’t need to be downloaded, decrypted, and parsed in order to find out what’s inside it. I thought your idea was brilliant. Is this what the “dlist” files do? I was wondering if maybe by just downloading the dlist files in the directory and assembling them, Duplicati could reconstruct the database.

I am new to Duplicati and really, really like what I have seen so far. This project is simply amazing! Although, I must say that I can understand the concern of the others here who have expressed the dismay about waiting 5 days or even multiple weeks just to get a database rebuild. This is especially problematic when you live in one of the many, many countries around the world in which electricity is unreliable.

Thank you!

Database rebuilds should not take that long, especially with the most recent beta release. (Although there has been one report I recall where all files still needed to be downloaded )

I have tested database recreations on all of my systems and the longest one takes about 15 minutes.

What version are you using and when was the last time you tested a database recreation?

This is helpful info - thank you! The timeframe I described came from earlier reports within this thread, not from my own testing, and so I wrote to find out if there had been any progress toward making it possible to download just the index information necessary to recreate the database. I figured that including an update for this thread would be helpful for others with a similar question in the future.

So, in Ken’s comment, he spoke of the need for creating a smaller file that the system could use to rebuild the database if the database hadn’t been manually backed up along with the block files. What I was wondering was whether that need had ever been addressed–if so, I’d be curious to know how. E.g., what was the change you referenced as having been implemented in the latest beta?

There has not been any progress on a more efficient database rebuild format, but there has been numerous improvements to the rebuild process.
Part of the very long restore times were due to some data being incorrectly stored/described, causing Duplicati to start the failsafe method of simply downloading every dblock file in an attempt to locate data that was not there.
This issue has been fixed as well as some performance improvements to the database logic.

The dlist files are just a list of what files are in each snapshot (e.g. each backup). Files larger than --block-size have a “description block” that resides in a dblock file. You need both the dlist file and the “description blocks” before you can recreate the database.

To make the database rebuilding faster, these “description blocks” are replicated into the dindex files, which also contains a list of what is inside the dblock files. This makes it possible to recreate the entire local database using only the dlist and dindex files.

In case of errors or missing files, it is also possible to fully ignore the dindex files and only use the dlist and dblock files, but at the cost of downloading all the larger dblock files.

Edit: There is also a tool that performs this non-database dependent restore, meant to work even if some of the backup files have gone missing:

The last one is included in the Duplicati installation, the first one can be downloaded from the link.

Thank you very much for this explanation. Very helpful.

I have tested some restores, both with and without the database or even the configuration files, both across the local network and from a remote storage provider (80 Mbps download speed), and I can testify that restore times were fast. I am using 100 MB blocks. Looks like this is wrapped up nicely.

Thanks again.