Help restore after hard drive crash

Thank you all for the support - I managed to get the files.
Here is what i did:

After letting the “recreating database” phase run for 5+ days, I went back to GUI and started the restore option all over again and this time it recovered all the files within 24 hours - i guess because it had already created a database.

In my opinion, this program is not ready to be used as a backup program due to lack of features and very time consuming restore process.

Thanks for your comment - I guess that’s part of why Duplicati 2 is still in beta and canary versions. And I agree that there are definitely some scenarios (such as large backup sets) where using Duplicati is harder than we’d like to be, but with input from users like you we can hopefully get those experiences improved.

The time it takes to rebuild the database varies. But I agree that it is not feasible to wait 5 days. I will think again and see if I can come up with some scheme that makes it faster to restore a database.

Yes, this is what happens behind the scenes. If you just go “restore directly”, it will create the database but keep it temporary. If your browser session times out during this time, it will delete the temporary database requiring you to start over, which is why I recommend doing the database rebuild on the commandline if it takes a long time.

I have been doing this for a couple weeks now and it seems to be working well.

I created the separate task in Duplicati to back up the primary task’s database, exported the command line, and saved to a batch file/script. Then the post-backup option was configured on the primary backup task to trigger the script.

On the Windows platform you have to double up the percentage characters for the command line to work correctly.

I would prefer an option to trigger another backup task via the web interface process instead of running the separate command line/batch file to be honest. This way I could see the progress and stats of the database backup task right in the web GUI.

I wonder how “off” things would be if you just used a pre or post script run to copy (or compress) the .sqlite DB to a different file that would be included in the backup.

The only drawback I see is you’d be backing up a “one version stale” database which would likely require a repair (but not a fill recreate) if it were ever needed.

There has been a few requests for something like this. I think it might make sense to add an additional file type that contains the non-redundant data for quick database recreate at the expense of more storage and upload data.

Would much space be saved if only 1 or 2 versions were kept in a “recovery” database?

Not sure, but I would probably store it in a different way, such that the database structure is more clear (i.e. just paste records into the database).

I am not sure that versions contribute to much, but I would need to investigate.

Hello, I am experiencing the same issue. I tried your step of repairing. I got no print outs except for it asking for my encryption password. I have gotten no errors. The CPU is still running at 100%. It has been running for 4 weeks now. Is this normal? What should I do. I still have not been able to recover my backup since January.

1 Like

Hi, sorry for the late response. I had to wait several days for the little data that i had - cpu and hard drive were very active during this time. I have stopped using the program since this incident. Good luck

Try

After letting the “recreating database” phase run for 5+ days, I went back to GUI and started the restore option all over again and this time it recovered all the files within 24 hours - i guess because it had already created a database.

Yes, this is what happens behind the scenes. If you just go “restore directly”, it will create the database but keep it temporary. If your browser session times out during this time, it will delete the temporary database requiring you to start over, which is why I recommend doing the database rebuild on the commandline if it takes a long time.

1 Like

I have a 2.3TB backup set which works fine. I decided to backup my database and try to break it. I started a backup and killed Duplicati about halfway through, simulating a power failure basically. Tried restarting a backup, it recommends a database repair, which it says is successful. Backup fails again suggesting a repair. Rinse and repeat. I decide to recreate the database. It has been running for 12 days (since the latest release) and the database is still not even recreated to half of its original size. Waiting over a month to recreate the database in not feasible at all.

1 Like

I can confirm this. It seems I will have to wait severel weeks to. Especially the last 10% go painstakingly slow. Seeing that the whole procedure often doesn’t work, and that no one seems to care about it makes me question the whole Project.

Hi I’m new to the backup game, fleeing from CrashPlan, if restoring 200GB takes 5 days with duplicati, would duplicity, restic, or duplicacy be faster or is the limitation the decryption which is limited by the CPU ? I’m a novice, thank you for being patient with my ignorance.

How Much Faster will Duplicati be on a Newer CPU? got in that a bit indirectly, seeming to say that with a modern computer-class (as opposed to, perhaps, embedded-NAS-class) processor, something else will be the limit. I comment that it’s always something (disk, connection, etc.) but you need to find out what…

I think most or all of the programs you mention are of the more current design that tries to deduplicate, so maybe all of them will have to piece files together from blocks. The downside of minimal upload and store can be felt when it comes time to reassemble. If you find that these others solved that, please let us know.

Comments on Duplicacy are here.

Benchmarking different storage providers is even older.

As noted, if you are in a drive-loss situation, Duplicati can get going slowly unless you backed up the local database (which some people do following the main backup). Not sure which scenario you had in mind…

If you’re in a huge hurry for a post-disaster restore, some backup vendors will send a drive with your data.

For my usage, I restored 200 GB with duplicati within 2 hours from GDrive using a 1 Gbps up/down connection. I got slightly faster results if doing this locally (two computers connected via 10 Gbe cards), about 50 GB within 30 mins.

I think as long your CPU supports AES-NI, you are not bottlenecked by CPU but rather something else regardless of the software you use

Thanks for this Ken! One question for you though: you said that you could include a particular folder at the end to restore it faster - but what’s the command for that? For example: if I want to restore a “Documents” folder that was stored on my D: drive what would I type? I tried this command and it didn’t work: --include [D:\Documents]

My entire command line prompt looked like this: “C:\Program Files\Duplicati 2\Duplicati.CommandLine.exe” restore “file://D:\OneDrive\BACKUPXXXXX” --no-local-db --restore-path=“D:\Temp” --include [D:\Documents]

Thank you for the help!

You need a trailing slash on the include, something like:

“C:\Program Files\Duplicati 2\Duplicati.CommandLine.exe” restore “file://D:\OneDrive\BACKUPXXXXX” --no-local-db --restore-path=“D:\Temp\RestoredDocuments” --include="D:\Documents\*"

Sometimes Windows messes up the \* and thinks it should escape the *, but otherwise it works.

1 Like

Awesome - thank you!

I’m sorry to revive this old topic, but I have searched the forums and haven’t seen an update on this, so I decided to write.

Ken, I see you had been thinking some time ago about the possibility of storing the database within a distinct file type so that the entire backup set doesn’t need to be downloaded, decrypted, and parsed in order to find out what’s inside it. I thought your idea was brilliant. Is this what the “dlist” files do? I was wondering if maybe by just downloading the dlist files in the directory and assembling them, Duplicati could reconstruct the database.

I am new to Duplicati and really, really like what I have seen so far. This project is simply amazing! Although, I must say that I can understand the concern of the others here who have expressed the dismay about waiting 5 days or even multiple weeks just to get a database rebuild. This is especially problematic when you live in one of the many, many countries around the world in which electricity is unreliable.

Thank you!

Database rebuilds should not take that long, especially with the most recent beta release. (Although there has been one report I recall where all files still needed to be downloaded )

I have tested database recreations on all of my systems and the longest one takes about 15 minutes.

What version are you using and when was the last time you tested a database recreation?