Restore testing with weird results

Hello,

First of all. I’m very happy with Duplicati and want to thank all contributors for making this awesome software!

I recently did a restore-test and I wanted to recover a bunch of photo’s just as a test.
(I have a remote storage at Jottacloud.)

I did a restore and, although it took a while, I got al my files. Still I received a bunch of error message about unable to find a specific file in the remote storage location. Should I be worried about this?

I mean, I got all my files back (checked for disk size and number of files, didn’t check each photo individually), but the errors are making me uncomfortable and concerned about the reliability.

Sadly I can’t show the exact error, because I reinstalled my PC and I’m currently recreating the database because of the reinstall.

After this has finished I will retry a restore and see what happens.

Does anyone also experience a similar issue?

Thanks in advance for the help.

Welcome to the forum @Kliko

That might not be a reliable test. Duplicati does do a verification of every restored file, I believe.
Errors on that should be cause for worry, and you should check whatever files were mentioned.
Actual error messages will help. Right now this is a guessing game.

How old is the backup, and what Duplicati do you use, e.g. Canary, or 2.0.4.5/23, 2.0.5.1 Beta?

Thank you for your quick reply.

Ok, good to know that my check is not reliable!
After I have rebuilt the database I will try a recovery and check for errors

It didn’t specificly gave errors to a file/photo. But to a back-up block (.aes).
Otherwise I would have checked the specific files that Duplicati threw an error on.

I’m running the latest beta version of Duplicati (I never use Canary) and I run a back-up daily.

I’ll keep you posted!

Just tried a restore. Version of today. Rebuilding the database had just finished. No Duplicati updates in between rebuilding database and restore.

I have the following results.

Version:
“Version”: “2.0.5.1 (2.0.5.1_beta_2020-01-18)”,

On screen I get:
“Your files and folders have been restored successfully.”
I think this is weird and shouldn’t be the case. Right? :face_with_raised_eyebrow:

Although I also get 3 errors:
2020-02-12 17:17:47 +01 - [Error-Duplicati.Library.Main.Operation.RestoreHandler-PatchingFailed]: Failed to patch with remote file: “duplicati-b1c7c8ecd19de49699a13ee8ddf45d4e5.dblock.zip.aes”, message: Failed to decrypt data (invalid passphrase?): Invalid password or corrupted data

The other 2 errors are the same, only different files.

I can provide the complete log if needed.

I check all files (photos) and I didn’t miss any or found corrupted ones. Even hidden files like thumbs.db and Thumbnail.info are there.
Just to be sure I again checked the amount of files and the file size on disk and that was also exact the same. (although this is not a reliable check :slight_smile: )

Any thoughts?

Seems like incorrect status. I don’t know how the ending status passes to web UI, but message is:

Unable to restore file to program folder #2003 is an open issue with the bad status as one part of it.

Using the Command line tools from within the Graphical User Interface to run The AFFECTED command can tell you what those files mean in terms of damage to the backup. You’d set new Commandline arguments to duplicati-b1c7c8ecd19de49699a13ee8ddf45d4e5.dblock.zip.aes etc.

Inventory of files that are going to be corrupted shows the output from a lab demonstration that intentionally corrupts files. In your case, of course, files are not supposed to be corrupted, but it sometimes happens, especially before 2.0.5.1 and if you had used throttling to limit upload rate.

Are they all dblock files? Those are hard to replace. The dlist and dindex files can be recreated.
You can see what bad files look like, e.g. creation date. I’d feel worse if 2.0.5.1 made a bad one.
What storage type is backup on? Sometimes they fail. You could also download and try to open.

AES Crypt can decrypt, or if you prefer CLI without an install, use Duplicati SharpAESCrypt.exe.

Thank you for your elaborate answer. I will try to investigate further hopefully somewhere this weekend.
Duplicati is currently running a new back-up. So can’t do any other tests yet.

What I can answer right now is that I’ve never used upload throttling.

That gets one suspect out of the way. A quick glance at Jottacloud also looked clean (unlike others). Actually studying the damage could be done with a look in the database plus a close view of the file.
Database has expected length and hash in RemoteVolumeTable. Hash work is a bit more indirect…

–upload-verification-file can avoid viewing the DB, e.g. in DB Browser for SQLite, if that’s any easier.