Restore testing with weird results

Hello,

First of all. I’m very happy with Duplicati and want to thank all contributors for making this awesome software!

I recently did a restore-test and I wanted to recover a bunch of photo’s just as a test.
(I have a remote storage at Jottacloud.)

I did a restore and, although it took a while, I got al my files. Still I received a bunch of error message about unable to find a specific file in the remote storage location. Should I be worried about this?

I mean, I got all my files back (checked for disk size and number of files, didn’t check each photo individually), but the errors are making me uncomfortable and concerned about the reliability.

Sadly I can’t show the exact error, because I reinstalled my PC and I’m currently recreating the database because of the reinstall.

After this has finished I will retry a restore and see what happens.

Does anyone also experience a similar issue?

Thanks in advance for the help.

Welcome to the forum @Kliko

That might not be a reliable test. Duplicati does do a verification of every restored file, I believe.
Errors on that should be cause for worry, and you should check whatever files were mentioned.
Actual error messages will help. Right now this is a guessing game.

How old is the backup, and what Duplicati do you use, e.g. Canary, or 2.0.4.5/23, 2.0.5.1 Beta?

Thank you for your quick reply.

Ok, good to know that my check is not reliable!
After I have rebuilt the database I will try a recovery and check for errors

It didn’t specificly gave errors to a file/photo. But to a back-up block (.aes).
Otherwise I would have checked the specific files that Duplicati threw an error on.

I’m running the latest beta version of Duplicati (I never use Canary) and I run a back-up daily.

I’ll keep you posted!

Just tried a restore. Version of today. Rebuilding the database had just finished. No Duplicati updates in between rebuilding database and restore.

I have the following results.

Version:
“Version”: “2.0.5.1 (2.0.5.1_beta_2020-01-18)”,

On screen I get:
“Your files and folders have been restored successfully.”
I think this is weird and shouldn’t be the case. Right? :face_with_raised_eyebrow:

Although I also get 3 errors:
2020-02-12 17:17:47 +01 - [Error-Duplicati.Library.Main.Operation.RestoreHandler-PatchingFailed]: Failed to patch with remote file: “duplicati-b1c7c8ecd19de49699a13ee8ddf45d4e5.dblock.zip.aes”, message: Failed to decrypt data (invalid passphrase?): Invalid password or corrupted data

The other 2 errors are the same, only different files.

I can provide the complete log if needed.

I check all files (photos) and I didn’t miss any or found corrupted ones. Even hidden files like thumbs.db and Thumbnail.info are there.
Just to be sure I again checked the amount of files and the file size on disk and that was also exact the same. (although this is not a reliable check :slight_smile: )

Any thoughts?

Seems like incorrect status. I don’t know how the ending status passes to web UI, but message is:

Unable to restore file to program folder #2003 is an open issue with the bad status as one part of it.

Using the Command line tools from within the Graphical User Interface to run The AFFECTED command can tell you what those files mean in terms of damage to the backup. You’d set new Commandline arguments to duplicati-b1c7c8ecd19de49699a13ee8ddf45d4e5.dblock.zip.aes etc.

Inventory of files that are going to be corrupted shows the output from a lab demonstration that intentionally corrupts files. In your case, of course, files are not supposed to be corrupted, but it sometimes happens, especially before 2.0.5.1 and if you had used throttling to limit upload rate.

Are they all dblock files? Those are hard to replace. The dlist and dindex files can be recreated.
You can see what bad files look like, e.g. creation date. I’d feel worse if 2.0.5.1 made a bad one.
What storage type is backup on? Sometimes they fail. You could also download and try to open.

AES Crypt can decrypt, or if you prefer CLI without an install, use Duplicati SharpAESCrypt.exe.

Thank you for your elaborate answer. I will try to investigate further hopefully somewhere this weekend.
Duplicati is currently running a new back-up. So can’t do any other tests yet.

What I can answer right now is that I’ve never used upload throttling.

That gets one suspect out of the way. A quick glance at Jottacloud also looked clean (unlike others). Actually studying the damage could be done with a look in the database plus a close view of the file.
Database has expected length and hash in RemoteVolumeTable. Hash work is a bit more indirect…

–upload-verification-file can avoid viewing the DB, e.g. in DB Browser for SQLite, if that’s any easier.

Hello,

Sorry for my late reply. I wasn’t much at home lately so didn’t have much time to do some testing.

I’ve created a new back-up from scratch with version 2.0.5.1. So there can be no issue from a previous version of Duplicati.
I did a back-up of bunch of data and performed a partial restore of 1 folder containing a bunch of videos (Pixar shorts). I got again a bunch of error messages like this one:

2020-02-27 14:43:37 +01 - [Error-Duplicati.Library.Main.Operation.RestoreHandler-PatchingFailed]: Failed to patch with remote file: "duplicati-b444164b12f344a6ca10e8b37b7d31960.dblock.zip.aes", message: Failed to decrypt data (invalid passphrase?): Invalid password or corrupted data

Then I tried to do the following with a command line execution like you mentioned:
Command: affected
Target URl: didn’t change this, seems ok
Command line arguments: Removed all locations that we there by default and changed it to duplicati-b444164b12f344a6ca10e8b37b7d31960.dblock.zip.aes
The rest I left the way it was on default.

I got the following result from commandline:

Finished!

The following filesets are affected:
0 : 27/02/2020 02:10:47
1 : 26/02/2020 02:03:01
2 : 25/02/2020 02:03:08
3 : 24/02/2020 02:03:09
4 : 23/02/2020 02:03:02
5 : 22/02/2020 02:03:08
6 : 21/02/2020 02:03:21
7 : 20/02/2020 02:03:15
8 : 19/02/2020 02:03:08
9 : 18/02/2020 02:03:17
10 : 17/02/2020 02:03:11
11 : 16/02/2020 02:03:05
12 : 15/02/2020 03:59:38
13 : 15/02/2020 03:29:08
14 : 13/02/2020 16:23:30

A total of 5 file(s) are affected:
G:\ Folder1\Folder1_1\File1 .idx
G:\ Folder1\Folder1_1\File2 .sub
G:\ Folder1\Folder1_2\File3 .mkv
G:\Pixar Shorts
G:\Pixar Shorts\ PixarShotVideo1 .mkv

The following related log messages were found:
15/02/2020 01:39:16: {“Size”:52396413,“Hash”:" HashCodeHere =“}
27/02/2020 14:06:12: {“RestoredFiles”:19,“SizeOfRestoredFiles”:21200378619,“RestoredFolders”:0,“RestoredSymlinks”:0,” …

Return code: 0

I’ve made some parts of the log anonymous (in bold) like folder- and filenames and the hash. (Not sure if this is sensitive data)

Looking at the result it seems that multiple files are affected by the corruption of this data block. I only restored the Pixar Shorts-folder. So only PixarShotVideo1.mkv should be affected. I haven’t checked the full video yet, but at least the file plays and I skipped through it without issues.

Return code: 0 is typically no error, right? At least, that’s what I always do in my code. :slight_smile:

update:
While checking some of the other .aes blocks that threw an error, I noticed that with the affected command I always see that there are multiple files affected. Some videos are 1GB size, so you would expect blocks (block size set to 50MB) only affecting 1 file (if it would be a ‘random’ issue).

But I only get errors on blocks that affect multiple files. Not sure if Duplicati always throws in parts of multiple files in 1 block, then it would explain my observation. But in case it is not, this information might be interesting for this potential bug.

The exact timeline isn’t clear, but if the from-scratch backup was made shortly before post on Feb 27, having backup mention Feb 13 files and Feb 15 logs is odd. That wouldn’t happen on fully fresh start.

What’s Jottacloud’s time for duplicati-b444164b12f344a6ca10e8b37b7d31960.dblock.zip.aes?
How far back do the backup dates go on the Restore from selector before you continue with Restore?

Other debug hints were given earlier but are somewhat difficult. Easiest might be to use Jottacloud UI. Duplicati Destination screen takes a Folder path. If Jottacloud has a web UI, how do its files look?

It certainly can, but it doesn’t have to. By default, blocks are 100KB and are packed into 50MB dblock files which then get repacked from time to time as blocks inside them are no longer used by a backup.

Do you really mean –blocksize, or do you mean you took default 50MB –dblock-size. They’re different.

Choosing sizes in Duplicati

Having said that, it does seem odd that a large video would get its blocks in multiple dblock files unless editing or tagging is happening that might be doing small changes (only changes are uploaded). This is also possible to examine in the database, but that would mean going into the database to look at tables.

EDIT:

“get its blocks in multiple dblock files” isn’t quite the right way to say that. It’s more like I’d have thought a large video would crank out dblock files just for the video, leaving no room for other files to put blocks.

Hello,

The exact timeline isn’t clear, but if the from-scratch backup was made shortly before post on Feb 27, having backup mention Feb 13 files and Feb 15 logs is odd. That wouldn’t happen on fully fresh start.

Thanks for the quick reply.
Maybe I wasn’t clear enough about the “fresh backup”.
The first back-up was indeed on feb 13.
Then it was a fresh back-up, and I don’t recall updating Duplicati since then.

What’s Jottacloud’s time for duplicati-b444164b12f344a6ca10e8b37b7d31960.dblock.zip.aes ?

Feb 15 2020 02:39 (no previous versions)

How far back do the backup dates go on the Restore from selector before you continue with Restore?

Feb 15 2020 16:23 (Back-up number 15)

Other debug hints were given earlier but are somewhat difficult. Easiest might be to use Jottacloud UI.

I can find the file. It is 50MB and hasn’t been changed since the date I mentioned before in this post.

Do you really mean –blocksize, or do you mean you took default 50MB –dblock-size. They’re different.

I mean the block size mentioned in the “general settings” of the configuration of the back-up. So I guess this is the dblock size. I left these values on default.

“new backup-up from scratch” wasn’t what it sounded like to me, and Verifying backend files on Feb 27 decided to test a Feb 15 file, and found it unexpected. If you like, verification sample size can be raised with –backup-test-samples either by a little (to see what else shows up) or a lot (to test everything now).

There’s not currently a way to check files against an expected hash value other than downloading them. One thing that can also hide problems is –no-local-blocks default favoring restore speed over good test.

Your restore test seems to have stated it restored 21 GB of files, but it might not have downloaded all… Would Jottacloud and your Internet connection have been able to manage in the time that restore took?

If you like, you could also start a truly new backup from scratch, i.e. Export (advised anyway for safety), Import, pick a different folder (important) on Destination, and let it go. For a backup as large as this one, starting with a subset of the full backup might be a better plan, otherwise it might tie things up awhile…

It’s kind of your call on what you’d prefer. Other debugging is certainly possible. More ideas were given.
Please also look at Commandline and home page throttle button to be extra certain there’s none in use. People have been surprised before, and that’s easier to explain than Jottacloud, which I don’t even use.