Failed to process dindex

Hi folks,

I just noticed (obviously a couple of weeks late) that I had an error on the random validation of a dindex file that was created back in 2021 as part of my initial backup (so it’s really big!). Below is the info from the backup’s log. About → Show Log only goes back one day.

I was able to download that file and decrypt it just fine.

Given this error, is downloading/decrypting enough to be comfortable that things are ok and it was just some transient transfer error or something like that?

Is there any way to manually validate that particular dindex file again? I found ways to do sample validations from a time (although the format of the time parameter doesn’t seem to be defined anywhere), or a backup?

Thanks
L

  • 2024-05-30 18:57:53 +00 - [Error-Duplicati.Library.Main.Operation.TestHandler-FailedToProcessFile]: Failed to process file duplicati-i131e55c650cb44d9b158e0a9b1236af2.dindex.zip.aes
    "Verifications": [
      {
        "Key": "duplicati-20240530T180002Z.dlist.zip.aes",
        "Value": []
      },
      {
        "Key": "duplicati-i131e55c650cb44d9b158e0a9b1236af2.dindex.zip.aes",
        "Value": [
          {
            "Key": "Error",
            "Value": "We encountered an internal error.  Please retry the operation again later."
          }
        ]
      },
      {
        "Key": "duplicati-ba5b96fce6660462b8acf3edf91f9e69c.dblock.zip.aes",
        "Value": []
      }
    ],

It shouldn’t be really big unless its associated dblock is really big because Remote volume size got raised. Default is 50 MB, which limits the dblock size size, which limits dindex. Do you have a big dblock file there?

After decrypt, you can look in the .zip file to see what dblock name is in the vol folder, then check file’s size.

Probably, and looking up the message, this looks like a Wasabi (which you use?) error message some see. Wasabi support might be able to comment on whether it’s of long-term concern. They might ask what you used for download, and I don’t know what you used for your download-and-decrypt. Guessing not Duplicati, although Duplicati.CommandLine.BackendTool.exe could have done a download of the specific desired file.

It’s pretty technical. You could probably edit the database Remotevolume VerificationCount to 0 to persuade the semi-random file chooser to choose that one again. I take it you don’t want to also validate all other files.

You might be able to validate all other files except for the dblock files, if this new variation works for Test too:

  --full-remote-verification (Enumeration): Activates in-depth verification of
    files
    After a backup is completed, some (dblock, dindex, dlist) files from the
    remote backend are selected for verification. Use this option to turn on
    full verification, which will decrypt the files and examine the insides
    of each volume, instead of simply verifying the external hash, If the
    option --no-backend-verification is set, no remote files are verified.
    This option is automatically set when then verification is performed
    directly. ListAndIndexes is like True but only dlist and index volumes
    are handled.
    * values: True, False, ListAndIndexes
    * default value: False

Thanks!

Btw, when i said “it’s big” I meant the backup, not the dindex. That dindex was from what was likely the initial backup of about 1-2TB. I should have been more clear. And yes, I’m using Wasabi. Had never run into this before.

L

Whatever it was would have had to last for at least awhile due to the below.

number-of-retries

--number-of-retries = 5
If an upload or download fails, Duplicati will retry a number of times before failing. Use this to handle unstable network connections better.

retry-delay

--retry-delay = 10s
After a failed transmission, Duplicati will wait a short period before attempting again. This is useful if the network drops out occasionally during transmissions.