FaliedToProcessFile Error

Hi,

I use OneDrive as a backup destination and recently had to upgrade the protocol. I had been receiving the 410 (Gone) error. It now works fine, EXCEPT for the following error, which I am getting constantly:

2019-01-19 14:32:56 -05 - [Error-Duplicati.Library.Main.Operation.TestHandler-FailedToProcessFile]: Failed to process file duplicati-b50cd61d415fe4bcc932171d549ce12d7.dblock.zip.aes

The file name that fails is NOT always the same.

I am not sure if this is related to the change in protocol but I think it is because I also use a network drive as a destination and that backup schedule seems to be working well, without errors.

Any ideas?

Thank you.

If that error message is from the job backup summary, the detail that’s needed is missing. If it’s from reporting options such as email, see if any other log details are at the bottom. If not, you could try watching it live using About → Show log → Live, or use the job’s Commandline option, or set job –log-file. Example of more detail:

Backups suddenly failing - 4 machines now

System.Exception: The file duplicati-20181122T130004Z.dlist.zip.aes was downloaded and had size 2097152 but the size was expected to be 2124781 at

and then more, but this is one of those errors (and there are others) where one more line would really help…

Verifying backend files is the step after the backup that this is from. The name changes because it’s random.

The TEST command is the same operation if it ever becomes necessary to survey your files heavily by hand. You could convert the default GUI Commandline backup format to run this command, if you want a direct test.

Thank you so much for your kind reply.

I retried the backup and watched the live log. Below are the last few log lines that show the error in 5 repeated instances.

  • Jan 20, 2019 10:12 AM: The operation Backup has completed

  • Jan 20, 2019 10:12 AM: Failed to process file duplicati-b955f8bada9e14a1f92a50b1bb278d3e9.dblock.zip.aes

  • Jan 20, 2019 10:12 AM: Backend event: Get - Failed: duplicati-b955f8bada9e14a1f92a50b1bb278d3e9.dblock.zip.aes (500.00 MB)

  • Jan 20, 2019 10:12 AM: Operation Get with file duplicati-b955f8bada9e14a1f92a50b1bb278d3e9.dblock.zip.aes attempt 5 of 5 failed with message: A task was canceled.

  • Jan 20, 2019 10:10 AM: Backend event: Get - Started: duplicati-b955f8bada9e14a1f92a50b1bb278d3e9.dblock.zip.aes (500.00 MB)

  • Jan 20, 2019 10:10 AM: Backend event: Get - Retrying: duplicati-b955f8bada9e14a1f92a50b1bb278d3e9.dblock.zip.aes (500.00 MB)

  • Jan 20, 2019 10:10 AM: Operation Get with file duplicati-b955f8bada9e14a1f92a50b1bb278d3e9.dblock.zip.aes attempt 4 of 5 failed with message: A task was canceled.

  • Jan 20, 2019 10:08 AM: Backend event: Get - Started: duplicati-b955f8bada9e14a1f92a50b1bb278d3e9.dblock.zip.aes (500.00 MB)

  • Jan 20, 2019 10:08 AM: Backend event: Get - Retrying: duplicati-b955f8bada9e14a1f92a50b1bb278d3e9.dblock.zip.aes (500.00 MB)

  • Jan 20, 2019 10:08 AM: Operation Get with file duplicati-b955f8bada9e14a1f92a50b1bb278d3e9.dblock.zip.aes attempt 3 of 5 failed with message: A task was canceled.

  • Jan 20, 2019 10:06 AM: Backend event: Get - Started: duplicati-b955f8bada9e14a1f92a50b1bb278d3e9.dblock.zip.aes (500.00 MB)

  • Jan 20, 2019 10:06 AM: Starting - RemoteOperationGet

  • Jan 20, 2019 10:06 AM: Backend event: Get - Retrying: duplicati-b955f8bada9e14a1f92a50b1bb278d3e9.dblock.zip.aes (500.00 MB)

  • Jan 20, 2019 10:06 AM: Operation Get with file duplicati-b955f8bada9e14a1f92a50b1bb278d3e9.dblock.zip.aes attempt 2 of 5 failed with message: A task was canceled.

  • Jan 20, 2019 10:06 AM: RemoteOperationGet took 0:00:01:40.006

  • Jan 20, 2019 10:04 AM: Backend event: Get - Started: duplicati-b955f8bada9e14a1f92a50b1bb278d3e9.dblock.zip.aes (500.00 MB)

  • Jan 20, 2019 10:04 AM: Starting - RemoteOperationGet

  • Jan 20, 2019 10:04 AM: Backend event: Get - Retrying: duplicati-b955f8bada9e14a1f92a50b1bb278d3e9.dblock.zip.aes (500.00 MB)

  • Jan 20, 2019 10:04 AM: Operation Get with file duplicati-b955f8bada9e14a1f92a50b1bb278d3e9.dblock.zip.aes attempt 1 of 5 failed with message: A task was canceled.

  • Jan 20, 2019 10:04 AM: RemoteOperationGet took 0:00:01:40.003

  • Jan 20, 2019 10:03 AM: Backend event: Get - Started: duplicati-b955f8bada9e14a1f92a50b1bb278d3e9.dblock.zip.aes (500.00 MB)

  • Jan 20, 2019 10:03 AM: Starting - RemoteOperationGet

  • Jan 20, 2019 10:03 AM: ExecuteNonQuery: UPDATE “RemoteVolume” SET “VerificationCount” = MAX(1, CASE WHEN “VerificationCount” <= 0 THEN (SELECT MAX(“VerificationCount”) FROM “RemoteVolume”) ELSE “VerificationCount” + 1 END) WHERE “Name” = “duplicati-i924f3d0b21254c5587d5b4e2ff50b698.dindex.zip.aes” took 0:00:00:00.015

Any clues?

Thanks,

The routine clue there is that it tried 5 times (default –number-of-retries) for the same file, and never got it.

Though the log (unlike the --log-file log) doesn’t show seconds, the minutes imply the attempts took awhile.

The most interestingly suspicious to-the-second values are the 1 minute and 40 second (1:40) time reports because there looks to be a default timeout in the .NET Framework API that’s documented as 100 seconds.

Exactly how timeout is measured seems to be the subject of some debate such as this. The –http-operation-timeout documentation for Duplicati sides with this being the total allowed transfer time. With your oversized 500MB dblock files (–dblock-size default is 50MB, but there are certainly valid reasons one may set 500MB) the download might just be too slow, due to Microsoft, your connection, your computer, doesn’t really matter.

Could you try editing the job Options to add an Advanced option of --http-operation-timeout, select Minutes, and try 5 (to be generous)? The UI seems a little buggy, for example Custom on the dropdown doesn’t work.

2 Likes

That was the issue. I changed the timeout to 10 minutes and it worked well. I think I will also reduce the dblock size a bit. I started with 1GB because I was concerned about too large a number of files on OneDrive. Then reduced to 500 MB. Then 250 MB. Now I will stick with 100 MB.

Thanks for the help!