New clue for the Fatal Failed: 400 - bad_request: Sha1 did not match data received


#1

I just spent a few days troubleshooting this error and finally found a major clue that hopefully helps the programmer to fix it.

I was getting this error:
Fatal Failed: 400 - bad_request: Sha1 did not match data received

When I removed the throttle-upload option, the problem goes away!

When I set throttle-upload to 25KB/s, the problem happens every time.
I’m backing up to B2, AES encrypted, 50MB vol size, keep 30 months, 607,585 files, 109GB total, win10 pro, i3, 4GB ram, slow dsl (512kbit/s up, 3kbit/s down) duplicati running as service, excluding hidden, system, temp, and files larger than 10KB. Anything else you might need to know?

I am a delphi/php/asp programmerI just spent a few days troubleshooting this error and finally found a major clue that hopefully helps the programmer to fix it.

I was getting this error:
Fatal Failed: 400 - bad_request: Sha1 did not match data received

When I removed the throttle-upload option, the problem goes away!

When I set throttle-upload to 25KB/s, the problem happens every time.
I’m backing up to B2, AES encrypted, 50MB vol size, keep 30 months, excluding hidden, system, temp, and files larger than 10KB. 607,585 files, 109GB total, win10(64) pro, i3, 4GB ram, slow dsl (512kbit/s up, 3kbit/s down but no packet loss) duplicati 2.0.2.15 running as service, CLRVersion : 4.0.30319.42000. Anything else you might need to know?

I tried setting env var FH_LIBRARY-Managed, didn’t help.
also tried duplicati 2.0.2.1, didn’t help.


#2

Hi @jshipp, welcome to the forum - and thanks for the investigation!

I’m not sure how throttle uploads would result in Sha1 errors, but if it’s repeatable for you that’s a good way to help debug stuff.

Can you post your job exported “As command-line” in both the working and failing versions (with personal data, passwords, etc. scrubbed of course)?


#3

WORKING:

“C:\Program Files\Duplicati 2\Duplicati.CommandLine.exe” backup “b2://duplicati-lamar2/ServerE?auth-username=XXXXXXXXXXXX&auth-password=XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX” “E:\Data\” --snapshot-policy=Auto --send-http-url=http://duplicati.XXXXXX.com --send-http-any-operation=true --send-http-message-parameter-name=message --send-http-level=all --send-http-message=“Duplicati %OPERATIONNAME% report for %backup-name% %PARSEDRESULT% %RESULT%” --backup-name=“Lamar ServerE to Backblaze” --dbpath=“C:\Program Files\Duplicati 2\data\JRXJWJOBUE.sqlite” --encryption-module=aes --compression-module=zip --dblock-size=50MB --keep-time=30M --passphrase=“XXXXXXXXXXXXXXXXX” --skip-files-larger-than=10KB --exclude-files-attributes=“temporary,system,hidden” --disable-module=console-password-input

NOT WORKING:

“C:\Program Files\Duplicati 2\Duplicati.CommandLine.exe” backup “b2://duplicati-lamar2/ServerE?auth-username=XXXXXXXXXXXX&auth-password=XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX” “E:\Data\” --snapshot-policy=Auto --send-http-url=http://duplicati.XXXXXX.com --send-http-any-operation=true --send-http-message-parameter-name=message --send-http-level=all --send-http-message=“Duplicati %OPERATIONNAME% report for %backup-name% %PARSEDRESULT% %RESULT%” --backup-name=“Lamar ServerE to Backblaze” --dbpath=“C:\Program Files\Duplicati 2\data\JRXJWJOBUE.sqlite” --encryption-module=aes --compression-module=zip --dblock-size=50MB --keep-time=30M --passphrase=“XXXXXXXXXXXXXXXXX” --skip-files-larger-than=10KB –throttle-upload=25KB --exclude-files-attributes=“temporary,system,hidden” --disable-module=console-password-input

Using the global throttle (next to the pause button at the top of the GUI) also results in the job failing. I don’t see anything different in the command line when I set the throttle like that.

thanks


#4

Thanks! I edited your post to make the command stand out a bit more and bold / italicize the difference.

I did some tests with a local destination backup (see below) but the end result is something’s going on (even with a local destination), but I couldn’t reliably replicate it.

If possible, can you do a test of --upload-throttle=1MB and see if it works for you?


I tried running a local backup with –throttle-upload=25KB and got a “Warning while running task xxxx” window with “Got 0 warning(s)” in it.

I then got 5 retries each on the following - both with Hash mismatch on file ... errors:

  • Get with file duplicati-20180308T213015Z.dlist.zip
  • Get with file duplicati-i6735436af81a45b5bbd68ef999a63514.dindex.zip

Eventually the log summary showed:

Errors: [ Failed to process file duplicati-20180308T213015Z.dlist.zip => Hash mismatch on file “C:\Users\Me\AppData\Local\Temp\dup-27b3c118-2f9b-4c7c-8909-167304175f1e”, recorded hash: DY0vKz/ezWIYCcQgMyptJiTv0+pGm3DSI8fkkfwwF9g=, actual hash F+sho+hzvVuCcTUlN5ysiPvrNbvCOXG1bv6MXJVpeR4=, Failed to process file duplicati-i6735436af81a45b5bbd68ef999a63514.dindex.zip => Hash mismatch on file “C:\Users\Me\AppData\Local\Temp\dup-ca215f22-fc0f-43ed-a11b-24f1ffff7330”, recorded hash: td/lx+agGcBffag2DR2REyqQJLfGTHb3NvbuEIxqiWI=, actual hash wQzF4oSFE0MmDQh7Wuapv2Z1ckzsMfjslDap8D15qB4= ]


Here are some more results:

  • --throttle-upload=25000 -> ERRORS (aka ~25KB but in bytes)
  • --throttle-upload=50KB -> ERRORS
  • --throttle-upload=1024KB -> ERRORS 50% of the time (aka 1MB but in KB)
  • --throttle-upload=2048KB -> no errors (aka 2MB but in KB)
  • --throttle-upload=5120KB -> no errors (aka 5MB but in KB)
  • --throttle-upload=10240KB -> no errors (aka 10MB by in KB)
  • --throttle-upload=1048576KB -> no errors (aka 1GB but in KB)
  • --throttle-upload=1MB -> no errors
  • --throttle-upload=25GB -> no errors
  • --throttle-upload=25TB -> no errors

Note that in all cases with errors they came during the “Verifying …” step - so your backups are working just fine, it’s the verification step that’s failing.


#5

If possible, can you do a test of --upload-throttle=1MB and see if it works for you?

That worked! Now I’m really curious what this means…


#6

Just to add a little more info… I had 6 computers with this error. ALL of them had the upload throttling set to 25 or 26 KB/s. I removed that setting and now they all work.


#7

Thanks for the confirmation!

At this point I’m curious if somebody like @kenkendk, @kees-z, or @Pectojin might have any thoughts on why an upload throttle parameter might cause SHA1 errors to become more likely to occur…


#8

That is a very strange error, as the upload throttle should not affect the download at all.

As I understand this, the files are stored correctly on the remote end, but the downloads fail?

In that case I think it is this piece that fails:

I have seen a similar error where the decryptor is incorrectly implemented. Most implementations wait until the stream returns zero bytes, and treat this as end-of-stream. But some implementations use number of bytes read < requested bytes as the end-of-stream marker.

Once the throttle is active, it can return less than what is requested, which can trigger this bug.

And I noticed that it actually does swap the upload/download speed here:

The signature is ThrottledStream(stream, readspeed, writespeed) meaning that the speed we read from the remote (aka download) is limited by the upload throttle.

I will fix the swapping of the speeds, but that just pushes the problem to those that have the download throttle set up.
I have completely rewritten that part, so it will soon be fully fixed, but the change is too big to include in the experimental/beta releases for now.

Would it make sense to disable the pipelining feature for the beta release?


#9

There is a related error here, which I think is the same problem but triggering a similar bug, but in the crypto library instead of the hashing library:


#10

I was about to ask if it might resolved During restore: arbitrary bandwith application only on first dblock.zip.aes restoring · Issue #2890 · duplicati/duplicati · GitHub but I see you’re a step ahead of me :smile:

I think I remember seeing the swapped upload/download limit on a ticket or forum post somewhere else too, but I can’t find it right now.


#11

Just by a second :smile:

I remember that too, so it must be somewhere


#12

That’s like half the battle. Figuring out who to tell when something is fixed :smile:


#13

Thanks for @kenkendk & @Pectojin for checking into this! Can either of you (@jshipp & @Omnicef) test putting your throttle back in (which should cause the error) but also add --disable-piped-streaming=true to see if Kenkendk’s theory about this being related to the “Out of bounds error on restore” ticket is correct?

Unfortunately, my test machine is busy doing a database recreate test which will likely take another day or two so I don’t plan on testing this until then…

--disable-piped-streaming (default: false)
Use this option to disable multithreaded handling of up- and downloads, that can significantly speed up backend operations depending on the hardware you’re running on and the transfer rate of your backend.
Default value: “false”

Oh, and do either of you have a third-party throttle app you could try? That might narrow down if the issue really is low speed or something specific to the Duplicati implementation of it…


Are you suggesting changing the default to NOT enabled or to remove the feature entirely (which could cause “unrecognized parameter” messages for those currently using it)?

Either way, I’m inclined to say leave it as is since it seems there are more people trying to make Duplicati run faster than those trying to make it run slower. Until an actual fix can be devised it might make more sense to add comments suggesting people stay away from “extremely low” values as they might introduce errors.


#14

ok, I added “–disable-piped-streaming=true” to the advanced options of the job, and it didn’t appear to help. It’s a shame I can’t throttle from the router but this particular computer isn’t lucky enough to have a router that fancy. Suggest a 3rd party throttling utility and i’ll try it.

System.Exception: 400 - bad_request: Sha1 did not match data received
at Duplicati.Library.Main.BackendManager.WaitForEmpty(LocalDatabase db, IDbTransaction transation)
at Duplicati.Library.Main.Operation.BackupHandler.Run(String[] sources, IFilter filter)
at Duplicati.Library.Main.Controller.<>c__DisplayClass17_0.b__0(BackupResults result)
at Duplicati.Library.Main.Controller.RunAction[T](T result, String[]& paths, IFilter& filter, Action`1 method)
at Duplicati.Library.Main.Controller.Backup(String[] inputsources, IFilter filter)
at Duplicati.Server.Runner.Run(IRunnerData data, Boolean fromQueue)


#15

Suggest a 3rd party throttling utility and i’ll try it.

Many firewalls, even software based ones such as ipfw, can do bandwidth throttling. So, you may already be just a couple configuration changes away.