Failed while executing Backup to Wasabi

I am new to duplicati / wasabi and have it installed successfully on 2 machines with small amounts of data backed up.
On a Ubuntu 16.04 machine (my music server with > 300 GB) the installation keeps failing - it looks like Duplicati is sending a bad request (as far as Wasabi is concerned).

I am running Duplicati - 2.0.2.12_canary_2017-10-20

Here is the error:

Amazon.S3.AmazonS3Exception: The request body terminated unexpectedly ---> Amazon.Runtime.Internal.HttpErrorResponseException: The remote server returned an error: (400) Bad Request. ---> System.Net.WebException: The remote server returned an error: (400) Bad Request.
 at System.Net.HttpWebRequest.EndGetResponse (IAsyncResult asyncResult) <0x41dc2e10 + 0x001a7> in <filename unknown>:0
 at System.Net.HttpWebRequest.GetResponse () <0x41f21e60 + 0x0005a> in <filename unknown>:0
 at Amazon.Runtime.Internal.HttpRequest.GetResponse () <0x41f21b00 + 0x00043> in <filename unknown>:0 --- End of inner exception stack trace ---
 at Amazon.Runtime.Internal.HttpRequest.GetResponse () <0x41f21b00 + 0x00313> in <filename unknown>:0
 at Amazon.Runtime.Internal.HttpHandler`1[TRequestContent].InvokeSync (IExecutionContext executionContext) <0x41f1dcd0 + 0x0023b> in <filename unknown>:0
 at Amazon.Runtime.Internal.PipelineHandler.InvokeSync (IExecutionContext executionContext) <0x41f0cd40 + 0x00034> in <filename unknown>:0
 at Amazon.Runtime.Internal.RedirectHandler.InvokeSync (IExecutionContext executionContext) <0x41f1dc50 + 0x00023> in <filename unknown>:0
 at Amazon.Runtime.Internal.PipelineHandler.InvokeSync (IExecutionContext executionContext) <0x41f0cd40 + 0x00034> in <filename unknown>:0
 at Amazon.Runtime.Internal.Unmarshaller.InvokeSync (IExecutionContext executionContext) <0x41f1dba0 + 0x00017> in <filename unknown>:0
 at Amazon.Runtime.Internal.PipelineHandler.InvokeSync (IExecutionContext executionContext) <0x41f0cd40 + 0x00034> in <filename unknown>:0
 at Amazon.S3.Internal.AmazonS3ResponseHandler.InvokeSync (IExecutionContext executionContext) <0x41f1db30 + 0x00017> in <filename unknown>:0
 at Amazon.Runtime.Internal.PipelineHandler.InvokeSync (IExecutionContext executionContext) <0x41f0cd40 + 0x00034> in <filename unknown>:0
 at Amazon.Runtime.Internal.ErrorHandler.InvokeSync (IExecutionContext executionContext) <0x41f1da40 + 0x00027> in <filename unknown>:0 --- End of inner exception stack trace ---
 at Duplicati.Library.Main.Operation.BackupHandler.HandleFilesystemEntry (ISnapshotService snapshot, Duplicati.Library.Main.BackendManager backend, System.String path, FileAttributes attributes) <0x41f3daa0 + 0x01fff> in <filename unknown>:0
 at Duplicati.Library.Main.Operation.BackupHandler.RunMainOperation (ISnapshotService snapshot, Duplicati.Library.Main.BackendManager backend) <0x41f3d100 + 0x00647> in <filename unknown>:0
 at Duplicati.Library.Main.Operation.BackupHandler.Run (System.String[] sources, IFilter filter) <0x41ecec10 + 0x0181f> in <filename unknown>:0 

I have tried repairing and rebuilding the database several times with no luck on solving it. Duplicati did run without problems when I uploaded a smaller data set (about 500MB).

@paulearley, I updated your post to make the error message easier to read (just started a new line before each “at…”. :slight_smile:

This sounds very similar to the following which (to date) does not seem to have been “solved”:

That being said, I’m not familiar with Wasabi but the error message mentions Amazon (so I’m INCORRECTLY assuming it’s a reseller of some kind) which has been known to throw this issue when trying to use Glacier storage…

They are not reselling (afaik) but simply have an S3 compatible API (like Minio and other do).

@paulearley Have you perhaps set the volume size to a very large value, and this trips Wasabi? Or is your clock perhaps out-of-sync ?

Hi Folks - this is Jim @ Wasabi. Thanks for the comments so far. Two comments from the Wasabi perspective:

  1. Wasabi offers a S3-compatible API (we are not reselling S3)

  2. We are looking into the problem reported here and I’ll update this thread once we understand the root cause better.

Thanks,
Jim

2 Likes

@JonMikelV The volume size is 50M (the default) - my clock is set in Ubuntu from NIST so they should be OK. Also I am in the same time zone as the Wasabi server (if that could have tripped things up).

The odd thing is the error is a http 400 error – that makes it seem like a command was constructed improperly.

Finally, I “fixed” the problem by avoiding it – I switched to Backblaze B2 which is working fine from the same machine. I can run another test to help other Wasabi users if there are further suggestions or if I stumble across something.
paulearley

THANKS @wasabi-jim. I would like to reconsider Wasabi if I can get this to work! Let me know if you would like me to test something.
paulearley

Thanks for the replies - and t he “fix”. :wink:

Hopefully other Wasabi users can chime in about their experiences so far (and keep an eye out from some @wasabi-jim good news).

Hi Folks - we have not yet been able to repro this problem in our own labs despite many variations & attempts (the Duplicati + Wasabi combo is working well for us in a wide range of file size and config options). We are still working with Paul from this thread and I’m in touch with another Duplicati user from outside of this thread. If there are other affected users, could you drop me a note at support@wasabi.com so I can review your config & backup scenario. I really appreciate folks help in isolating this problem. Thanks, Jim

1 Like

Hi Folks - we now understand the root cause of this problem and are planning a service update on 19 Nov 2017 to address the problem. We are sorry for the inconvenience and I want to thank the Duplicati community for helping resolve the issue. Thanks, Jim

2 Likes

Thanks for the update @wasabi-jim!

Are there any specific failure scenarios we can test after the 19th to help confirm the issue is resolved?

Let me talk with my QA colleagues to better understand their repro scenario and I’ll update this thread once I know more - thx

Hi Folks - Wasabi did deploy a service update on 19 Nov 2017 and all of our testing has shown that the problem has been resolved. We regret the inconvenience and please let us know at support@wasabi.com (or this thread) if you have any further problems. Thanks, Jim

2 Likes

I would like to know if anyone has a tutorial on how to configure Duplicati with Wasabi. I’m from Brazil and I found the costs of Wasabi interesting and the option of a great upload and download speed for the files, but I would like to test the rates here in Brazil.

The Duplicati I saw that has the option of BackBlaze direct, but Wasabi do not know how to do the configuration right, I saw that it is compatible with S3, but the URL to login and things like I have no idea how to configure.

Another question about Wasabi, does anyone know if it has a way to create separate buckets that only 1 user can access with each key?

You can use Wasabi as an S3 destination. Just choose Wasabi from the list of possible destinations.

I think @wasabi-jim can answer that?

Hi Folks - yes, you can use Duplicati with Wasabi. Details are here:

If you want to have a separate bucket for each user you can do that. Just drop us a note at support at wasabi.com and we can help you with the details. Thanks, Jim

2 Likes

Strange, I went to Destination => S3 Compatible and there are only those options, nothing Wasabi.
Duplicati - 2.0.2.1_beta_2017-08-01

image

Thanks a lot for the help, I intend to do the tests this week and everything going well I plan to start migrating the backup of some clients to your system.

Yes, Wasabi was added in 2.0.2.5 IIRC. You can still use it in 2.0.2.1, but you need to enter the URL as a “custom server url” (it is s3.wasabisys.com) and set the advanced option --s3-ext-forcepathstyle=true.

In the latest canary build, these things are done automatically.

1 Like

I always try to use the more stable version of the tool to avoid unknown bugs. I’m waiting for the new beta to come out so I can upgrade, while I’m doing the tests that way.
Thank you very much for the feedback, I know how much time you spend giving this kind of support.