Unable to read data from the transport connection: The socket has been shut down

Hi,
Im trying to use a s3 bucket to store my backup and i have this error showing up when i attempt to start the backup:

One or more errors occurred. (Unable to read data from the transport connection: The socket has been shut down. (Unable to read data from the transport connection: The socket has been shut down.) (One or more errors occurred. (Unable to read data from the transport connection: The socket has been shut down.)))

Nothing else appears in the logs.

When i try to test the connection with the settings i ve put for my s3 bucket it is successful, but when i run the backup i get the error…

I have tried to create manually the s3 bucket and to let duplicati create it automatically and it’s the same regarding the error when i try to launch the backup
(the bucket gets created tough)

I m running duplicati from linuxserver.io docker linuxserver/duplicati:latest image
I am using puid and guid 0

I dont know where to look to troubleshoot this…
Any help will be appreciated
thank you

aws s3 authorizations with duplicati backup : Unable to read data from the transport connection: The socket has been shut down on Stack Overflow continues with more data but no answer yet. Any S3 policy experts?

i am running into the exact same error.
Duplicati via docker-compose.yaml (linuxserver.io)

UI Check for connection successfull (S3 AWS)
S3 Access Key uses “full root” permissions
Backup Creation Success
Upload fails with exception

System:

APIVersion : 1
PasswordPlaceholder : **********
ServerVersion : 2.0.7.1
ServerVersionName : - 2.0.7.1_beta_2023-05-25
ServerVersionType : Beta
StartedBy : Server
BaseVersionName : 2.0.7.1_beta_2023-05-25
DefaultUpdateChannel : Beta
DefaultUsageReportLevel : Information
ServerTime : 2023-11-17T21:01:14.04272+01:00
OSType : Linux
DirectorySeparator : /
PathSeparator : :
CaseSensitiveFilesystem : true
MonoVersion : 6.12.0.200
MachineName : bddadf7fe4ad
UserName : root
NewLine :
CLRVersion : 4.0.30319.42000
CLROSInfo : {"Platform":"Unix","ServicePack":"","Version":"5.15.0.84","VersionString":"Unix 5.15.0.84"}

Exception:

#0) System.AggregateException: Unable to read data from the transport connection: The socket has been shut down. (Unable to read data from the transport connection: The socket has been shut down.) (One or more errors occurred. (Unable to read data from the transport connection: The socket has been shut down.)) ---> System.IO.IOException: Unable to read data from the transport connection: The socket has been shut down. ---> System.Net.Sockets.SocketException: The socket has been shut down
   --- End of inner exception stack trace ---
  at System.Net.Sockets.Socket+AwaitableSocketAsyncEventArgs.ThrowException (System.Net.Sockets.SocketError error) [0x00007] in <a8a996a78a804d888710c9e2575d78c8>:0 
  at System.Net.Sockets.Socket+AwaitableSocketAsyncEventArgs.System.Threading.Tasks.Sources.IValueTaskSource.GetResult (System.Int16 token) [0x0001f] in <a8a996a78a804d888710c9e2575d78c8>:0 
  at System.Threading.Tasks.ValueTask+ValueTaskSourceAsTask+<>c.<.cctor>b__4_0 (System.Object state) [0x00030] in <d636f104d58046fd9b195699bcb1a744>:0 
--- End of stack trace from previous location where exception was thrown ---

Full Stacktrace:

What is backup creation? Just defining the job? Then its initial run fails?

That’s probably testing a list. If you can get in with CLI and find your Duplicati (which image?) try
Duplicati.CommandLine.BackendTool.exe to see what else works (or doesn’t). For automatic tester, Duplicati.CommandLine.BackendTester.exe exists. For either, you can get a URL from a job export. Modify it to an empty folder.

image
is another thing you could try if you haven’t. Your stack trace looks like you’re on Amazon AWS SDK.

I created a job and manually started the first backup. The backup seems to be created locally and added to the database but fails when duplicati tries to upload to the s3 bucket.

I am running linux so no .exe for me but i can’t find the cli script inside the docker container.
Changing to Minio SDK does not work because Duplicati switches back to Amazon AWS SDK on save.

I installed Duplicati 2 on Windows and executed the Target Path my linux Duplicati instance displayed successfully

.\Duplicati.CommandLine.exe backup "s3://MY_BUCKET/?s3-server-name=s3.amazonaws.com&s3-location-constraint=eu-central-1&s3-storage-class=STANDARD&s3-client=aws&auth-username=UNAME&auth-password=PWKEY" "D:\a"

But anyway the exported target path for s3 looks ok as far as i can tell and via windows cli does uplaod a file to aws s3. The thing is i am using (GitHub - kartoza/docker-pg-backup: A cron job that will back up databases running in a docker postgres container) to backup my postgres database to aws s3 using the s3cmd Amazon S3 Tools: Command Line S3 Client and S3 Backup for Windows, Linux: s3cmd, s3express and this is working fine with the same credentials just on a different bucket name on the same linux system.

There is no local backup created prior to upload, although it does take awhile before uploading begins.

Not true. You’re assuming .exe means Windows. .NET programs (invented by Microsoft) use it as well.

Assemblies in .NET

Assemblies are implemented as .exe or .dll files.

Depending on how your Linux is set up, it might not know the extension. In that case, say mono then file.

It’s seemingly an open issue, though not well understood:

You could open a GUI Commandline and try to pick the option manually, then run the backup. Option:

  --s3-client (String): Specifies the S3 client library to use
    Set either to aws or minio . Then either the AWS SDK or Minio SDK will used to communicate with S3 services.

Using what? I’m trying to get you to do it with Duplicati’s CLI tools, as it’s Duplicati having the problem.

You left the Path field blank? That would be after the BUCKET/ and before the ? if you were using one.
If there was actually a Path and it was redacted into MY_BUCKET, check the slash count per this post:

EDIT:

The s3-client option survives the Save if it’s done on Options screen 5 instead of on Destination screen.
This is reminiscent of disappearing Destination Advanced options, even though it’s not in that section…

Fix bug that removes advanced target options when editing backups #4972 wants to fix that cited issue. Possibly it will also fix s3-client option.

Hey jeah sorry i might have not correctly explained what i did.
I executed the Duplicati.CommandLIne.exe on windows which was successull.
Now i also executed Duplicati.CommandLine.exe via mono on linux inside the docker container and this also run successfull:

mono Duplicati.CommandLine.exe backup "s3://MY_BUCKET_NAME/?s3-server-name=s3.amazonaws.com&s3-location-constraint=eu-central-1&s3-storage-class=STANDARD&s3-client=aws&auth-username=UNAME&auth-password=PASS" ./t --allow-missing-source
Backup started at 11/18/2023 12:34:11 AM

Enter encryption passphrase: 

Confirm encryption passphrase: 
Checking remote backup ...
  Listing remote folder ...
Scanning local files ...
  0 files need to be examined (0 bytes)
  Uploading file (829 bytes) ...
  Uploading file (877 bytes) ...
  Uploading file (877 bytes) ...
Compacting remote backup ...
Checking remote backup ...
  Listing remote folder ...
Verifying remote backup ...
Remote backup verification completed
  Downloading file (877 bytes) ...
  Downloading file (877 bytes) ...
  Downloading file (829 bytes) ...
  Duration of backup: 00:00:17
  Remote files: 3
  Remote size: 2.52 KB
  Total remote quota: 0 bytes
  Available remote quota: 0 bytes
  Files added: 0
  Files deleted: 0
  Files changed: 0
  Data uploaded: 2.52 KB
  Data downloaded: 2.52 KB
Backup completed successfully!

Correct i store directly into the root of the bucket, no subdirectory.

So i dont really see a point in using anything else then AWS SDK. I will give it a try tomorrow but i think the problem is somewhere else. The TARGET_PATH the UI produces does work in linux and windows via the duplciati cli

I don’t see how #4972 fixes this bug because the initial setup seems to be correct and as far as i see it the aws sdk should normally work like with the cli. from duplicati

I will do more testing tomorrow. Thank you for your super fast replies!

I don’t think that’s a correct description of the problem. The option change is saved, and is taken in account, however when going back to edit again the option the backup job always appears with the Aws option selected. When this problem occurs, if one don’t save, the Minio option is still active.

Unclear what bug that is. It should fix what it claims to fix, and possibly also the revert to AWS SDK.
What it claims to fix is disappearing Advanced option on screen 2. The workaround is use screen 5.
We don’t know if minio is needed anyway, as it seems that now you have AWS SDK sometimes OK.

Confirming that the problem works as described just after that on Windows. One can see minio in an Export As Command-line or in GUI Commandline. I think that’s the same as the problem it tries to fix.

I’m now unclear what works and what doesn’t. Is this like Duplicati.CommandLine.exe backup works, however GUI gets socket error? That would seem rather odd, but I have no S3, so I can’t investigate.

Yes, to be continued. Especially with help from developers, perhaps we can figure out what’s going on.

Correct. Command line works with the connection string the GUI generated using duplicati-cli.
Just the scheduled backup using the gui fails.

Looks like a problem with async tasks maybe

It would help to be sure that the only difference is that. On the other hand, if something else differs, it could be the true reason. I’d expect that in cli you did just a minimum check, while in the Gui you tried to upload a whole bunch of data. That’s a very different proposition in fact, you can run into specific problems when dealing with real life workload. First thing to come to mind is the infamous 5 GB limit.

I dont have a conplicated setup. I use the default settings and add my credentials for aws s3. I just selected some docker conpose yaml files. Nothing special and these are just a few kb each.

I cant see how this basic setup should be an issue. Don‘t really know what to do or test to help the devs find the problem.