The stream does not support concurrent IO read or write operations

I have Duplicati working on one of my machines, but the other Windows 2016 Server has the following error everytime the job runs from a schedule.

Failed: The stream does not support concurrent IO read or write operations.
Details: System.NotSupportedException: The stream does not support concurrent IO read or write operations.
at Duplicati.Library.Main.BackendManager.WaitForComplete(LocalDatabase db, IDbTransaction transation)
at Duplicati.Library.Main.Operation.BackupHandler.Run(String[] sources, IFilter filter)
at Duplicati.Library.Main.Controller.<>c__DisplayClass16_0.b__0(BackupResults result)
at Duplicati.Library.Main.Controller.RunAction[T](T result, String[]& paths, IFilter& filter, Action`1 method)

I have searched the forum and am unable to locate anything that assists me in troubleshooting this event. Can anyone provide any suggestions on this error?

Thanks in advance!

Hi @JimL243, welcome to the forum!

Can you provide a few more details such as:

  • what version of Duplicati are you running?
  • to what destination are you backing up?
  • what was happening when the error occurred

That last item is basically asking if you were doing something like cancelling an in-process job which MIGHT have been the cause of a similar stream does not support concurrent IO read or write operations error mentioned here:

Hi!
We are running Duplicati 2.0.2.1_beta x64
We are backing up to an S3 bucket at wasabi.com
This occurs evertime we perform a scheduled backup.

Thanks again for your help and looking into this.

-Jim

Thanks for the info.

Since the other topic showing this error is going to actual S3 as a destination it’s possible there’s either a bug in Duplicati’s use of the S3 (and compatible) APIs or that the APIs themselves have a failure scenario.

My GUESS is that if we look more deeply into your logs we’ll find something similar to the other topic where there are two different operations on the same file name (in their case a rename AND a put) going on at almost the same time.

If Duplicati sends these requests to the S3 (compatible) destination too quickly the destination may not be done processing the first one when the second one comes in. If we can verify in your logs that is what’s going then there may be a way to slow down those requests.

Another guess is that this job worked fine for a while and THEN started throwing this error…

If it happens with every run (which is good for debugging) then you might also want to try running your job with the --no-auto-compact parameter enabled. This turns off the step that downloads multiple small / sparsely used archives and re-compresses them into fewer full-sized / fully utilized archives (which is a step that would potentially have renames and puts close to each other).