HELP Backup to Azure fails at "Waiting for upload"

I have a backup configure to azure blob and it used to work well, but some days ago, i received this error on email

Failed: The request was aborted: The request was canceled.
Details: Microsoft.WindowsAzure.Storage.StorageException: The request was aborted: The request was canceled. ---> System.Net.WebException: The request was aborted: The request was canceled.
   at System.Net.ConnectStream.InternalWrite(Boolean async, Byte[] buffer, Int32 offset, Int32 size, AsyncCallback callback, Object state)
   at System.Net.ConnectStream.Write(Byte[] buffer, Int32 offset, Int32 size)
   at Microsoft.WindowsAzure.Storage.Core.ByteCountingStream.Write(Byte[] buffer, Int32 offset, Int32 count)
   at Microsoft.WindowsAzure.Storage.Core.Util.StreamExtensions.WriteToSync[T](Stream stream, Stream toStream, Nullable`1 copyLength, Nullable`1 maxLength, Boolean calculateMd5, Boolean syncRead, ExecutionState`1 executionState, StreamDescriptor streamCopyState)
   at Microsoft.WindowsAzure.Storage.Core.Executor.Executor.ExecuteSync[T](RESTCommand`1 cmd, IRetryPolicy policy, OperationContext operationContext)
   --- End of inner exception stack trace ---
   at Duplicati.Library.Main.Operation.BackupHandler.HandleFilesystemEntry(ISnapshotService snapshot, BackendManager backend, String path, FileAttributes attributes)
   at Duplicati.Library.Main.Operation.BackupHandler.RunMainOperation(ISnapshotService snapshot, BackendManager backend)
   at Duplicati.Library.Main.Operation.BackupHandler.Run(String[] sources, IFilter filter)
   at Duplicati.Library.Main.Controller.<>c__DisplayClass16_0.<Backup>b__0(BackupResults result)
   at Duplicati.Library.Main.Controller.RunAction[T](T result, String[]& paths, IFilter& filter, Action`1 method)
Request Information

I then deleted the backup and start it from scratch. But again, after uploading 40GB (full backup size), i have the same error on “Waiting for upload” stage.

I tested with a new backup and it works well with small files. When i put some “big” file (100mb or more) it gave the same error. On this “test backup”, i solved the issue changing the “block size” from 50 to 20mb. But in the “production backup” of 40GB this trick doesn’t work.

I noticed that in the end of backup, duplicati tries to upload a 80mb file, and then it gave the error message, tries to rename the “block” and the backup fails.


Why are duplicati uploading a 84MB file, if my settings are configured to 20MB block size?

How can i solve it or give more informations?


The final file is the “file list”, and if you have a lot of files, this can be large. There is no support for splitting this list over multiple zip files, unfortunately.

Thanks for the answer!

Theres a workaround to this? I think it occurs because my slow internet connection. The strange part is: i have this backup configured for months and always works great.

If we solve the problem with “block size” larger than 20MB, i think the backup will work again.

Can i give you more information or something?


Not really, but if you split your backup into multiple backups, each backup will have a smaller file list.

I split the backup and it works! Thanks.


I have backup of size only about 1GB and dblock-size set to 20MB and I am getting the exactly same exception in 3/5 tries of backup. I think it is caused by slow upload speed of ADSL line and most probably some timeout of Azure SDK is causing this.

I have read somewhere on the Internet that this can be tuned by changing BlobRequestOptions (MaximumExecutionTime, ServerTimeout) of Microsoft.WindowsAzure.Storage.Blob, but this is not made available to change from Duplicati.

I am thinking of adding this option to the Duplicati myself and perform some tests, but I wanted to check here if somebody maybe already found another solution to this error?

Thank you

To verify your suggestion, you could setup rclone to your backend, and change duplicati to use rclone. I believe rclone has the advanced settings made available.

If that’s the case (and you have the issue proposed above) then somebody with an Azure account should be able to replicate the issue be setting a dblock size to your final “file list” size then use the built in throttle feature (or external tool) to simulate a slow connection.

If you’re using a newer canary or experimental version, I believe the are now http-timeout and http-retry Advanced parameters that might help get around this issue.