System.Net.WebException: The remote server returned an error: (451)

My duplicati finished off a 1.27 TB backup that i had compressed into 100 meg files resulting in 26,803 files. When the backup finished it ‘failed’.

When i did a retry i get this error:

Fatal error
System.Net.WebException: The remote server returned an error: (451) Local error in processing.
   at Duplicati.Library.Main.BackendManager.List()
   at Duplicati.Library.Main.Operation.FilelistProcessor.RemoteListAnalysis(BackendManager backend, Options options, LocalDatabase database, IBackendWriter log, String protectedfile)
   at Duplicati.Library.Main.Operation.FilelistProcessor.VerifyRemoteList(BackendManager backend, Options options, LocalDatabase database, IBackendWriter log, String protectedfile)
   at Duplicati.Library.Main.Operation.BackupHandler.PreBackupVerify(BackendManager backend, String protectedfile)
   at Duplicati.Library.Main.Operation.BackupHandler.Run(String[] sources, IFilter filter)

Ive also noticed now that when I go to the backup and do a ‘TEST CONNECTION’ i get a similar problem, it sits there for a while then returns the error:

Failed to connect: The remote server returned an error: (451) Local error in processing.

So im assuming, when it does its connection test its counting all 26,803 files and then timing out.

I’m thinking a way to fix the problem would be redo the backup with larger files, but I was hoping to not have to do that. Is there a way I can extend the timeout?

I’m not convinced it’s a timeout causing the issue (they don’t usually manifest during the “test connection” call), but if you are you using a Canary version of Duplicati this post mentions a --http-operation-timeout parameter you can test to set the number of minutes before it times out (default is 10) and see if the error goes away.

And if you’re using Backblaze (B2) as your destination, you could look at the --b2-page-size parameter to lower the number of files returned in each request (I’m not sure what the default is for that).

Actually, since we know the default timeout is 10 minutes you should be able to view the logs and see if this step is taking 10 minutes before it errors out…

Oh, and I edited your post to make the error message a little easier to read (just add ~~~ before and after the text).

I’m not using a canary version. and my destination is a local NAS running an FTP site.

I did a couple more tests.

  1. I ran the test connection and it looks like it times out on 3 minutes
  2. I changed the directory of the test connection to an empty directory and the test passed fine. So it definitely has something to do with the content.
  3. I also tested connecting to the ftp via filezilla and while it did take a bit of time to list the directory, it did work. which makes it look like a duplicati issue rather than an ftp issue.

Should i upgrade to the latest canary build and try increasing --http-operation-timeout parameter?

Unless you really want to upgrade to a Canary version you could try a portable install (or install on another machine) and just try the “Test connection” step with the http-timeout increase.

If the problem still manifests then we know that’s not the cause.

I have duplicati installed on one of my NAS’s. When i do a ‘test connection’ to the same ftp with a test directory it connects fine. If i do a test connection to the backup directory with the files in it. I get the “Failed to connect: The remote server returned an error: (451) Local error in processing.” error.

I upgraded to the latest Carany build (2017-20-10) and when i do a test connection i get this error:

Failed to connect: The remote server returned an error: (450) File unavailable (e.g., file busy).

I think that is a problem with the FTP server. Due to the way the FTP protocol was designed, it is very susceptible to denial-of-service attacks. For this reason most FTP servers limit the size of the “list files” response, and I think this is what you are seeing.

We have discussed using a “split folder” approach where files are stored in subfolders, but this has not been implemented yet.

If you cannot tweak the parameters of the FTP server, you need to either use another protocol (SFTP, or WebDAV for instance) or you can increase the volume size (fewer but larger files). For the last option, you need to consult the FTP server documentation to figure out what maximum size it permits, and make sure you stay under that.

I can list the ftp directory with other software without issue, so i don’t think its a list limit.
BUT! I just changed it to SFTP (SSH) and test connection worked! So im retrying the backup and we’ll see how we go!

2 Likes

Good call! I always go for SFTP first if it’s an option - just under general security principles.