My duplicati finished off a 1.27 TB backup that i had compressed into 100 meg files resulting in 26,803 files. When the backup finished it ‘failed’.
When i did a retry i get this error:
System.Net.WebException: The remote server returned an error: (451) Local error in processing.
at Duplicati.Library.Main.Operation.FilelistProcessor.RemoteListAnalysis(BackendManager backend, Options options, LocalDatabase database, IBackendWriter log, String protectedfile)
at Duplicati.Library.Main.Operation.FilelistProcessor.VerifyRemoteList(BackendManager backend, Options options, LocalDatabase database, IBackendWriter log, String protectedfile)
at Duplicati.Library.Main.Operation.BackupHandler.PreBackupVerify(BackendManager backend, String protectedfile)
at Duplicati.Library.Main.Operation.BackupHandler.Run(String sources, IFilter filter)
Ive also noticed now that when I go to the backup and do a ‘TEST CONNECTION’ i get a similar problem, it sits there for a while then returns the error:
Failed to connect: The remote server returned an error: (451) Local error in processing.
So im assuming, when it does its connection test its counting all 26,803 files and then timing out.
I’m thinking a way to fix the problem would be redo the backup with larger files, but I was hoping to not have to do that. Is there a way I can extend the timeout?
I’m not convinced it’s a timeout causing the issue (they don’t usually manifest during the “test connection” call), but if you are you using a Canary version of Duplicati this post mentions a --http-operation-timeout parameter you can test to set the number of minutes before it times out (default is 10) and see if the error goes away.
And if you’re using Backblaze (B2) as your destination, you could look at the --b2-page-size parameter to lower the number of files returned in each request (I’m not sure what the default is for that).
Actually, since we know the default timeout is 10 minutes you should be able to view the logs and see if this step is taking 10 minutes before it errors out…
Oh, and I edited your post to make the error message a little easier to read (just add ~~~ before and after the text).
I have duplicati installed on one of my NAS’s. When i do a ‘test connection’ to the same ftp with a test directory it connects fine. If i do a test connection to the backup directory with the files in it. I get the “Failed to connect: The remote server returned an error: (451) Local error in processing.” error.
I think that is a problem with the FTP server. Due to the way the FTP protocol was designed, it is very susceptible to denial-of-service attacks. For this reason most FTP servers limit the size of the “list files” response, and I think this is what you are seeing.
We have discussed using a “split folder” approach where files are stored in subfolders, but this has not been implemented yet.
If you cannot tweak the parameters of the FTP server, you need to either use another protocol (SFTP, or WebDAV for instance) or you can increase the volume size (fewer but larger files). For the last option, you need to consult the FTP server documentation to figure out what maximum size it permits, and make sure you stay under that.
I can list the ftp directory with other software without issue, so i don’t think its a list limit.
BUT! I just changed it to SFTP (SSH) and test connection worked! So im retrying the backup and we’ll see how we go!