Found [lots of] files that are missing from the remote storage (FTP problem solved!)

My backup with lots of files and gigabytes was constantly failing with missing files, but files were on the server. So I looked in the FTP logs (pure-ftpd in my case).
Last command from duplicati was mlsd /my_backup after it failed.
So I connected from command line with ftp program and run mlsd /my_backup and response was: 226 Output truncated to 10000 matches. Truncated you say?! I connected via ssh and counted 10915 files on that server. So ftp server was messing the backup results.
In case of pure-ftpd I found that there is configuration LimitRecursion 10000 8 in pure-ftpd.conf which should be set to much higher number. After setting it to million files, backup happily finished without errors.

So if you are backuping to some FTP (or other cloud service), check for the file list limit.

To the devs: you should implement check for truncated message on end of file listing and throw that error.

@Kowach Thanks for the information.

The 226 return code as per RFC indicates the connection will be closed and the operation was successful, so its not an error code.

Detecting this would need to be a regex match of this message which is specific to pure-ftpd, but indeed it breaks the functionality and is potentially hard to diagnose; great catch and thanks for sharing. I’ve create an issue.

1 Like

@Kowach FYI there is already a PR for this detection that should make its way into Canary/Stable soon enough.

Thanks again for reporting.

If a PureFTP server is detected, a warning is issued when executing server file listing.

The parameter --ignore-pureftpd-limit-issue allows the warning to be suppressed.

In the event of an effective truncation, it will throw an exception and that cannot be suppressed as incomplete listing can lead to misleading backup/restore operations.

1 Like