Found 33 files that are missing from the remote storage, please run repair

Agreed.

Agreed, unfortunately. So SFTP is a different access to the very same folder of files FTP fails on?
There was also a repair thrown in which muddies this up some. Do you have a lot of files around?
Some FTP servers are known to limit the listing, which might explain why a lot of files aren’t seen.

You can probably try something like Verify files, then go to job → Show log → Remote to look.
Expand the list item for FTP, and use browser search for some file that is getting a false Missing.
If it doesn’t show up in the listing, that’s the reason Duplicati is thinking that the file is now missing.

If Duplicati’s list is not seeing files that are there, you could also test using another FTP program.

Something must have changed. If your computer was upgraded and it is using default FTP driver, it is relying on .Net to access FTP. Maybe that’s what has changed. You could try to switch to Alternate FTP to test this idea.
General note, when you change driver for a backup, try to remove the backend advanced options before changing (note them if you want to be able to switch back, or better do a json export of your backup if it’s not done already)

Yeah, its the same folder and permissions that I’m using for both. Same profile and database too, just the destination type changes. Both point to the same place.

When it runs and says certain files are missing using FTP, I can see those files on the server and it looks like the permissions are correct for the access for FTP.

I tried using the alternative one, but its tricky to work using SSL I’ve found.

Just tried again and it errors using SFTP, so something isn’t quite right regardless of access.

So if I just try and run the original job I get this

Duplicati.Library.Interface.RemoteListVerificationException: Found 42 files that are missing from the remote storage, please run repair
at Duplicati.Library.Main.Operation.FilelistProcessor.VerifyRemoteList(BackendManager backend, Options options, LocalDatabase database, IBackendWriter log, IEnumerable1 protectedFiles) at Duplicati.Library.Main.Operation.BackupHandler.PostBackupVerification(String currentFilelistVolume) at Duplicati.Library.Main.Operation.BackupHandler.<RunAsync>d__20.MoveNext() --- End of stack trace from previous location where exception was thrown --- at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw() at CoCoL.ChannelExtensions.WaitForTaskOrThrow(Task task) at Duplicati.Library.Main.Controller.<>c__DisplayClass14_0.<Backup>b__0(BackupResults result) at Duplicati.Library.Main.Controller.RunAction[T](T result, String[]& paths, IFilter& filter, Action1 method)
at Duplicati.Library.Main.Controller.Backup(String inputsources, IFilter filter)
at Duplicati.Server.Runner.Run(IRunnerData data, Boolean fromQueue)

The first failed log for the job that I can see says this

Duplicati.Library.Interface.RemoteListVerificationException: Found 30 files that are missing from the remote storage, please run repair
at Duplicati.Library.Main.Operation.FilelistProcessor.VerifyRemoteList(BackendManager backend, Options options, LocalDatabase database, IBackendWriter log, IEnumerable1 protectedFiles) at Duplicati.Library.Main.Operation.BackupHandler.PostBackupVerification(String currentFilelistVolume) at Duplicati.Library.Main.Operation.BackupHandler.<RunAsync>d__20.MoveNext() --- End of stack trace from previous location where exception was thrown --- at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw() at CoCoL.ChannelExtensions.WaitForTaskOrThrow(Task task) at Duplicati.Library.Main.Controller.<>c__DisplayClass14_0.<Backup>b__0(BackupResults result) at Duplicati.Library.Main.Controller.RunAction[T](T result, String[]& paths, IFilter& filter, Action1 method)
at Duplicati.Library.Main.Controller.Backup(String inputsources, IFilter filter)
at Duplicati.Server.Runner.Run(IRunnerData data, Boolean fromQueue)

Running the job and monitoring the live log output, the missing files are in the format of

duplicati-ifb16dfdf3cef4325b4feef91ac03ca5c.dindex.zip.aes

Well, Duplicati has a debugging tool for backends; it’s called Duplicati.CommandLine.BackendTool.exe and you run it by exporting the job as command line, grabbing the connection string, and go to the Duplicati directory and start

Duplicati.CommandLine.BackendTool.exe list (connection string)

then comparing with what you can find on the server by other ways. If it’s with Ftp you can always use any ftp free client utility like Filezilla or Winscp. This list feature is exactly what the Web Ui is doing to find if files referenced in the database are missing on the backend.

So, by using the command of

I’ve outputted it to a text file due to the number of files, if I search that file for a file that Duplicati says is missing it isn’t there.

If I Filezilla to the same folder, using the same credentials and search for the file, it is there.

I may have had a break through. I’m just testing.

So I think it’s fixed.

The issue is FTP, specifically Pure-FTP server and that it is set by default (I assume) to recurse 10000 files. There are more files than that in this set now.

How Filezilla was able to see if and Duplicati wasn’t, I don’t know as I would have thought that limit applied everywhere.

What I’ve done is change the config in PureFTP to 20000 (in the config file and/or by creating a file with the number of 20000 8 in a file called “LimitRecursion” in the /etc/pure-ftpd/conf directory).

Once I’d done this and restarted the server I re ran the tool and was able to see the missing file.

I ran a repair on the database and then ran a backup which was successful.

I’ll keep an eye on it, but thanks so much for your help. And if this was the issue, hopefully it’s a quick fix to change the FTP server setting as it wasn’t anything to do with Duplicati.

1 Like

FTP only lists a max 2000 files, is it possible to use subfolders? #1957

Some Microsoft cloud products give you an empty list if past 5000 files.
That’s less confusing, but can be very alarming when you get the error.

The recent Microsoft Graph technology seems to prevent this problem.
As older technology phases out, the problem with Microsoft decreases.

Logic and usecase for remote subfolders

Thanks for the specific directions. Another option is to slowly ramp up Remote volume size option, causing compact to repackage smaller dblock files into larger files to help reduce the total file count.

I think I’m going to do this for all my backups that use FTP.

I found this surprising, so I installed Pure-ftpd 1.0.50 and connected to it using FileZilla 3.66 and could not reproduce this; Filezilla was not able to bypass the server restriction.

I wasn’t sure either. Maybe my Filezilla didn’t list more than the 10000 limit, but one of those files it was able to read was the ‘missing’ file due to how to read them? Not sure, but I think the server limit was still being respected.

When I ran the Duplicati.CommandLine.BackendTool.exe list originally it was definitely limited at 10000 files until I made the change.

I was hoping that there was some secret way to call the ftp list function for that - after all, Alternate Ftp has specific code for some Ftp servers - but it don’t seem so.

I have no doubt about that, it’s how it is supposed to work.