Backup location via FTP on AVM FritzBox fails

error (1.2 MB)
This is my first time actually using Duplicati.

I’m trying to setup a 2,5" hdd that’s hooked up to a AVM Fritzbox as the backup location and copy the files via FTP.

Unfortunately, the initial backup failes after a couple of minutes. According to the logs the connection failed.

Is there a way to get more details about this? Is there some kind of ftp time-out causing this problem?

System.Net.WebException: Die Verbindung mit dem Remoteserver kann nicht hergestellt werden.

   bei Duplicati.Library.Main.Operation.BackupHandler.HandleFilesystemEntry(ISnapshotService snapshot, BackendManager backend, String path, FileAttributes attributes)

   bei Duplicati.Library.Main.Operation.BackupHandler.RunMainOperation(ISnapshotService snapshot, BackendManager backend)

   bei Duplicati.Library.Main.Operation.BackupHandler.Run(String[] sources, IFilter filter)

   bei Duplicati.Library.Main.Controller.<>c__DisplayClass17_0.<Backup>b__0(BackupResults result)

   bei Duplicati.Library.Main.Controller.RunAction[T](T result, String[]& paths, IFilter& filter, Action`1 method)

   bei Duplicati.Library.Main.Controller.Backup(String[] inputsources, IFilter filter)

   bei Duplicati.Server.Runner.Run(IRunnerData data, Boolean fromQueue)

Hi @TOMillr

I don’t think there’s detailed logging specifically for FTP (unfortunately same might generally hold), however there is detailed generic logging available about things like Get and Put operations and their transfer rates.

log-level Information is enough to see file action. You can also use WebUI About -> Show log -> Live where you can choose level as you like. To see the transfer rates for individual files (not aggregate) use Profiling.

You can also look for similar problem with FTP and/or FRITZ!Box. There’s a long discussion with both here.

Debugging failures at protocol level may need expert + netstat or packet capture. Note you might have this.

If you create a very small test backup to the FTP destination does it work? If so, how much more source data can you add before it fails and does cutting back allow it to start working again?

You might also want to try setting --no-connection-reuse=true just to see if that makes a difference.

Duplicati will attempt to perform multiple operations on a single connection, as this avoids repeated login attempts, and thus speeds up the process. This option can be used to ensure that each operation is performed on a seperate connection
Default value: “false”

There is also an FTP called “FTP (Alternative)” on the selector that uses a third-party library instead of the one included in .NET or Mono. You could try that, and for the “FTP” storage also look at Advanced options called ftp-regular and ftp-passive. They are mutually exclusive, but sometimes one FTP mode works better than the other. Passive is more router/firewall friendly. I thought it was the Duplicati default, but Advanced options disagrees…

Is all of this happening on a local area network, as opposed to across the Internet (which has security worries)?