Slow Internet, backup stalls, no errors

Running current beta 2.0.2.1
Win10, all updates
Uploading to FTP site

I have one computer that can’t seem to complete a backup. I’ve restarted it several times, tried delete/repair, but it always seems to get stuck on something and I don’t know where to look.

This site is extremely slow - it is 5/1 Mbps DSL only, so it takes a long time for any zip segment to get transferred.

It starts to upload the first 50mb set, but once that complete, the FTP connection closes, and there’s no further update or errors out of the CLI:

Backup started at 2018-02-13 7:56:11 PM
Checking remote backup …
Listing remote folder …
Re-creating missing index file for duplicati-bff7d80856ff7488584f8cb82768254cb.dblock.zip.aes
Scanning local files …
Uploading file (781 bytes) …
1504 files need to be examined (787.70 MB)
1457 files need to be examined (745.49 MB)
1231 files need to be examined (348.92 MB)
Uploading file (49.92 MB) …
126 files need to be examined (311.33 MB)
118 files need to be examined (255.39 MB)

I’ve killed/restarted it 3 times and it is always stuck at 118 files.

Here is the output in --verbose:

Checking file for changes C:\Users\User\Downloads\Install\Msi\ControlCenter4.msi, new: True, timestamp changed: True, size changed: True, metadatachanged: True, 2016-11-22 10:10:29 AM vs 0001-01-01 12:00:00 AM
118 files need to be examined (616.45 MB)

edit: Well, I ended up deleting a bunch of files, wiping the db completely, deleted the FTP folder, and starting from scratch. That got it working, but I still don’t know what it was hung up in the first place or how to troubleshoot it…

Glad to hear it’s working…less glad they it took a nuking to make it behave.

It looked like you were stuck repairing the database for some reason. A main menu About -> Show Log -> Live with Profiling selected might have provided more detail.

If you prefer text files adding --log-file=\<path> and --log-level=Profiling to the job would have created a text file of the same stuff the Live log w/Profiling would have had.

I was in the UI Live log section but nothing ever came up, in any log level… not sure why though.

OK so I am now on the second computer at the site and it’s also stalling, but this time there is information in the Profile logs:

Feb 13, 2018 10:28 PM: UsageReporter failed
Feb 13, 2018 10:28 PM: UsageReporter failed
Feb 13, 2018 10:28 PM: UsageReporter failed
Feb 13, 2018 10:28 PM: UsageReporter failed
Feb 13, 2018 10:24 PM: CommitAddBlockToOutputFlush took 00:00:00.023
Feb 13, 2018 10:24 PM: Starting - CommitAddBlockToOutputFlush
Feb 13, 2018 10:24 PM: ExecuteReader: SELECT DISTINCT “Hash”, “Size” FROM “Block” WHERE “VolumeID” = ? took 00:00:00.000
Feb 13, 2018 10:24 PM: Starting - ExecuteReader: SELECT DISTINCT “Hash”, “Size” FROM “Block” WHERE “VolumeID” = ?
Feb 13, 2018 10:24 PM: ExecuteScalarInt64: INSERT INTO “Block” (“Hash”, “VolumeID”, “Size”) VALUES (?, ?, ?); SELECT last_insert_rowid(); took 00:00:00.000
Feb 13, 2018 10:24 PM: Starting - ExecuteScalarInt64: INSERT INTO “Block” (“Hash”, “VolumeID”, “Size”) VALUES (?, ?, ?); SELECT last_insert_rowid();
Feb 13, 2018 10:24 PM: ExecuteScalarInt64: SELECT “ID” FROM “Block” WHERE “Hash” = ? AND “Size” = ? took 00:00:00.000

10:24PM is when this computer stalled at 228 files (2.55GB) to go. The FTP transfer continued until the 50MB segment completed, but now it’s stuck at 52,346,650 bytes and won’t close the transfer.

edit: killed the Duplicati.server.exe task and started it again. Log shows stalling on the same line - Feb 13, 2018 10:46 PM: CommitAddBlockToOutputFlush took 00:00:00.013

If it makes a difference, I am running Duplicati as a service.

Found this in the General log:

Feb 13, 2018 10:46 PM: Message
removing file listed as Temporary: duplicati-20180214T062437Z.dlist.zip.aes
Feb 13, 2018 10:46 PM: Message
Uploading filelist from previous interrupted backup
Feb 13, 2018 10:46 PM: Message
scheduling missing file for deletion, currently listed as Uploading: duplicati-i88da95bd766849968cbe9f76a3920d99.dindex.zip.aes
Feb 13, 2018 10:46 PM: Message
scheduling missing file for deletion, currently listed as Uploading: duplicati-b2c31324000d04acbb180f860588a6dee.dblock.zip.aes
Feb 13, 2018 10:46 PM: Message
scheduling missing file for deletion, currently listed as Uploading: duplicati-i302ec13a2add4e79b5feab8e889e5efd.dindex.zip.aes
Feb 13, 2018 10:46 PM: Message
scheduling missing file for deletion, currently listed as Uploading: duplicati-b4dd86ad9d84045ba83c589404ebe0a99.dblock.zip.aes
Feb 13, 2018 10:46 PM: Message
scheduling missing file for deletion, currently listed as Uploading: duplicati-i162815fca59b40ae9f0061f38c719fee.dindex.zip.aes
Feb 13, 2018 10:46 PM: Retry
Operation Delete with file duplicati-b3070991a4e244773be1dfcd5ffed995e.dblock.zip.aes attempt 1 of 5 failed with message: The remote server returned an error: (500) Syntax error, command unrecognized.
System.Net.WebException: The remote server returned an error: (500) Syntax error, command unrecognized.
at Duplicati.Library.Utility.AsyncHttpRequest.AsyncWrapper.GetResponseOrStream()
at Duplicati.Library.Utility.AsyncHttpRequest.GetResponse()
at Duplicati.Library.Backend.FTP.Delete(String remotename)
at Duplicati.Library.Main.BackendManager.DoDelete(FileEntryItem item)
at Duplicati.Library.Main.BackendManager.ThreadRun()
Feb 13, 2018 10:46 PM: Message
removing incomplete remote file listed as Uploading: duplicati-b3070991a4e244773be1dfcd5ffed995e.dblock.zip.aes

This is a UI problem in Chrome which is fixed in the latest canary build. If you open the page in IE or Firefox you should see logs :slight_smile:

OK I updated this client to 2.0.2.19 Canary build, but same result…

Profiling live log seems to show now, but doesn’t report anything:

Feb 14, 2018 6:20 PM: Server has started and is listening on 127.0.0.1, port 8200

Status is stuck at 64 files (2.00 GB) to go.
FTP server receives the first 50MB segment, but then nothing after that.

EDIT: Same thing shows when running from cmdline:

Backend cleanup finished, retrying verification
Listing remote folder …
Expected there to be a temporary fileset for synthetic filelist (5, duplicati-20180214T064558Z.dlist.zip.aes), but none was found?
Scanning local files …
307 files need to be examined (2.89 GB)
170 files need to be examined (2.48 GB)
101 files need to be examined (2.30 GB)
Uploading file (49.94 MB) …
67 files need to be examined (2.10 GB)
64 files need to be examined (2.03 GB)

Stalls here, FTP receives first 50MB segment, nothing after that.

EDIT2: OK So I tried lowering the segment size from 50MB to 10MB. Currently it’s uploading its 4th 10MB segment and hasn’t stalled yet.
Is there something that this slow internet is exacerbating? Maybe a process is timing out waiting for the upload to finish, causing everything to stall?

That’s possible, but I would expect a timeout message in that case.

I’m not sure if it will help, but now that you’re on the latest canary you could try using the --http-operation-timeout parameter and set it to something longer than the default of 10m

Ok, I stopped the backup and added --http-operation-timeout=20m and changed the block size back to 50MB, and will report back…

No dice… I’m running from cmd line too, so not sure if http comes into play here.
I notice that after the file finishes uploading, the FTP connection stays open for a while before disconnecting. When it’s operating normally, the connection drops almost immediately after the file finishes uploading.

EDIT: Even at 20M block size, it is stalling.

Have you tried to use the aFTP backend? It works better with some servers.

1 Like

D’oh! I got my protocols confused. The HTTP timeout parameter probably won’t apply to an FTP connection. It’s more for WebDAV, S3, etc. HTTP based protocols.

@kenkendk I have tried the alternate before (by accident), and it didn’t even connect. I’m using FileZilla FTP server, with SSL enabled.

@JonMikelV That’s my thought too. Watching it, it seems like you have a process that watches the FTP connection, so that when it sees a block has finished uploading, disconnect and continue on. But, if it takes too long, the block finishes uploading, but no command is sent to disconnect and continue. Instead, the FTP connection sits idle (and eventually times out) while the Duplicati job sits there, stalled.

I’m not sure if this will apply to FTP or not, but you could try using the --no-connection-reuse parameter.

no-connection-reuse (default false)
Duplicati will attempt to perform multiple operations on a single connection, as this avoids repeated login attempts, and thus speeds up the process. This option can be used to ensure that each operation is performed on a seperate connection

Hm, I’ll give that a try. Are you sure default for that is false? All the behaviour I’ve seen from the FTP server side are Duplicati clients repeatedly disconnecting/reconnecting for every block being uploaded.

Pretty sure. If I the setting in the GUI it comes in NOT checked (false) and if I export the job to a command line I see --no-connection-reuse=false as well.

It can get confusing when negatively named parameters are set to false. If we negate both sides of the equals it could be read as “connection-reuse=true”, which matches what you’re seeing in the FTP logs.

Same result, with --no-connection-reuse=true.
This is the cmd I’m using to test with (exported from the GUI job and pruned for security):

“C:\Program Files\Duplicati 2\Duplicati.CommandLine.exe” backup “ftps://backup.server.com//Test?auth-username=abc&auth-password=xxx” “C:\Quickbooks\” “C:\Users\” --send-mail-url=smtps://mail.server:465 --send-mail-any-operation=true --send-mail-subject=“Duplicati %OPERATIONNAME% (%PARSEDRESULT%) report for %backup-name%” --send-mail-to=“” --send-mail-username=“” --send-mail-password=“” --send-mail-from=“Duplicati Backup Report <>” --auto-cleanup=true --snapshot-policy=auto --backup-name=“Test-Server” --dbpath=“C:\WINDOWS\system32\config\systemprofile\AppData\Local\Duplicati\TETCDIQGSY.sqlite” --encryption-module=aes --compression-module=zip --dblock-size=50MB --passphrase=“” --keep-versions=3 --exclude-files-attributes=“hidden,system,temporary” --disable-module=console-password-input --no-connection-reuse=true --log-file=C:\Dell\log.txt --log-level=Profiling

That’s too bad because I’m not sure what else to try beyond checking FTP and the Duplicati logs I see you’re writing. And even then, the issue may not be logged…

I suppose a TCP packet monitoring tool might be useful, but I’m not sure.

Ah, well, it seems fine uploading in 10MB blocks, so I’ll leave it at that. Thanks for taking the time to offer suggestions.

You’re welcome - thanks to you for using Duplicati! … And the patience to wait for my replies as well as the figuring out the block size workaround yourself! :smiley:

It is good to see strong support for a product. I think as it is, it is a good product, just needs some time spent on polish and robustness (and maybe more detailed logging, lol!)

There are other minor quirks I am sorting out right now but I will search/create a new topic if it comes to that.
cheers!

1 Like