Backup process stucks on random file

Version: Duplicati - 2.0.5.111_canary_2020-09-26 (some preciouses versions had the same malfunction).
Scheduled backup starts and stuck in a middle of process while backing up files. Sometime it’s the same file, sometime it’s another.
If I press Stop on current file, the process stuck on Stopping – “Stopping after the current file:…”
If Stop Windows Service (“Duplicati service”), the process “Duplicati.Server.exe” still running with changing thread amount – 19-20-19-20-19-20-19-20 or 31-32-31-32-31-32-31-32 in my last case.
Nothing can stop the process except Task manager Terminate process.
When I terminate the process, start Windows service and start the backup Job, Duplicati stuck at the same file or close to it.
The job backing up to webdav, operation timeout is 2 minutes. Yesterday I have spent more than 2 hours looking at progress bar stuck on a small file.
If I disconnect from internet and connect back the process do not timed out our show any error.
I didn’t have successful backup for a week :frowning:
The last command in remote log is
: put duplicati-b930f5532c34d461a9d4165fd01448c37.dblock.zip.aes
{“Size”:104829453,“Hash”:“xRJHsE3kVCDMZKKVrDVg+SQgFZTMrVdcShfa01TFLVM=”}
Timestamp is 40 minutes ago.

In general log I can see previous job only, because of Task manager kill, I think.
One more try. Truck on the same file.

Job: 18177 files (28.75 GB) to go at 7.77 KB/s
Nothing is changed but speed (going down)
Job: 18177 files (28.75 GB) to go at 2.90 KB/s

I’m trying make a bug report via Job’s command:
Reporting:
Show log … Create bug report …
I’m clicking and nothing happens (job still in progress).

What a magic file which breaks the program!

Setting “snapshot-policy” = “required” allowed me to complete the backup procedure, finally!

helped not much, the job just stucks on another file.

Seemingly declining speed is sometimes due to problems getting files uploaded. How are those going?

Please see job log’s Complete logBackendStatisticsRetryAttempts for a job that finished.

About → Show log → Live → Retry shows retries, or set a log-file=<path> with log-file-log-level=retry

This is another sign that there could be something on the network side that’s causing backup flow to stop.
asynchronous-upload-limit will only queue so many files before it considers file queue too big to add more.
If you have lots of Temp space, I guess you could set this limit much larger to see if the stuck spot moves.

Another sign is if you get a number of files (default size 50 MB) in your Temp folder waiting to be uploaded.
For some reason the dir /od dup-* command in Command Prompt shows ones in progress as empty.
You could try looking in %TEMP% anyway, or maybe Explorer can give you an accurate view of files there.
PowerShell stays current, and you could run ls $env:TEMP\dup-* | sort -Property LastWriteTime

Yet another sign that it’s not a specific file. Was the last file consistent across runs originally? It’s not now.

Do you have any other destinations available to test to see if this seems a source problem or destination?