Upload Throttle not working

I’m on 2.0.5.1_beta_2020-01-18, Windows 10, backing up to an S3 compatible bucket, and I’m experiencing big issues still.

Restarting the machine, and setting both upload and download caps during the warmup pause period does nothing, upload is set to 50KB/s (<0.5Mbps), but network is still showing Duplicati using an average of 1.5Mbps over the last hour that I’ve been testing it.

Then when I try to stop it to get my limited bandwidth back, “Stop Now” doesn’t work, it displays “Stopping after the current file” instead (and no I didn’t hit the wrong button), even after multiple attempts to use “Stop Now”. After attempting this, things get worse… the progress bar on the current backup job has stopped progressing at all, even though the overview progress bar shows a rate of a little over 100KB/s and continues to fluctuate. Sometimes the main progress rate dissappears, but after an F5 and a quick refresh it appears again. Nothing works to stop it… the only way I’ve been able to stop it was to literally kill the process in the task manager.

I’ve seen mentions above about multiple threads maybe each limited, but not respecting the overall limit as a group, but I can’t find any settings where I can set a thread limit of 1.

Would be happy to provide any other info you need to diagnose, but would also appreciate any suggestions for a temporary solution. One of my backup tasks has been failing since the beginning of April and is quite out of date now.

Thanks,
Kevin.


EDIT: I see a number of Canary versions but no mention of throttling in the release notes, but even still I’ll give duplicati-2.0.5.107_canary_2020-05-26-x64 a try and see how it goes.


EDIT 2: No joy, same issues continue to persist, including having to Kill the process to stop whatever is going on and preventing “Stop Now” from working. These are continually forcing me to rebuild databases as I corrupt them from these hard stops.

Is this exact sequence required? Is this the GUI control at top of page? Duplicati restarts don’t reset that.

What happens if you just leave it set, then backup? For me, setting upload throttling there works fine for OneDrive and Google Drive. I don’t have S3 to try, but I can’t think of any reason it would work differently.

Here are my results at 10KByte/second. One difference is Google Drive has --asynchronous-concurrent-upload-limit of 1 instead of the default 4. I’m not familiar with the code details, but I think when the parallel upload code got added, it couldn’t do each upload at stated speed, or it would exceed the specified limits.

Possibly the math isn’t right yet, however you can certainly throttle even lower to see if you can get any…

OneDrive sample dblock upload:
2020-06-03 07:16:03 -04 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Started: duplicati-b308c3beeed464101a7b6eaba42aa8340.dblock.zip.aes (27.08 MB)
2020-06-03 08:03:27 -04 - [Profiling-Duplicati.Library.Main.Operation.Backup.BackendUploader-UploadSpeed]: Uploaded 27.08 MB in 00:47:23.4198389, 9.75 KB/s
2020-06-03 08:03:27 -04 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Completed: duplicati-b308c3beeed464101a7b6eaba42aa8340.dblock.zip.aes (27.08 MB)

Google Drive sample dblock upload:
2020-06-03 09:33:40 -04 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Started: duplicati-bb55d347b41d1491482c43ac7c5c8c958.dblock.zip.aes (36.82 MB)
2020-06-03 10:35:08 -04 - [Profiling-Duplicati.Library.Main.Operation.Backup.BackendUploader-UploadSpeed]: Uploaded 36.82 MB in 01:01:27.9430744, 10.22 KB/s
2020-06-03 10:35:10 -04 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Put - Completed: duplicati-bb55d347b41d1491482c43ac7c5c8c958.dblock.zip.aes (36.82 MB)

About → Show log → Live → Retry would give you a live log of major events like file uploads, and that should be enough to see whether uploads are moving too fast. The profiling log can do the math for you however it’s probably not worth it for a first shot. They’re huge. A small alternative would be a filtered log.

How is network usage monitored? I watched in Task Manager. Not much else is typically uploading, and Duplicati certainly wasn’t blasting. I even watched packets on Wireshark, and saw the data dribbling out destined for the only Google (or any remote) destination Duplicati had a connection ESTABLISHED with.

There have definitely been some bugs of confusion over throttling direction, but 2.0.5.107 should be fine.
throttle-download and –throttle-upload on the job are alternate ways of throttling. They can also be set in Settings in Duplicati as a global option. I don’t recall who wins if the three spots I mentioned don’t agree.

I think this is just a messaging bug of reusing a message for a different situation. The “Stop now” is not instant, but is closer to it than “Stop after current file” which means the source file (as seen in the GUI). There is a long pipeline between seeing the file and actually getting everything processed and uploaded.

Stay as close to defaults as possible now. Maybe you mean --asynchronous-concurrent-upload-limit as shown above, but also demonstrated that it’s doing well with either 1 or 4 threads, at least in my testing,

You can certainly test a small newly added backup to a local folder. Throttling should work there as well.

Upload results posted above. Download results at 100 KByte/sec:

OneDrive
2020-06-03 18:21:09 -04 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Get - Started: duplicati-b8572d4ae76034e8b8abfe4417ec35fa9.dblock.zip.aes (36.22 MB)
2020-06-03 18:27:26 -04 - [Profiling-Duplicati.Library.Main.BackendManager-DownloadSpeed]: Downloaded 36.22 MB in 00:06:17.0884884, 98.35 KB/s
2020-06-03 18:27:26 -04 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Get - Completed: duplicati-b8572d4ae76034e8b8abfe4417ec35fa9.dblock.zip.aes (36.22 MB)

Google Drive
2020-06-03 14:42:51 -04 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Get - Started: duplicati-bcb1775dde5a14746851f048ebbf987b9.dblock.zip.aes (22.77 MB)
2020-06-03 14:46:51 -04 - [Profiling-Duplicati.Library.Main.BackendManager-DownloadSpeed]: Downloaded 22.77 MB in 00:04:00.5378011, 96.92 KB/s
2020-06-03 14:46:51 -04 - [Information-Duplicati.Library.Main.BasicResults-BackendEvent]: Backend event: Get - Completed: duplicati-bcb1775dde5a14746851f048ebbf987b9.dblock.zip.aes (22.77 MB)

This was on 2.0.5.106, I believe, and confirms that the 2.0.5.104 throttles control their correct direction.
There’s still a mystery about the new report, so I hope you can troubleshoot and figure out the problem.

sounded similar, so I linked here. I wrote some other evidence that upload throttling seems to often work.
There are no open issues with throttle in their title that sound like this. There’s one on throttle working when backup is to local folder, but you can use that to your advantage and create Steps to reproduce.

Hey @ts678, sorry for not getting back sooner, I hope to dig into this tonight and get back some more information. I’m on a limited (1Mbit Up) connection here and wanted to give it a chance to get through a full backup at least once, since it hasn’t completed successfully in the last 2 weeks. Alas after 2 days of not being able to do anything else on the internet due to the throtling (set at 25KB, and then dropped to 5KB) not being adhered to, I finally had to kill the process again.

I’ll try some of the suggestions you made one at a time to see what has any effect, and I’ll try to collect logs this time to provide any insight. Appreciate the help and will do my best to help sort through!

Cheers,
Kevin.

Yes, in order to try to eliminate any lingering setting changes or anything else, I was attempting to do it after reboot. I’ve since not had to as the throttle limits are in place after reboot. I’ve set to 10KB/s UP to make sure I can see Duplicati working and that it’s not so little that it could be mistaken for anything else.

I was eager to try this so I’ve set it now, and it looks promising, in my first attempt it looks like the 10KB/s limit is actually being enforced. If I remove that setting, usage spikes again as if it’s not controlled at all. It feels like a sanity check around throttle/threads isn’t working as expected. I think that’s what you alluded to here as well.

Same. I watch the overall “Performance” graph for that network device, as well as the “Process” thread for Duplicati to see what it’s using specifically. As noted above, when I set the upload thread limit to 1, it would avg out at my expected 10KB (0.1Mbps), but without the thread limit, it was averaging over 1.0 Mbps as if no limit was set at all.

Fair enough, I think that would be helped with a bit more information than the short stopping message in the status bar at top. I assume it’s more akin to “Stop after current chunk” rather than “Stop instantly” so it would be ideal to be able to see a chunk progress indicator somehow so that there is some expectation that it will indeed end. If I watch in task manager, it seems like I get 0%CPU, etc. for that process after trying to stop, so it feels like it’s frozen. More feedback would allay this.

I’m rebuilding all of my DB’s to let them get a fresh shot at it now that I’ve limited upload threads to 1, and will report back how it does once I’ve seen some progress.

Thanks again!