Alternatively perhaps consider QoS. I tag all backup traffic with a certain QoS value. Then my router puts that traffic in a lower priority bucket. It can consume all my upload traffic if available, but as soon as I do something else it takes priority over the backup traffic.
This was my suggestion in another thread. How did you identify “backup traffic” so it could be tagged? I’m familiar with how to tag things on my router, but I’m at bit at a loss on how to identify backup traffic. I’m using Backblaze B2, so it’s outgoing https traffic, and coming from 3 or so devices on my network, but two of them could be on of of a couple of different VLANs.
Is there a way to classify just the backup traffic that doesn’t involve a specific set of source and destination IPs?
I found a possibility if you have only one instance of Duplicati: NetLimiter 4 for Windows.
Unfortunately not freeware but it fits for me! The Pro version costs about 30$. Maybe the Lite can do also the limitation then 20$.
You have many possibilities. You can limit the internet speed of the process (duplicati.gui.tryicon.exe). You can set this limits “by hand” manual or with rules => eg. enable the limit 23:00 and disable it 06:00 or you can limit it by target (eg. 220.127.116.11 for the Cloud of Deutsche Telecom) or you can set priorities (I have not tried this yet) or you can set limits after reaching up/down quotas (also not tried this).
Here an example for a limit set just for duplicati internet traffic:
And in the process view you see the upload limit 80kb/sec
And here a limit for a specific cloud server is set, no matter if the traffic is from Duplicati or a different process:
If you have more then one PC with Duplicati a possible solution would be to redirect the traffic of all Duplicati-PCs thru a proxy and then use a proxy which can throttle different users. Eg. CCProxy (costs about 4$/year).
Thanks for that suggestion, it looks really useful!
Not that I’m offering anything like this, but would an advanced parameter like this do what you want? For example:
--throttle-schedule=80KB@09:00-17:00 would limit bandwidth to 80KB from 9AM to 5PM otherwise run at current global (or manually throttled) speed
--throttle-schedule=80KB@09:00-17:00&128KB@17:00-00:00 would limit bandwidth to 80KB from 9AM to 5PM, 128KB from 5PM to midnight, then whatever the global (or manually throttled) setting is from midnight to 9AM
Overlapping times would likely use the FIRST time and speed specified. Not that times are likely approximate such that a speed change won’t start until the next upload begins. In other words, if it takes 15 minutes to upload a file started at 8:59 AM then the new speed may not kick in until 9:14 AM.
Note that in this concept there is no day-of-week adjustment possible.
Edit: It looks like there’s some GitHub requests for this type of feature as well. Maybe a bounty could be put together…
My initial question was much easier! I just asked if the GUI can remember the last value. So your suggestion would be a possibility but I this it much more complex than “a little gui improvement” (remember in a cookie or something like that).
I was hoping there would be something from the command line on windows for me to script pausing the entire server and setting the throttle limits. Not just on the individual backup jobs. Something like this:
Now that I understand what you want to do, I can definitely say…I have no idea.
I know other users have used QoS (Quality of Service) settings on their routers / gateways (Throttle Bandwidth After Backup Has Started?) or third party apps (see NetLimiter above) that throttle specific traffic or network connections, but I’m betting that @Pectojin will have a thought or two on possibly doing what you’re describing.