Pause is not Pausing

Hi, I am in the middle of my first backup with Duplicati. I am running on Windows 10. My backup target is a machine on my local network via SFTP.

The first backup is rather large (1 TB). The backup has been going several days, as expected.

The backup process is creating some source machine slowdowns, again which is fine.

However, I do have occasional tasks that need full attention of the source machine, so I attempted to pause the backup. However, this does not seem to actually pause the backup.

The taskbar Icon changes to show the pause symbol. The local server web interface shows the Resume option on the left and the un-pause icon at the top. However the progress bars keep marching along.

I have checked network traffic, and I can see the 50MB uploads continue moving across the network. I can see the new files appear on the target as well.

I thought that perhaps there was a queue of files waiting to upload, so I checked my temp directory, and I see only the 4 staging files that are setup by the asynchronous-upload-limit parameter are present.

I have examined the log, and new files are being added to the backup.

I thought perhaps the pause needed time to be recognized, so I left it paused for 24 hours to see what would happen, and the backup continued.

I have searched the forum, and have found the post Pause does not really pause, but this seems to be a different problem.

Any idea why I can’t pause my backup?

Thanks in advance for your help.


I’ll take some time and try to reproduce the issue

Well, after several days it finally paused - after it uploaded the entire backup. It left something like 1% left to do, and when I un-paused it, it took just a couple minutes to finish.

It is almost like the pause will only stop the backup between different stages of the process, which is unfortunate as the upload stage is the longest and most intensive.

Is this the intended behavior, or is this a bug?

I’ve been in this community for a small time, so take my opinion with a grain of a salt.

As far as I know, this is not intended behavior, but that’s the current implementation.

While debugging a related problem, I’ve found that once a dblock upload starts, there is nothing implemented to stop it until it finishes. That’s problem #1. 1. Furthermore, I can’t seem to find any rendezvous points that check for the control state and whether they should pause/stop or proceed the current operation (process or upload more files). That’s probably why Pause is not working

If you have time and are interested, here is another problem that may have the same root causes (no checking of control state):

Ctrl + F for " Apparently it hangs on AzureBlobWrapper , therefore I may have been wrong saying it happened on any backend, will test on a random one later to be sure." to find the comment in which the relevant part of the discussion is started.

I can’t explain the whole scheme, but the below seems to reflect the sort of design that you described:

Yes, I had that in mind when I wrote. This TaskControlRedevouz is largely used on the Restore operation, however I couldn’t find a single (maybe one or two, I don’t remember) check for the current state on the backup operation.

So, behavior I am experiencing is typical - good I guess.

But as stated, this is not the intended behavior.

Should I fill a bug/feature request? Where do I do that?

That’s up to you. It’s best to at least get it on the record as a request, and you can open it in the forum in the Features category instead of the Support category, or I can change the category on THIS article if you prefer.

You could also jump into the “Pause does not really pause” feature request you found, with a update. Source seems to have had several more TaskControlRendevouz calls in, but the rewriting for was heavy enough that I can’t easily tell where chunks of code I saw before went, so it might be a false alarm. Someone who’s motivated enough could probably set up the exact same test on two versions for better proof.

Duplicati issues in GitHub is another place people put enhancement requests, but a different login is required.

I am having the same problem. First backup, taking long time. Want to use bandwidth/processing for others tasks during some hours of the day, but it won’t let me since pause does not work.

I am also experiencing the pause button not working. I am not sure why it should be a feature request though - it used to work quite well in previous versions. Was that a bug that got fixed?!

There was a major rewrite in to add concurrency in hopes of increasing performance, that might have caused the changes I cited earlier, as lots of code got moved, and possibly some got left behind…

I think there are reports in the forum of issues from pause or stop, so maybe it also avoided some bugs. You probably won’t find active forum participants who were close enough to the plans to give you details.

From a forum point of view, it might not be a Feature but it’s not exactly Support. It ought to be in issues, which can track issues better. Perhaps it already is. Maybe someone can see if anyone has good ideas.

This issue has been reported on the project’s Github:

As I wrote over there:

I’ve also noticed that changing the ‘throttle’ settings has no effect on an upload that’s currently in progress.

Presumably this is a bug and not an intentional removal of a feature. Either way, it represents a significant regression in capability. For those with large datasets to backup but relatively limited bandwidth on the uplink (like my residential connection), this is a serious problem. I require the ability to pause a large upload (which may take a day or more), or at the very least change the throttle, in order to make bandwidth available for other users and services on the network, whilst minimising backup time as much as possible. Flexible, real-time management of the upload process state and bandwidth is an absolutely essential capability of a backup solution, at least for my use-case.

I’m left with the options of either limiting available bandwidth at the network level (proxy, switch, router etc), or downgrading to a previous version if and until this is fixed.

I hope this issue is addressed soon, as this a potential show-stopper of a bug as far as my use-case, which would be a crying shame as Duplicati is a fantastic project that’s served me very well since I began using it early last year. Hugely grateful to the devs who’ve worked so hard to make this freely available.

There might be another workaround (which I have not tested – would you like to?) of using the Duplicati rclone storage type with the rclone –bwlimit which allows a timetable. For your case, slow would be the alternative to truly paused. The rclone backend seems to not be a streaming backend, so you may lose some progress info, and throttling would be set by rclone instead of Duplicati. Supported storage types appear to be quite extensive, but any configuration issues are probably best brought to the rclone forum.

There’s also a toggle of the limiter with SIGUSR2, and some sort of bandwidth remote control available.

How to Limit Data Usage and Internet Bandwidth in Windows and similar specialty tools might also help.

While I can see pause getting fixed because it’s a regression, better bandwidth controls seem a feature.

Thanks for the suggestions. I may look into the rclone backend when I’ve a bit more free time, but limiting bandwidth at the network layer will suffice for now in my case. My backup box is dedicated to that purpose, so although inconvenient, it’s feasible to isolate and throttle at the switch, at least in the short term.

FYI I’m having this problem too. It just won’t pause (this is my first initial backup). I had to kill the program.

Any update on this? I have the same problem in my docker container. No pausing and no upload limiting. shows no pause change since
shows a couple of open issues which you could look at.

job pausing/stopping doesn’t work in #3565 has updates since its mention above.
Fix pause and resume, and check for cancel while uploading #3712 seems itself paused. I’m not sure why.
“pause” doesn’t stop the upload #1088 is an older issue.

Improved logic around throttle values, thanks @seantempleton

is the release note for

Fix bandwidth throttling inconsistency #4127

which fixes confusion

–throttle-download ignored, --throttle-upload throttles download too #4115

where settings were applied in the wrong way, however the problem in (what release is yours?) would not result in no upload throttling, but in download throttling when you wanted just upload throttling.

Basically, upload throttling seems to be working for mostpeople. Further proof is it’s caused issues like timeouts (needs settings adjustment) and data corruption (bug is fixed). On the other hand, there’s this:

Upload Throttle not working
which has some troubleshooting ideas and is currently waiting to hear results from the person reporting.
Perhaps you can follow up on some of the ideas, otherwise I’m not sure if any progress can be made…

Now that “stop after current file” and “stop now” are supposedly working in the canary channel, what are the use cases for pausing a running backup?

Any solution to this? Perhaps the non-working “pause” button should be removed from the interface in lieu of the existing “stop” functionality?

My inclination is to remove it, but wanted to see first if there were some use cases that I wasn’t aware of. Since it’s not working, it’s safer to remove it to prevent users from simply killing the process, which could leave the database in a bad state.