Pause is not Pausing

Hi, I am in the middle of my first backup with Duplicati. I am running 2.0.4.5_beta_2018-11-28 on Windows 10. My backup target is a machine on my local network via SFTP.

The first backup is rather large (1 TB). The backup has been going several days, as expected.

The backup process is creating some source machine slowdowns, again which is fine.

However, I do have occasional tasks that need full attention of the source machine, so I attempted to pause the backup. However, this does not seem to actually pause the backup.

The taskbar Icon changes to show the pause symbol. The local server web interface shows the Resume option on the left and the un-pause icon at the top. However the progress bars keep marching along.

I have checked network traffic, and I can see the 50MB uploads continue moving across the network. I can see the new files appear on the target as well.

I thought that perhaps there was a queue of files waiting to upload, so I checked my temp directory, and I see only the 4 staging files that are setup by the asynchronous-upload-limit parameter are present.

I have examined the log, and new files are being added to the backup.

I thought perhaps the pause needed time to be recognized, so I left it paused for 24 hours to see what would happen, and the backup continued.

I have searched the forum, and have found the post Pause does not really pause, but this seems to be a different problem.

Any idea why I canā€™t pause my backup?

Thanks in advance for your help.

2 Likes

Iā€™ll take some time and try to reproduce the issue

Well, after several days it finally paused - after it uploaded the entire backup. It left something like 1% left to do, and when I un-paused it, it took just a couple minutes to finish.

It is almost like the pause will only stop the backup between different stages of the process, which is unfortunate as the upload stage is the longest and most intensive.

Is this the intended behavior, or is this a bug?

Iā€™ve been in this community for a small time, so take my opinion with a grain of a salt.

As far as I know, this is not intended behavior, but thatā€™s the current implementation.

While debugging a related problem, Iā€™ve found that once a dblock upload starts, there is nothing implemented to stop it until it finishes. Thatā€™s problem #1. 1. Furthermore, I canā€™t seem to find any rendezvous points that check for the control state and whether they should pause/stop or proceed the current operation (process or upload more files). Thatā€™s probably why Pause is not working

If you have time and are interested, here is another problem that may have the same root causes (no checking of control state):

Ctrl + F for " Apparently it hangs on AzureBlobWrapper , therefore I may have been wrong saying it happened on any backend, will test on a random one later to be sure." to find the comment in which the relevant part of the discussion is started.

I canā€™t explain the whole scheme, but the below seems to reflect the sort of design that you described:

Yes, I had that in mind when I wrote. This TaskControlRedevouz is largely used on the Restore operation, however I couldnā€™t find a single (maybe one or two, I donā€™t remember) check for the current state on the backup operation.

So, behavior I am experiencing is typical - good I guess.

But as stated, this is not the intended behavior.

Should I fill a bug/feature request? Where do I do that?

Thatā€™s up to you. Itā€™s best to at least get it on the record as a request, and you can open it in the forum in the Features category instead of the Support category, or I can change the category on THIS article if you prefer.

You could also jump into the ā€œPause does not really pauseā€ feature request you found, with a 2.0.4.5 update. Source seems to have had several more TaskControlRendevouz calls in 2.0.3.3, but the rewriting for 2.0.4.5 was heavy enough that I canā€™t easily tell where chunks of code I saw before went, so it might be a false alarm. Someone whoā€™s motivated enough could probably set up the exact same test on two versions for better proof.

Duplicati issues in GitHub is another place people put enhancement requests, but a different login is required.

I am having the same problem. First backup, taking long time. Want to use bandwidth/processing for others tasks during some hours of the day, but it wonā€™t let me since pause does not work.

I am also experiencing the pause button not working. I am not sure why it should be a feature request though - it used to work quite well in previous versions. Was that a bug that got fixed?!

There was a major rewrite in 2.0.3.6 to add concurrency in hopes of increasing performance, that might have caused the changes I cited earlier, as lots of code got moved, and possibly some got left behindā€¦

I think there are reports in the forum of issues from pause or stop, so maybe it also avoided some bugs. You probably wonā€™t find active forum participants who were close enough to the plans to give you details.

From a forum point of view, it might not be a Feature but itā€™s not exactly Support. It ought to be in issues, which can track issues better. Perhaps it already is. Maybe someone can see if anyone has good ideas.

This issue has been reported on the projectā€™s Github:

As I wrote over there:

Iā€™ve also noticed that changing the ā€˜throttleā€™ settings has no effect on an upload thatā€™s currently in progress.

Presumably this is a bug and not an intentional removal of a feature. Either way, it represents a significant regression in capability. For those with large datasets to backup but relatively limited bandwidth on the uplink (like my residential connection), this is a serious problem. I require the ability to pause a large upload (which may take a day or more), or at the very least change the throttle, in order to make bandwidth available for other users and services on the network, whilst minimising backup time as much as possible. Flexible, real-time management of the upload process state and bandwidth is an absolutely essential capability of a backup solution, at least for my use-case.

Iā€™m left with the options of either limiting available bandwidth at the network level (proxy, switch, router etc), or downgrading to a previous version if and until this is fixed.

I hope this issue is addressed soon, as this a potential show-stopper of a bug as far as my use-case, which would be a crying shame as Duplicati is a fantastic project thatā€™s served me very well since I began using it early last year. Hugely grateful to the devs whoā€™ve worked so hard to make this freely available.

There might be another workaround (which I have not tested ā€“ would you like to?) of using the Duplicati rclone storage type with the rclone ā€“bwlimit which allows a timetable. For your case, slow would be the alternative to truly paused. The rclone backend seems to not be a streaming backend, so you may lose some progress info, and throttling would be set by rclone instead of Duplicati. Supported storage types appear to be quite extensive, but any configuration issues are probably best brought to the rclone forum.

Thereā€™s also a toggle of the limiter with SIGUSR2, and some sort of bandwidth remote control available.

How to Limit Data Usage and Internet Bandwidth in Windows and similar specialty tools might also help.

While I can see pause getting fixed because itā€™s a regression, better bandwidth controls seem a feature.

Thanks for the suggestions. I may look into the rclone backend when Iā€™ve a bit more free time, but limiting bandwidth at the network layer will suffice for now in my case. My backup box is dedicated to that purpose, so although inconvenient, itā€™s feasible to isolate and throttle at the switch, at least in the short term.

FYI Iā€™m having this problem too. It just wonā€™t pause (this is my first initial backup). I had to kill the program.

Any update on this? I have the same problem in my docker container. No pausing and no upload limiting.

https://github.com/duplicati/duplicati/releases shows no pause change since 2.0.3.11_canary_2018-09-05
https://github.com/duplicati/duplicati/issues?q=is%3Aissue+is%3Aopen+pause+in%3Atitle
shows a couple of open issues which you could look at.

job pausing/stopping doesnā€™t work in 2.0.4.5_beta_2018-11-28 #3565 has updates since its mention above.
Fix pause and resume, and check for cancel while uploading #3712 seems itself paused. Iā€™m not sure why.
ā€œpauseā€ doesnā€™t stop the upload #1088 is an older issue.

2.0.5.104_canary_2020-03-25

Improved logic around throttle values, thanks @seantempleton

is the release note for

Fix bandwidth throttling inconsistency #4127

which fixes confusion

ā€“throttle-download ignored, --throttle-upload throttles download too #4115

where settings were applied in the wrong way, however the problem in 2.0.5.1 (what release is yours?) would not result in no upload throttling, but in download throttling when you wanted just upload throttling.

Basically, upload throttling seems to be working for mostpeople. Further proof is itā€™s caused issues like timeouts (needs settings adjustment) and data corruption (bug is fixed). On the other hand, thereā€™s this:

Upload Throttle not working
which has some troubleshooting ideas and is currently waiting to hear results from the person reporting.
Perhaps you can follow up on some of the ideas, otherwise Iā€™m not sure if any progress can be madeā€¦

Now that ā€œstop after current fileā€ and ā€œstop nowā€ are supposedly working in the canary channel, what are the use cases for pausing a running backup?

Any solution to this? Perhaps the non-working ā€œpauseā€ button should be removed from the interface in lieu of the existing ā€œstopā€ functionality?

My inclination is to remove it, but wanted to see first if there were some use cases that I wasnā€™t aware of. Since itā€™s not working, itā€™s safer to remove it to prevent users from simply killing the process, which could leave the database in a bad state.