Pause is not Pausing

#1

Hi, I am in the middle of my first backup with Duplicati. I am running 2.0.4.5_beta_2018-11-28 on Windows 10. My backup target is a machine on my local network via SFTP.

The first backup is rather large (1 TB). The backup has been going several days, as expected.

The backup process is creating some source machine slowdowns, again which is fine.

However, I do have occasional tasks that need full attention of the source machine, so I attempted to pause the backup. However, this does not seem to actually pause the backup.

The taskbar Icon changes to show the pause symbol. The local server web interface shows the Resume option on the left and the un-pause icon at the top. However the progress bars keep marching along.

I have checked network traffic, and I can see the 50MB uploads continue moving across the network. I can see the new files appear on the target as well.

I thought that perhaps there was a queue of files waiting to upload, so I checked my temp directory, and I see only the 4 staging files that are setup by the asynchronous-upload-limit parameter are present.

I have examined the log, and new files are being added to the backup.

I thought perhaps the pause needed time to be recognized, so I left it paused for 24 hours to see what would happen, and the backup continued.

I have searched the forum, and have found the post Pause does not really pause, but this seems to be a different problem.

Any idea why I can’t pause my backup?

Thanks in advance for your help.

2 Likes
"Stop after upload" broken? It does not stop
#2

I’ll take some time and try to reproduce the issue

#3

Well, after several days it finally paused - after it uploaded the entire backup. It left something like 1% left to do, and when I un-paused it, it took just a couple minutes to finish.

It is almost like the pause will only stop the backup between different stages of the process, which is unfortunate as the upload stage is the longest and most intensive.

Is this the intended behavior, or is this a bug?

#4

I’ve been in this community for a small time, so take my opinion with a grain of a salt.

As far as I know, this is not intended behavior, but that’s the current implementation.

While debugging a related problem, I’ve found that once a dblock upload starts, there is nothing implemented to stop it until it finishes. That’s problem #1. 1. Furthermore, I can’t seem to find any rendezvous points that check for the control state and whether they should pause/stop or proceed the current operation (process or upload more files). That’s probably why Pause is not working

If you have time and are interested, here is another problem that may have the same root causes (no checking of control state):

Ctrl + F for " Apparently it hangs on AzureBlobWrapper , therefore I may have been wrong saying it happened on any backend, will test on a random one later to be sure." to find the comment in which the relevant part of the discussion is started.

#5

I can’t explain the whole scheme, but the below seems to reflect the sort of design that you described:

Pause does not really pause
#6

Yes, I had that in mind when I wrote. This TaskControlRedevouz is largely used on the Restore operation, however I couldn’t find a single (maybe one or two, I don’t remember) check for the current state on the backup operation.

#7

So, behavior I am experiencing is typical - good I guess.

But as stated, this is not the intended behavior.

Should I fill a bug/feature request? Where do I do that?

#8

That’s up to you. It’s best to at least get it on the record as a request, and you can open it in the forum in the Features category instead of the Support category, or I can change the category on THIS article if you prefer.

You could also jump into the “Pause does not really pause” feature request you found, with a 2.0.4.5 update. Source seems to have had several more TaskControlRendevouz calls in 2.0.3.3, but the rewriting for 2.0.4.5 was heavy enough that I can’t easily tell where chunks of code I saw before went, so it might be a false alarm. Someone who’s motivated enough could probably set up the exact same test on two versions for better proof.

Duplicati issues in GitHub is another place people put enhancement requests, but a different login is required.

"Stop after upload" broken? It does not stop
#9

I am having the same problem. First backup, taking long time. Want to use bandwidth/processing for others tasks during some hours of the day, but it won’t let me since pause does not work.

#10

I am also experiencing the pause button not working. I am not sure why it should be a feature request though - it used to work quite well in previous versions. Was that a bug that got fixed?!

#11

There was a major rewrite in 2.0.3.6 to add concurrency in hopes of increasing performance, that might have caused the changes I cited earlier, as lots of code got moved, and possibly some got left behind…

I think there are reports in the forum of issues from pause or stop, so maybe it also avoided some bugs. You probably won’t find active forum participants who were close enough to the plans to give you details.

From a forum point of view, it might not be a Feature but it’s not exactly Support. It ought to be in issues, which can track issues better. Perhaps it already is. Maybe someone can see if anyone has good ideas.

#12

This issue has been reported on the project’s Github:

As I wrote over there:

I’ve also noticed that changing the ‘throttle’ settings has no effect on an upload that’s currently in progress.

Presumably this is a bug and not an intentional removal of a feature. Either way, it represents a significant regression in capability. For those with large datasets to backup but relatively limited bandwidth on the uplink (like my residential connection), this is a serious problem. I require the ability to pause a large upload (which may take a day or more), or at the very least change the throttle, in order to make bandwidth available for other users and services on the network, whilst minimising backup time as much as possible. Flexible, real-time management of the upload process state and bandwidth is an absolutely essential capability of a backup solution, at least for my use-case.

I’m left with the options of either limiting available bandwidth at the network level (proxy, switch, router etc), or downgrading to a previous version if and until this is fixed.

I hope this issue is addressed soon, as this a potential show-stopper of a bug as far as my use-case, which would be a crying shame as Duplicati is a fantastic project that’s served me very well since I began using it early last year. Hugely grateful to the devs who’ve worked so hard to make this freely available.

#13

There might be another workaround (which I have not tested – would you like to?) of using the Duplicati rclone storage type with the rclone –bwlimit which allows a timetable. For your case, slow would be the alternative to truly paused. The rclone backend seems to not be a streaming backend, so you may lose some progress info, and throttling would be set by rclone instead of Duplicati. Supported storage types appear to be quite extensive, but any configuration issues are probably best brought to the rclone forum.

There’s also a toggle of the limiter with SIGUSR2, and some sort of bandwidth remote control available.

How to Limit Data Usage and Internet Bandwidth in Windows and similar specialty tools might also help.

While I can see pause getting fixed because it’s a regression, better bandwidth controls seem a feature.

#14

Thanks for the suggestions. I may look into the rclone backend when I’ve a bit more free time, but limiting bandwidth at the network layer will suffice for now in my case. My backup box is dedicated to that purpose, so although inconvenient, it’s feasible to isolate and throttle at the switch, at least in the short term.

#15

FYI I’m having this problem too. It just won’t pause (this is my first initial backup). I had to kill the program.