Stop after Upload

I am using Duplicati successfully since some time for smaller source backups.
I started to backup a 1,4TB folder, which is still in progress. For testing purpose I run the backup to a local disk, because the process takes quite long, and the other backups are on hold.
I wanted to stop my 1,4TB backup process, to allow the others to run through, after this I wanted to work on the 1,4TB backup again.

I never had quite the need to stop a backup since yet, I tested “Stop Now” Option on another machine quite successfully, sometimes with a broken local database but this is repairable most of the time.

I didn’t want that to happen for that 1,4TB, so I used what made more sense the “Stop after Upload” option. After an hour - the backup is still in progress and hasn’t stopped, even though the status says it is stopping after upload.


Version: Duplicati -
Upload Volume Size: 250MB
Blocksize: 500KB


I had the same problem and the only workaround I found was restarting the PC in order to cancel the Job

but what if that breaks my initial backup?
I needed days to geht teh first 600GB do be placed in the encrypted zip files.
starting over with this would be a nightmare

good point. When I remeber correct on my side after strating the backup again it was resuming the Job - but I dont put my hands in fire for that :wink:

Hi @herbert, welcome to the forum!

It’s possible Duplicati is still trying to gather together and process enough blocks to fill up a 250MB volume.

If you look at your destination, do you see any duplicati-*.dblock.* files created AFTER the time you clicked “Stop after Upload”?

It is backing up RAW picture files of 20MB which do not get compressed.
After doing a stop after upload and waiting 5 hours I have gathered another 10 GB of additional data in the backup directory :wink:
Therefor I assume it should have been able to stop.

I also tried to shutdown the QNAP App using the AppCenter, which did work. Restarted it and it immediatly started were it left - doing the backup of the 1,4TB. This was not really what I intended, but anyhow nice to know that a “shutdown” of the system is not influencing the backup itself :slight_smile:

I have now Stopped the process because I needed to move the already existing backup to another location to go forward. I am now moving the data. I hope after restarting the backup job with the new destination all works out.

But for now the Stop after Upload just doesn’t work at all for me.

I need to add something to this

Today I tested another backup on the same machine using the same duplicati instance.
The initial backup was already done previously, this was just an increment run and backing up data via SFTP to a remote location.
Doing a Stop after upload on this job was working as expected.

Good to know - perhaps it’s an issue that only crops up during initial backups (or maybe when the database hasn’t been fully populated yet)…

I need to cross-reference this post

because that is were I am now. and hopping to get some tips how to go on from there.

Good thing, the other backup jobs on the same machine are working, I am only missing the 1,4TB backup to get finished for now - still a pain.

I think we need to move on getting a better UI for the backup process. With this, you would be able to see exactly which files are in queue and why. Once we know that, we can tweak the process to work better.

I don’t have a good explanation for why it would keep producing 10GB more data after pressing stop (unless you have a volume size > 5gb).


Great, good to know.
As long as I can recover from a reboot of the machine, restart of the service and a forced Stop instead, everything is fine - even if not perect :slight_smile:


I just wanted to add something, even the issue is resolved from my site. I summed up the steps it took me to have a working backup again

If you want, I could also add that information somewhere else on the project, but than I would like some information where to start contributing something to duplicati.

Thanks for sharing what worked for you! I don’t recall hearing much about database lock issues so I’m wondering if this might be something that is happening due to your environment (in other words, running on a QNAP).

As an Open Source project, we’d love the help! :+1:

If you’re interested in contributing to the code, you can work on it over at GitHub - duplicati/duplicati: Store securely encrypted backups in the cloud!.

If you want to help out other users, that can be done here on the forum either re-actively (just post a reply!) or pro-actively such as through creating a #howto guide.

Same problem here and I end up killing the Duplicati process. Which makes me feel bad.

1 Like

Also using on a QNAP device?


I’m seeing the same thing here.

I have a large backup that I’ve just changed a large number of files in. I set it running overnight, and planned to stop it today to allow my regular ‘daily’ backups to run. I clicked ‘Stop after upload’ about 6 or 7 hours ago now, and it’s still backing up.

Running on Debian if it helps.


No. Don’t even know what that is… Windows client, Duplicati, Backup to Google Drive. Latest Canary .14 - “Stop after upload” does nothing, but a full stop instantly interrupts.

I’ll try to check this myself but after choosing “Stop after upload” if you check the job log Remote tab, see if 2 more uploads start.

My GUESS is it’s an issue with the multi-threading updates a few versions ago. Something like:

  • thread one starts an upload
  • thread two starts a compress
  • stop after upload requested
  • thread one finishes upload
  • stop request rejected because thread two still busy (when instead it should be left in process queue so every thread can respond to it)

If that’s the case, then using Duplicati with single threads MIGHT not exhibit the issue.

Of course that’s all just a theory, I haven’t looked at the Stop code yet. :slight_smile:

1 Like


Any news on this? I have just installed Duplicati today and with initial backups (Many TB) not being able to stop after upload is making the program unusable to me…

Many Thanks in advance, looking great apart from this!

1 Like

Any news? I’m currently reading:

" [Local] Bilder : Starting backup …"

It’s a 200 GByte backup. And I cannot stop the “Starting backup” without killing the Duplicati process at all. Feels like it insists on counting files or whatever it does…
If it decided to stop the process at some point, there still could be subsequent jobs waiting…
So, killing duplicati.exe makes me feel uncomfortable.

-> The non-ability to see or edit the current backup queue is still a big thing IMO.
Probably a lot of work for you hard working guys I guess :slight_smile: