I had the same problem and the only workaround I found was restarting the PC in order to cancel the Job
but what if that breaks my initial backup?
I needed days to geht teh first 600GB do be placed in the encrypted zip files.
starting over with this would be a nightmare
good point. When I remeber correct on my side after strating the backup again it was resuming the Job - but I dont put my hands in fire for that
Hi @herbert, welcome to the forum!
It’s possible Duplicati is still trying to gather together and process enough blocks to fill up a 250MB volume.
If you look at your destination, do you see any
duplicati-*.dblock.* files created AFTER the time you clicked “Stop after Upload”?
It is backing up RAW picture files of 20MB which do not get compressed.
After doing a stop after upload and waiting 5 hours I have gathered another 10 GB of additional data in the backup directory
Therefor I assume it should have been able to stop.
I also tried to shutdown the QNAP App using the AppCenter, which did work. Restarted it and it immediatly started were it left - doing the backup of the 1,4TB. This was not really what I intended, but anyhow nice to know that a “shutdown” of the system is not influencing the backup itself
I have now Stopped the process because I needed to move the already existing backup to another location to go forward. I am now moving the data. I hope after restarting the backup job with the new destination all works out.
But for now the Stop after Upload just doesn’t work at all for me.
I need to add something to this
Today I tested another backup on the same machine using the same duplicati instance.
The initial backup was already done previously, this was just an increment run and backing up data via SFTP to a remote location.
Doing a Stop after upload on this job was working as expected.
Good to know - perhaps it’s an issue that only crops up during initial backups (or maybe when the database hasn’t been fully populated yet)…
I need to cross-reference this post
because that is were I am now. and hopping to get some tips how to go on from there.
Good thing, the other backup jobs on the same machine are working, I am only missing the 1,4TB backup to get finished for now - still a pain.
I think we need to move on getting a better UI for the backup process. With this, you would be able to see exactly which files are in queue and why. Once we know that, we can tweak the process to work better.
I don’t have a good explanation for why it would keep producing 10GB more data after pressing stop (unless you have a volume size > 5gb).
Great, good to know.
As long as I can recover from a reboot of the machine, restart of the service and a forced Stop instead, everything is fine - even if not perect
I just wanted to add something, even the issue is resolved from my site. I summed up the steps it took me to have a working backup again
If you want, I could also add that information somewhere else on the project, but than I would like some information where to start contributing something to duplicati.
Thanks for sharing what worked for you! I don’t recall hearing much about database lock issues so I’m wondering if this might be something that is happening due to your environment (in other words, running on a QNAP).
As an Open Source project, we’d love the help!
If you’re interested in contributing to the code, you can work on it over at GitHub - duplicati/duplicati: Store securely encrypted backups in the cloud!.
If you want to help out other users, that can be done here on the forum either re-actively (just post a reply!) or pro-actively such as through creating a #howto guide.
Same problem here and I end up killing the Duplicati process. Which makes me feel bad.
Also using 18.104.22.168 on a QNAP device?
I’m seeing the same thing here.
I have a large backup that I’ve just changed a large number of files in. I set it running overnight, and planned to stop it today to allow my regular ‘daily’ backups to run. I clicked ‘Stop after upload’ about 6 or 7 hours ago now, and it’s still backing up.
Running 22.214.171.124_beta_2018-11-28 on Debian if it helps.
No. Don’t even know what that is… Windows client, Duplicati, Backup to Google Drive. Latest Canary .14 - “Stop after upload” does nothing, but a full stop instantly interrupts.
I’ll try to check this myself but after choosing “Stop after upload” if you check the job log Remote tab, see if 2 more uploads start.
My GUESS is it’s an issue with the multi-threading updates a few versions ago. Something like:
- thread one starts an upload
- thread two starts a compress
- stop after upload requested
- thread one finishes upload
- stop request rejected because thread two still busy (when instead it should be left in process queue so every thread can respond to it)
If that’s the case, then using Duplicati with single threads MIGHT not exhibit the issue.
Of course that’s all just a theory, I haven’t looked at the Stop code yet.
Any news on this? I have just installed Duplicati today and with initial backups (Many TB) not being able to stop after upload is making the program unusable to me…
Many Thanks in advance, looking great apart from this!
Any news? I’m currently reading:
" [Local] Bilder : Starting backup …"
It’s a 200 GByte backup. And I cannot stop the “Starting backup” without killing the Duplicati process at all. Feels like it insists on counting files or whatever it does…
If it decided to stop the process at some point, there still could be subsequent jobs waiting…
So, killing duplicati.exe makes me feel uncomfortable.
-> The non-ability to see or edit the current backup queue is still a big thing IMO.
Probably a lot of work for you hard working guys I guess
Thanks for your, and @mbc9’s, interest. Unfortunately, I don’t think much progress has been made on this.
As an “annoying but (likely) doesn’t actually break anything” issue this is probably lower in the priority list than you would like.
That being said, having more details might help narrow down what needs to be done.
For example, it sound like @Tapio is wanting to be able to stop during the file scanning process (at the beginning of the process) while @mbc9 wants to be about to stop after the current upload (middle of the process).
Did I get that right?
I ask because “stop after upload” uses a different stop process than “stop now”.