I am using Duplicati successfully since some time for smaller source backups.
I started to backup a 1,4TB folder, which is still in progress. For testing purpose I run the backup to a local disk, because the process takes quite long, and the other backups are on hold.
I wanted to stop my 1,4TB backup process, to allow the others to run through, after this I wanted to work on the 1,4TB backup again.
I never had quite the need to stop a backup since yet, I tested “Stop Now” Option on another machine quite successfully, sometimes with a broken local database but this is repairable most of the time.
I didn’t want that to happen for that 1,4TB, so I used what made more sense the “Stop after Upload” option. After an hour - the backup is still in progress and hasn’t stopped, even though the status says it is stopping after upload.
It is backing up RAW picture files of 20MB which do not get compressed.
After doing a stop after upload and waiting 5 hours I have gathered another 10 GB of additional data in the backup directory
Therefor I assume it should have been able to stop.
I also tried to shutdown the QNAP App using the AppCenter, which did work. Restarted it and it immediatly started were it left - doing the backup of the 1,4TB. This was not really what I intended, but anyhow nice to know that a “shutdown” of the system is not influencing the backup itself
I have now Stopped the process because I needed to move the already existing backup to another location to go forward. I am now moving the data. I hope after restarting the backup job with the new destination all works out.
But for now the Stop after Upload just doesn’t work at all for me.
Today I tested another backup on the same machine using the same duplicati instance.
The initial backup was already done previously, this was just an increment run and backing up data via SFTP to a remote location.
Doing a Stop after upload on this job was working as expected.
I think we need to move on getting a better UI for the backup process. With this, you would be able to see exactly which files are in queue and why. Once we know that, we can tweak the process to work better.
I don’t have a good explanation for why it would keep producing 10GB more data after pressing stop (unless you have a volume size > 5gb).
Thanks for sharing what worked for you! I don’t recall hearing much about database lock issues so I’m wondering if this might be something that is happening due to your environment (in other words, running on a QNAP).
I have a large backup that I’ve just changed a large number of files in. I set it running overnight, and planned to stop it today to allow my regular ‘daily’ backups to run. I clicked ‘Stop after upload’ about 6 or 7 hours ago now, and it’s still backing up.
Running 188.8.131.52_beta_2018-11-28 on Debian if it helps.
It’s a 200 GByte backup. And I cannot stop the “Starting backup” without killing the Duplicati process at all. Feels like it insists on counting files or whatever it does…
If it decided to stop the process at some point, there still could be subsequent jobs waiting…
So, killing duplicati.exe makes me feel uncomfortable.
-> The non-ability to see or edit the current backup queue is still a big thing IMO.
Probably a lot of work for you hard working guys I guess