I am using Duplicati successfully since some time for smaller source backups.
I started to backup a 1,4TB folder, which is still in progress. For testing purpose I run the backup to a local disk, because the process takes quite long, and the other backups are on hold.
I wanted to stop my 1,4TB backup process, to allow the others to run through, after this I wanted to work on the 1,4TB backup again.
I never had quite the need to stop a backup since yet, I tested “Stop Now” Option on another machine quite successfully, sometimes with a broken local database but this is repairable most of the time.
I didn’t want that to happen for that 1,4TB, so I used what made more sense the “Stop after Upload” option. After an hour - the backup is still in progress and hasn’t stopped, even though the status says it is stopping after upload.
but what if that breaks my initial backup?
I needed days to geht teh first 600GB do be placed in the encrypted zip files.
starting over with this would be a nightmare
@JonMikelV
It is backing up RAW picture files of 20MB which do not get compressed.
After doing a stop after upload and waiting 5 hours I have gathered another 10 GB of additional data in the backup directory
Therefor I assume it should have been able to stop.
@Jorg_Nestler
I also tried to shutdown the QNAP App using the AppCenter, which did work. Restarted it and it immediatly started were it left - doing the backup of the 1,4TB. This was not really what I intended, but anyhow nice to know that a “shutdown” of the system is not influencing the backup itself
I have now Stopped the process because I needed to move the already existing backup to another location to go forward. I am now moving the data. I hope after restarting the backup job with the new destination all works out.
But for now the Stop after Upload just doesn’t work at all for me.
Today I tested another backup on the same machine using the same duplicati instance.
The initial backup was already done previously, this was just an increment run and backing up data via SFTP to a remote location.
Doing a Stop after upload on this job was working as expected.
I think we need to move on getting a better UI for the backup process. With this, you would be able to see exactly which files are in queue and why. Once we know that, we can tweak the process to work better.
I don’t have a good explanation for why it would keep producing 10GB more data after pressing stop (unless you have a volume size > 5gb).
Great, good to know.
As long as I can recover from a reboot of the machine, restart of the service and a forced Stop instead, everything is fine - even if not perect
I just wanted to add something, even the issue is resolved from my site. I summed up the steps it took me to have a working backup again
If you want, I could also add that information somewhere else on the project, but than I would like some information where to start contributing something to duplicati.
Thanks for sharing what worked for you! I don’t recall hearing much about database lock issues so I’m wondering if this might be something that is happening due to your environment (in other words, running on a QNAP).
If you want to help out other users, that can be done here on the forum either re-actively (just post a reply!) or pro-actively such as through creating a How-To guide.
I have a large backup that I’ve just changed a large number of files in. I set it running overnight, and planned to stop it today to allow my regular ‘daily’ backups to run. I clicked ‘Stop after upload’ about 6 or 7 hours ago now, and it’s still backing up.
Running 2.0.4.5_beta_2018-11-28 on Debian if it helps.
No. Don’t even know what that is… Windows client, Duplicati, Backup to Google Drive. Latest Canary .14 - “Stop after upload” does nothing, but a full stop instantly interrupts.
Any news on this? I have just installed Duplicati today and with initial backups (Many TB) not being able to stop after upload is making the program unusable to me…
Many Thanks in advance, looking great apart from this!
It’s a 200 GByte backup. And I cannot stop the “Starting backup” without killing the Duplicati process at all. Feels like it insists on counting files or whatever it does…
If it decided to stop the process at some point, there still could be subsequent jobs waiting…
So, killing duplicati.exe makes me feel uncomfortable.
-> The non-ability to see or edit the current backup queue is still a big thing IMO.
Probably a lot of work for you hard working guys I guess