I have 2 backup profiles defined:
- Files - every hour. Smart retention.
- Image (
dd image is generated using “pre backup script”) is backed-up - every 24 hours, Smart retention.
Today after the backup of the image was completed, Duplicati is “deleting unwanted files” for hours now. I can confirm that the process is not hung, coz I can see activity in “live logs”. But this activity is blocking my hourly backup of files.
Can this deleting of unwanted files be made non-blocking?
This “image backup” profile has only 2 files to be backed up. One file is about 23 G and the other is just KBs (basically a sha256 sum of the image file).
The loops are already on a non-blocking background thread as you can use Duplicati.eg go to settings. They just run backups one at a time.
But, they do have a UI blockage issue when the loops get stuck from an unknown amount of errors or issues. So its either that it was incredibly slow or this. There is activity during that with any loops that can still run will be running. But it stays stuck and never exits the backup. A lot of people are going to run into this and keep bringing it up in one way or another unknowingly. Its possible you ran into that and it wasn’t actually deleting anything anymore.
^ You can check the amount of files on the target machine or server and refresh and see if its changing over 10-30 minutes. If it is then the backup is indeed still running. Or maybe check in another way.
It might be interesting to do multiple backups at once or cancel a backup to run another one on schedule. But, it would create higher usage or maybe even conflicts or other issues. With the way Duplicati code is set up, I’m not sure this would be easy to do. It might be but then I think of all the complexity they have and it feels like a nightmare is there.
You could work around this by changing how you backup stuff depending on what you’re including; or running multiple backup applications so that even if one is still running then another backs up.
As far as schedule, if you have large enough backups or other variables (eg online connection slowdown verses local), they may never always run on time and your method or expectation should be adjusted.
These are all the points I can see that can be related here and am simply pointing out all possibilities. Personally, I’d expect years to go by on this as far as Duplicati changes go. If ever. But, I don’t know what they will work on or when. In theory, I might get to it before anyone else and that will be a while if I ever do.
If you mean have it not block other jobs from running, you can only get one backup running at a time.
The exception to that is if you sneak in under GUI with CLI jobs, where conflict avoidance is your job.
If (simplified) pattern is dblock downloads, one dblock upload, then dblock deletes, that’s running the Compacting files at the backend to clean up wasted space due to deletions of old versions of backup.
Compact actually has some tunings, e.g. you can make it compact more often but less time each run.
no-auto-compact can turn it off totally, but wasted space will accumulate. Can destination stand that?
How long does the image backup take, and how much does it upload (see Complete log in job logs)?
What OS is this? You might be able to launch a somewhat independent CLI job to compact for hours.
You could also just turn the entire image backup+compact into a CLI job by Export As Command-line.
You could also run two copies of Duplicati GUI, however use a different browser for different Duplicati.