This issue has been reported on the project’s Github:
opened 08:50PM - 26 Dec 18 UTC
- **Duplicati version**: 2.0.4.5_beta_2018-11-28
- **Operating system**: OS X H… igh Sierra 10.13.6
## Description
Aborting a backup job doesn't work; this may be specific to the scanning stage, or perhaps due to server and tray/web GUI running from separate plists due to High Sierra permissions problems?
The job in question was run with the following options (passphrase and folder exclusion statements removed):
--full-remote-verification=true
--use-block-cache=true
--thread-priority=idle
--full-block-verification=true
--backup-name=homedir
--dbpath=/var/root/.config/Duplicati/86847676818883747767.sqlite
--encryption-module=aes
--compression-module=zip
--dblock-size=100MB
--retention-policy=1W:1D,4W:1W,12M:1M
--aes-set-threadlevel=4
--exclude-files-attributes=temporary
--disable-module=console-password-input
--dry-run=true
## Steps to reproduce
1. Use separate plists for server and tray icon
2. Start a backup
3. During the scanning stage, click on (X) in web UI, select immediate
4. Click pause icon in web UI
Stopping the job requires force-quitting the mono process.
On a related note (I will probably open a separate bug for this), my whole reason for stopping the job was due to use-block-cache not working; I'm seeing a massive amount of write activity. **In dry run mode with use-block-cache, duplicati essentially re-writes the entire contents of my SSD. This is incredibly bad for an SSD!)**
```
## Debug log
Backup started at 12/26/2018 3:32:23 PM
Checking remote backup ...
Listing remote folder ...
Scanning local files ...
74690 files need to be examined (5.53 GB) (still counting)
80923 files need to be examined (5.85 GB) (still counting)
86682 files need to be examined (5.98 GB) (still counting)
96166 files need to be examined (8.52 GB) (still counting)
107703 files need to be examined (11.26 GB) (still counting)
120194 files need to be examined (36.15 GB) (still counting)
130064 files need to be examined (45.56 GB) (still counting)
140289 files need to be examined (56.31 GB) (still counting)
150345 files need to be examined (57.19 GB) (still counting)
161269 files need to be examined (58.02 GB) (still counting)
168300 files need to be examined (57.83 GB) (still counting)
176195 files need to be examined (58.15 GB) (still counting)
184556 files need to be examined (134.90 GB) (still counting)
194669 files need to be examined (134.83 GB) (still counting)
202100 files need to be examined (134.65 GB) (still counting)
212829 files need to be examined (134.63 GB) (still counting)
221705 files need to be examined (134.48 GB) (still counting)
System.Threading.ThreadAbortException: Thread was being aborted.
at Duplicati.Library.Main.Controller.RunAction[T] (T result, System.String[]& paths, Duplicati.Library.Utility.IFilter& filter, System.Action`1[T] method) [0x00239] in <c6c6871f516b48f59d88f9d731c3ea4d>:0
at Duplicati.Library.Main.Controller.Backup (System.String[] inputsources, Duplicati.Library.Utility.IFilter filter) [0x00068] in <c6c6871f516b48f59d88f9d731c3ea4d>:0
at Duplicati.CommandLine.Commands.Backup (System.IO.TextWriter outwriter, System.Action`1[T] setup, System.Collections.Generic.List`1[T] args, System.Collections.Generic.Dictionary`2[TKey,TValue] options, Duplicati.Library.Utility.IFilter filter) [0x00119] in <04206d56f2084515b87543fcb90a7e00>:0
at (wrapper delegate-invoke) System.Func`6[System.IO.TextWriter,System.Action`1[Duplicati.Library.Main.Controller],System.Collections.Generic.List`1[System.String],System.Collections.Generic.Dictionary`2[System.String,System.String],Duplicati.Library.Utility.IFilter,System.Int32].invoke_TResult_T1_T2_T3_T4_T5(System.IO.TextWriter,System.Action`1<Duplicati.Library.Main.Controller>,System.Collections.Generic.List`1<string>,System.Collections.Generic.Dictionary`2<string, string>,Duplicati.Library.Utility.IFilter)
at Duplicati.CommandLine.Program.ParseCommandLine (System.IO.TextWriter outwriter, System.Action`1[T] setup, System.Boolean& verboseErrors, System.String[] args) [0x00313] in <04206d56f2084515b87543fcb90a7e00>:0
at Duplicati.CommandLine.Program.RunCommandLine (System.IO.TextWriter outwriter, System.IO.TextWriter errwriter, System.Action`1[T] setup, System.String[] args) [0x00002] in <04206d56f2084515b87543fcb90a7e00>:0
System.Threading.ThreadAbortException: Thread was being aborted.
at Duplicati.CommandLine.Program.RunCommandLine (System.IO.TextWriter outwriter, System.IO.TextWriter errwriter, System.Action`1[T] setup, System.String[] args) [0x000bb] in <04206d56f2084515b87543fcb90a7e00>:0
at Duplicati.Server.WebServer.RESTMethods.CommandLine+<>c__DisplayClass4_0.<POST>b__0 (Duplicati.Library.Main.IMessageSink sink) [0x00030] in <fe28905ee30b422e8d475f1cfdb85515>:0
```
As I wrote over there:
I’ve also noticed that changing the ‘throttle’ settings has no effect on an upload that’s currently in progress.
Presumably this is a bug and not an intentional removal of a feature. Either way, it represents a significant regression in capability. For those with large datasets to backup but relatively limited bandwidth on the uplink (like my residential connection), this is a serious problem. I require the ability to pause a large upload (which may take a day or more), or at the very least change the throttle, in order to make bandwidth available for other users and services on the network, whilst minimising backup time as much as possible. Flexible, real-time management of the upload process state and bandwidth is an absolutely essential capability of a backup solution, at least for my use-case.
I’m left with the options of either limiting available bandwidth at the network level (proxy, switch, router etc), or downgrading to a previous version if and until this is fixed.
I hope this issue is addressed soon, as this a potential show-stopper of a bug as far as my use-case, which would be a crying shame as Duplicati is a fantastic project that’s served me very well since I began using it early last year. Hugely grateful to the devs who’ve worked so hard to make this freely available.