One way to start is with an Export As Command-line from GUI until you get the hang of what’s required. CLI jobs run independently, which can get you parallelism. The GUI serializes jobs, and that makes me wonder why the original post works so hard to make sure jobs of variable time don’t step on each other.
Do you have an upload link that’s a lot faster than your drives? If not, perhaps omit parallelism and just make a load of jobs with settings for the various categories of files, and let them loose to run until done. Folders without changes will be basically instant (timestamp didn’t change = no need to even scan file).
As a side note, your backup sounds large, so Choosing sizes in Duplicati might be worth reading, since Duplicati can slow down when trying to keep track of big amounts of tiny blocks. Default block is 100KB.