How to keep runs in order with duplicati_client

I used to have my jobs scheduled to run a second after the previous one and that has kept the different backup jobs running in the order I wanted. Now I am trying to run the same jobs with duplicati_client, but it seems that the jobs are not run in the order in which I run the duplicati_client commands.

For example, if I run a script like this:

duplicati_client login || exit
duplicati_client run 1
duplicati_client run 2
duplicati_client run 3
duplicati_client run 4
duplicati_client logout

and after all of the jobs have finished, I open the web UI to check the results, I find that the jobs did not end in the order in which I wanted them to run.

Before I start writing extensive run-before and run-after scripts that use a lock file, I decided to ask which part of Duplicati actually ensures that the jobs do not run in parallel, and what might be the preferred way to define the order when I use duplicati_client?

In repeated tests, the job #4 runs before job #3, which is not acceptable to me.

So for a temporary solution, I came up with this:

#!/bin/sh
check () {
    duplicati_client get backup $1 | grep Started
}
duplicati_client login || exit
for job in 1 2 3 4
do
    was="$(check $job)"
    echo "Job $job"
    duplicati_client run $job
    while [ "$was" = "$(check $job)" ] ; do sleep 5 ; done
done
duplicati_client logout

I am not satisfied with it, though.

I suppose the unpredictability of the running order of the jobs is not a bug but a feature. I wonder if there is a reason why duplicati_client cannot ensure the order of the jobs?