Parallel backup plans

can we have multiple backup plans running at the same time simultaneously ?

No, but if you schedule multiple backups one minute after each other, the jobs will be queued and processed in sequence.

@dr_state what is your goal? Do you want two copies of your backup data? If so I might recommend a different approach. I accomplish this by having one backup job back up to my local NAS, then on my local NAS it synchronizes to B2. (I’m using Synology Cloud Sync for that, but tools like rclone also work.)

The nice thing about this approach is that Duplicati doesn’t have to do the backup work twice. If my local NAS fails, I can easily repoint the backup job to B2 storage and restore from there if needed.

what i have done is that because i have a large data size to be backed up , I have split the backup plans into smaller backup plans on the same server /same drive and they only backup to one storage.

I hope the backup destinations are unique in some way. Can be as simple as using a different subfolder name for each backup job. You need to avoid mixing the backup files from more than one backup job in the exact same location.

(Technically you CAN do it by using unique file prefixes on your backup configs but I’d recommend using separate folders.)

sorry to tag onto this,
but is it possible yet to run multiple backups at the same time using the service?

i understand the GUI isnt possible from other posts, but im interested in the service?

currently we have a load of different backup jobs which all back up different folders to different locations offside (required by work sadly so i cant change this) and sometimes the backup of one folder can take 15mins before the next backup happens,

also doesnt help we have the backups set to every hour offsite (again sadly required by work)

Welcome to the forum @si458

No. If you look at Task Manager you see that the service is the server run as a service.
Duplicati components explains the different ways that the functionalities get packaged.

I’m not sure starting lots of work at once gets faster end, but it can stress a computer.
If you’re determined to do that, you can task-scheduler some Duplicati.CommandLine.

There won’t be a GUI, but you can set up job there and then Export As Command-line.
Although there’s a potential for time collisions, you might be able to blend your usages.

I don’t know the exact requirement you’re under, but it sounds like work doesn’t fit time.
Perhaps you need a faster something-or-other (drives, CPUs, networking, and so on).

If your job logs show that compacting is the occasional slow spot, it can be tuned a bit.
You can also trade off speed versus storage use a little, e.g. by reducing compression.

hi @ts678
thats a real pain that duplicati still cant handle multiple backups at once :frowning:
im also not keen either getting the taskschedular to kick off the multiple backups 5 mins apart when i cant guarantee the backups will finish within the 5 min window
id rather an all in one solution…
time to look for another backup solution then
thanks anyways