Parallel backup plans

can we have multiple backup plans running at the same time simultaneously ?

No, but if you schedule multiple backups one minute after each other, the jobs will be queued and processed in sequence.

@dr_state what is your goal? Do you want two copies of your backup data? If so I might recommend a different approach. I accomplish this by having one backup job back up to my local NAS, then on my local NAS it synchronizes to B2. (I’m using Synology Cloud Sync for that, but tools like rclone also work.)

The nice thing about this approach is that Duplicati doesn’t have to do the backup work twice. If my local NAS fails, I can easily repoint the backup job to B2 storage and restore from there if needed.

what i have done is that because i have a large data size to be backed up , I have split the backup plans into smaller backup plans on the same server /same drive and they only backup to one storage.

I hope the backup destinations are unique in some way. Can be as simple as using a different subfolder name for each backup job. You need to avoid mixing the backup files from more than one backup job in the exact same location.

(Technically you CAN do it by using unique file prefixes on your backup configs but I’d recommend using separate folders.)

sorry to tag onto this,
but is it possible yet to run multiple backups at the same time using the service?

i understand the GUI isnt possible from other posts, but im interested in the service?

currently we have a load of different backup jobs which all back up different folders to different locations offside (required by work sadly so i cant change this) and sometimes the backup of one folder can take 15mins before the next backup happens,

also doesnt help we have the backups set to every hour offsite (again sadly required by work)

Welcome to the forum @si458

No. If you look at Task Manager you see that the service is the server run as a service.
Duplicati components explains the different ways that the functionalities get packaged.

I’m not sure starting lots of work at once gets faster end, but it can stress a computer.
If you’re determined to do that, you can task-scheduler some Duplicati.CommandLine.

There won’t be a GUI, but you can set up job there and then Export As Command-line.
Although there’s a potential for time collisions, you might be able to blend your usages.

I don’t know the exact requirement you’re under, but it sounds like work doesn’t fit time.
Perhaps you need a faster something-or-other (drives, CPUs, networking, and so on).

If your job logs show that compacting is the occasional slow spot, it can be tuned a bit.
You can also trade off speed versus storage use a little, e.g. by reducing compression.

hi @ts678
thats a real pain that duplicati still cant handle multiple backups at once :frowning:
im also not keen either getting the taskschedular to kick off the multiple backups 5 mins apart when i cant guarantee the backups will finish within the 5 min window
id rather an all in one solution…
time to look for another backup solution then
thanks anyways

Also very disappointed that parallel backup is not supported.

In my case I need backup to local disk and to the cloud.
Cloud connection is slow. When new portion of data is added, it can take a week to upload it to the cloud.
It’s OK for me to have local only backup for a week (while it is uploaded to the cloud). But not OK to have no backup at all (Duplicati will not run local backup for a week, while cloud backup is uploading).

My preference is to use Duplicati to do a local backup only (to my NAS), and then to use a separate tool to sync to cloud. (See my earlier post above.)

Even if Duplicati added parallelism to the web scheduler engine I wouldn’t change my approach.

Note that you CAN do parallelism with Duplicati if you are willing to forgo the web UI and use command line.

Yes, I can use separate tools and command line… but in that case what do I need Duplicati with its friendly UI for? :roll_eyes:

Could you please provide more details on your solution? You use Duplicaty to backup locally to NAS with encryption, and then just sync files to the cloud (with some other tool)?
Is there any risk that backup on NAS will be corrupted, so will the one in the cloud?
I thought independent local and cloud backups to be more safe.

And what is the way to do parallelism with Duplicati command line? Just schedule a task with windows task scheduler? Will Duplicati do everything OK if I run parallel tasks this way?

Yep. In my case my NAS is a Synology, and I use their “Cloud Sync” package to do a one-way sync to S3.

I use S3 versioning. If some malware or accident deletes/corrupts my Duplicati data, I would still have a good version of the files in S3. But on top of that I also use filesystem snapshots on my Synology NAS. If I got malware I would roll back to a previous snapshot.

The only potential issue I can think of is if you try to start two backups at exactly the same time and they are both configured to create VSS snapshots. From memory Windows doesn’t like to be asked to create two snapshots at exactly the same time, one will fail. Not sure if that still applies to more modern versions of Windows, but the workaround was to just stagger the snapshot creation by a little bit. Just a minute or so is sufficient.

1 Like