Windows CLI Script to Create B2 Backups with Hostname as Folder Name?

Hey All,

NOTE: We are only ever going to deploy and manage through CLI (restores will be done via GUI), so please no GUI suggestions as this is for a mass deployment.

So I am in the process of assessing Duplicati for file and folder backups of users on Windows desktops in our environment. We are piping backups directly to B2. Thus far we have the following, which connects, and starts backing up:

Duplicati.CommandLine.exe backup “b2://Backups-Folder/?auth-username=key&auth-password=pass” “C:\Users\” “C:\Program Files\Duplicati 2\data\” --backup-name = Duplicati.CommandLine --encryption-module=aes --compression-module=zip --dblock-size=50mb --passphrase=“password” --retention-policy=“1W:1D,4W:1W,12M:1M” --exclude-files-attributes=“system,temporary,hidden” --disable-module=console-password-input --allow-missing-source

The issue of coarse, is that each machine just dumps files into the “Backups-Folder”. What would be amazing is if we can have each machine backing up into its own folder (preferably with the machine host name as the folder name).

Does anyone know if this is possible?


Update as follows:


*cd C:\Program Files\Duplicati 2*
Duplicati.CommandLine.exe backup “b2://Backups/%COMPUTERNAME%?auth-username=key&auth-password=password” “C:\Users\” “C:\Program Files\Duplicati 2\data\” --backup-name = Duplicati.CommandLine --encryption-module=aes --compression-module=zip --dblock-size=50mb --passphrase=“password” --retention-policy=“1W:1D,4W:1W,12M:1M” --exclude-files-attributes=“system,temporary,hidden” --disable-module=console-password-input --allow-missing-source

Glad you figured it out! But note if you set up more than one backup job on a machine, you will need to store them separately on B2. Don’t ever target more than one backup job (whether it’s the same computer or different computers) to the exact same location.

Also don’t back up your Duplicati data folder. You may have trouble with the job-specific sqlite files if you try to back them up. Plus it is pointless anyway. Those can be rebuilt from storage in the case of data loss.

The plan looks like all users share access to read (or write) each others’ data. Not sure if that’s a worry. Dodging it might be kind of painful, e.g. if each user had to be given their own credentials to Backblaze.

That exact job would be pushed repetitivley to the same machine every say 4 hours. Each machine will have the same script pushed to it, meaning each machines job is the same job over and over.

So, plan is to port this working script into powershell now where i will define separate passphrases for each job/maxhine.

Given we are purely pushing this from an rmm, the only way a user could access another users data is if they captured the script on the fly and noted the details which is hugely unlikely, but powershell will remove this risk if i can get it to work