Backing up to multiple USB or external drives?

I am running into problems with either multiple USB drives or external drives being switched out. Gives errors and I am assuming that the indexes cannot be found as multiple devices are being used…Client would like to take local backups off site.


Hello and welcome to the forum!

It sounds like you have a client who is backing up a single set of data to more than one USB device (I assume so they can keep one local and one offsite), is that correct?

@pdphillips Duplicati doesn’t provide functionality for removable media. You could get it to work with some switches that would disable the local database, etc., but that would only be a workaround. Duplicati is designed to backup offsite to a storage medium that doesn’t change.

JonMikeIV: Yes, that is the case…

samw: Bummer, was hoping to use it for multiple media library as well…

Well, technically Duplicati works fine with removable media when it’s connected but it can get a bit chatty about “errors” when the media is not connected. Though this can be abated a bit by adjusting --send-mail-level settings.

I have a Mac that has daily backups scheduled to to two destinations - a remote “always on” destination and a local USB drive that is only connected once a week. The daily USB backup complains about the missing destination with every run, but the backup itself runs fine when the drive is connected.

If I cared enough to add a --run-script-before task to check whether or not the destination was connected at the time of the run then I could get rid of that error too.

Duplicati can work in this scenario, but you’ll need two different jobs (pointing to the same source) to make it happen. The reason for this is that Duplicati keeps a local database of what’s been backed up to the destination so if you do a week’s worth of backups to USB drive A then swap in USB drive B Duplicati will complain that a bunch of destination files are missing (which is what you mentioned in the original post).

If you’re working on Windows then you also have to potential “Windows Drive Letters” issue to deal with since Windows can changes the drive letter of the USB drive when attached if the previously used drive letter is already in use.

To get around this you can use the --alternate-destination-marker parameter to designate a “magic filename” that Duplicati will use to identify the correct target drive. So you could use something like --alternate-destination-marker=Duplicati-Backup-Drive-A.txt.

You then set --alternate-target-paths to be drive-variable. So if you were placing your backups in a G:\Duplicati-Backups folder you could set something like --alternate-target-paths=*:\Duplicati-Backups.

This setup would tell Duplicati to look through every drive until it finds the Duplicati-Backup-Drive-A.txt file then will run the backup into the \Duplicati-Backups folder of that drive.

If you set up each backup job with a different --alternate-destination-marker file, then each job will only back up to it’s “designated” destination drive.

@JonMikelV Excellent clarification Jon. Another way I worked an issue like this is to basically do the following:

  • Manually change the assigned drive letters to my portable drives in Windows to X: and Y:. This way Windows will remember the drives from their GUIDs.

  • Created two backup jobs with the same sources and the two different destinations

  • Changed the block size to 1MiB and hash to MD5 to avoid local database bloat (Optional)

  • Ran the two jobs ad-hoc to create first initial backups

  • Scheduled the two jobs using the Windows task manager which allowed me the flexibility to schedule the backup on alternate weeks

Basically worked like a charm, but eventually I just built a cheap Remote NAS box and just started backing up using SFTP. I seeded my backed up data 1st and then only ran the changes over the internet to avoid waiting for weeks for the first job to finish.

Nice plan when using Windows Scheduled Tasks. :slight_smile:

Out of curiosity, what did you use for your remote NAS and would you be interested in making a #howto for CrashPlan orphans that are finding cloud options too pricey for remote storage?

@JonMikelV I use a FreeNAS box that I just setup in the office and do my home backups directly to it. Not really sure whether a how-to guide is required as my setup is really simple. I just point my job to an external IP of my FTP server and supply the log in credentials. I really like the product and was thinking of creating Youtube videos showing how to install and configure, etc. But I suspect they may already be around?

FREENAS? Yes, I think there are few. :wink:

Of course that doesn’t mean that you couldn’t do better… :smiley:

Thanks for sharing!

7 posts were split to a new topic: “process cannot access the file” error on Quickbooks files

I created two backup jobs too.
Same source and two different destinations.
I use two HD on alternate weeks (on monday of the first week I connect to the NAS “HD10211” until saturday of the same week, then I disconnect it. On monday of the second week, I connect to the NAS another HD, that I renamed “HD10211”, until saturday of this second week. On the third monday, I connect to NAS the first HD10211 again). I start Duplicate on the first week from default account and the second week from an other account (Guest). So each job points to only one drive.
Unfortunately this error appears the third week when I connected the first HD.

Hey @MQ1,

I’m not sure why you are running the jobs from different accounts or what renaming the drives means, do you mean changing the volume label? If so Duplicati doesn’t check the volume labels at all. What are you trying to achieve?


I renamed both external drivers (destination jobs) by the same name: HD10211 and I use the drivers on alternate weeks.
Duplicate checks both drivers, but appears the error that I did sand to you and Duplicate doesn’t backup the files from source.
Can you help me?
King regards.

Readers here probably lack history, but I think this is the continuation of a somewhat-misplaced discussion where I advised against using the same paths on two drives, as it opens the way to oddly confusing errors. Multiple accounts were also described there, although I still don’t fully understand why that’s an advantage.

Looking closely at the errors (which might be in the post just below the red box) would find a file with a date, leading one to know which files are extra. They might even be leftovers never deleted from prior drive uses, however errors should have shown up in the first week job on the first week drive. If that drive was put away exactly as intended, and never touched by the second week job, I don’t see why it would start on this error, however if drives got confused and this was the second week drive, I’d expect such errors about extra files.

Close checking of file dates on the drives against backup dates in the two jobs might help find the problem.