How to use the same external hard drive for Windows and Linux?

Hello, I’m currently trying to make my Duplicati backup work on both Windows and Linux, since my setup is currently as follows:

  • I have a computer with both Linux and Windows partitions installed on one hard drive.
  • On a secondary hard drive, I have a common folder for documents, images, videos, etcetera.
  • That hard drive is symlinked both on Windows (the Pictures, Videos, Documents, Music and Downloads folders are redirected to the corresponding folders on the secondary hard drive), and on the Linux side (the drive is mounted on boot to a fixed mount point, /mnt/windows, then the /home/me/{Pictures,Music,...} folders are soft symlinks)
  • On the Windows side, Duplicati is already configured to use a third, external hard drive connected through USB, in order to make a backup of several folders, such as the common drive mentioned above, and a few others such as the Steam folder.

What I’m currently stuck on is the step where I configure the Linux side to use that same external hard drive as a backup. Since the names of the folders are different (although they share the same content), I can’t figure how to make a common backup, without having to create a duplicate backup job. I already tried exporting the backup script, but since it uses the Windows names I can’t figure out if switching the folder names will break anything in the process - is it possible to do what I’m talking about?

I don’t know if what you are trying to do is at all possible, but if it is, I would export the job from Windows as a .json file, not as a script. I would import the .json file on Linux (choose “Add backup” in main menu, and on the first page of import, you choose"Import from a file"), and then edit the paths in the web UI. I imagine you can edit also some other details, such as schedule and even the name of the job – which of course you don’t want to change, but it should be possible.

I would try the import with a test job, not the actual one. I would run the backup on both hosts at least once and if they succeed, then I would check if restore shows me duplicate or single files – or complains about the paths.

But before doing that test, I would wait for a real expert’s answer here.

Oh, crap, forget what I said in my first response. The only way to get it working, if at all, is to re-create the database on Linux from the existing target location (the external drive). For that you don’t need to export and import anything. Instead you launch a restore from the main menu. I can’t remember if there is an option to stop the processing right after the local database has been created, and not proceed to an actual restore.

I sure hope I could start all over again.

After some serious thought, I have to combine some of the aspects from my previous attempts and add some new. For one, the job-specific local .sqlite database only contains information about the files and folders already backed up, and the job settings that you can export and import as a faile is stored in Duplicati-server.sqlite.

So if the local database uses a presentation that does not involve paths in OS specific string form, then you could perhaps place the database on a shared location (whether on internal or external disk is up to you). You can make the move in the web UI. Then you export the new settings into a json file, login in to the other OS, import the settings, and modify the paths in it. In ideal case, the backup then works on both OSes.

I don’t know if this is the actual case, because I don’t know how NTFS presents the paths internally, On Linux, the internal reference is by inodes, and I hope NTFS has something similar. If not, the local database cannot be shared.

Even then it might be possible to re-create the database on Linux as I explained earlier. That is very inconvenient, however, because every time you have run the backup on one OS and then then want to run it on the other, you have to repair the local database on that other OS.

If you want to try it out, be sure to not use the existing backup, but create a test job for the test. And do not define any schedule for it; just run it manually.

The database keeps Windows paths in Windows format (including drive letter). Metadata is also different, however finer points (e.g. NTFS ACLs) might not matter. This definitely seems an awkward arrangement.

In terms of requirements, I take it that this is dual-boot, files are changed from both, and backups desired quickly enough that one of the OSs can’t be declared the backup “owner”, and the other will have to wait?

Waiting can be somewhat smoothed over using direct file copying for files newer than Duplicati’s last run, however either a smart copy tool would be needed, or maybe some helper e.g. the Linux find command.

There are also ways to make Linux run more like Windows or vice versa (Windows Subsystem for Linux) however I’m not sure how solid they are. I do see some reports of being able to sort of run mono on WSL.

Just doing redundant backups is not all that bad, in fact some people consider it best practice, although it usually also involves different backup software and destination – where are you if you drop external drive?

Redundant backups even to same destination can help you if Duplicati ever has issues (it can happen…).

The other challenge is that your backups are not the same. The Windows one has a “few others” as well. Even if you somehow manage to share a database (which I doubt is possible), this would confuse things.

This is what I do too. I have two external disks for backups, one for each laptop. I copy my most precious files from Windows to Ubuntu’s Samba share, and they get backed up from both laptops. I haven’t seen any problem in this duplication of the backups. Yes, I know I should also back up to cloud, but I’m still undecided on which to use.

1 Like

You may be able to pull this off if you were dual-booting OSes of the same type (Windows/Windows, Linux/Linux, etc). If that were the case, and if your source data had the same paths on both of the two OSes, then you could pull it off by carefully placing the Duplicati sqlite databases.

But running Linux/Windows dual-boot setup is a showstopper. The Duplicati database uses different path styles for these operating systems.

What you CAN do is run an independent Duplicati install on each system. You could target the external USB storage as your backup destination on both, BUT you’d have to target different folders - you cannot share the exact same backup destination folder between two or more backup jobs.

Downside to this approach is that you’d have to make changes twice - if you have a new folder you want to include in backups, for instance. Also your ability to restore would be limited to the versions backed up by the Duplicati instance in the currently running OS. If you wanted to restore something backed up by the other, you’d have to reboot to that OS.