Change of Duplicati server (+ OS) without losing the remote data

I have been using Duplicati on a Windows VM for quite some time. Since I now use the VM exclusively for Duplicati, I wanted to switch to Duplicati with Docker on a Linux platform (unraid). Of course I want to keep the existing backups, so I exported the config files and imported them into the Duplicati container. When starting the existing backups I now get an error with the following description:

“The backup contains files that belong to another operating system. Proceeding with a backup would cause the database to contain paths from two different operation systems, which is not supported. To proceed without losing remote data, delete all filesets and make sure the --no-auto-compact option is set, then run the backup again to re-use the existing data on the remote store.”

I have read in other posts (some many years old) that Duplicati does not handle different OS well.
Is there a solution for the problem I have in the meantime?

EDIT:
I have now carried out another test. If I use duplicati on the new server not in a Docker container but also again in a (completely new) Windows VM, the backup works fine and the backup just keeps running with the old settings and the old remote data.

So it means that the change of the OS is the main reason.

Welcome to the forum @public1

Changing source OS would be worth reading then. It’s from last month and adds a new technique.
What did you think of the other ones you saw while reading in any older posts? Are any workable?

EDIT:

I’m not clear what Duplicati has and now wishes to backup, as opposed to restoring which is easy.
When one starts dealing with VMs, containers, and hosts, it’s confusing what the exact situation is.

EDIT 2:

You also quoted one solution, which you might recognize from some of your readings. Need more?
There are several more technically difficult methods which involve edits of dlist files or databases.
The folder arrangement of new backup versus old backup could also influence which way is best…

EDIT 3:

Continuing with the thought of “how different is the new layout?”, you’ll need to tell Duplicati where backup Source files are now. If you didn’t change anything, it would still expect to use drive letters.
This is a somewhat separate topic from what to do with the Destination side, but it’s necessary too.

hello @public1

Duplicati is a backup tool. It is not a migration tool. The best way to migrate is to use OS tools, that is, restore under the same system that has been used to backup, and then migrate the files to another computer with a different operating system.

The prompt "To proceed without losing remote data, delete all filesets and make sure the --no-auto-compact option is set, then run the backup again to re-use the existing data on the remote store.” does not say that it will allow you to keep your backups, but that it could allow you to keep your data.

The important difference is that the backup history allows you to know that a file has been deleted or created at some specific date, and to restore a given version. Keeping your data is much more limited; the backup will save the files at the date when you will backup, and the Duplicati deduplication will allow it to go faster, that is, see that data exists already on the backend and not send it again. But for a file that is currently deleted on your system but was existing in the past and has still data allocated on the backend, you would not get any way to restore it with this procedure. What you will get is ONE (1) fileset after this initial backup. The words ‘without losing data’ are a bit excessive IMO. It depends on what one is calling ‘data’.

To keep the backups (the history) would be a much more involved process - and is indeed not supported.

We’re still guessing the current situation and the goal. Maybe source is on host, so no migrate.

One clue is in the topic title “without losing the remote data” which of course doesn’t lose itself.
If desired for its old versions, it can be kept, and restored, except you’ll need to tell it the folder.

If instead, the goal is to avoid a big maybe slow upload, the quoted solution will work, however
deleting all old versions might not be desired. One also needs to set a special option to allow it.

Depending on remote storage cost and copy speed, having both old and new versions may do.

The tricky route, which has been tried but is somewhat experimental, are the dlist or DB edits…
If you experiment, work with someone and keep copies of files you change, to allow going back.