Changing source OS

I’m in a similar position. I want to create a backup using my windows PC then export it to a NAS to continue later, but it doesn’t like the source changes.

Still no way to fix this?
or an easy way to change the paths in the database?

ok, so i’ve done some tests with renaming the paths in side the filelist.json file (inside the

my initial backup with the windows machine was from source \\backup\test\ which ends up looking like: \\\backup\test\ and when moving that over to the NAS the path translates to /data/backup/test/ which is quite easy to find and replace.

What i found was then i had to find and replace any sub directories which was easy enough by searching for “\” and replacing it with “/”

This all seems to work great until you want to go the other way…

If you have a backup that is /data/backup/test/ and you want to rename it to \\\backup\test\ that is easy enough. But its when you start looking for the subdirectories. you can’t just do a search on “/” and replaced it with “\” Because there are hashes in the file that use the “/” character as part of the hash, and you can end up accidently renaming the hash.

Anyone have any suggestions on fixing this file path? or somehow reindexing?

Let’s make sure I’m understanding the situation currently.

You run a backup of source files on Windows (meaning all the paths are backslash Windows style).

Then you export the backup job and move it and the source files to a Linux environment, import the backup job and change the source and destination paths (now forward slash based) as appropriate for the new OS.

When you run the moved backup it complains because it can’t find any files in filelist.json (inside the because the source slashes (and root paths) have changed?

Correct, I get the error:

“The backup contains files that belong to another operating system. Proceeding with a backup would cause the database to contain paths from two different operation systems, which is not supported. To proceed without losing remote data, delete all filesets and make sure the --no-auto-compact option is set, then run the backup again to re-use the existing data on the remote store."

But there is no way to 'delete the filesets

Thanks for the confirmation.

This is speculation on my part (hopefully @kenkendk or somebody else can confirm) but it sounds like you need to purposefully cripple the backup so that a rebuild of data is triggered.

DO NOT TRY THESE until it’s confirmed by somebody else, but my guess is you’d need to do something LIKE:

  1. manually connect to your destination
  2. delete (or better yet, rename) the file
  3. add the --no-auto-compact Advanced parameter to your backup (otherwise I think it will delete your backups, as it will see every backup file as wasted space since it’s not referenced in the file)
  4. run your backup, at which point I expect it will
    a. realize there no remove file
    b. scan all your local files to build new contents
    c. upload the new

If you brought your local Windows sqlite files over to Linux rather than exporting/importing the backup job them something like this MIGHT end up being needed to get paths in the local database structured correctly:

  1. export your backup
  2. delete the backup but DO NOT “Delete remote files”
  3. import the backup
  4. run the backup - at which point it will:
    a. realize it has no local data files
    b. download remote files to re-populate the local data files
    c. run the backup and NOT upload very much data as it will see the contents already exists at the destination

Again, DO NOT do either of these until somebody can confirm either of them are the correct process.

Ok, so ive tried your first suggestion, but no success…

here were my steps, let me know if I misunderstood.

  1. Made windows backup
  2. Exported backup
  3. Imported into linux
  4. Renamed DLIST files manually on backup
  5. Added --no-auto-compact parameter
  6. Ran backup - Error no list files present
  7. Attempted to run repair - Error
  8. Attempted to delete and run repair - Recreated
  9. Ran backup - Errored.

So it doesn’t look like this method works.

I’m not quite understanding your second method, are you saying that you need to redownload the whole backup again?

There is a check in the source that prevents you from continuing if you have no remote dlist files. You need to create one, but it can be empty (i.e. no files) and put it in the remote folder to “fool” Duplicati into thinking that it is a valid backup. Then you can run the repair, which will build the local database, but with no files.

The next backup should then add your filenames to the new dlist file, and you can use the commandline to DELETE the old/fake/empty version (version 1).

For the manual fixing of the dlist file, you can use something like Python to load the json, manipulate it, and then write it out again as a json:

This will allow you to only do the replace on the path fields instead of a simple search-n-replace.

ok, well the fake dlist file sounds like a much easier way to go than to making a script. So once you run the repair it will create a proper dlist file?

No, “repair” will rebuild the local database.

The next backup you run will create the correct dlist file, and re-use the dblock data.

Ah ok! so it won’t actually redo the backup. That’s good to know.