Migrating Backup (problem with "PoC" & How to do it properly)

Important: Dont invest too much time in the error message, as this is primarily about the proper way to do the migration instead of solving the errors.

Hi,
I’ve searched other Problems, the HowTos and asked Copilot, but all indicate that migrating from one target to another should be straightforward.
I need to migrate my current 3 backup job that all save to Mega, as they keep locking my account on a regular basis and that’s really annoying.

Just today, I created a new Backup config and thought I would try the migration process as the data currently backed-up by it is unimportant.

  • Initially, the backup was to Google Drive and worked.
  • Downloaded the whole folder from there and put it into my download folder.
  • Changed the config of the Backup and ran the backup. No problem, worked.
  • Then I deleted the files in Google Drive and uploaded the current local backup from my Downloads folder back to Google Drive and changed the configuration.
  • After that, I got the error message:
    > Duplicati.Library.Interface.RemoteListVerificationException: Found 24 files that are missing from the remote storage, please run repair
Click here for details about Backup error message

Duplicati.Library.Interface.RemoteListVerificationException: Found 24 files that are missing from the remote storage, please run repair
at Duplicati.Library.Main.Operation.FilelistProcessor.VerifyRemoteList(BackendManager backend, Options options, LocalDatabase database, IBackendWriter log, IEnumerable1 protectedFiles) at Duplicati.Library.Main.Operation.BackupHandler.PreBackupVerify(BackendManager backend, String protectedfile) at Duplicati.Library.Main.Operation.BackupHandler.<RunAsync>d__20.MoveNext() --- End of stack trace from previous location where exception was thrown --- at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw() at CoCoL.ChannelExtensions.WaitForTaskOrThrow(Task task) at Duplicati.Library.Main.Controller.<>c__DisplayClass14_0.<Backup>b__0(BackupResults result) at Duplicati.Library.Main.Controller.RunAction[T](T result, String[]& paths, IFilter& filter, Action1 method)
at Duplicati.Library.Main.Controller.Backup(String inputsources, IFilter filter)
at Duplicati.Server.Runner.Run(IRunnerData data, Boolean fromQueue)

As expected and experienced before, the repair does nothing but suggest to usage of --rebuild-missing-dblock-files

Click here for details about Repair error message

Duplicati.Library.Interface.UserInformationException: The backup storage destination is missing data files. You can either enable --rebuild-missing-dblock-files or run the purge command to remove these files. The following files are missing: duplicati-b633345ca1ee044b49be8621d71055de9.dblock.zip.aes, duplicati-b2743fa7ed5aa4047b9f3cc19ba3b9835.dblock.zip.aes, duplicati-bb44a093e5ef74d89820884626df1475c.dblock.zip.aes, duplicati-b91c15f08845d425eba63b1b3da7addec.dblock.zip.aes, duplicati-b77304f19f5bc44f2b22888626336fa7d.dblock.zip.aes, duplicati-bfd61547914bf4375817ed5441f82a413.dblock.zip.aes, duplicati-b7fbea02bae8b4be1979fc8e82fec59cf.dblock.zip.aes, duplicati-ba042b1c94b7040fba78cb6e5d43eca87.dblock.zip.aes, duplicati-b86f083df30a74321a7cae1ec239bebbb.dblock.zip.aes, duplicati-b1862906c9a3449f3b7b3c75e97bd1901.dblock.zip.aes, duplicati-bf5d86421581640e9951b22dd9b9066fc.dblock.zip.aes
at Duplicati.Library.Main.Operation.RepairHandler.RunRepairRemote()
at Duplicati.Library.Main.Operation.RepairHandler.Run(IFilter filter)
at Duplicati.Library.Main.Controller.RunAction[T](T result, String& paths, IFilter& filter, Action`1 method)
at Duplicati.Library.Main.Controller.Repair(IFilter filter)
at Duplicati.Server.Runner.Run(IRunnerData data, Boolean fromQueue)

After trying it a few times, strangely the amount of missing files came down from 24 to 11. Unfortunately, the number doesnt lower anymore when performing more repairs/backups.

The question now isn’t how to solve this problem (as the data is not important) but how to do the migration properly.
Most likely, the 3 Backups going back to 2022, are supposed to be migrated to Google Drive.
From what I’ve read, it should be as easy as downloading the data ( the .zip.aes files) from Mega, uploading it to Google Drive and then changing the backup target in the backup job. But my experience doesn’t really match that.

Google Drive is a special case. Duplicati can only see its own files unless you get AuthID by

image

at site https://duplicati-oauth-handler.appspot.com/. Google may remove this ability someday.

EDIT:

Enhancing security controls for Google Drive third-party apps was removal plan. Not done yet.

Safest plan is to only put files on Google Drive with Duplicati, as then they should always work.

1 Like

Duplicati can only see its own files unless you get AuthID by

Yes, did that. Checked the connection and it worked and was able to create the folder for me. So that’s not the problem.

Safest plan is to only put files on Google Drive with Duplicati, as then they should always work.

Do you mean simply changing the Target destination and running a backup or do I misunderstand you?
Wouldnt that result in Duplicate - correctly - stating that there are parts of the backup missing?

EDIT: Tried that, as expected result was
Found 24 files that are missing from the remote storage, please run repair

So the question is - how do I get the backuped .ZIP.AES files, that are currently somewhere else like in Mega to somewhere else (Google Drive in this case) ?

I didn’t say it was. That was done with Duplicati, right? If below was not, then that’s the problem:

How was upload done? Duplicati can only see files it uploads unless you use full access login.

As described, most “somewhere else” should just work. Google Drive is a special harder case.

  1. Yes, that seems to be the cause of the problem.No, I simply took the files that were stored locally and drag&drop uploaded them to Google Drive manually.
  2. That also explains why there already was “folder A” in google drive but duplicati insisted on creating a new “folder A”.
  3. Doesn’t really explain why the amount of missing files sunk from 24 to 11, but lets disregard that one.

Maybe I didnt really look closely at your reply on my phone, but I didnt see your link to the page when reading your answer for the first time on my phone yesterday. Anyway, I fond the same link here:

That method worked, but as both you now and the post back then said, that method could be pulled by Google at any time.

That was mentioned in the other thread as well (see below, but unfortunately that method seems to be a bit above my level. I found another thread where someone had to deal with the same problem

So, as at the moment the full access granting seems to work (and as the Google Drive account is solely for backup purposes, so no data apart from duplicati will be there; → no concern about duplicati messing with other files), I’ll stick with the full access workaround for now.

Thank you very much

It also exposes another Google Drive oddity, which is that duplicate names are fine.
This is rather unusual on computer systems, but Google’s name is only an attribute.

I can’t predict what Google will do. They certainly missed their original cutoff target.
Maybe someday they’ll provide a way to “give” files to some user such as Duplicati.
Their original plan involved a file picker, but I don’t know if it can do a folder of files.