All duplicati files "missing" after NAS to Google Cloud change

So I finally decided to no longer have my backup locally on a NAS (windows mounted drive), but rather in the cloud.

So I copied the files from the NAS folder to google using rclone. This is a 2gb backup.

When I change the backup config from “local drive” to google cloud, everything looks good. I make an auth-id, copy, test it. Weird thing is it asks to create a the folder, although it is already there (/backup/duplicati). When I look at the drive, I have now two “duplicati” folders under “/backup”. So I moved all the files to the new one and run verify. I still says that 44 files (thats how many there are) are missing.

I’m pretty sure I’m missing something permission related on google drive, but I can’t pinpoint it.

(Or is there a better/different way to change backup target?)

Thanks in advance.

Welcome to the forum @xrmb

All duplicati files “missing” after NAS to Google Cloud change

First, what is the exact one you chose? Choices are Google Cloud Storage and Google Drive.

Rclone supports both too, and these are not at all the same thing. Your issue “sounds” like what a
Google Drive user can run into when copying files in. Plus “look at the drive” sounds like a “Drive”.

If it’s Google Drive, what does Created information for a file that you copied say created that file?

Duplicati OAuth Handler can currently (Google announced they’ll cut it off) give a full access login.

The limited login, where Duplicati can only see files that it (not Rclone) created, is the GUI default.
This is probably the safer one long-term, but you can use full access to poke around to see effect.

AFAIK you did it just right for everything but Google Drive which is going to be an awkward copy-in.
You can look to see if Google Drive has a way to change creator. It didn’t used to, but it needs that.

Duplicati.CommandLine.BackendTool.exe can copy files in as created by Duplicati. Copy with that.
xargs might help. For destination URL you can use URL taken from GUI Export As Command-line.

While looking at who created what, you might find that the new folder was created by Duplicati but contents were created by Rclone (however it lists). If so, Test connection should see new folder, however verify (which reads a file list, which you can also test with BackendTool) will not see them.

Another unusual Google Drive behavior. Duplicate names are fine. You can create these in GUI too.
There are rare cases where this confuses Duplicati, but it doesn’t seem to get reported very often…

Looks like you cracked the case.

I used “Google Drive”. The Google Drive website no longer shows who “Created” it, it used to say “Duplicati”, or “rclone”.

I dont have the option for “full access”, the popup only gives me the “limited” one.

So maybe I have been cut off already. I’ll investigate the command line tools. No big deal copying the 2gb again, or starting the backup from scratch. I still have the history locally if I ever need it.

I guess what google is doing here is to make the drive more secure and prevent rouge apps.

You probably just clicked AuthID on the Destination screen, and got the default for what you chose. is the link I gave earlier. Click it to see all your options.

Duplicati OAuth Handler

This service is intended for Duplicati users that want to use one of the following OAuth enabled providers:

but of course some won’t be Google Drive at all. Command line users do a manual pick-then-copy.
I suspect you may be doing one too, not using the convenient (but limited) AuthID popup any more.