So I finally decided to no longer have my backup locally on a NAS (windows mounted drive), but rather in the cloud.
So I copied the files from the NAS folder to google using rclone. This is a 2gb backup.
When I change the backup config from “local drive” to google cloud, everything looks good. I make an auth-id, copy, test it. Weird thing is it asks to create a the folder, although it is already there (/backup/duplicati). When I look at the drive, I have now two “duplicati” folders under “/backup”. So I moved all the files to the new one and run verify. I still says that 44 files (thats how many there are) are missing.
I’m pretty sure I’m missing something permission related on google drive, but I can’t pinpoint it.
(Or is there a better/different way to change backup target?)
The limited login, where Duplicati can only see files that it (not Rclone) created, is the GUI default.
This is probably the safer one long-term, but you can use full access to poke around to see effect.
AFAIK you did it just right for everything but Google Drive which is going to be an awkward copy-in.
You can look to see if Google Drive has a way to change creator. It didn’t used to, but it needs that.
While looking at who created what, you might find that the new folder was created by Duplicati but contents were created by Rclone (however it lists). If so, Test connection should see new folder, however verify (which reads a file list, which you can also test with BackendTool) will not see them.
Another unusual Google Drive behavior. Duplicate names are fine. You can create these in GUI too.
There are rare cases where this confuses Duplicati, but it doesn’t seem to get reported very often…
So maybe I have been cut off already. I’ll investigate the command line tools. No big deal copying the 2gb again, or starting the backup from scratch. I still have the history locally if I ever need it.
I guess what google is doing here is to make the drive more secure and prevent rouge apps.