I noticed the tray icon still appeared, so I clicked on that, and a new Duplicati tab popped up in Firefox, which took me straight to the home screen for the GUI and didn’t give problems about invalid authentication. So, I used that tab going forward and closed the original one which prompted me for password.
I selected the options for restoring to my new laptop’s home directory, and it ran fine for the most part except in the end, I got a variety of warning and errors (see log: duplicati_403.zip). 2 things stand out:
Failed to apply metadata
Failed to patch with remote file and Failed to retrieve file due to 403 errors
The former is a warning - not too concerned about it but still would be nice if that could be fixed. The latter is an error and what I’m really concerned about.
I’ve come across these threads related to my issue:
The first link suggest permissions issues but I’m using my own personal google drive and I can’t find a place to change permissions. Google drive already lists me as the owner of the files that were failed to be retrieved.
The second link suggests several possibilities:
some folks had luck restoring manually, as opposed to automatically. But, I’m getting these 403 errors despite starting the restore manually form the GUI, using the latest stable duplicati version.
There was something in the thread about a 750GB limit imposed by google. That seems to be an upload limitation. My backups are fine - it’s the restore that is problematic. And anyhow, my backup is much smaller than 750GB:
I just tried restoring the same way once again and the same exact files produced the 403 error (see second log: duplicati_403_part2.zip (2.9 KB))
This is interesting. If it were simply a matter of intermittent networking issues or google throttling the rate at which it services API requests (as some postulated in the second link), then it seems a very big coincident that the same exact files would produce the 403 error…
Is there something wrong with these particular files that duplicati couldn’t retrieve?
Although I didn’t compare the entire list of failed files between the two runs (and it’s not in the log either, due to limit of 20 error entries), finding the first three I looked at being the same suggests something (unfortunately) other than some random timing result. Do you see any common things between the failing files? Can you successfully download them in any way? One Duplicati test is BackendTool to request a get, and you can get a suitable URL from Export As Command-line.
If that works, then the question is why it doesn’t in a restore. If it fails, can Drive web UI give file?
Google Drive reportedly can throttle with a 403 on retrieval, but second test suggests it’s not that.
Google Drive by default only gives Duplicati access to files it uploaded, but usually other files are invisible to Duplicati, so unless you have a Duplicati option like no-backend-verification, it should complain every backup about files it thinks should be there that the file listing somehow can’t see.
If you want to try better access (until Google takes it away as threatened for years), you can use OAuth Server Cloud service to get a Google Drive (full access) login but it shouldn’t work differently if all the files have been uploaded by Duplicati. The Google web UI no longer displays what application created the file. It did before. I suppose an alternative test is to do a list in the BackendTool, and AFAIK if the file shows up there, you’re not in the Drive “invisible file” problem.
Thanks for your help. The failed files seem to have the following thing in common: when I search for them in google drive, the file shows up AS WELL AS a second file which doesn’t have the same name. Is google drive hallucinating or is there really a connection between the two files? Example:
I could download both of the above files from the Google Drive web UI (after google asked me to confirm my intentions via a pop-up since the files were too big to scan for viruses). Is it possible duplicati is getting hung up because it can’t programmatically interact with the pop-up?
I tried using the backend tool to get the file and got the 403 error. I ran: duplicati-backend-tool get "googledrive://BACKUP_PATH?authid=BLAH_BLAH_BLAH" duplicati-b535832ef508c4c4e997f62b71825ff85.dblock.zip
and got:
Command failed: The remote server returned an error: (403) Forbidden.
System.Net.WebException: The remote server returned an error: (403) Forbidden.
at Duplicati.Library.Utility.AsyncHttpRequest.AsyncWrapper.GetResponseOrStream()
at Duplicati.Library.Utility.AsyncHttpRequest.GetResponse()
at Duplicati.Library.Backend.GoogleDrive.GoogleDrive.GetAsync(String remotename, Stream stream, CancellationToken cancelToken)
at Duplicati.Library.Backend.GoogleDrive.GoogleDrive.GetAsync(String remotename, String filename, CancellationToken cancelToken)
at Duplicati.Library.Utility.Utility.Await(Task task)
at Duplicati.CommandLine.BackendTool.Program.Main(String[] _args)
I tried listing and the file is found. I ran: duplicati-backend-tool list "googledrive://BACKUP_PATH?authid=BLAH_BLAH_BLAH" | grep duplicati-b535832ef508c4c4e997f62b71825ff85.dblock.zip
and got the output: duplicati-b535832ef508c4c4e997f62b71825ff85.dblock.zip File 2/8/2025 7:13:08 PM 49.232 MB
It might be hallucinating, but it looks like you can steady it by not searching content.
Putting title: in front of file name returned reasonable results based on file name.
Originally I thought you were surprised by duplicated names which Google supports.
What pop-up? After initial auth issue solution, I see no mention of a pop-up in restore.
Google search finds some mentions of “This app is blocked”, dating back many years.
When exactly did yours say that? I know they’re wanting to kill off the full-access login.
argues against it being a login type issue, so I’m not sure why they give 403 to access.
I think there’s a more detailed error at the protocol level that Duplicati doesn’t show us.
403 errors has an example of what we could be missing that might reveal their reason.
Maybe the developer will have some other ideas. Anyway, good to see the file exists…
If need be, the Google Drive files can be moved somewhere else that works with them.
I suppose as a long-shot you could see if setting read-write-timeout longer (or 0) helps this. Symptoms of the too-short default timeout have been different so far, but maybe this is a first…
You could also pick a random (but not known bad) file from the set to try the BackendTool get. Seeing it work for some file but not for others would prove you did it right, but something stops specific files from downloading through Duplicati. For a full list you can set a log-file=<path> using log-file-log-level=error. That will avoid the 20 line truncation limit in the default log.
Do you ever use rclone? If so, I wonder if it can get all the files? Duplicati can use that rclone.
EDIT 1:
rclone copyto can do individual files. That would be an easier place to start any access tests.
EDIT 2:
If you can get it to fail, Logging has a lot of levels. Maybe some will reveal what Google dislikes.
The problem is that Google has implemented this in a way that is not working well for 3rd party tools. The logic is that only files Duplicati has created are accessible to Duplicati, which is exactly what we want. Duplicati cannot use your documents, pictures or spreadsheets anyway, so better shield them from access.
However, some things can cause the permission to be lost. I don’t know exactly what causes this, but many users have reported that files are suddenly inaccessible.
The problematic part is that Google provides no way to change the permissions, so you can never grant Duplicati access to files once the permission is lost. I have spent a long time trying to communicate this problem to Google, but they have other priorities.
Sadly, my attempts to fix things have cause Google to retract the original permissions, so it is no longer possible to obtain full access. Attempting to solve this with Google has so far only resulted in an endless reply of the same email responses.
If you do not have space to store the remote data locally, you can set up your own OAuth server, which will then let you grant full access to 100 accounts in the “Test mode”, which should be enough for your personal needs. It is quite a hassle, but I have documented the steps here:
Maybe we should investigate if rclone has found a workaround for the permission issue?
Do you recall if they’re still visible? These are.
The usual behavior is someone copies files in without Duplicati, then Duplicati can’t see them.
Or they had first created a Destination folder manually, then Duplicati creates a second one…
The latter is because Google Drive allows duplicates. That’s unusual and can surprise people.
They are still visible, and they still work fine, Duplicati is just denied access to them.
You can trigger this by de-authorizing Duplicati from your account, and then re-adding it. This process will fully drop all permissions giving Duplicati 403 for all files. I guess renaming the file back-n-forth or moving it out-n-back may also trigger the permission loss for a single file, but I cannot imagine anyone doing that on purpose.
Yes, that will certainly break things. I assumed the OP had not touched the files.
I just remembered another, perhaps easier, way to fix things:
Download the files that you cannot access with Duplicati from your Google Drive
Delete them from your Google Drive
Use the duplicati-backend-tool to upload each file
For step (3) you copy your destination url, lets say it is googledrive://folder1/folder2?authid=abc.
Then invoke the duplicati-backend-tool like this:
duplicati-backend-tool \
PUT googledrive://folder1/folder2?authid=abc \
./b535832ef508c4c4e997f62b71825ff85.dblock.zip
This will then re-upload the file and in the process set the permissions correctly. Repeat for all the files that give 403.
Thanks for your help @ts678 and @kenkendk . After saving a local copy of one of the problematic zips and deleting the version on google drive, I tried to reupload:
duplicati-backend-tool PUT "googledrive://BACKUP_PATH?authid=BLAH_BLAH_BLAH" duplicati-b535832ef508c4c4e997f62b71825ff85.dblock.zip
I ran that command from the folder in which i downloaded the problematic zip. But then I got this error:
Command failed: Cannot access a disposed object.
Object name: 'System.Net.HttpWebResponse'.
System.ObjectDisposedException: Cannot access a disposed object.
Object name: 'System.Net.HttpWebResponse'.
at System.Net.HttpWebResponse.get_StatusCode()
at Duplicati.Library.Backend.GoogleServices.GoogleCommon.ChunkedUploadAsync[T](OAuthHelper oauth, String uploaduri, Stream stream, CancellationToken cancelToken)
at Duplicati.Library.Backend.GoogleServices.GoogleCommon.ChunkedUploadWithResumeAsync[TRequest,TResponse](OAuthHelper oauth, TRequest requestdata, String url, Stream stream, CancellationToken cancelToken, String method)
at Duplicati.Library.Backend.GoogleDrive.GoogleDrive.PutAsync(String remotename, Stream stream, CancellationToken cancelToken)
at Duplicati.Library.Backend.GoogleDrive.GoogleDrive.PutAsync(String remotename, String filename, CancellationToken cancelToken)
at Duplicati.Library.Utility.Utility.Await(Task task)
at Duplicati.CommandLine.BackendTool.Program.Main(String[] _args)
Not sure why this would happen. The upload code used by that tool is the exact same code that is used by the regular Duplicati binaries.
Is it perhaps a weird error message because the file exists somehow?
Could you try to rename it to something different, like myfile-1234.dblock.zip and then upload it?
I don’t have an explanation for it. The code used to upload from the backend tool is the same that Duplicati’s backup is using. So if one works, the other should too.
Our continuous integration tests run on each new commit to ensure the Google Drive integration is working as expected, and it has not reported the error you see, so I do not have any guesses as to what is causing this.
Is it an option to try this on another machine? If you have access to a Windows machine for instance, it is possible that the error message is at least different there.
I would also suggest trying with the Duplicati 2.0.8.1:
However, on Linux/MacOS it requires Mono to be installed, which is quite intrusive and not well maintained at the moment. If you are able to use Docker, that might be an option as well.
Thanks. Out of curiosity, why do you suggest Duplicati 2.0.8.1? I’m currently running Duplicati - 2.1.0.5_stable_2025-03-04 which is very recent. Is the old version you suggested supposed to be more stable than the latest one?