Failed: The remote server returned an error: (403) Forbidden Google Drive

Hi guys,
I will also add myself to the error.

I have a personal google account, not business, with 2tb storage space of which about 860gb is occupied.
I have 8 backups running on drive and only one fails (the biggest one).
Assuming I copy all files from scratch, my total backup would be 170gb so it doesn’t exceed the 750gb daily limit.
All of them work except one.

Knowing that I’m not the only one gives me comfort!

If I run the manual backup, I receive confirmation of success but the upload happens in a few minutes and at a speed of 314 bytes/s.
The size of my backup that loads daily is 38gb so the message can not tell the truth.

It seems that you need to compile the application with tracing enabled:

Compile your code with tracing enabled. See How to: Compile Conditionally with Trace and Debug for more information about the compiler switches required to enable tracing.

That’s probably beyond what I can do at the moment. I might just have a look at trying to track down where the HTTPS calls to Google Drive are handled in the code and put a console debug output of the body of any 403 responses.

I have a feeling the problem is API Rate Limiting. I’m using a new Google Account set up specifically just to hold backups (as Google One is relatively cheap for 2TB of storage), and I have a hunch the allowed API rate might be lower than expected. The 403 errors I’m seeing seem to be around enumerating file lists, so that’s what leads me to believe its an API rate error, not a data volume problem.

I’m seeing 403 Forbidden errors now after uploading only 35GB of data, so I’m pretty certain the error isn’t related to volume of data, but potentially volume of API calls.

If anyone knows the codebase well enough to point me to the line where HTTP(S) requests are dispatched, I’ll see if I can hack it to log the body response.

You just need to edit an existing config file in a Duplicati folder per How to: Configure network tracing.
It’s a bit more than copy-and-paste-at-bottom, because new content goes in original <configuration>.
This will make a big file, but it’s a view on the application side of the TLS encryption, which is helpful.

I’m not very familiar with this, but I think you want to read HttpWebRequest.GetResponse Method and

https://github.com/duplicati/duplicati/blob/master/Duplicati/Library/Backend/GoogleServices/GoogleCommon.cs has a couple of spots that might wind up in GetResponse() with your error details.

This is really helpful, I’ll give both those suggestions a go. I’m still getting 403 Forbidden errors, but now at longer intervals, which makes me more confident that the problem is some sort of opaque rate management that Google is applying to Google Drive API calls.

1 Like

Hi there, first time posting, thanks for this thread and all of the others as I try to make sense of my 403 errors.

My setup:
Duplicati on Ubuntu
Team Drive in Google Drive (educational and unlimited), ~15tb back up over the past year.

Same as OP and others: all of a sudden backups are failing on the (final) verification part (GET fails).
The backup file uploads go through fine. Waited 24 hours to see if there was some kind of ban, but it’s been 36 hours, and I uploaded nowhere near 750gb.
OAUTH verification (either limited or full) test fine.

To test, I had Duplicati create a folder in the Team Drive and that all worked fine with uploads and everything. BUT verification fails as well as any attempts to restore.

Any suggestions would be appreciated.
I’ve made no changes to the Team Drive or permissions.
I can upload manually to the drive.

Edit: Just saw that when I try to download a Duplicati file manually I get:

There’s no way that I downloaded 10tb, but perhaps other people on other team drives associated with the account creator?

Either way, looks like I’ll continue to wait it out.

Hi folks. I was receiving the exact same error message that made the automated backups useless. This was the case until I removed my password protection from Duplicati. Since then my unattended backups have been working flawlessly. Try it.

Welcome to the forum @superjacktr

There are various passwords, e.g. the GUI can have one. Are you talking about the one on backup data?

image

I have this because I don’t trust Google (and whoever can make them hand over their data) all that much.
Other people might care less. It’s an interesting finding, but my finding is I get the 403s kind of at random.
Thanks for the suggestion, and time will tell whether it keeps working for you (and I’m not at all sure how).

is one of mine where Google Drive 403 retries exhausted my retry limit. I think Duplicati puts up with this pretty well in the main backup, but the compact that might run later doesn’t seem to like any sort of error.

Anybody who has a reliable way of getting the 403 (e.g. from a test backup designed to troubleshoot this) could help the project a lot by writing up reliable steps to reproduce, and filing it as an Issue to be chased.

I’ve sometimes wondered if the 403 error is tied in with an OAuth 2 access token that Google has expired.
Access Token Lifetime gives the concepts. How we get along with OAuth says how Duplicati handles this.

When I first saw this response, I thought it was to a currently active topic which was seeing the 403 errors.

Sorry I should have been explicit. Yes it is the User Interface password under the settings.

This sounds even stranger then… I don’t use one, but still get intermittent 403 errors from Google Drive.
I had one for a short time. I guess I can set one and see whether I can make 403 errors worse to debug.

One known problem with the user interface password is that browsers used to autofill it into bad places.
This typically showed up on Restore though because GUI password got put in the encryption password.

2.0.5.112_canary_2021-01-20 put fix in Canary, and it should be in the 2.0.6.x Beta releases.
It’s possible that there are some other files to make autofill-proof, if that turns out to be issue.

Fixed password manager autofill, thanks @drwtsn32x

Very similar issue here.
I have four backups and one of them suddenly stopped working. For me it doesn’t make any difference if it’s started automatically or manually. All four backups use the same AuthID and backup to the same Google team drive. I already renewed the AuthID (with full access even), but it didn’t help.

The backup that’s failing is by far the biggest (~800GB, currently at 500GB uploaded as it didn’t have a complete run yet). Second biggest is 85GB.

One thing I noticed is that the “check connection” function takes like 15 minutes to tell me that the check was successfull. While before, when I initially set the backup up, it took like 5 seconds. But it takes that long for all backups and not just the failing one.

I just set the number of retries to 20. I’ll check if that halps.

Log:

System.Net.WebException: The remote server returned an error: (403) Forbidden.
  at System.Net.HttpWebRequest.GetResponseFromData (System.Net.WebResponseStream stream, System.Threading.CancellationToken cancellationToken) [0x00146] in <9c6e2cb7ddd8473fa420642ddcf7ce48>:0 
  at System.Net.HttpWebRequest.RunWithTimeoutWorker[T] (System.Threading.Tasks.Task`1[TResult] workerTask, System.Int32 timeout, System.Action abort, System.Func`1[TResult] aborted, System.Threading.CancellationTokenSource cts) [0x000f8] in <9c6e2cb7ddd8473fa420642ddcf7ce48>:0 
  at Duplicati.Library.Main.BackendManager.List () [0x00049] in <e60bc008dd1b454d861cfacbdd3760b9>:0 
  at Duplicati.Library.Main.Operation.FilelistProcessor.RemoteListAnalysis (Duplicati.Library.Main.BackendManager backend, Duplicati.Library.Main.Options options, Duplicati.Library.Main.Database.LocalDatabase database, Duplicati.Library.Main.IBackendWriter log, System.Collections.Generic.IEnumerable`1[T] protectedFiles) [0x0000d] in <e60bc008dd1b454d861cfacbdd3760b9>:0 
  at Duplicati.Library.Main.Operation.FilelistProcessor.VerifyRemoteList (Duplicati.Library.Main.BackendManager backend, Duplicati.Library.Main.Options options, Duplicati.Library.Main.Database.LocalDatabase database, Duplicati.Library.Main.IBackendWriter log, System.Collections.Generic.IEnumerable`1[T] protectedFiles) [0x00000] in <e60bc008dd1b454d861cfacbdd3760b9>:0 
  at Duplicati.Library.Main.Operation.BackupHandler.PreBackupVerify (Duplicati.Library.Main.BackendManager backend, System.String protectedfile) [0x0011d] in <e60bc008dd1b454d861cfacbdd3760b9>:0 
  at Duplicati.Library.Main.Operation.BackupHandler.RunAsync (System.String[] sources, Duplicati.Library.Utility.IFilter filter, System.Threading.CancellationToken token) [0x01048] in <e60bc008dd1b454d861cfacbdd3760b9>:0 
  at CoCoL.ChannelExtensions.WaitForTaskOrThrow (System.Threading.Tasks.Task task) [0x00050] in <9a758ff4db6c48d6b3d4d0e5c2adf6d1>:0 
  at Duplicati.Library.Main.Operation.BackupHandler.Run (System.String[] sources, Duplicati.Library.Utility.IFilter filter, System.Threading.CancellationToken token) [0x00009] in <e60bc008dd1b454d861cfacbdd3760b9>:0 
  at Duplicati.Library.Main.Controller+<>c__DisplayClass14_0.<Backup>b__0 (Duplicati.Library.Main.BackupResults result) [0x0004b] in <e60bc008dd1b454d861cfacbdd3760b9>:0 
  at Duplicati.Library.Main.Controller.RunAction[T] (T result, System.String[]& paths, Duplicati.Library.Utility.IFilter& filter, System.Action`1[T] method) [0x0026f] in <e60bc008dd1b454d861cfacbdd3760b9>:0 
  at Duplicati.Library.Main.Controller.Backup (System.String[] inputsources, Duplicati.Library.Utility.IFilter filter) [0x00074] in <e60bc008dd1b454d861cfacbdd3760b9>:0 
  at Duplicati.Server.Runner.Run (Duplicati.Server.Runner+IRunnerData data, System.Boolean fromQueue) [0x00349] in <156011ea63b34859b4073abdbf0b1573>:0

I’m running Duplicati - 2.0.6.3_beta_2021-06-17 on Ubuntu Server 20.04

I haven’t personally had one never come back, but it has sometimes exceeded the retries I chose…

Implement exponential backoff for backend errors #4661 was recently done by a volunteer. Thanks!
Scheduled Backup Failure: Google Drive: (403) Forbidden was cited, and Google suggests backoff.

If you have a fast system and upload, it’s possible your 800 GB helped push past 750 GB daily limit.
That will mean even 20 retries won’t get past it, and you’ll have to wait for Google to unlock things…

That’s unlikely. My internet at home isn’t fast enough to upload 750GB in one day. It might be that my google account is also used on another server with more upload speed, but I’m pretty sure it isn’t. Also my other three backups to the same google drive never had a problem

I suppose it’s possible for Google to have localized problems that might explain why some of your systems are varying. What appears like a unified folder structure to you is probably spread all over their data center.

If it doesn’t come back by itself shortly, trying to troubleshoot a bit lower is possible but somewhat complex.

Export As Command-line can get you a URL to use with Duplicati.CommandLine.BackendTool.exe to see whether a simple operation like list or get fares any better or gives any further information about the 403.

Being on Ubuntu, you can’t use .NET Framework network tracing, which is hard but gives the best visibility.

Looks like setting the number of retries to 20 helped. The backup started the next time I tried

1 Like

BTW I am also using google drive as a backend (backing now 4.4 TB), and also faced this error last December at initial measurements. For me, using 2.0.6.3_beta_2021-06-17, the following combination was already sufficient:
–number-of-retries 10
–retry-delay 20

Once there is a stable beta including the backoff mechanism instead of constant --retry-delay, switching to that should render the --number-of-retries having this high not needed.

1 Like

I have just enhanced error reporting for Google Drive errors by unpacking and displaying the HTTP response. I was receiving 403 errors. The detail relating to these errors was “This file has been identified as malware or spam and cannot be downloaded”. It appears that Google checks files before allowing them to be downloaded. In order to bypass this error Google Drive has a query string parameter called “acknowledgeAbuse” that must be set if a file is detected as having spam/malware (the flag can’t be set generally, it can only be set for files detected as having spam/malware). I have modified the code to add this flag if this error is detected but only if a new Google Drive Duplicati option called “acknowledge-abuse” is also enabled.

The code is complete… if I haven’t fixed the issue above for everyone then at least the improved error reporting will get them closer.

Where do I find detail on how to put this code into an early release branch?

1 Like

This sounds like the better display of returned errors (beyond status code) that I’ve long been wanting. Although you’re after Google Drive errors (and 403 has annoyed me too), is it a general HTTP helper?

It’s done using a GitHub pull request but I don’t do them so can mostly refer you to public information.
How to join in the development of Duplicati?
Proposing changes to your work with pull requests (GitHub’s documentation, but other info is around)

Maybe someone who is more familiar with this will stop by to help, but that’s the basics of how it goes.
After the pull request gets put into the master branch by somebody, the next step is a Canary release.

These tend to happen when enough changes build up or an emergency happens (like fixing Dropbox).
Ideally they would happen more often, but there is a need for someone to volunteer to do the releases.
Duplicati seeks volunteers in all areas such as forum, manual, test, and most importantly development.

Regardless, thanks for the code!

EDIT:

Actually, my 403 errors were on uploads, so possibly that’s a different problem. Regardless, any extra information that can be displayed would be helpful (and hopefully not be too long or have private info).

It’s definitely a better display of errors as it’s returning the error information that Google Drive responds with so you have some hope of correcting the issue. It’s not a general HTTP helper as it is applied slightly differently in different circumstances. Also once errors have been detected then developers will be able to code solutions to mitigate the error as I did when Google Drive tells me that the particular file contains spam or malware.

Thank you for the information on how to get the files back from Duplicati. On reading the doco they’d like me to fork the Master branch, I can then just comment, commit and push my changes into the fork for someone to include in a Canary release when they get to it. FYI, a Canary release was created yesterday, hopefully another will be created soon after I push my changes.

Before I went to bed last night I thought I should probably include my work across all web methods. I have now been through all the web methods of the Google services and added the improved logging. That should give you the information you need when you next get an error. After you get an error message then we can work together to determine the cause and fix the code if needed.

That was kind of an odd one, noted as a “preliminary build”, unsigned, with just a few files, and no autoupdate (probably a best plan given the other omissions). Long version of its backstory is here.

Plans for next release? was another contributor (thank you both) hoping to Canary soon. Problem currently is that the release manager has been unavailable, but there’s an opening for a volunteer.
You saw the style of a typical release note. Beyond that, some amount of Git expertise is required.

So yes, I’m hoping we can keep releases flowing, but the future of them is kind of murky right now.
I see your pull request (thanks!). Maybe someone can get it into master before next “real” Canary.