Failed: The remote server returned an error: (403) Forbidden Google Drive

Apologies for delayed response but I have been away from home the last couple of days (which actually has proved something I think…)

OS - Win 10_1909
Dup - 2.0.5.1_beta_2020-01-18
Google Drive - Google Cloud – G Suite

Yes I did have to do a start from scratch rebuild of my machine but thanks to the backups on Duplicati I didnt lose a digit of data.

So once I restored my data and rebuilt the backup jobs on Duplicati but didnt run them for a couple of days until i was certain all was well with my machine.

Two backups set with different AuthID but both pass when doing the Test Connection (one running at 2am and one running at 3:45am).

Error from the 2am as follows

System.Net.WebException: The remote server returned an error: (403) Forbidden.

  • at Duplicati.Library.Main.BackendManager.List()*
  • at Duplicati.Library.Main.Operation.FilelistProcessor.RemoteListAnalysis(BackendManager backend, Options options, LocalDatabase database, IBackendWriter log, String protectedfile)*
  • at Duplicati.Library.Main.Operation.FilelistProcessor.VerifyRemoteList(BackendManager backend, Options options, LocalDatabase database, IBackendWriter log, String protectedfile)*
  • at Duplicati.Library.Main.Operation.BackupHandler.PreBackupVerify(BackendManager backend, String protectedfile)*
  • at Duplicati.Library.Main.Operation.BackupHandler.d__20.MoveNext()*
    — End of stack trace from previous location where exception was thrown —
  • at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw()*
  • at CoCoL.ChannelExtensions.WaitForTaskOrThrow(Task task)*
  • at Duplicati.Library.Main.Controller.<>c__DisplayClass14_0.b__0(BackupResults result)*
  • at Duplicati.Library.Main.Controller.RunAction[T](T result, String& paths, IFilter& filter, Action`1 method)*
  • at Duplicati.Library.Main.Controller.Backup(String inputsources, IFilter filter)*
  • at Duplicati.Server.Runner.Run(IRunnerData data, Boolean fromQueue)*

No error from the other one. Backup worked.

As previously said it worked when running manually (but did give some no meta data warnings - more of that later) but backed up OK.

So I left the thing alone when I was away and guess what - backed up successfully (since 2am 7th Oct 20) , automatically with no changes or intervention from myself! Has worked ever since

Suggests that it was an issue with Google?

Now back to the ‘Warning’ issues I had. The warning was of the form
[Warning-Duplicati.Library.Main.Operation.Backup.MetadataGenerator.Metadata-MetadataProcessFailed]: Failed to process metadata for xx file name - no metadata present.

The issues was that for some strange reason I was not the owner of the file. Find file (right click Properties > Security then if it says you arent the owner you need to change yourself to the owner or by adding yourself to the group that has access (via the advanced button at the bottom of the scurity tab (from memory))

Hope that helps someone!

lcsneil

I do not really understand what solved the issue in finally… It continued to work coincidentally? I get the same error like you all the time, but like I mentioned just for one backupjob.
I saw, that I was running on an older version uf duplicati, so I updated instantly to Duplicati - 2.0.5.1_beta_2020-01-18 and did setup the malfunctioning job new. unfortunatly the situation stays like before.

little update from my side: after the update to Duplicati - 2.0.5.1_beta_2020-01-18 without any changes, I upgraded my system to win 10 - 2004, and now however, the backup job is working again. so the combination between updating duplicati and windows does the trick for me… Thank you guys for your support.

A few weeks ago, my backup is showing this error.

If I run the backup manually it is successfully completed, the file listing is also successfully completed.

The scheduled backup started to fail daily, I have two backups configured one is running and the other is not.

Can someone give me a tip on how to solve?

I am having the same issue as others. If I run it manually it works every time. The scheduled run fails with the below:

System.Net.WebException: The remote server returned an error: (403) Forbidden.
at Duplicati.Library.Main.BackendManager.List()
at Duplicati.Library.Main.Operation.FilelistProcessor.RemoteListAnalysis(BackendManager backend, Options options, LocalDatabase database, IBackendWriter log, String protectedfile)
at Duplicati.Library.Main.Operation.FilelistProcessor.VerifyRemoteList(BackendManager backend, Options options, LocalDatabase database, IBackendWriter log, String protectedfile)
at Duplicati.Library.Main.Operation.BackupHandler.PreBackupVerify(BackendManager backend, String protectedfile)
at Duplicati.Library.Main.Operation.BackupHandler.d__20.MoveNext()
— End of stack trace from previous location where exception was thrown —
at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw()
at CoCoL.ChannelExtensions.WaitForTaskOrThrow(Task task)
at Duplicati.Library.Main.Controller.<>c__DisplayClass14_0.b__0(BackupResults result)
at Duplicati.Library.Main.Controller.RunAction[T](T result, String& paths, IFilter& filter, Action`1 method)
at Duplicati.Library.Main.Controller.Backup(String inputsources, IFilter filter)
at Duplicati.Server.Runner.Run(IRunnerData data, Boolean fromQueue)

If I regenerate the AuthID it will run the schedule once successfully and fail every time after that. I am running version 2.0.5.1_beta_2020-01-18 on Windows Server 2019 Version 1809.

Same problem on OSX with 2.0.5.1_beta_2020-01-18

System.Net.WebException: The remote server returned an error: (403) Forbidden.
at System.Net.HttpWebRequest.GetResponseFromData (System.Net.WebResponseStream stream, System.Threading.CancellationToken cancellationToken) [0x00146] in <4605a50878ff458c98a8b66722729be6>:0
at System.Net.HttpWebRequest.RunWithTimeoutWorker[T] (System.Threading.Tasks.Task1[TResult] workerTask, System.Int32 timeout, System.Action abort, System.Func1[TResult] aborted, System.Threading.CancellationTokenSource cts) [0x000f8] in <4605a50878ff458c98a8b66722729be6>:0
at Duplicati.Library.Main.AsyncDownloader+AsyncDownloaderEnumerator+AsyncDownloadedFile.get_TempFile () [0x00008] in <8f1de655bd1240739a78684d845cecc8>:0
at Duplicati.Library.Main.Operation.CompactHandler.DoCompact (Duplicati.Library.Main.Database.LocalDeleteDatabase db, System.Boolean hasVerifiedBackend, System.Data.IDbTransaction& transaction, Duplicati.Library.Main.BackendManager sharedBackend) [0x00264] in <8f1de655bd1240739a78684d845cecc8>:0
at Duplicati.Library.Main.Operation.DeleteHandler.DoRun (Duplicati.Library.Main.Database.LocalDeleteDatabase db, System.Data.IDbTransaction& transaction, System.Boolean hasVerifiedBacked, System.Boolean forceCompact, Duplicati.Library.Main.BackendManager sharedManager) [0x00397] in <8f1de655bd1240739a78684d845cecc8>:0
at Duplicati.Library.Main.Operation.BackupHandler.CompactIfRequired (Duplicati.Library.Main.BackendManager backend, System.Int64 lastVolumeSize) [0x000a5] in <8f1de655bd1240739a78684d845cecc8>:0
at Duplicati.Library.Main.Operation.BackupHandler.RunAsync (System.String sources, Duplicati.Library.Utility.IFilter filter, System.Threading.CancellationToken token) [0x01033] in <8f1de655bd1240739a78684d845cecc8>:0
at CoCoL.ChannelExtensions.WaitForTaskOrThrow (System.Threading.Tasks.Task task) [0x00050] in <9a758ff4db6c48d6b3d4d0e5c2adf6d1>:0
at Duplicati.Library.Main.Operation.BackupHandler.Run (System.String sources, Duplicati.Library.Utility.IFilter filter, System.Threading.CancellationToken token) [0x00009] in <8f1de655bd1240739a78684d845cecc8>:0
at Duplicati.Library.Main.Controller+<>c__DisplayClass14_0.b__0 (Duplicati.Library.Main.BackupResults result) [0x0004b] in <8f1de655bd1240739a78684d845cecc8>:0
at Duplicati.Library.Main.Controller.RunAction[T] (T result, System.String& paths, Duplicati.Library.Utility.IFilter& filter, System.Action`1[T] method) [0x0026f] in <8f1de655bd1240739a78684d845cecc8>:0
at Duplicati.Library.Main.Controller.Backup (System.String inputsources, Duplicati.Library.Utility.IFilter filter) [0x00074] in <8f1de655bd1240739a78684d845cecc8>:0
at Duplicati.Server.Runner.Run (Duplicati.Server.Runner+IRunnerData data, System.Boolean fromQueue) [0x00349] in :0

I’m having the same issues with a completely new setup. I’m guessing it has to do with the 750 GB per day limit that Google Drive seems to have on uploads, because my Google Drive now has 750 GB of 2 TB filled (although I did have ~15 GB filled before starting the backup…).

image

Will wait and see tomorrow how far the upload has come. (https://www.reddit.com/r/DataHoarder/comments/8khwve/google_drive_750gb_daily_upload_quota_what_time/)

All backups have now succeeded, taking up a total of 1.1 TB, so I’m quite certain that the 750 GB limit was the issue.

Time to test recovery!

Good test would be either DR-type recovery to a different system, or if on same one, set no-local-blocks, otherwise you might recover the whole backup from source file blocks, but you’d avoid the 750 GB issue.

So out of the blue this has started happening again to my unattended backups. Started about 7-8 days ago where I would get Forbidden 403 when backing up to Googledrive automatically around 3am - 5am. I even left if a couple of days to see if the 750GB limit was the issue but no.
However, If I run it manually the backup works.

So the sequence in question is this one
[Data !]
Last successful backup: Today at 08:21 (took 00:14:22) [Note this was a manual one]

Next scheduled run: Tomorrow at 03:00
Source: 254.81 GB
Backup: 167.29 GB / 3 Versions

The error it throws is:-
System.Net.WebException: The remote server returned an error: (403) Forbidden.
at Duplicati.Library.Main.BackendManager.List()
at Duplicati.Library.Main.Operation.FilelistProcessor.RemoteListAnalysis(BackendManager backend, Options options, LocalDatabase database, IBackendWriter log, String protectedfile)
at Duplicati.Library.Main.Operation.FilelistProcessor.VerifyRemoteList(BackendManager backend, Options options, LocalDatabase database, IBackendWriter log, String protectedfile)
at Duplicati.Library.Main.Operation.BackupHandler.PreBackupVerify(BackendManager backend, String protectedfile)
at Duplicati.Library.Main.Operation.BackupHandler.d__20.MoveNext()
— End of stack trace from previous location where exception was thrown —
at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw()
at CoCoL.ChannelExtensions.WaitForTaskOrThrow(Task task)
at Duplicati.Library.Main.Controller.<>c__DisplayClass14_0.b__0(BackupResults result)
at Duplicati.Library.Main.Controller.RunAction[T](T result, String& paths, IFilter& filter, Action`1 method)
at Duplicati.Library.Main.Controller.Backup(String inputsources, IFilter filter)
at Duplicati.Server.Runner.Run(IRunnerData data, Boolean fromQueue)

Any ideas?

lcsneil

1 Like

I’m also experiencing 403 Forbidden responses in the logs of a backup job, and so far only 98GB has been uploaded. Like most people I’m seeing

(Inner Exception #0) System.Net.WebException: The remote server returned an error: (403) Forbidden.

Is there a way of capturing the actual body of the 403 response? Google Drive provides more information within the JSON structured body of the response. For example:

{
 "error": {
  "errors": [
   {
"domain": "usageLimits",
"reason": "userRateLimitExceeded",
"message": "User Rate Limit Exceeded"
   }
  ],
  "code": 403,
  "message": "User Rate Limit Exceeded"
 }
}

That would help me debug the true cause of the problem.

Network Tracing in the .NET Framework might satisfy “any”, but I haven’t used it in a long time.
If you try, be careful what you post because there might be some sensitive protocol or payload.

There’s not much logging at the low levels. I don’t know if its privacy worry, or just wasn’t done.
Maybe some generic code could be added to help the many HTTP backends, but what of rest?

Hi guys,
I will also add myself to the error.

I have a personal google account, not business, with 2tb storage space of which about 860gb is occupied.
I have 8 backups running on drive and only one fails (the biggest one).
Assuming I copy all files from scratch, my total backup would be 170gb so it doesn’t exceed the 750gb daily limit.
All of them work except one.

Knowing that I’m not the only one gives me comfort!

If I run the manual backup, I receive confirmation of success but the upload happens in a few minutes and at a speed of 314 bytes/s.
The size of my backup that loads daily is 38gb so the message can not tell the truth.

It seems that you need to compile the application with tracing enabled:

Compile your code with tracing enabled. See How to: Compile Conditionally with Trace and Debug for more information about the compiler switches required to enable tracing.

That’s probably beyond what I can do at the moment. I might just have a look at trying to track down where the HTTPS calls to Google Drive are handled in the code and put a console debug output of the body of any 403 responses.

I have a feeling the problem is API Rate Limiting. I’m using a new Google Account set up specifically just to hold backups (as Google One is relatively cheap for 2TB of storage), and I have a hunch the allowed API rate might be lower than expected. The 403 errors I’m seeing seem to be around enumerating file lists, so that’s what leads me to believe its an API rate error, not a data volume problem.

I’m seeing 403 Forbidden errors now after uploading only 35GB of data, so I’m pretty certain the error isn’t related to volume of data, but potentially volume of API calls.

If anyone knows the codebase well enough to point me to the line where HTTP(S) requests are dispatched, I’ll see if I can hack it to log the body response.

You just need to edit an existing config file in a Duplicati folder per How to: Configure network tracing.
It’s a bit more than copy-and-paste-at-bottom, because new content goes in original <configuration>.
This will make a big file, but it’s a view on the application side of the TLS encryption, which is helpful.

I’m not very familiar with this, but I think you want to read HttpWebRequest.GetResponse Method and

https://github.com/duplicati/duplicati/blob/master/Duplicati/Library/Backend/GoogleServices/GoogleCommon.cs has a couple of spots that might wind up in GetResponse() with your error details.

This is really helpful, I’ll give both those suggestions a go. I’m still getting 403 Forbidden errors, but now at longer intervals, which makes me more confident that the problem is some sort of opaque rate management that Google is applying to Google Drive API calls.

1 Like

Hi there, first time posting, thanks for this thread and all of the others as I try to make sense of my 403 errors.

My setup:
Duplicati on Ubuntu
Team Drive in Google Drive (educational and unlimited), ~15tb back up over the past year.

Same as OP and others: all of a sudden backups are failing on the (final) verification part (GET fails).
The backup file uploads go through fine. Waited 24 hours to see if there was some kind of ban, but it’s been 36 hours, and I uploaded nowhere near 750gb.
OAUTH verification (either limited or full) test fine.

To test, I had Duplicati create a folder in the Team Drive and that all worked fine with uploads and everything. BUT verification fails as well as any attempts to restore.

Any suggestions would be appreciated.
I’ve made no changes to the Team Drive or permissions.
I can upload manually to the drive.

Edit: Just saw that when I try to download a Duplicati file manually I get:

There’s no way that I downloaded 10tb, but perhaps other people on other team drives associated with the account creator?

Either way, looks like I’ll continue to wait it out.

Hi folks. I was receiving the exact same error message that made the automated backups useless. This was the case until I removed my password protection from Duplicati. Since then my unattended backups have been working flawlessly. Try it.

Welcome to the forum @superjacktr

There are various passwords, e.g. the GUI can have one. Are you talking about the one on backup data?

image

I have this because I don’t trust Google (and whoever can make them hand over their data) all that much.
Other people might care less. It’s an interesting finding, but my finding is I get the 403s kind of at random.
Thanks for the suggestion, and time will tell whether it keeps working for you (and I’m not at all sure how).

is one of mine where Google Drive 403 retries exhausted my retry limit. I think Duplicati puts up with this pretty well in the main backup, but the compact that might run later doesn’t seem to like any sort of error.

Anybody who has a reliable way of getting the 403 (e.g. from a test backup designed to troubleshoot this) could help the project a lot by writing up reliable steps to reproduce, and filing it as an Issue to be chased.

I’ve sometimes wondered if the 403 error is tied in with an OAuth 2 access token that Google has expired.
Access Token Lifetime gives the concepts. How we get along with OAuth says how Duplicati handles this.

When I first saw this response, I thought it was to a currently active topic which was seeing the 403 errors.