Scheduled Backup Failure: Google Drive: (403) Forbidden

I’ve searched the forum, and while there are other topics regarding Google Drive backup failures, I think my issue is different: Manual (run now) Google Drive backups work, but scheduled backups fail.

I have two scheduled backups: local and cloud. The local is fine. No issues. Of seven scheduled cloud backup attempts, only one has been successful. Which is weird because manually starting the cloud backup always works.

1 Like

Welcome to the forum @jbrockerville

Failed: The remote server returned an error: (403) Forbidden Google Drive
Google One / Google Drive > Error 403 when scheduled but ok when run manually?
has some things that worked (or didn’t) for other people, but for the most part this is mysterious.
It also seems to be Google Drive specific, and mine still gets random 403, sometimes too many.

Are local and cloud systems similar, with installed Duplicati going to same Google Drive account?

When I said I have two backups, local and cloud, I meant the destinations are local and cloud. Both sources are local files.

  • “Local”
    • Local-set-of-files backed up to a separate local HDD
  • “Cloud”
    • Local-set-of-files backed up to Google Drive

The scheduled “cloud” backup always fails. But I can run it manually. Very strange.

Thanks for clarifying. I hope something in the forum topics I cited will lead to a cure or some understanding.

Nope. Seems random. Please point to a specific post for the solution.

Sep 28 - Success
Sep 29 - Failure
Sep 30 - Failure
Oct 1 - Failure
Oct 2 - Success
Oct 3 - Success
Oct 4 - Failure
Oct 5 - Failure

All failures are the same:

System.Net.WebException: The remote server returned an error: (403) Forbidden.
   at Duplicati.Library.Main.BackendManager.List()
   at Duplicati.Library.Main.Operation.FilelistProcessor.RemoteListAnalysis(BackendManager backend, Options options, LocalDatabase database, IBackendWriter log, IEnumerable`1 protectedFiles)
   at Duplicati.Library.Main.Operation.FilelistProcessor.VerifyRemoteList(BackendManager backend, Options options, LocalDatabase database, IBackendWriter log, IEnumerable`1 protectedFiles)
   at Duplicati.Library.Main.Operation.BackupHandler.PreBackupVerify(BackendManager backend, String protectedfile)
   at Duplicati.Library.Main.Operation.BackupHandler.<RunAsync>d__20.MoveNext()
--- End of stack trace from previous location where exception was thrown ---
   at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw()
   at CoCoL.ChannelExtensions.WaitForTaskOrThrow(Task task)
   at Duplicati.Library.Main.Controller.<>c__DisplayClass14_0.<Backup>b__0(BackupResults result)
   at Duplicati.Library.Main.Controller.RunAction[T](T result, String[]& paths, IFilter& filter, Action`1 method)
   at Duplicati.Library.Main.Controller.Backup(String[] inputsources, IFilter filter)
   at Duplicati.Server.Runner.Run(IRunnerData data, Boolean fromQueue)

I have my backup going to Google Drive set at default number-of-retries now. Fatal failure mail told me this:

Failed: The remote server returned an error: (403) Forbidden.
Details: System.Net.WebException: The remote server returned an error: (403) Forbidden.
   at Duplicati.Library.Main.BackendManager.List()
   at Duplicati.Library.Main.Operation.FilelistProcessor.RemoteListAnalysis(BackendManager backend, Options options, LocalDatabase database, IBackendWriter log, IEnumerable`1 protectedFiles)
   at Duplicati.Library.Main.Operation.FilelistProcessor.VerifyRemoteList(BackendManager backend, Options options, LocalDatabase database, IBackendWriter log, IEnumerable`1 protectedFiles)
   at Duplicati.Library.Main.Operation.BackupHandler.PreBackupVerify(BackendManager backend, String protectedfile)
   at Duplicati.Library.Main.Operation.BackupHandler.<RunAsync>d__20.MoveNext()
--- End of stack trace from previous location where exception was thrown ---
   at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw()
   at CoCoL.ChannelExtensions.WaitForTaskOrThrow(Task task)
   at Duplicati.Library.Main.Controller.<>c__DisplayClass14_0.<Backup>b__0(BackupResults result)
   at Duplicati.Library.Main.Controller.RunAction[T](T result, String[]& paths, IFilter& filter, Action`1 method)

Log data:
2021-10-04 07:01:50 -04 - [Error-Duplicati.Library.Main.Operation.BackupHandler-FatalError]: Fatal error
System.Net.WebException: The remote server returned an error: (403) Forbidden.
   at Duplicati.Library.Main.BackendManager.List()
   at Duplicati.Library.Main.Operation.FilelistProcessor.RemoteListAnalysis(BackendManager backend, Options options, LocalDatabase database, IBackendWriter log, IEnumerable`1 protectedFiles)
   at Duplicati.Library.Main.Operation.FilelistProcessor.VerifyRemoteList(BackendManager backend, Options options, LocalDatabase database, IBackendWriter log, IEnumerable`1 protectedFiles)
   at Duplicati.Library.Main.Operation.BackupHandler.PreBackupVerify(BackendManager backend, String protectedfile)
   at Duplicati.Library.Main.Operation.BackupHandler.<RunAsync>d__20.MoveNext()

You can try raising your retries as a workaround. I kept mine low so I can look for any odd breakage when retries run out. I don’t know why Google Drive does this, but it did it for awhile to get past all 5 of its retries.

This backup is hourly, and it takes a lot of backups before I have one fail, even on a low retry count. Yours does this more often. There has been talk of trying to gather additional information from failure responses (sometimes I guess there’s more detail), but until a developer can do that (developer volunteers are few), Network Tracing in the .NET Framework is one way to get a clear text vew of everything (including private information, so be careful of what you post if you’re willing to give this a try to see what more Google said).

After a failure and some more minutes waiting, does it then run? For me, Google eventually gets over it…

I enabled debug logging and enabled retries of 5. Thanks for that suggestion.

What do you mean by this? Like a scheduled run fails and then a manual run? Manual always succeeds. We’ll see about the auto retries later on. :slight_smile:

5 is the default. If it’s not enough, you can try more. You could also lengthen retry-delay if you like.

Yes, that’s going to make it hard. Probably the easiest way to test 403 time without going manual is increasing the above values. A very smart run-script-before would be a lot of work for similar effect.

If the problem is long-lasting (we don’t know until settings changes above are tried, and results found),
Resolve errors shows some reasons why a 403 is returned. You can see that some additional details possibly were sent to say more about the problem, but I don’t think Duplicati can currently show them.

The hope with network tracing was that the resulting text log would let you find a 403 and some details.
To avoid a huge tracing log, the problem must happen often and soon, almost like what you are seeing.

Not having a lot of luck. Only about 1 in 5 scheduled attempts succeeds. I’m not referring to retries. If a scheduled backup fails, it fails all 5 retries. I upped the retry count to 10 and added a delay of 10s. We’ll see if that does anything.

Your wording isn’t clear to me. Is network tracing in? Can I enable it somehow to tease out other information besides the 403 code? Or did you mean you need a dev to add in the tracing module to gather more info on this sort of problem?

Please click on the prior link (below again)

This is some general information. You can probably gloss over most of it on the way to the important step:

Configure network tracing where you merge those configuration lines into the existing configuration file for whatever you want traced. For example, Duplicati.CommandLine.BackendTool.exe.config already exists. You can make a backup copy, edit original, file, see <configuration> at top and </configuration> at bottom, then add in (without repeating those lines) the rest of the lines from the linked page. I’ve done it both near top and near bottom. I also change the output file to a full path that I like. When program runs, it makes a (possibly very large, depending on what’s run) log file. There should be a 403 in there. Any clues nearby?

The format is hard to read, but I think this is the 200 (normal status, unlike 403) I got from a list request:

System.Net Information: 0 : [14712] Connection#49044892 - Received status line: Version=1.1, StatusCode=200, StatusDescription=OK.

You’re getting a 403 on a list, so it’s a similar test. Maybe it will fail for you, and let use see some details.

EDIT:

Use GUI job Export As Command-line to get an appropriate Target URL to pass to the command line tools.

Hmm… I added network tracing config to Duplicati.CommandLine.BackendTool.exe.config as you suggested, but I saw no more network entries in the log than before. However, when did the export, i.e.

its output was

"C:\Program Files\Duplicati 2\Duplicati.CommandLine.exe" backup "googledrive://Duplicati?authid=###" "include_folders" --backup-name="Irreplaceable (Cloud)" --dbpath="C:\Users\brock\AppData\Local\Duplicati\EZPSKDYYOJ.sqlite" --encryption-module= --compression-module=zip --dblock-size=64MB --retention-policy="1W:1D,4W:1W,12M:1M" --no-encryption=true --debug-output=true --number-of-retries=10 --debug-retry-errors=true --log-file="C:\Duplicati.log" --log-file-log-level=Verbose --retry-delay=10s --exclude-files-attributes="hidden,system" --disable-module=console-password-input --exclude="exclude_folders"

Perhaps the network tracing config should go in Duplicati.CommandLine.exe.config instead? Is there any particular command line flag I need to set in conjunction with the network tracing config?

Interestingly though, the retry delay might have helped. Instead of failing all the retries, it actually succeeded on the the 5th retry last night. I don’t know what it is about 4am that’s so sketchy for this. I’ve tried setting the next scheduled run to be like 15 mins from “now” and it always succeeds. Manual run always succeeds. Ugh.

means change the (non-Duplicati) log line that Microsoft wrote as initializeData="network.log", although leaving it as a relative path might mean that you put a new network.log file in Duplicati folder.
Extra files there break updates (which validate their folder contents), so it’s good to log using full path.

It would be nice if the error were either a solid fail or an almost-always-works (which are also difficult).
The failure here is solid enough that it would be nice if we can catch what’s coming back when it fails.

This, on the other hand, makes me worry whether any command line tool will be able to cause it to fail.
Duplicati.CommandLine.BackendTester.exe is another one to try with or without the network tracing on.
Destination URL needs to be modified to point to an empty folder.

Maybe. Whatever will fail (or at least retry) is a potential place to enable the network tracing to look at it.
The smaller the output file, the easier it may be. If you get huge ones, EditPad Lite can open such files.

Do you actually have this working? I cannot get network trace logs to work. Enabling Network Tracing says code might need to be compiled with this tracing enabled. Does the publicly released 2.0.6.3_beta_2021-06-17 binary have tracing enabled?

Yes, and the quoted StatusCode=200 line is a direct copy-and-paste from the file. This is with completely unaltered Duplicati except for the config file change. One possible difference is that if you have a Duplicati update situation, the version under C:\Program Files\Duplicati 2 checks for the latest update and starts it, meaning possibly the .config change should be in update folder. Problem is that update folder is checked, and altering files will get update ignored. This can be fixed by a Duplicati install to update the original copy.

You can look at About → System info to see whether your ServerVersion is higher than your BaseVersion. Task Manager Properties shows the full path. If they’re the same for both Duplicati, then ignore this theory.

If they’re different, then you can actually cd to the folder in updates to run from there instead, however it’s awkward because your changes will get in the way of a regular Duplicati launch. You’d have to undo them.

Le sigh. They are not different. Server and base are both 2.0.6.3_beta_2021-06-17. There is no updates folder. I can’t get tracing to enable whatsoever. I appreciate the help and I’d like to return the favor if I can because it fails 5-7 times before it succeeds every night now, but I don’t know what else to do.

is pretty much what you were attempting for awhile with Visual Studio. I wonder if @Rational can help?

Let me try it in a more “normal” install. Mine are in an unusual location and disable the parent/child doings.

Dropbox: Error reading JObject from JsonReader. Path ”, line 0, position 0
are some directions from the original Duplicati author, but it doesn’t merge .config files like I’ve been doing.

I skimmed through the post and I think the issue is completely different than the solution I shared. OP gets 403, and specifically in the List() call.

If OP can debug Duplicati, they should break on the response code and check the response body for details. Google Drive API sends 403 for the following reasons:

  • The daily limit was exceeded.
  • The user rate limit was exceeded.
  • The project rate limit was exceeded.
  • The sharing rate limit was exceeded.
  • The user hasn’t granted your app rights to a file.
  • The user doesn’t have sufficient permissions for a file.
  • Your app can’t be used within the authenticated user’s domain.
  • Number of items in a folder was exceeded.

Here is the doc link: Resolve errors  |  Google Drive API  |  Google Developers

If OP can’t debug, then check which cause is more likely:

  • Check number of files
  • check app has permission (maybe by regenerating new api auth key)
  • Also check that you remove “user” and “pass” from the google api url (you can do that with gui: configuration > edit > destination > 3 dots > copy url configuration, modify it then import it)
  • make sure you don’t actually exceed limits, set concurrent upload limit to 1, set retry limit to 100 or so, set retry time delay to 60 seconds
  • you didn’t change the root folder to which it uploads, right? it only has permissions for the folder/files it creates…
  • lastly, try with a small test backup in a new folder, and see if it works

Hope this helps :smile:

1 Like

Ok. So let’s get something cleared up here. I’ve been trying to enable tracing (via the configs) for the officially released build, i.e. https://updates.duplicati.com/beta/duplicati-2.0.6.3_beta_2021-06-17-x64.msi. Yes or no. Should that work?