It should work AFAIK, but I’m not Microsoft. If you want a better answer, wait for the testing I mentioned.
So far, though:
It worked for me either way I merged into the .config file (at top or bottom). Are you using a merged file?
Specifically the nearest Canary build, installed from duplicati-2.0.5.114_canary_2021-03-10.zip, but .msi contents come from the same identical files (only adding a fancier mechanism to allow the GUI installer).
Windows 10 Professional Version 21H2. Attached is my file. I added the Microsoft part at top for this test:
This is weird, but a fresh install wouldn’t work for me at C:\Program Files\Duplicati 2, so I copied the whole folder to make C:\Program Files\Duplicati 2 - Copy which wouldn’t work, and I copied that to make C:\tmp\Duplicati 2 - Copy which happily made my log… I guess you can also see if copying the folder helps logs.
I only installed the modified .config file once before doing copying, and use same list command each try.
Before trying that, I was doing folder compares, and not finding any noteworthy difference from install way.
EDIT:
The weird behavior for me seems partly due to a C:\ProgramData\Duplicati\updates\2.0.6.100 being there, and being seen and used by the Duplicati parent process as its update, avoiding my modified .config file.
set AUTOUPDATER_Duplicati_SKIP_UPDATE=true is one way to stay single-process – more predictable.
Unless you have a Duplicati around that’s newer than your 2.0.6.3, you won’t be hitting this sort of problem.
I did discover some interesting behaviors which need more exploring though, such as an updates folder in the Duplicati folder being created (permissions willing, which might be what changed when I moved folder).
EDIT 2:
If you decide not to set that environment variable, you can see which program each Duplicati is running in Task Manager right-click Properties, or Sysinternals Process Monitor is a nice display – mouse over lines.
Huh… The network trace log is generated when you run the exported command line, but not when you click “Run now” in the gui. Duplicati.CommandLine.exe doesn’t seem to be executed from Duplicati.GUI.TrayIcon.exe. I moved the trace config into Duplicati.GUI.TrayIcon.exe.config, restarted Duplicati.GUI.TrayIcon.exe and, voila, trace log. I guess the gui doesn’t launch Duplicati.CommandLine.exe as an external process.
look the most like yours that I can find from the 403 errors. Unfortunately the only one that you have influence over is the Retry failed requests to resolve errors, and influence might only be retry-delay reasonably long. Duplicati doesn’t do exponential backoff on retries. It retries after above fixed delay.
Beyond what Google has documented, some web searches might find causes that they didn’t name.
Can you get this 403 problem on a small backup with nothing at all changed on the Options screen?
@ts678 Thankyou for all your help, but I think this is where I consider this resolved. Adding the retry and retry delay options have fixed my issue. I’m going to open a new issue (or add to an existing one if a similar one exists) on the GitHub page and include the things I found.
So, for anyone simply looking for the answer (or at least the answer for my situation), here it is…
Error:
From log Duplicati GUI -> About -> Show log -> "Failed while executing" log...
System.Net.WebException: The remote server returned an error: (403) Forbidden.
Specific error:
From enabling network tracing (read back through the thread to see how this was enabled for scheduled backups via the GUI if you’re curious, but be advised that the log file is massive and you might need other tools to deal with a text file that big or figure out how to write sequential trace logs maybe)
"code": 403,
"message": "Rate Limit Exceeded"
Solution:
Add the following advanced options in the GUI to your Google Drive backup config
--retry-delay = 10s
After a failed transmission, Duplicati will wait a short period before attempting again. This is useful if the network drops out occasionally during transmissions.
--number-of-retries = 5
If an upload or download fails, Duplicati will retry a number of times before failing. Use this to handle unstable network connections better.
is something that you did raise beyond the default. Possibly the default is too low, but making it too high invites annoyingly long delays in situations that will naturally fail for a long time (as opposed to recover).