Unauthorized error from request https://graph.microsoft.com/ OneDrive

Hello,

I setup Duplicati on unraid and keep getting the following error.

OneDrive is setup correctly in Microsoft, the backup starts fine and then after a couple minutes stops with the error below, very weird.

I have re-created the [AuthID] a couple of times already but still the same error!

Duplicati.Library.Backend.MicrosoftGraph.MicrosoftGraphException: Unauthorized: Unauthorized error from request https://graph.microsoft.com/
System.Net.HttpWebResponse
{
“error”: {
“code”: “InvalidAuthenticationToken”,
“message”: “Unable to initialize RPS”,
“innerError”: {
“date”: “2021-10-06T02:17:38”,
“request-id”: “3fxxxxxxx-xxxxxx-xxxxxx-xxxxx1-837xxxxxxxxxx”,
“client-request-id”: “3fxxxxxxx-xxxxxx-xxxxxx-xxxxx1-837xxxxxxxxxx”
}
}
}

Thanks.

I would like to add that just to rule out the Microsoft OneDrive side, I run a ~70GB backup with a different app like “Duplicacy” and it completed just fine with no errors.
I really like Duplicati, but sadly is not working with OneDrive! :slightly_frowning_face:

So looks like there is some kind of Duplicati bug, maybe the Dev’s would like to investigate more ?

Google search of that shows how widespread and poorly understood this is on various different software including Arq Backup and rclone which had it pretty bad case and did a workaround of doing more retries.

Random “InvalidAuthenticationToken: Unable to initialize RPS” have started appearing #1489
Work around for “Unable to initialize RPS” errors from OneDrive #5270

This is the first Duplicati report that I can find in the forum or in Duplicati’s GitHub Issues. Some answers on the Internet wonder if it’s related to the specific account. Maybe it’s even what equipment is used for it.

As a workaround, you can increase number-of-retries and/or retry-delay to see if it can ride past glitches.

Googleapi: Error 403: User rate limit exceeded., userRateLimitExceeded, Maximum number of retries reached (backoff: 64, attempts: 15) in Duplicacy forum shows Google getting a pretty hefty retry handling.

OneDrive code is possibly here, and possibly also more persistent than Duplicati’s (which allows setting).
That code chunk looks like you might be able to see some Duplicacy log entries about any retry activities.

Duplicati retries can be seen at Retry level (e.g. About → Show log → Live → Retry), or the job log shows "RetryAttempts": 5, (for example from my new Backblaze backup) in job log’s Complete log section.

I can see retries with Duplicacy, let me run a backup once again and try to increase the retries similar to Duplicacy.

I’m not going to try to talk Duplicacy details, but for Duplicati, see what’s retrying. Normally a backup does list to sanity-check the destination files, then goes into lots of put to upload its dblock and dindex files.

Time per file varies according to network speed and the value set for default 50 MB Remote volume size which is the limit on dblock file size. You might also see if you can get a small, very-default backup going.

If need be, there are some specialized tools like Duplicati.CommandLine.BackendTester.exe to do testing. Export As Command-line can give you the Target URL which you can edit to an empty folder to do the test.

Thanks will do, currently running a new backup and monitoring to see of any retries or errors.