OneDrive for Bussiness - Failed to connect

Sorry about the delay.

That’s right, when i click " Test Connection"

Unfortunately, I don’t have an OD4B account to test with.

It’s possible @kenkendk or a user of one of the ~27k One Drive for Business backups that ran in December might have some suggestions…

However, it should be noted that the first two weeks of 2018 have shown an ~50% drop in OD4B backups compared to the last 5 weeks of 2017, so it is possible something has changed in how things work for SOME users.

I can provide an account for testing.

I try with ubuntu 14.04 and v2.0.2.15-2.0.2.15_experimental_2018-01-03 today, and get the same error.

if you can build yourself, I’ve made this pull request:

it adds rclone to the backend. Rclone allows syncing with OD4B, and my tests were positive for OD4B.

Rclone use the API of OD4B that only works with the accounts which have been verified domain by admin.

Many education accounts which are applied online have not been verified domain yet.

https://github.com/ncw/rclone/pull/1577#issuecomment-354557585

I see a reply on which a user can use his account with Duplicati.

sorry, can’t test with educational account

I tried some more, and when i try to do a backup, i get some more information from fidler…

Error: 917656; Access+denied.+Before+opening+files+in+this+location%2c+you+must+first+browse+to+the+web+site+and+select+the+option+to+login+automatically.

Edit: I didn’t get this error in the Linux docker client, but only the Windows version. Linux give me no useful information at all.

And when i googled that, it seems that more people have the same problem.

JoWe

Same here. After spending some time discovering how to connect (using edu OneDrive account here), it connected and wrote the files, but it can’t read, so it freeze in “Verifying” step “forever”.

Found an open issue at github. And I saw that @kenkendk added a “UserAgent” header code to canary version. Does it means that this bug is fixed, but Onedrive for Business (provided by Education, in my case) won’t work with current Duplicati beta release?

I believe this is the exact same issue reported here.

Edit: I’m on macOS and testing CloudMounter to mount OneDrive (for Business) as a network drive, then I can setup duplicati using it as local folder (folder location is kinda hidden). It seems to be working fine, backup jobs are done quickly mainly because there is no waiting for the upload stage, as duplicati just write files to local path and CloudMounter will automatically upload. I’m just wondering how much additional local disk space will be impacted with this, because I believe using official online storage providers’ API, duplicati won’t write new files until the last written file is uploaded, somewhat sequential. For now, I’ll be testing/using OD4B like this as redundancy, but keeping the main backups to Google Drive just for safety.

Are you using a version of Duplicati that’s new than the Jan. 18, 2018 commit with the “UserAgent” header change?

As for local disk space impact when using CloudMounter, you might need to contact them directly. They’re website says cloud files aren’t stored locally, but I didn’t find anything that covered what happens when the file STARTS locally.

The best case (disk space wise) would be that the local file is MOVED to the cloud, but it’s unclear how long it takes for that to happen after the file is done being written.

So worst case scenario is local disk usage would be the full backup size (as reported in Duplicati) while best case would be temporary local disk usage of maybe a few dblock (archive Volume size) files eventually going to zero.

Note that CloudMounter mentions encrypting the files on their way to the cloud. Be careful with this setting as if you ever decide to switch back to a direct WebDAV or API connection you may end up getting the CloudMounter encrypted files, which Duplicati wouldn’t be able to use (unless you go back to using CloudMounter).

I’m using the latest beta (You are currently running Duplicati - 2.0.2.1_beta_2017-08-01).

Exactly. So I couldn’t for example setup a backup job of my 1TB NAS using a Macbook Air, otherwise Duplicati would start process all blocks and just sending to “network folder”, in which macbook will eventually be out of space. Perhaps if there is any duplicati’s special/advanced setting to allow maximum size for each job run, I then could use as a “quick hack”.

Yep, I disabled that, thanks for mentioning.