hello guys i keep getting error 403 when i backup to google drive, and i readed a few posts on the net and all i see that duplicati blames google, and that is weird because, the same library i try to backup always worked fine on my qnap with hbs3, so i realy think it is on duplicati’s side rather than google,
That might not be easily understood which side. If there’s another log entry that helps further I’m not sure of that either. You can try enabling more logs or looking at other logs found in Duplicati depending on what you’re looking at and depending if Duplicati logs anything else to be of help.
In my own code, I’ve seen things that people blame on eg Google Drive, yet reworking the code solves the problem, and even us developers can incorrectly understand things. Sometimes reworking works around eg an issue in Google Drive as well. It can get really time consuming to try to actually truly figure it out.
So I’m not confident either way to guess on this past 50% or even with a likely or anything in any way if they think its a Drive issue. Drive does seem to use 403 for a number of possibilities as well which doesn’t help.
There’s also stuff like the following link which suggests 403 can sometimes be fixed via sign out and sign back in and I think that’s with official Drive use (think I got it right the first time). This is how you can fix HTTP 403 error on Google Drive Though should mention these websites with help sometimes repeat actually totally unhelpful not ever going to fix it things.
It can be a real pain lol. I even have an app crash for 2 years with Google’s Docs app on Android. They never fix it. I’ve even sent them the crash log. Nothing. Paste and randomly crash
What Google actually calls for for on some of its 403 errors is exponential backoff on retry attempts. 126.96.36.199_canary_2022-03-13 actually added that, but Canary releases are too unknown for many.
If you want to venture into the latest fixes (and latest bugs), it’s available, but not by an autoupdate.
refers to a new pull request which has not been processed yet. Any volunteers to process these?
Although volunteers in all areas (including forum) are needed, we have gotten new pull requests.
I’m also not quite sure if the rework will tell us details of a 403 on upload. Also, is it on an upload?
The original fix was for download, but my Google Drive 403 have been on uploads, when I check.
Simple long-term record is log-file=<path>, log-file-log-level=retry. A put is upload, get download.
It’s difficult to get a view of the network, because it’s encrypted. Windows network tracing is a way, however it’s a bit hard to set up, and you have to be careful not to accidentally post your secrets…
That’s a good point. Another way of looking into it. I like it
It could be more precise to say that it doesn’t at the time being. Again, its not always black and white. It definitely could be Duplicati though so don’t get me wrong. Its not like it doesn’t have issues.
Its just not an easy one to say either-or without (possibly) many hours spent looking into it with heavy debugging on the code or maybe the idea that ts678 says if it goes well with the https connection.
The fix handles all HTTP methods including put and post (I assume by “pull” you’re referring to “put”). I’ve only tested it on a get, in terms of I’ve only had issues with a get to test it on. It will provide you with the 403 details returned from Google Drive (if it works correctly for a put) :-).
BTW… since implementing the fix, Duplicati has successfully performed 539 backups (I have it backing up every 15 minutes for testing purposes).
The only “pull” that I can find is “pull request”, but I think your pull request answer gives me hope that
Care to do some? What Duplicati needs is more people testing more things, maybe even exceptions, possibly including network errors (real or artificial), random hard kill during backup, all sorts of fun.
To be most helpful, testers should be willing to collect and provide logs, database info, etc. for debug.
Goal now is to find some developer to take it from steps-to-reproduce into a fix. We need developers.
Ignoring that for a moment, the next best thing is good test cases with data, and maybe an analysis…
Sorry, that was just me misreading the forum post in my hurry.
I agree with your sentiment here, but I’m time-poor. I see that Duplicati uses NUnit for a few tests. NUnit is now old and XUnit has replaced it. Unit Tests for Duplicati are probably only useful for a small set of Duplicati’s functionality. I suspect we’d be better off with integration tests… but can we add integration tests to Duplicati?
One of the system’s I’m currently working on has around 200 XUnit integration tests, they take about 90 seconds to run for over 1000 operations hitting the database… nice! The integration tests are aimed at the back end of the system as they call a RESTful API. I don’t have time to look through the code at the moment… do you know how the UI is talking to the back-end and could we automate the back-end from XUnit?
The next issue would be that the area I just modified is the Google Drive handler… we’d need to work out how you mock connecting to Google Drive… what’s more, we don’t know how Google Drive is going to behave in certain circumstances… I suppose we can only establish tests for what we do know.
So we’d only be establishing automated testing to confirm that a code change doesn’t break something (which is ok). We couldn’t use automated testing to check something like the new code that I’ve just added because we don’t know (as yet) how Google Drive will behave when a 403 error occurs for say a PUT operation until we get one.
Side Note: I’m hoping that with all the backups I’m doing that my Google Drive will run out of space… it’s getting close and hopefully it will fall over on a PUT or POST call so that it will test the error reporting change I made in that area.
@Hinako This could be possible as all of the handlers are projects in their own right and can be compiled separately. I could probably hand you the Google Drive handler with the enhanced error reporting code in it. From memory I think it’s about six files that you update. If it doesn’t work then you can just revert back to the originals. Let me know if you’re interested and I’ll compile the libraries for you and make them available from my website.
The graph from ncw is interesting. I wonder if Duplicati is teetering on a rate limit, with 403 as a hint? Resolve a 403 error: Project rate limit exceeded (if new error detail shows that) may be configurable, however if that’s the one that hurts me, I’d expect other complaints at the same time, and I find none.
@warwickmm@ts678 I doubt Google Drive is having problems with rate limiting. I’m hitting Google Drive every 15 minutes from a server connected directly to an internet backbone… I’m hitting it pretty quickly. The 403 error that I received doesn’t appear in Google’s 403 error list: here so we’re just guessing until the enhanced error handling is in place.
Uploads (up to 4 at a time by default) start when the upload is completed to go.
--asynchronous-concurrent-upload-limit (Integer): The number of concurrent
When performing asynchronous uploads, the maximum number of concurrent
uploads allowed. Set to zero to disable the limit.
* default value: 4
Default Remote volume size (Options sreen) is 50 MB, so it may take awhile.
Initial backup will find everything needs uploading. Later ones just do changes.