Hi, thank you a lot for Duplicati software, it’s really easy to use and to understand. I’m using more than 20 computers, in different places, so different WAN IPs, to send just a few system files to my professional Google Drive, on a shared Drive. FWIW, all clients use the same full access Token.
I had a lot of errors that I doesn’t understand but I managed them well. And one of these was System.Net.WebException: The remote server returned an error: (403) Forbidden.
I’ve read the API guide from google for the 403 errors HTTP returns.
I’ve made a list of possible errors that can encounter google Drive :
Authorization of the account
There are 3 authorization levels :
Manager : Can do everything
Content manager : Can send delete and modify files
Contributor : Can only send or modify files but not delete them.
For my save I seen that Contributor isn’t a good option beacause Duplicati cant delete the files, i’ts not working, normal. Content Manager can do everything but when It reach the version quota number and tries to delete the older files he have to, he can’t. So I used the Manager because he can do everything.
AuthID
For sure I have set up a full access Auth ID not a limited access. So Duplicati have everything to do his job.
Send limitation
I’ve read that Google Drive seems to hate when a lot of IP with the same token send a lot of files a the same hour. So I made more send-retries --number-of-retries=XX --retry-delay=XXs. So i don’t have this problem anymore.
API limitation
Also Gdrive have limits here the problem that you can encounter with a 403 forbidden Error Google drive API Limit
To resume
I think I have to use manager privileges, but I wonder if it’s the right solution. I wonder how to view detailed responses from Google Drive, the exact HTTP JSON response. (should I create a log file with which information level?)
this may have a relation to this PR
It has fallen in the cracks of the original project author having left the maintainer tasks, the PR author not having changed the PR versus the objections of the original project author, and me addressing more pressing needs and also just not using Google services.
I don’t know if now Google is offering a general setting disabling this ‘feature’ instead of the code change in the PR, it may be interesting to check this out.
Thanks for your quick response.
Actually I’ve seen that if you give administrator privileges, instead of content manager.It seem that Duplicati can delete only the file that he is the author, but when the account on the shared drive as a content manager privileges, Duplicati can’t delete it’s own file. It seem that in content manager Duplicati’s files are granted to another person. So he can upload and modify but when it’s about to delete, he cant do anything.
I think the solution is to give Duplicati Administrator right on your shared drive.
Also I’ve seen that the default password to open the database is
Duplicati_Key_42
[source :] (Clear text password stored in Duplicati-server.sqlite - #17 by kenkendk)
Because I want to make sure that if for example someone want to mess up everything, he could get the password, unecrypt the database, get the credentials for any repo and then do everything.
If it’s the case, I have a feature request I think x)
You ask different questions and answering them all is difficult.
The gist of my answer was to say that this ‘forbidden’ failure may be unrelated to the rights you give to Duplicati and may happen even with full rights.
If you have your computer hacked, the only way to protect your backups is to have a copy of your backup that the hacked computer can’t touch. If the computer itself is doing the copy of the backup automatically, that’s just not possible. These are general security questions, not Duplicati related.
The Duplicati database encryption is a hack that should not be trusted.
I see no such problem myself but to be candid I have never bothered with the whole database encryption matter and I am not sure of its current status.
--server-encryption-key
This option sets the encryption key used to scramble the local settings database. This option can also be set with the environment variable DUPLICATI_DB_KEY. Use the option --unencrypted-database to disable the database scrambling.
I’m not sure if you’re arguing for or against database content scrambling (needs a Duplicati restart), however it’s rather configurable. I turn mine off because I want to look in the database occasionally. Typical users would probably be better off leaving it on, as it safeguards the database content a bit.
Non-Windows users probably don’t have this scrambling at all. Their SQLite tends not to have that. Future updates to the Duplicati version might not either, as it has been removed by the DB authors, requiring some alternate solution to be found, if this sort of data protection is going to be continued.