my name is Adrian. I want to add user to the G suite via google admin SDK via python3.
here is my problem:
I have this code:
SCOPES = ['https://www.googleapis.com/auth/admin.directory.user']
def createUserConnection():
creds = None
# The file token.pickle stores the user's access and refresh tokens, and is
# created automatically when the authorization flow completes for the first
# time.
if os.path.exists('token.pickle'):
with open('token.pickle', 'rb') as token:
creds = pickle.load(token, encoding='Latin-1')
# If there are no (valid) credentials available, let the user log in.
if not creds or not creds.valid:
if creds and creds.expired and creds.refresh_token:
creds.refresh(Request())
else:
flow = InstalledAppFlow.from_client_secrets_file(
'credentials.json', SCOPES)
creds = flow.run_local_server(port=0)
# Save the credentials for the next run
with open('token.pickle', 'wb') as token:
pickle.dump(creds, token)
createUserConnection.service = build('admin', 'directory_v1', credentials=creds)
def addUser(name,familyName,usermail):
print('Adding user '+usermail+' to the G suite')
#json definition
userInfo = json.dumps({
"name" : {
"givenName": name,
"familyName": familyName
},
"kind": "admin#directory#user",
"primaryEmail": usermail,
"password": "Welcome1234",
"changePasswordAtNextLogin": True
})
createUserConnection.service.users().insert(body=userInfo).execute()
if __name__ == '__main__':
createUserConnection()
addUser("bla","bla","blabla#24i.com")
When i run it via python3, it rerurns an error. File "/Users/adrianbardossy/Downloads/google_accounts/python3.7/lib/python3.7/site-packages/googleapiclient/http.py", line 856, in execute
raise HttpError(resp, content, uri=self.uri)
googleapiclient.errors.HttpError: <HttpError 400 when requesting https://www.googleapis.com/admin/directory/v1/users?alt=json returned "Invalid Input: primary_user_email">
I was trying to fix by passing the username blabla instead of blabla#24i.com, still the same issue. Based on the documentation here: https://developers.google.com/resources/api-libraries/documentation/admin/directory_v1/python/latest/admin_directory_v1.users.html for the insert method. Can you help me resolve the issue?
Adrian
This is caused by inputting a JSON object string in the body argument to service.users().insert(). It should just be a plain python dict.
Instead of
userInfo = json.dumps({
"name": ...
use
userInfo = {
"name": ...
I was also stuck on this for a long time since the docs imply it should be entered as a JSON object, which I interpreted to mean "JSON object encoded as a string", but i discovered it should really just be a python object/dict.
I had the same message error Invalid Input: primary_user_email using the javascript library and from what I understand it means the api couldn't find the primaryEmail property in your body. In other words: your JSON is malformed.
Here's my take on this issue, I use the javascript terminology but it should transfer to python as well :
If you're using the gapi.client.directory.insert method of adding a user you need to encapsulate your body in a resource property, your JSON becomes
{
"resource": {
"name" : {
"givenName": "name",
"familyName": "familyName"
},
"kind": "admin#directory#user",
"primaryEmail": "usermail",
"password": "password",
"changePasswordAtNextLogin": true
}
}
However, if you're using the gapi.client.request method of adding a user, the library will encapsulate the JSON for you, so your body should look like this :
{
"name" : {
"givenName": "name",
"familyName": "familyName"
},
"kind": "admin#directory#user",
"primaryEmail": "usermail",
"password": "password",
"changePasswordAtNextLogin": true
}
Short answer for OP: you're using the insert method, try encapsulating your JSON in a resource property.
Hope it helps people who stumble on this error in the future
I know this is old, but if you haven’t gotten it yet, I threw a dict into the field in lieu of doing any conversion or anything to JSON:
def get_user_inf(service):
'''
This initiates the collection of user data.
'''
user_fname = str(input('What is the new hires first name? : '))
user_lname = str(input('What is the new hires last name? : '))
user_email = str(input('What is the new hires email? : '))
new_user = {
"name": {
"givenName": user_fname,
"fullName": user_fname + " " + user_lname,
"familyName": user_lname,
},
"primaryEmail": user_fname + '.' + user_lname + "#hometownticketing.com",
"recoveryEmail": user_email,
"password": "Temp42!!!",
"changePasswordAtNextLogin": True,
}
user = service.users().insert(body=new_user).execute()
Related
I am currently working on a python program to query public github API url to get github user email address. The response from the python object is a huge list with a lot of dictionaries.
My code so far
import requests
import json
# username = ''
username = 'FamousBern'
base_url = 'https://api.github.com/users/{}/events/public'
url = base_url.format(username)
try:
res = requests.get(url)
r = json.loads(res.text)
# print(r) # List slicing
print(type(r)) # List that has alot dictionaries
for i in r:
if 'payload' in i:
print(i['payload'][6])
# matches = []
# for match in r:
# if 'author' in match:
# matches.append(match)
# print(matches)
# print(r[18:])
except Exception as e:
print(e)
# data = res.json()
# print(data)
# print(type(data))
# email = data['author']
# print(email)
By manually accessing this url in chrome browser i get the following
[
{
"id": "15069094667",
"type": "PushEvent",
"actor": {
"id": 32365949,
"login": "FamousBern",
"display_login": "FamousBern",
"gravatar_id": "",
"url": "https://api.github.com/users/FamousBern",
"avatar_url": "https://avatars.githubusercontent.com/u/32365949?"
},
"repo": {
"id": 332684394,
"name": "FamousBern/FamousBern",
"url": "https://api.github.com/repos/FamousBern/FamousBern"
},
"payload": {
"push_id": 6475329882,
"size": 1,
"distinct_size": 1,
"ref": "refs/heads/main",
"head": "f9c165226201c19fd6a6acd34f4ecb7a151f74b3",
"before": "8b1a9ac283ba41391fbf1168937e70c2c8590a79",
"commits": [
{
"sha": "f9c165226201c19fd6a6acd34f4ecb7a151f74b3",
"author": {
"email": "bernardberbell#gmail.com",
"name": "FamousBern"
},
"message": "Changed input functionality",
"distinct": true,
"url": "https://api.github.com/repos/FamousBern/FamousBern/commits/f9c165226201c19fd6a6acd34f4ecb7a151f74b3"
}
]
},
The json object is huge as well, i just sliced it. I am interested to get the email address in the author dictionary.
You're attempting to index into a dict() with i['payload'][6] which will raise an error.
My personal preferred way of checking for key membership in nested dicts is using the get method with a default of an empty dict.
import requests
import json
username = 'FamousBern'
base_url = 'https://api.github.com/users/{}/events/public'
url = base_url.format(username)
res = requests.get(url)
r = json.loads(res.text)
# for each dict in the list
for event in r:
# using .get() means you can chain .get()s for nested dicts
# and they won't fail even if the key doesn't exist
commits = event.get('payload', dict()).get('commits', list())
# also using .get() with an empty list default means
# you can always iterate over commits
for commit in commits:
# email = commit.get('author', dict()).get('email', None)
# is also an option if you're not sure if those keys will exist
email = commit['author']['email']
print(email)
The following code:
import requests
import json
import msal
config = {
"authority": "https://login.microsoftonline.com/<My tenant ID>",
"client_id": "<My client ID>",
"client_secret": "<My secret>",
"scope": ["https://graph.microsoft.com/.default"],
}
app = msal.ConfidentialClientApplication(
config["client_id"],
authority=config["authority"],
client_credential=config["client_secret"] )
result = app.acquire_token_silent(config["scope"], account=None)
if not result:
result = app.acquire_token_for_client(scopes=config["scope"])
bearerToken = result['access_token']
url = "https://<My org ID>.<My org region>.dynamics.com/api/data/v9.1/workflows"
headers = {
"Accept": "application/json",
"Content-type": "application/json",
"Authorization": "Bearer "+bearerToken,
}
response = requests.request("GET", url, headers = headers)
response
Is producing the following output:
<Response [401]>
The expected output is like this:
{
"#odata.context": "https://org00000000.crm0.dynamics.com/api/data/v9.1/$metadata#workflows",
"value": [{
"#odata.etag": "W/\"12116760\"",
"category": 5,
"statecode": 0,
"workflowidunique": "00000000-0000-0000-0000-000000000001",
"workflowid" : "00000000-0000-0000-0000-000000000002",
"createdon": "2018-11-15T19:45:51Z",
"_ownerid_value": "00000000-0000-0000-0000-000000000003",
"modifiedon": "2018-11-15T19:45:51Z",
"ismanaged": false,
"name": "Sample flow",
"_modifiedby_value": "00000000-0000-0000-0000-000000000003",
"_createdby_value": "00000000-0000-0000-0000-000000000003",
"type": 1,
"description": "This flow updates some data in Common Data Service.",
"clientdata": "{\"properties\":{\"connectionReferences\":{\"shared_commondataservice\":{\"source\":\"NotSpecified\",\"id\":\"/providers/Microsoft.PowerApps/apis/shared_commondataservice\",\"tier\":\"NotSpecified\"}},\"definition\":{...}},\"schemaVersion\":\"1.0.0.0\"}"
}]
}
...as shown in the Microsoft documentation that appears here: https://learn.microsoft.com/en-us/power-automate/web-api
Previously I:
Registered the app in Azure and generated secret key, as is indicated in the procedure shown in this link: https://learn.microsoft.com/en-us/powerapps/developer/data-platform/walkthrough-register-app-azure-active-directory#create-an-application-registration
Created app role as described here: https://learn.microsoft.com/en-us/power-platform/admin/database-security#minimum-privileges-to-run-an-app
Created a Dataverse app user, linked to the app created in 1. and the role created in 2., as described here: https://learn.microsoft.com/en-us/powerapps/developer/data-platform/authenticate-oauth#manually-create-a-dataverse-application-user
Why is this not working?
Finally got a solution thanks to #microsoft support team.
It was the scope, whose correct content is:
"scope": ["https://<My org ID>.<My org region>.dynamics.com/.default"],
This is the Question/Answer case, so here you wont find any details.
After Googling within few hours I found solution that you can find below.
simple_salesforce
from simple_salesforce import Salesforce
def custom_field_create:
"""
Based on https://salesforce.stackexchange.com/a/212747/65221
Examples of field types you can find in the column "Data Type" of the salesforce front end,
on the page where you can create/edit/delete fields for your selected object.
NOTE: case of "type" is important. For example the type "DateTime"
must be exactly "DateTime" and not like "datetime".
"""
email = 'your_email'
password = 'your_password'
security_token = 'your_token'
object_api_name = 'contact' # replace with your object name
field_api_name = 'Activity_Time' # replace with your field name
field_label = 'Activity Time' # replace with your field label
sf = Salesforce(username=email, password=password, security_token=security_token)
url = 'tooling/sobjects/CustomField/'
payload = {
"Metadata":
{"type": "Text", "inlineHelpText": "", "precision": None, "label": f"{field_label}", "length": 90, "required": False},
"FullName": f"{object_api_name}.{field_api_name}__c"
}
result = sf.restful(url, method='POST', json=payload)
print('result:', result)
I'm looking for a python 3 example on how I would get an access token so I could import a csv file from GCS into Cloud SQL from a Google Cloud Function.
It's from a Cloud Function so the expectation is that the service account it runs under or the service account of the Cloud SQL instance would have access if given access, but that's not the case.
Response HTTP Response Body: {
"error": {
"code": 401,
"message": "Request is missing required authentication credential. Expected OAuth 2 access token, login cookie or other valid authentication credential. See https://developers.google.com/identity/sign-in/web/devconsole-project.",
"errors": [
{
"message": "Login Required.",
"domain": "global",
"reason": "required",
"location": "Authorization",
"locationType": "header"
}
],
"status": "UNAUTHENTICATED"
}
}
Below is the code, am curious if anyone has some sample code on how I can get it to authenticate.
response = requests.post(
url="https://www.googleapis.com/sql/v1beta4/projects/redacted-project/instances/redacted-instance/import",
headers={"Content-Type": "application/json; charset=utf-8"
},
data=json.dumps({
"importContext": {
"fileType": "CSV",
"csvImportOptions": {
"table": "service_data"
},
"uri": "gs://redacted-bucket/log/" + blob.name + "",
"database": "redacted-db"
}
})
)
print('Response HTTP Status Code: {status_code}'.format(status_code=response.status_code))
print('Response HTTP Response Body: {content}'.format(content=response.content))
You should use the google-api-python-client to construct a service for this API instead of trying to make a request directly. This will allow it to pick up the default service account for the Cloud Function:
from googleapiclient.discovery import build
service = build('sql', 'v1beta4')
...
More details here: https://github.com/googleapis/google-api-python-client/blob/master/docs/start.md
1.From your Google Cloud Functions, get auth tokens by querying the metadata server assuming that your cloud function runs under default service account, which is App Engine Default service account and has the role Editor.
import requests
import json
METADATA_URL = 'http://metadata.google.internal/computeMetadata/v1/'
METADATA_HEADERS = {'Metadata-Flavor': 'Google'}
SERVICE_ACCOUNT = 'default'
def import_table(request):
url = '{}instance/service-accounts/{}/token'.format(
METADATA_URL, SERVICE_ACCOUNT)
# Request an access token from the metadata server.
r = requests.get(url, headers=METADATA_HEADERS)
r.raise_for_status()
# Extract the access token from the response.
access_token = r.json()["access_token"]
body = json.dumps({'importContext': {'fileType': 'CSV',
'csvImportOptions': {'table': 'your_table'},
'uri': 'gs://temprun/your_dump_file',
'database': 'your_database'}})
response = requests.post(
url="https://www.googleapis.com/sql/v1beta4/projects/your_project/instances/your_sql_instance/import",
headers={"Content-Type": "application/json; charset=utf-8",
"Authorization": "Bearer {}".format(access_token)
},
data=body)
return str(response)
2.Using client libraries google-api-python-client:
def import_table(request):
from googleapiclient.discovery import build
service = build('sqladmin', 'v1beta4')
body = {'importContext': {'fileType': 'CSV',
'csvImportOptions': {'table': 'your_table'},
'uri': 'gs://temprun/your_dump_file',
'database': 'your_database'}}
service.instances().import_(project='your_project', instance='your_instance', body=body).execute()
return "Table was imported"
If successful, the response body contains an instance of Operation.
{'kind': 'sql#operation',
'targetLink': 'https://sqladmin.googleapis.com/sql/v1beta4/projects/your-project/instances/instance',
'status': 'PENDING',
'user': 'youraccount,
'insertTime': '2020-03-18T09:02:55.437Z',
'operationType': 'IMPORT',
'importContext': {'uri': 'gs://yourbucket/dumpfile',
'database': 'yourdatabase',
'kind': 'sql#importContext',
'fileType': 'CSV',
'csvImportOptions': {'table': 'sql-table}},
'name': 'cdcd53d4-96fe-41cf-aee4-12cf6ec6394e',
'targetId': 'instance_name',
'selfLink': 'https://sqladmin.googleapis.com/sql/v1beta4/projects/project/operations/cdcd53d4-96fe-41cf-aee4-12cf6ec6394e',
'targetProject': 'your-project'}
From within Google Cloud Functions you can get auth tokens by querying the metadata server.
There is an easier option, however: use the Cloud SQL Client Library. This will automatically get auth tokens for you.
Both of these options will authenticate with the PROJECT_ID#appspot.gserviceaccount.com service account. You may need to grant that account permissions if you are doing cross-project calls etc.
I'm testing Azure Schema Extensions with a simple groovy script to the Microsoft Graph API. I first query for a list of schema extensions and this seems to work (though I am not sure where the extensions in the response have been set, I assume they are there by default). I then try to POST a schema extension but this fails with this error message:
[error: [
code:InternalServerError,
message:Object reference not set to an instance of an object.,
innerError:[request-id:xxxxx-xxxx-xxx-xxxxxx, date:2018-05-14T00:46:00]]]
This is the code for the GET query and the response:
def uriTestGET = "https://graph.microsoft.com/v1.0/schemaExtensions?"
def httpTestGET = new HTTPBuilder(uriTestGET)
httpTestGET.ignoreSSLIssues()
httpTestGET.request(GET,JSON) { req ->
httpTestGET.parser.'application/json'
headers.'Authorization' = 'Bearer ' + AzureToken
headers.'Content-Type' = 'application/json'
response.failure = { resp, json ->
println "GET Failure. GROUP: ${resp.statusLine}"
println(json)
}
response.success = { resp, json ->
println "GET Success. GROUP: ${resp.statusLine}"
println(json)
}
}
Response
[
#odata.context:https://graph.microsoft.com/v1.0/$metadata#schemaExtensions,
#odata.nextLink:https://graph.microsoft.com/v1.0/schemaExtensions?$skiptoken=XXXXXXXXXX,
value:[[
id:adatumisv_exo2,
description:sample desccription,
targetTypes:[Message],
status:Available,
owner:xxxxxx-xxxx-xxxx-xxxx,
properties:[
[name:p1, type:String],
[name:p2, type:String]]],
[id:circuitid_globals,
description:Circuit ID Graph Global Schema,
targetTypes:[Group, User], .. etc
And the following POST request:
def uriTestPOST = "https://graph.microsoft.com/v1.0/schemaExtensions?"
def httpTestPOST = new HTTPBuilder(uriTestPOST)
httpTestPOST.ignoreSSLIssues()
httpTestPOST.request(POST,JSON) { req ->
httpTestPOST.parser.'application/json'
headers.'Authorization' = 'Bearer ' + AzureToken
headers.'Content-Type' = 'application/json'
body = [
"id":"TestExtension",
"description": "Test to add user object schema extension",
"status": "Available",
"targetTypes": ["user"]
]
response.failure = { resp, json ->
println "POST Failure. GROUP: ${resp.statusLine}"
println(json)
}
response.success = { resp, json ->
println "POST Success. GROUP: ${resp.statusLine}"
println(json)
}
}
This gets the response:
[error:[
code:InternalServerError,
message:Object reference not set to an instance of an object.,
innerError:[request-id:xxxx-xxxx-xxxx-xxxx, date:2018-05-14T00:46:00]]]
Updated Body looks like this -
body = [
"id":"TestExtension",
"description": "Test to add user object schema extension",
// "status": "Available",
"targetTypes": ["User"],
"properties" : [["name": "ExtensionProperty", "type":"String"]]
]
And this is the new error message:
[error:[
code:Authorization_RequestDenied,
message:Insufficient privileges to complete the operation.,
innerError:[request-id:xxxx-xxxx-xxxx-xxxx, date:2018-05-14T05:09:41]]]
I've decoded the token and it shows the following roles are included:
Decoding the token it states the following roles:
"roles": [
"User.ReadWrite.All",
"Directory.ReadWrite.All",
"User.Invite.All" ]
I've been adding additional permissions to get this to work, these seem to be greater privileges than is required as far as I can see.
The system automatically chooses between using Application or Delegated permissions based on the OAuth Grant you've chosen:
Client Credentials Grant = Application
Authorization Code Grant = Delegated
Implicit Grant = Delegated
This is because you need an actual User to authenticate if you want them to delegate your application to act on their behalf. Without a User authenticated there isn't anyone to delegate permissions so you need to operate under Application scopes.