The following code:
import requests
import json
import msal
config = {
"authority": "https://login.microsoftonline.com/<My tenant ID>",
"client_id": "<My client ID>",
"client_secret": "<My secret>",
"scope": ["https://graph.microsoft.com/.default"],
}
app = msal.ConfidentialClientApplication(
config["client_id"],
authority=config["authority"],
client_credential=config["client_secret"] )
result = app.acquire_token_silent(config["scope"], account=None)
if not result:
result = app.acquire_token_for_client(scopes=config["scope"])
bearerToken = result['access_token']
url = "https://<My org ID>.<My org region>.dynamics.com/api/data/v9.1/workflows"
headers = {
"Accept": "application/json",
"Content-type": "application/json",
"Authorization": "Bearer "+bearerToken,
}
response = requests.request("GET", url, headers = headers)
response
Is producing the following output:
<Response [401]>
The expected output is like this:
{
"#odata.context": "https://org00000000.crm0.dynamics.com/api/data/v9.1/$metadata#workflows",
"value": [{
"#odata.etag": "W/\"12116760\"",
"category": 5,
"statecode": 0,
"workflowidunique": "00000000-0000-0000-0000-000000000001",
"workflowid" : "00000000-0000-0000-0000-000000000002",
"createdon": "2018-11-15T19:45:51Z",
"_ownerid_value": "00000000-0000-0000-0000-000000000003",
"modifiedon": "2018-11-15T19:45:51Z",
"ismanaged": false,
"name": "Sample flow",
"_modifiedby_value": "00000000-0000-0000-0000-000000000003",
"_createdby_value": "00000000-0000-0000-0000-000000000003",
"type": 1,
"description": "This flow updates some data in Common Data Service.",
"clientdata": "{\"properties\":{\"connectionReferences\":{\"shared_commondataservice\":{\"source\":\"NotSpecified\",\"id\":\"/providers/Microsoft.PowerApps/apis/shared_commondataservice\",\"tier\":\"NotSpecified\"}},\"definition\":{...}},\"schemaVersion\":\"1.0.0.0\"}"
}]
}
...as shown in the Microsoft documentation that appears here: https://learn.microsoft.com/en-us/power-automate/web-api
Previously I:
Registered the app in Azure and generated secret key, as is indicated in the procedure shown in this link: https://learn.microsoft.com/en-us/powerapps/developer/data-platform/walkthrough-register-app-azure-active-directory#create-an-application-registration
Created app role as described here: https://learn.microsoft.com/en-us/power-platform/admin/database-security#minimum-privileges-to-run-an-app
Created a Dataverse app user, linked to the app created in 1. and the role created in 2., as described here: https://learn.microsoft.com/en-us/powerapps/developer/data-platform/authenticate-oauth#manually-create-a-dataverse-application-user
Why is this not working?
Finally got a solution thanks to #microsoft support team.
It was the scope, whose correct content is:
"scope": ["https://<My org ID>.<My org region>.dynamics.com/.default"],
Related
Here's my current code:
import json
import requests
def createPage(database_id, page_id, headers, url):
newPageData = {
"parent": {
"database_id": database_id,
"page_id": page_id,
},
"properties": {
"Name": {"title": {"text": "HI THERE"}},
},
}
data = json.dumps(newPageData)
res = requests.request("POST", url, headers=headers, data=data)
print(res.status_code)
print(res.text)
database_id = "ea28de8e9cca4f62b4c4da3522869d03"
page_id = "697fd88570b3420aaa928fa28d0bf230"
url = "https://api.notion.com/v1/databases/"
key = "KEY"
payload = {}
headers = {
"Authorization": f"Bearer {key}",
"accept": "application/json",
"Notion-Version": "2021-05-11",
"content-type": "application/json",
}
createPage(database_id, page_id, headers, url)
But everytime I run this, it appears like I keep getting new databases within the page. This is before running the script:
This is after running the script:
I would like it to be like this after running the script:
How can that be achieved?
It looks as you're calling the API URL that creates a new Database, and not the one that creates a new page.
This URL: https://api.notion.com/v1/databases/ is for creating new databases, and not for creating pages.
In order to create a new page within a database, use the following URL:
https://api.notion.com/v1/pages
Where you'll need to provide the previously created database id, among other identifiers
More detailed documentation can be found here
https://developers.notion.com/reference/post-page
I have an existing azure-pipelines.yml file in my branch. I want to invoke this file via Azure RestAPI and let Azure CI Pipelines create. I need to do it by python code.
something I have tried like this but getting some error related 203. It seems ...... 203 Non-Authoritative Information Return Issue when attempting to perform any action (GET/POST/etc) through the Azure DevOps API.
..Main focus is create pipelines by code. If any existing/working examples, it would be helpful..
import json
api_url = "https://dev.azure.com/DevOps/Ops/_apis/pipelines?api-version=6.0-preview.1"
json_data = {
"folder": "/",
"name": "My Pipeline",
"configuration": {
"type": "yaml",
"path": "/Boot/{{ project_name }}/pipelines/azure-pipelines.yaml",
"repository": {
"name": "Boot",
"type": "azureReposGit"
}
}
}
headers = {"Content-Type":"application/json"}
response = requests.post(api_url, data = json.dumps(json_data), headers=headers)
#print(response.json())
print(response.status_code)```
Write a Python demo for you here:
import requests
import json
def create_pipeline_basedon_yaml(Organization, Project, Repository, Yaml_File, Pipeline_Folder, Pipeline_Name, Personal_Access_Token):
##########get repo id##########
url_repoapi = "https://dev.azure.com/"+Organization+"/"+Project+"/_apis/git/repositories/"+Repository+"?api-version=4.1"
payload_repoapi={}
headers_repoapi = {
'Authorization': 'Basic '+Personal_Access_Token,
}
response_repoapi = requests.request("GET", url_repoapi, headers=headers_repoapi, data=payload_repoapi)
repo_id = response_repoapi.json()['id']
##########create pipeline##########
url_pipelineapi = "https://dev.azure.com/"+Organization+"/"+Project+"/_apis/pipelines?api-version=6.0-preview.1"
payload_pipelineapi = json.dumps({
"configuration": {
"path": Yaml_File,
"repository": {
"id": repo_id,
"type": "azureReposGit"
},
"type": "yaml"
},
"folder": Pipeline_Folder,
"name": Pipeline_Name
})
headers_pipelineapi = {
'Authorization': 'Basic '+Personal_Access_Token,
'Content-Type': 'application/json'}
requests.request("POST", url_pipelineapi, headers=headers_pipelineapi, data=payload_pipelineapi)
Organization = "xxx"
Project = "xxx"
Repository = "xxx"
Yaml_File = "xxx.yml"
Pipeline_Folder = "test_folder"
Pipeline_Name = "Pipeline_basedon_yaml"
Personal_Access_Token = "xxx"
create_pipeline_basedon_yaml(Organization, Project, Repository, Yaml_File, Pipeline_Folder, Pipeline_Name, Personal_Access_Token)
I can successfully create the pipeline based on the specific yaml file:
I have created below role in the app registration manifest:
"appRoles": [
{
"allowedMemberTypes": [
"User"
],
"displayName": "Student",
"id": "d1c2ade8-98f8-45fd-aa4a-6d06b947c66f",
"isEnabled": true,
"description": "Student",
"value": "Student"
}
],
Now I am using appRoleAssignment api to assign a role to the user. I am following this documentation. In this page, it says that we need to use below api with the json body:
POST https://graph.microsoft.com/v1.0/servicePrincipals/{id}/appRoleAssignments
Content-Type: application/json
Content-Length: 110
{
"principalId": "principalId-value",
"resourceId": "resourceId-value",
"appRoleId": "appRoleId-value"
}
I am unable to understand what I should use in principalId, resourceId and appRoleId. As per that page, it says that:
principalId: The id of the client service principal to which you are assigning the app role.
resourceId: The id of the resource servicePrincipal (the API) which has defined the app role (the application permission).
appRoleId: The id of the appRole (defined on the resource service principal) to assign to the client service principal.
But what I could understand is that principalId is the ID of the user I have in the active directory for which I want to assign the role.
which in my case is the ObjectId in below photo:
is this correct.?
resourceId is the tennant id and appRoleId is the id I used while creating the app role above which is d1c2ade8-98f8-45fd-aa4a-6d06b947c66f
Putting it all together if I make a request in python
token = get_token()
headers = {'Authorization': 'Bearer ' + token, 'Content-Type': 'application/json'}
user_data = {
"principalId": "1bc79085-12qw-4fad-8da8-647f4b4b2927",
"resourceId": "c01b6482-3ccd-4533-8c98-a7c5e8067cc8",
"appRoleId": "d1c2ade8-98f8-45fd-aa4a-6d06b947c66f"
}
j_data = json.dumps(user_data)
conn = http.client.HTTPSConnection('graph.microsoft.com')
conn.request("POST", "/v1.0/servicePrincipals/1bc79085-12qw-4fad-8da8-647f4b4b2927/appRoleAssignments", j_data, headers)
response = conn.getresponse()
rdata = response.read()
I am getting below response:
{
"error": {
"code": "Request_ResourceNotFound",
"message": "Resource '1bc79085-12qw-4fad-8da8-647f4b4b2927' does not exist or one of its queried reference-property objects are not present.",
"innerError": {
"date": "2020-10-26T05:16:35",
"request-id": "1c87a140-7bc9-499d-82dd-bc1dcb54e075",
"client-request-id": "1c87a140-7bc9-499d-82dd-bc1dcb54e075"
}
}
}
Can anyone please help me debug this. Please help. Thanks
EDIT:
Error:
{
"error": {
"code": "Request_ResourceNotFound",
"message": "Resource '261eda4b-6eee-45ba-a176-259960603409' does not exist or one of its queried reference-property objects are not present.",
"innerError": {
"date": "2020-10-26T07:09:38",
"request-id": "8dc2ea73-63e5-45b5-8127-445df777c1e1",
"client-request-id": "8dc2ea73-63e5-45b5-8127-445df777c1e1"
}
}
}
Json:
{
"principalId": "f923e078-ca9d-4611-a80e-bebb712ad7d1",
"resourceId": "261eda4b-6eee-45ba-a176-259960603409",
"appRoleId": "d1c2ade8-98f8-45fd-aa4a-6d06b947c66f"
}
Post URL: https://graph.microsoft.com/v1.0/servicePrincipals/261eda4b-6eee-45ba-a176-259960603409/appRoleAssignments
GET Url to get the object id: https://graph.microsoft.com/v1.0/serviceprincipals?$select=id&$filter=displayName eq '{useracces}'
POST https://graph.microsoft.com/v1.0/servicePrincipals/{id}/appRoleAssignedTo
Content-Type: application/json
Content-Length: 110
{
"principalId": "principalId-value",
"resourceId": "resourceId-value",
"appRoleId": "appRoleId-value"
}
In this example, {id} and {resourceId-value} would both be the object id of the resource service principal, which is the enterprise app associated with the Azure AD app you have created appRoles in.
And {principalId-value} would be the object id of the user.
{appRoleId-value} is the id of the app role you created in manifest.
UPDATE:
The steps you get the object id of service principal are correct.
If you want to get it using Graph API, you can do it like this:
GET https://graph.microsoft.com/v1.0/serviceprincipals?$select=id&$filter=displayName eq '{app name}'
I'm looking for a python 3 example on how I would get an access token so I could import a csv file from GCS into Cloud SQL from a Google Cloud Function.
It's from a Cloud Function so the expectation is that the service account it runs under or the service account of the Cloud SQL instance would have access if given access, but that's not the case.
Response HTTP Response Body: {
"error": {
"code": 401,
"message": "Request is missing required authentication credential. Expected OAuth 2 access token, login cookie or other valid authentication credential. See https://developers.google.com/identity/sign-in/web/devconsole-project.",
"errors": [
{
"message": "Login Required.",
"domain": "global",
"reason": "required",
"location": "Authorization",
"locationType": "header"
}
],
"status": "UNAUTHENTICATED"
}
}
Below is the code, am curious if anyone has some sample code on how I can get it to authenticate.
response = requests.post(
url="https://www.googleapis.com/sql/v1beta4/projects/redacted-project/instances/redacted-instance/import",
headers={"Content-Type": "application/json; charset=utf-8"
},
data=json.dumps({
"importContext": {
"fileType": "CSV",
"csvImportOptions": {
"table": "service_data"
},
"uri": "gs://redacted-bucket/log/" + blob.name + "",
"database": "redacted-db"
}
})
)
print('Response HTTP Status Code: {status_code}'.format(status_code=response.status_code))
print('Response HTTP Response Body: {content}'.format(content=response.content))
You should use the google-api-python-client to construct a service for this API instead of trying to make a request directly. This will allow it to pick up the default service account for the Cloud Function:
from googleapiclient.discovery import build
service = build('sql', 'v1beta4')
...
More details here: https://github.com/googleapis/google-api-python-client/blob/master/docs/start.md
1.From your Google Cloud Functions, get auth tokens by querying the metadata server assuming that your cloud function runs under default service account, which is App Engine Default service account and has the role Editor.
import requests
import json
METADATA_URL = 'http://metadata.google.internal/computeMetadata/v1/'
METADATA_HEADERS = {'Metadata-Flavor': 'Google'}
SERVICE_ACCOUNT = 'default'
def import_table(request):
url = '{}instance/service-accounts/{}/token'.format(
METADATA_URL, SERVICE_ACCOUNT)
# Request an access token from the metadata server.
r = requests.get(url, headers=METADATA_HEADERS)
r.raise_for_status()
# Extract the access token from the response.
access_token = r.json()["access_token"]
body = json.dumps({'importContext': {'fileType': 'CSV',
'csvImportOptions': {'table': 'your_table'},
'uri': 'gs://temprun/your_dump_file',
'database': 'your_database'}})
response = requests.post(
url="https://www.googleapis.com/sql/v1beta4/projects/your_project/instances/your_sql_instance/import",
headers={"Content-Type": "application/json; charset=utf-8",
"Authorization": "Bearer {}".format(access_token)
},
data=body)
return str(response)
2.Using client libraries google-api-python-client:
def import_table(request):
from googleapiclient.discovery import build
service = build('sqladmin', 'v1beta4')
body = {'importContext': {'fileType': 'CSV',
'csvImportOptions': {'table': 'your_table'},
'uri': 'gs://temprun/your_dump_file',
'database': 'your_database'}}
service.instances().import_(project='your_project', instance='your_instance', body=body).execute()
return "Table was imported"
If successful, the response body contains an instance of Operation.
{'kind': 'sql#operation',
'targetLink': 'https://sqladmin.googleapis.com/sql/v1beta4/projects/your-project/instances/instance',
'status': 'PENDING',
'user': 'youraccount,
'insertTime': '2020-03-18T09:02:55.437Z',
'operationType': 'IMPORT',
'importContext': {'uri': 'gs://yourbucket/dumpfile',
'database': 'yourdatabase',
'kind': 'sql#importContext',
'fileType': 'CSV',
'csvImportOptions': {'table': 'sql-table}},
'name': 'cdcd53d4-96fe-41cf-aee4-12cf6ec6394e',
'targetId': 'instance_name',
'selfLink': 'https://sqladmin.googleapis.com/sql/v1beta4/projects/project/operations/cdcd53d4-96fe-41cf-aee4-12cf6ec6394e',
'targetProject': 'your-project'}
From within Google Cloud Functions you can get auth tokens by querying the metadata server.
There is an easier option, however: use the Cloud SQL Client Library. This will automatically get auth tokens for you.
Both of these options will authenticate with the PROJECT_ID#appspot.gserviceaccount.com service account. You may need to grant that account permissions if you are doing cross-project calls etc.
I have registered an application in AzureAD: AnalysisService
It has the following IDs:
Application (client) ID: ID1
Directory (tenant) ID: ID2
and I have defined the following permission it:
My aim is scaling up and down my Azure Analysis Service in logic app with the following ID
Subscription ID: ID3
In logic app I have the following request:
{
"uri": "https://management.azure.com/subscriptions/**ID3**/resourceGroups/ServerName/providers/Microsoft.AnalysisServices/servers/Model?api-version=2017-08-01",
"method": "PATCH",
"authentication": {
"tenant": "ID2",
"audience": "https://management.core.windows.net",
"clientId": "ID1",
"secret": "*sanitized*",
"type": "ActiveDirectoryOAuth"
},
"body": {
"sku": {
"capacity": 1,
"name": "S4",
"tier": "Standard"
},
"tags": {
"testKey": "testValue"
}
}
}
After sending this request I get the following error:
{
"statusCode": 403,
"headers": {
"Pragma": "no-cache",
"x-ms-failure-cause": "gateway",
"x-ms-request-id": "xxxxxx-4dea-xxx-xxxx-xxx",
"x-ms-correlation-request-id": "xxxxxxxx-4dea-xxxx-xxxx-5dea12ba0cca",
"x-ms-routing-request-id": "WESTEUROPE:20190211T181536Z:xxxxxx-4dea-4fa8-bccd-xxxxxx",
"Strict-Transport-Security": "max-age=31536000; includeSubDomains",
"X-Content-Type-Options": "nosniff",
"Connection": "close",
"Cache-Control": "no-cache",
"Date": "Mon, 11 Feb 2019 18:15:35 GMT",
"Content-Length": "413",
"Content-Type": "application/json; charset=utf-8",
"Expires": "-1"
},
"body": {
"error": {
"code": "AuthorizationFailed",
"message": "Client \"ID4\" with the object ID \"ID4\" has no permission to run the action \"Microsoft.AnalysisServices/servers/write\" over \"/subscriptions/ID3/resourceGroups/ServerName/providers/Microsoft.AnalysisServices/servers/ModelName\"."
}
}
}
what should I do more to solve this problem?
UPDATE
I have granted my service principal (which represents Azure AD application) the following specific permissions over my Analysis Services instance:
I have also control it in management studio as described here and I can see the serviceprincipal also there:
But I get still the same error message
Is the ID4, the object ID of my logic app? should I add my logic app also in IAM of my Analysis Service?
You should grant your service principal (which represents Azure AD application) those specific permissions: Microsoft.AnalysisServices/servers/write over your Analysis Services instance: /subscriptions/ID3/resourceGroups/ServerName/providers/Microsoft.AnalysisServices/servers/ModelName. Alternatively, you can grant those permissions on the resource group level, or subscription level.
Here's how you do it: https://learn.microsoft.com/en-us/azure/role-based-access-control/role-assignments-portal.
In short: go to the resource, click IAM on the left, click + Add Role Assignment on the top of the blade and pick your role\identity. This particular permissions falls under something like Analysis Services contributor.
ps. you can always create a custom role to follow least privilege principle: https://learn.microsoft.com/en-us/azure/role-based-access-control/custom-roles