Azure Databricks PAT token creation for Azure Service Principal Name - azure

I could not add Azure AD Service Principal Name into Azure Databricks through portal but I was able to add my Service Principal with help of Databricks APIs Endpoints. How can I create PAT Token for my Service Principal Name.

You can use the service principal to create an Azure Active Directory Token and use that to authenticate into Databricks.
To create an AAD token
curl -X POST -H 'Content-Type: application/x-www-form-urlencoded' \
-d 'grant_type=client_credentials&client_id=<client-id>&resource=2ff814a6-3304-4ab8-85cb-cd0e6f879c1d&client_secret=<application-secret>' \
https://login.microsoftonline.com/<tenant-id>/oauth2/token
Replace <client-id> and <application-secret> with the application ID and secret of your service principal and <tenant-id> with your tenant ID.
The response will include the value of the access token
{
"access_token": "<token value>"
}
Since you have already added the service principal into the Databricks workspace, so now you can directly use the generated token to invoke the Databricks REST endpoints as the service principal:
curl -X GET \
-H 'Authorization: Bearer <token-value>' \
https://<databricks-instance>/api/2.0/clusters/list
You can also create additional tokens for the service principal using the Databricks Token API
curl -X POST -H 'Authorization: Bearer <token-value>' \
--data '{ "comment": "This is an example token", "lifetime_seconds": 7776000 }' \
https://<databricks-instance>/api/2.0/token/create
More details are available here.

Note: You add the Azure AD service principal to a workspace using the SCIM API.
Unfortunately, you cannot create Azure Databricks token programmatically.
You’ll use an Azure Databricks personal access token (PAT) to authenticate against the Databricks REST API. To create a PAT that can be used to make API requests:
Go to your Azure Databricks workspace.
Click the user icon in the top-right corner of the screen and click User Settings.
Click Access Tokens > Generate New Token.
Copy and save the token value.
Even for creating using APIs, initial authentication to this API is the same as for all of the Azure Databricks API endpoints: you must first authenticate as described in Authentication.
For more details, refer Tutorial: Run a job with an Azure service principal

After adding service principal to databricks, you can use curl to create a databrick token/PAT for the service principal.
curl -X POST \
${DATABRICKS_HOST}/api/2.0/token-management/on-behalf-of/tokens \
--header "Content-type: application/json" \
--header "Authorization: Bearer ${DATABRICKS_TOKEN}" \
--data #create-service-principal-token.json \
| jq .
https://docs.databricks.com/dev-tools/service-principals.html#step-2-create-the-databricks-access-token-for-the-databricks-service-principal

Related

Azure IAM: Trigger external security challenge in OAuth2 with curl

I'm trying to obtain the access_token for my user in Azure IAM via OAuth2. I've built a curl command this way:
curl \
-H "Content-Type: application/x-www-form-urlencoded" \
-d "scope=openid" \
-d "response_type=id_token+access_token" \
-d "grant_type=password" \
-d "client_id=${MY_APP_ID}" \
-d "username=${MY_USER}" \
-d "password=${MY_PASS}' \
'https://login.microsoftonline.com/${MY_TENANT_ID}/oauth2/v2.0/token'
However I'm getting:
{"error":"invalid_grant","error_description":"AADSTS50158: External security challenge not satisfied. User will be redirected to another page or authentication provider to satisfy additional authentication challenges...
We use MFA, however the curl call is not triggering it. What's the proper way to get the access_token in the MFA-backed OAuth2 flow?
I tried to reproduce the same in my environment and got the results like below:
I created a user and enabled MFA:
I tried to generate the tokens in Postman using ROPC flow and got the similar error like below:
https://login.microsoftonline.com/TenantID/oauth2/v2.0/token
client_id:clientID
scope:openid
grant_type:password
username:username
password:password
Note that : ROPC Grant Type doesn't support MFA enabled users and will be blocked instead. Refer this MsDoc.
To get the access_token in the MFA-backend OAuth2 flow, you can make use of Authorization Code Flow like below:
Make use of below endpoint to generate auth-code:
https://login.microsoftonline.com/TenantID/oauth2/v2.0/authorize?
&client_id=ClientID
&response_type=code
&redirect_uri=RedirectURI
&response_mode=query
&scope=openid
&state=12345
I signed-in with the MFA enabled user to generate the code like below:
I generated the tokens by making use of below parameters:
https://login.microsoftonline.com/TenantID/oauth2/v2.0/token
client_id:ClientID
client_secret:ClientSecert
scope:openid
grant_type:authorization_code
redirect_uri:RedirectURI
code:code
You can also make use of Implicit grant flow for MFA enabled users. Refer this MsDoc.

Dialogflow Bearer Token Analysis

How to I get a bearer token for a Dialogflow v2beta1 API call?
I want to integrate Dialogflow APIs so now I can't even test APIs in postman without bearer token. For testing I have generated API Key for my agent in GCP project but I didn't found any solution for getting bearer token.
POST https://dialogflow.googleapis.com/v2beta1/[PARENT]/intents?key=[YOUR_API_KEY] HTTP/1.1
Authorization: Bearer [YOUR_ACCESS_TOKEN] Accept: application/json Content-Type: application/json
I guess you already have a Service Account with proper permissions to project/product/resource. If no, you can find a guide on how to create it in Creating and managing service accounts.
Regarding Bearer Token you should read about it in Authenticating as a service account.
If you have a Service Account with proper access and key.json you can use Bearer token.
In GCP console you can print default token using command:
### for default SA
$ gcloud auth application-default print-access-token
### for other SA
$ gcloud auth print-access-token SA_NAME#PROJECT_ID.iam.gserviceaccount.com
More details can be found in this docs.
Request for default SA should looks like this:
curl -X POST /v2beta1/{parent=projects/*}/agent:train \
-H "Authorization: Bearer $(gcloud auth application-default print-access-token)"
For specific one time request you should use below example:
curl -X POST /v2beta1/{parent=projects/*}/agent:train \
-H "Authorization: Bearer $(gcloud auth print-access-token <YourSAaccount>)"
SA account might looks like: <SAname>#<projectID>.iam.gserviceaccount.com
Please keep in mind that this SA must be active. You can do it using command to activate SA:
$ gcloud auth activate-service-account SA_NAME#PROJECT_ID.iam.gserviceaccount.com --key-file=/path/to/SAkey/key.json
###or using just key
$ gcloud auth activate-service-account --key-file=/path/to/SAkey/key.json
And command for listing active SA is:
$ gcloud auth list
I chose a random POST from Dialogflow API.

How to connect to Azure Databricks' Hive using a SQLAlchemy from a third party app using a service principal?

I want to connect Superset to a Databricks for querying the tables. Superset uses SQLAlchemy to connect to databases which requires a PAT (Personal Access Token) to access.
It is possible to connect and run queries when I use the PAT I generated on my account through Databricks web UI? But I do not want to use my personal token in a production env. Even so, I was not able to find how to generate a PAT like token for a Service Principal.
The working SQLAlchemy URI is looks like this:
databricks+pyhive://token:XXXXXXXXXX#aaa-111111111111.1.azuredatabricks.net:443/default?http_path=sql%2Fprotocolv1%qqq%wwwwwwwwwww1%eeeeeeee-1111111-foobar00
After checking the Azure docs, there are two ways on how to run queries between Databricks and another service:
Create a PAT for a Service Principal to be associated with Superset.
Create a user AD account for Superset.
For the first and preferred method, I was able to advance, but I was not able to generate the Service Principal's PAT:
I was able to register an app on Azure's AD.
So I got the tenant ID, client ID and create a secret for the registered app.
With this info, I was able to curl Azure and receive a JWT token for that app.
But all the tokens referred in the docs are JTW's OAUTH2 tokens, which does not seems to work with SQLAlchemy URI.
I know it's possible to generate a PAT for a Service Principal since there is a mention on how to read, update and delete a Service Principal's PAT on the documentation. But it has no information on how to create a PAT for a Service Principal.
I prefer to avoid using the second method (creating an AD user for Superset) since I am not allowed to create/manage users for the AD.
In summary, I have a working SQLAlchemy URI, but I want to use a generated token, associated with a Service Principal, instead of using my PAT. But I can't find how to generate that token (I only found documentation on how to generate OAUTH2 tokens).
You can create PAT for service principal as following (examples are taken from docs, do export DATABRICKS_HOST="https://hostname" before executing):
Add service principal into the Databricks workspace using SCIM API (doc):
curl -X POST '$DATABRICKS_HOST/api/2.0/preview/scim/v2/ServicePrincipals' \
--header 'Content-Type: application/scim+json' \
--header 'Authorization: Bearer <personal-access-token>' \
--data-raw '{
"schemas":[
"urn:ietf:params:scim:schemas:core:2.0:ServicePrincipal"
],
"applicationId":"<application-id>",
"displayName": "test-sp",
"entitlements":[
{
"value":"allow-cluster-create"
}
]
}'
Get AAD Token for service principal (doc, another option is to use az-cli):
export DATABRICKS_TOKEN=$(curl -X POST -H 'Content-Type: application/x-www-form-urlencoded' \
-d 'grant_type=client_credentials&client_id=<client-id>&resource=2ff814a6-3304-4ab8-85cb-cd0e6f879c1d&client_secret=<application-secret>' \
https://login.microsoftonline.com/<tenant-id>/oauth2/token|jq -r .accessToken)
Generate token using the AAD Token (doc):
curl -s -n -X POST "$DATABRICKS_HOST/api/2.0/token/create" --data-raw '{
"lifetime_seconds": 100,
"comment": "token for superset"
}' -H "Authorization: Bearer $DATABRICKS_TOKEN"

Access token invalid after configuring Microsoft Azure Active Directory for Snowflake External OAuth

I was trying to Configure Microsoft Azure AD for External OAuth as per the Snowflake tutorial: https://docs.snowflake.com/en/user-guide/oauth-azure.html
The configuration steps went ahead without a hitch and I was able to use the final step: https://docs.snowflake.com/en/user-guide/oauth-azure.html#testing-procedure to obtain the access token from AAD.
However, when I tried to use the access token with Snowflake using a JDBC driver, I obtained the error: "net.snowflake.client.jdbc.SnowflakeSQLException: Invalid OAuth access token.
The Snowflake integration created is of the form:
create security integration ext_oauth_azure_ad
type = external_oauth
enabled = true
external_oauth_type = azure
external_oauth_issuer = '<issuer-url>'
external_oauth_jws_keys_url = '<keys-url>/discovery/v2.0/keys'
external_oauth_audience_list = ('https://<app-id-uri>')
external_oauth_token_user_mapping_claim = 'upn'
external_oauth_snowflake_user_mapping_attribute = 'login_name'
external_oauth_any_role_mode = 'ENABLE';
I tried playing around with this config by changing the external_oauth_token_user_mapping_claim to email since that was the attribute in the decoded JWT access token that matched the login_name but to no avail.
The scope provided in AD is the session:role-any which should be valid for any scope.
Not sure how to proceed post this.
Edit:
The command used to obtain access token is:
curl -X POST -H "Content-Type: application/x-www-form-urlencoded;charset=UTF-8" --data-urlencode "client_id=<ad-client-id>" --data-urlencode "client_secret=<ad-client-secret>" --data-urlencode "username=<ad-user-email>" --data-urlencode "password=<my-password>" --data-urlencode "grant_type=password" --data-urlencode "scope=<scope-as-in-ad>" 'https://login.microsoftonline.com/<tenant-id>/oauth2/v2.0/token'
Update:
Tried using the command:
select system$verify_external_oauth_token('<access_token>');
to validate if the token was valid in Snowflake and obtained the result:
Token Validation finished.{"Validation Result":"Failed","Failure Reason":"EXTERNAL_OAUTH_INVALID_SIGNATURE"}
This is strange because I have added the correct issuer based on the configuration step(entityId from the Federation metadata document
)

Authorization failing with 401,while trying to access azure rest apis

I'm trying to call the inbuilt azure API by bearer token generation. The bearer token is generated using "https://login.microsoftonline.com/{tenantID}/oauth2/token,"; and using this token, I'm trying to access the get device API from IoT Hub. The headers i am providing for the REST API call are content-type and Authorization(with the bearer token). But it is returning an error message as below:
Message;:;ErrorCode:IotHubUnauthorized;3cc43d2f-def7-4a3e-a2ue-eb367467ab90 is not valid;
Can anyone please help me in solving this?
To connect to your IoT Hub's Service API, you need a shared access token, not an oauth2 token. You can generate the token you need to set in your header through the az cli
az iot hub generate-sas-token -n {iothub_name}
If you like a more visual approach, you can use the Device Explorer. You can simply enter your IoT Hub connection string with service connect or iothubowner right and generate the token.
You can then use the service endpoints of your IoT Hub, here's an example curl request:
curl --request GET \
https://<hub-name>.azure-devices.net/devices?api-version=2018-06-30 \
--header "Authorization: SharedAccessSignature sr=<hub-name>.azure-devices.net&sig=KSobATNRdkFtd999999990v7NYU4hitkTA3ts%3D&se=1626508840&skn=iothubowner"

Resources