ApplicationsOperations object construction - azure

I want to automate application creation in Azure with python. My goal is to execute it with AWS Lambda.
I have found ApplicationsOperations class, but I don't understand how to use it.
For the client part it's ok with a GraphRbacManagementClient object
But for config, serializer and deserializer parameters I don't know how to construct them.
Is someone here has code sample for ApplicationsOperations ?

You don't use it directly, you create a GraphrBac client and you use the "applications" attribute:
https://learn.microsoft.com/en-us/python/api/overview/azure/graph-rbac?view=azure-python
from azure.graphrbac import GraphRbacManagementClient
from azure.common.credentials import UserPassCredentials
credentials = UserPassCredentials(
'user#domain.com', # Your user
'my_password', # Your password
resource="https://graph.windows.net"
)
tenant_id = "myad.onmicrosoft.com"
graphrbac_client = GraphRbacManagementClient(
credentials,
tenant_id
)
apps = list(graphrbac_client.applications.list(
filter="displayName eq 'pytest_app'"
))

Related

Query on Microsoft Graph API Python

I want to pull emails by Graph API from client inbox using python.
I started with a tutorial and successfully experimented over my personal inbox.
My problem,
Every time my code generates an authorization URL.
I have to browse through it (using web browser library) , sign in using my credentials and copy paste the authorization code for generating access token.
Which is a lot of manual work every time.
Question :
Is there a way to automate the whole process of token generation ?
Such that my client only shares his application id and client secret, and email is pulled without his sign in credentials ?
My code is attached below -
import msal
from msal import PublicClientApplication
import webbrowser
import requests
import pandas as pd
APPLICATION_ID="app id"
CLIENT_SECRET="client secret"
authority_url='https://login.microsoftonline.com/common/'
base_url = 'https://graph.microsoft.com/v1.0/'
endpoint_url = base_url+'me'
SCOPES = ['Mail.Read','Mail.ReadBasic']
client_instance = msal.ConfidentialClientApplication(client_id = APPLICATION_ID,client_credential = CLIENT_SECRET,authority = authority_url)
authorization_request_url=client_instance.get_authorization_request_url(SCOPES)
#print(authorization_request_url)
# browsing authorization request URL for retrieving authorization code.
webbrowser.open(authorization_request_url,new=True)
# Manually pasting authorization code.
authorization_code='authorization code from authorization URL'
access_token = client_instance.acquire_token_by_authorization_code(code=authorization_code,scopes=SCOPES)
access_token_id=access_token['access_token']
# Rest of the codes are for hitting the end point and retrieving the messages
Any help with code suggestions will be much appreciated.
Thanks in advance
If you would like to authenticate only with a clientId and clientSecret, without any user context, you should leverage a client credentials flow.
You can check this official MS sample that uses the same MSAL library to handle the client credentials flow. It is quite straightforward, as you can see below:
import sys # For simplicity, we'll read config file from 1st CLI param sys.argv[1]
import json
import logging
import requests
import msal
# Optional logging
# logging.basicConfig(level=logging.DEBUG)
config = json.load(open(sys.argv[1]))
# Create a preferably long-lived app instance which maintains a token cache.
app = msal.ConfidentialClientApplication(
config["client_id"], authority=config["authority"],
client_credential=config["secret"],
# token_cache=... # Default cache is in memory only.
# You can learn how to use SerializableTokenCache from
# https://msal-python.rtfd.io/en/latest/#msal.SerializableTokenCache
)
# The pattern to acquire a token looks like this.
result = None
# Firstly, looks up a token from cache
# Since we are looking for token for the current app, NOT for an end user,
# notice we give account parameter as None.
result = app.acquire_token_silent(config["scope"], account=None)
if not result:
logging.info("No suitable token exists in cache. Let's get a new one from AAD.")
result = app.acquire_token_for_client(scopes=config["scope"])
if "access_token" in result:
# Calling graph using the access token
graph_data = requests.get( # Use token to call downstream service
config["endpoint"],
headers={'Authorization': 'Bearer ' + result['access_token']}, ).json()
print("Graph API call result: ")
print(json.dumps(graph_data, indent=2))
else:
print(result.get("error"))
print(result.get("error_description"))
print(result.get("correlation_id")) # You may need this when reporting a bug
The sample is retrieving a list of users from MS Graph, but it should be just a matter of adapting it to retrieve the list of emails of a specific user by changing the "endpoint" parameter in the parameters.json file to:
"endpoint": "https://graph.microsoft.com/v1.0/users//users/{id | userPrincipalName}/messages"
You can check here more information regarding the MS Graph request to list emails.
register your app
get your tenant id from azure portal and disable mfa
application_id = "xxxxxxxxxx"
client_secret = "xxxxxxxxxxxxx"
#authority_url = "xxxxxxxxxxx"
authority_url = 'xxxxxxxxxxxxxxxxxxxx'
base_url = "https://graph.microsoft.com/v1.0/"
endpoint = base_url+"me"
scopes = ["User.Read"]
tenant_id = "xxxxxxxxxxxx"
token_url = 'https://login.microsoftonline.com/'+tenant_id+'/oauth2/token'
token_data = {
'grant_type': 'password',
'client_id': application_id,
'client_secret': client_secret,
'resource': 'https://graph.microsoft.com',
'scope':'https://graph.microsoft.com',
'username':'xxxxxxxxxxxxxxxx', # Account with no 2MFA
'password':'xxxxxxxxxxxxxxxx',
}
token_r = requests.post(token_url, data=token_data)
token = token_r.json().get('access_token')
print(token)

Authorization for Airflow 2.0 and Azure AD using custom role - Override oauth_user_info method in AirflowSecurityManager

I am deploying Airflow 2.0 on Azure and using Azure AD for Authentication and Authorization.
Created App registration and Custom roles. Authentication with AD works. need to read role claims in token to do the role mapping .. Below is my web Config ... For this I am overriding oauth_user_info method in AirflowSecurityManager to read the roles claim in Azure AD token and assign it to role_keys..
Problem is when I do this and assign 'SECURITY_MANAGER_CLASS = AzureCustomSecurity, although web UI POD gets up without any error but when I test it appears that custom class method oauth_user_info is not called at all ... if I remove the custom class then I can see authentication works and I can see the token in logs.... so problem is that this custom class in not invoked in the authentication & Authorization process. Any one can help here ?
I can see a similar kind of thing working for AWS congnito in a post here
AWS Cognito OAuth configuration for Flask Appbuilder
This is my Web config
import os
import json
from airflow.configuration import conf
from flask_appbuilder.security.manager import AUTH_OAUTH
from airflow.www.security import AirflowSecurityManager
from customsecmanager import AzureDlabSecurity
SQLALCHEMY_DATABASE_URI = conf.get("core", "SQL_ALCHEMY_CONN")
basedir = os.path.abspath(os.path.dirname(__file__))
CSRF_ENABLED = True
AUTH_TYPE = AUTH_OAUTH
AUTH_USER_REGISTRATION_ROLE = "Public"
AUTH_USER_REGISTRATION = True
class AzureCustomSecurity(AirflowSecurityManager):
def oauth_user_info(self, provider, response=None):
log.debug("inside custom method")
if provider == "azure":
log.debug("Azure response received : {0}".format(resp))
id_token = resp["id_token"]
log.debug(str(id_token))
me = self._azure_jwt_token_parse(id_token)
log.debug("Parse JWT token : {0}".format(me))
return {
"name": me["name"],
"email": me["upn"],
"first_name": me["given_name"],
"last_name": me["family_name"],
"id": me["oid"],
"username": me["oid"],
"role_keys": me["roles"],
}
else:
return {}
OAUTH_PROVIDERS = [{
'name':'azure',
'token_key':'access_token',
'icon':'fa-windows',
'remote_app': {
'base_url':'https://graph.microsoft.com/v1.0/',
'request_token_params' :{'scope': 'openid'},
'access_token_url':'https://login.microsoftonline.com/<tenantid>/oauth2/token',
'authorize_url':'https://login.microsoftonline.com/<tenantid>/oauth2/authorize',
'request_token_url': None,
'client_id':'XXXXXXXX',
'client_secret':'******'
}
}]
# a mapping from the values of `userinfo["role_keys"]` to a list of FAB roles
AUTH_ROLES_MAPPING = {
"Viewer": ["Viewer"],
"Admin": ["Admin"],
#}
AUTH_ROLES_SYNC_AT_LOGIN = True
SECURITY_MANAGER_CLASS = AzureCustomSecurity
I tried to reproduce your code, but instead of overriding oauth_user_info I've overridden get_oauth_user_info method (docs link) and it worked. May you try this?
class AzureCustomSecurity(AirflowSecurityManager):
def get_oauth_user_info(self, provider, resp):
""" you code here... """

How to refresh the boto3 credetials when python script is running indefinitely

I am trying to write a python script that uses watchdog to look for file creation and upload that to s3 using boto3. However, my boto3 credentials expire after every 12hrs, So I need to renew them. I am storing my boto3 credentials in ~/.aws/credentials. So right now I am trying to catch the S3UploadFailedError, renew the credentials, and write them to ~/.aws/credentials. But though the credentials are getting renewed and I am calling boto3.client('s3') again its throwing exception.
What am I doing wrong? Or how can I resolve it?
Below is the code snippet
try:
s3 = boto3.client('s3')
s3.upload_file(event.src_path,'bucket-name',event.src_path)
except boto3.exceptions.S3UploadFailedError as e:
print(e)
get_aws_credentials()
s3 = boto3.client('s3')
I have found a good example to refresh the credentials within this link:
https://pritul95.github.io/blogs/boto3/2020/08/01/refreshable-boto3-session/
but there this a little bug inside. Be careful about that.
Here is the corrected code:
from uuid import uuid4
from datetime import datetime
from time import time
from boto3 import Session
from botocore.credentials import RefreshableCredentials
from botocore.session import get_session
class RefreshableBotoSession:
"""
Boto Helper class which lets us create refreshable session, so that we can cache the client or resource.
Usage
-----
session = RefreshableBotoSession().refreshable_session()
client = session.client("s3") # we now can cache this client object without worrying about expiring credentials
"""
def __init__(
self,
region_name: str = None,
profile_name: str = None,
sts_arn: str = None,
session_name: str = None,
session_ttl: int = 3000
):
"""
Initialize `RefreshableBotoSession`
Parameters
----------
region_name : str (optional)
Default region when creating new connection.
profile_name : str (optional)
The name of a profile to use.
sts_arn : str (optional)
The role arn to sts before creating session.
session_name : str (optional)
An identifier for the assumed role session. (required when `sts_arn` is given)
session_ttl : int (optional)
An integer number to set the TTL for each session. Beyond this session, it will renew the token.
50 minutes by default which is before the default role expiration of 1 hour
"""
self.region_name = region_name
self.profile_name = profile_name
self.sts_arn = sts_arn
self.session_name = session_name or uuid4().hex
self.session_ttl = session_ttl
def __get_session_credentials(self):
"""
Get session credentials
"""
session = Session(region_name=self.region_name, profile_name=self.profile_name)
# if sts_arn is given, get credential by assuming given role
if self.sts_arn:
sts_client = session.client(service_name="sts", region_name=self.region_name)
response = sts_client.assume_role(
RoleArn=self.sts_arn,
RoleSessionName=self.session_name,
DurationSeconds=self.session_ttl,
).get("Credentials")
credentials = {
"access_key": response.get("AccessKeyId"),
"secret_key": response.get("SecretAccessKey"),
"token": response.get("SessionToken"),
"expiry_time": response.get("Expiration").isoformat(),
}
else:
session_credentials = session.get_credentials().__dict__
credentials = {
"access_key": session_credentials.get("access_key"),
"secret_key": session_credentials.get("secret_key"),
"token": session_credentials.get("token"),
"expiry_time": datetime.fromtimestamp(time() + self.session_ttl).isoformat(),
}
return credentials
def refreshable_session(self) -> Session:
"""
Get refreshable boto3 session.
"""
# get refreshable credentials
refreshable_credentials = RefreshableCredentials.create_from_metadata(
metadata=self.__get_session_credentials(),
refresh_using=self.__get_session_credentials,
method="sts-assume-role",
)
# attach refreshable credentials current session
session = get_session()
session._credentials = refreshable_credentials
session.set_config_variable("region", self.region_name)
autorefresh_session = Session(botocore_session=session)
return autorefresh_session
According to the documentation, the client looks in several locations for credentials and there are other options that are also more programmatic-friendly that you might want to consider instead of the .aws/credentials file.
Quoting the docs:
The order in which Boto3 searches for credentials is:
Passing credentials as parameters in the boto.client() method
Passing credentials as parameters when creating a Session object
Environment variables
Shared credential file (~/.aws/credentials)
AWS config file (~/.aws/config)
Assume Role provider
In your case, since you are already catching the exception and renewing the credentials, I would simply pass the new ones to a new instance of the client like so:
client = boto3.client(
's3',
aws_access_key_id=NEW_ACCESS_KEY,
aws_secret_access_key=NEW_SECRET_KEY,
aws_session_token=NEW_SESSION_TOKEN
)
If instead you are using these same credentials elsewhere in the code to create other clients, I'd consider setting them as environment variables:
import os
os.environ['AWS_ACCESS_KEY_ID'] = NEW_ACCESS_KEY
os.environ['AWS_SECRET_ACCESS_KEY'] = NEW_SECRET_KEY
os.environ['AWS_SESSION_TOKEN'] = NEW_SESSION_TOKEN
Again, quoting the docs:
The session key for your AWS account [...] is only needed when you are using temporary credentials.
Here is my implementation which only generates new credentials if existing credentials expire using a singleton design pattern
import boto3
from datetime import datetime
from dateutil.tz import tzutc
import os
import binascii
class AssumeRoleProd:
__credentials = None
def __init__(self):
assert True==False
#staticmethod
def __setCredentials():
print("\n\n ======= GENERATING NEW SESSION TOKEN ======= \n\n")
# create an STS client object that represents a live connection to the
# STS service
sts_client = boto3.client('sts')
# Call the assume_role method of the STSConnection object and pass the role
# ARN and a role session name.
assumed_role_object = sts_client.assume_role(
RoleArn=your_role_here,
RoleSessionName=f"AssumeRoleSession{binascii.b2a_hex(os.urandom(15)).decode('UTF-8')}"
)
# From the response that contains the assumed role, get the temporary
# credentials that can be used to make subsequent API calls
AssumeRoleProd.__credentials = assumed_role_object['Credentials']
#staticmethod
def getTempCredentials():
credsExpired = False
# Return object for the first time
if AssumeRoleProd.__credentials is None:
AssumeRoleProd.__setCredentials()
credsExpired = True
# Generate if only 5 minutes are left for expiry. You may setup for entire 60 minutes by catching botocore ClientException
elif (AssumeRoleProd.__credentials['Expiration']-datetime.now(tzutc())).seconds//60<=5:
AssumeRoleProd.__setCredentials()
credsExpired = True
return AssumeRoleProd.__credentials
And then I am using singleton design pattern for client as well which would generate a new client only if new session is generated. You can add region as well if required.
class lambdaClient:
__prodClient = None
def __init__(self):
assert True==False
#staticmethod
def __initProdClient():
credsExpired, credentials = AssumeRoleProd.getTempCredentials()
if lambdaClient.__prodClient is None or credsExpired:
lambdaClient.__prodClient = boto3.client('lambda',
aws_access_key_id=credentials['AccessKeyId'],
aws_secret_access_key=credentials['SecretAccessKey'],
aws_session_token=credentials['SessionToken'])
return lambdaClient.__prodClient
#staticmethod
def getProdClient():
return lambdaClient.__initProdClient()

How to send a GraphQL query to AppSync from python?

How do we post a GraphQL request through AWS AppSync using boto?
Ultimately I'm trying to mimic a mobile app accessing our stackless/cloudformation stack on AWS, but with python. Not javascript or amplify.
The primary pain point is authentication; I've tried a dozen different ways already. This the current one, which generates a "401" response with "UnauthorizedException" and "Permission denied", which is actually pretty good considering some of the other messages I've had. I'm now using the 'aws_requests_auth' library to do the signing part. I assume it authenticates me using the stored /.aws/credentials from my local environment, or does it?
I'm a little confused as to where and how cognito identities and pools will come into it. eg: say I wanted to mimic the sign-up sequence?
Anyways the code looks pretty straightforward; I just don't grok the authentication.
from aws_requests_auth.boto_utils import BotoAWSRequestsAuth
APPSYNC_API_KEY = 'inAppsyncSettings'
APPSYNC_API_ENDPOINT_URL = 'https://aaaaaaaaaaaavzbke.appsync-api.ap-southeast-2.amazonaws.com/graphql'
headers = {
'Content-Type': "application/graphql",
'x-api-key': APPSYNC_API_KEY,
'cache-control': "no-cache",
}
query = """{
GetUserSettingsByEmail(email: "john#washere"){
items {name, identity_id, invite_code}
}
}"""
def test_stuff():
# Use the library to generate auth headers.
auth = BotoAWSRequestsAuth(
aws_host='aaaaaaaaaaaavzbke.appsync-api.ap-southeast-2.amazonaws.com',
aws_region='ap-southeast-2',
aws_service='appsync')
# Create an http graphql request.
response = requests.post(
APPSYNC_API_ENDPOINT_URL,
json={'query': query},
auth=auth,
headers=headers)
print(response)
# this didn't work:
# response = requests.post(APPSYNC_API_ENDPOINT_URL, data=json.dumps({'query': query}), auth=auth, headers=headers)
Yields
{
"errors" : [ {
"errorType" : "UnauthorizedException",
"message" : "Permission denied"
} ]
}
It's quite simple--once you know. There are some things I didn't appreciate:
I've assumed IAM authentication (OpenID appended way below)
There are a number of ways for appsync to handle authentication. We're using IAM so that's what I need to deal with, yours might be different.
Boto doesn't come into it.
We want to issue a request like any regular punter, they don't use boto, and neither do we. Trawling the AWS boto docs was a waste of time.
Use the AWS4Auth library
We are going to send a regular http request to aws, so whilst we can use python requests they need to be authenticated--by attaching headers.
And, of course, AWS auth headers are special and different from all others.
You can try to work out how to do it
yourself, or you can go looking for someone else who has already done it: Aws_requests_auth, the one I started with, probably works just fine, but I have ended up with AWS4Auth. There are many others of dubious value; none endorsed or provided by Amazon (that I could find).
Specify appsync as the "service"
What service are we calling? I didn't find any examples of anyone doing this anywhere. All the examples are trivial S3 or EC2 or even EB which left uncertainty. Should we be talking to api-gateway service? Whatsmore, you feed this detail into the AWS4Auth routine, or authentication data. Obviously, in hindsight, the request is hitting Appsync, so it will be authenticated by Appsync, so specify "appsync" as the service when putting together the auth headers.
It comes together as:
import requests
from requests_aws4auth import AWS4Auth
# Use AWS4Auth to sign a requests session
session = requests.Session()
session.auth = AWS4Auth(
# An AWS 'ACCESS KEY' associated with an IAM user.
'AKxxxxxxxxxxxxxxx2A',
# The 'secret' that goes with the above access key.
'kwWxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxgEm',
# The region you want to access.
'ap-southeast-2',
# The service you want to access.
'appsync'
)
# As found in AWS Appsync under Settings for your endpoint.
APPSYNC_API_ENDPOINT_URL = 'https://nqxxxxxxxxxxxxxxxxxxxke'
'.appsync-api.ap-southeast-2.amazonaws.com/graphql'
# Use JSON format string for the query. It does not need reformatting.
query = """
query foo {
GetUserSettings (
identity_id: "ap-southeast-2:8xxxxxxb-7xx4-4xx4-8xx0-exxxxxxx2"
){
user_name, email, whatever
}}"""
# Now we can simply post the request...
response = session.request(
url=APPSYNC_API_ENDPOINT_URL,
method='POST',
json={'query': query}
)
print(response.text)
Which yields
# Your answer comes as a JSON formatted string in the text attribute, under data.
{"data":{"GetUserSettings":{"user_name":"0xxxxxxx3-9102-42f0-9874-1xxxxx7dxxx5"}}}
Getting credentials
To get rid of the hardcoded key/secret you can consume the local AWS ~/.aws/config and ~/.aws/credentials, and it is done this way...
# Use AWS4Auth to sign a requests session
session = requests.Session()
credentials = boto3.session.Session().get_credentials()
session.auth = AWS4Auth(
credentials.access_key,
credentials.secret_key,
boto3.session.Session().region_name,
'appsync',
session_token=credentials.token
)
...<as above>
This does seem to respect the environment variable AWS_PROFILE for assuming different roles.
Note that STS.get_session_token is not the way to do it, as it may try to assume a role from a role, depending where it keyword matched the AWS_PROFILE value. Labels in the credentials file will work because the keys are right there, but names found in the config file do not work, as that assumes a role already.
OpenID
In this scenario, all the complexity is transferred to the conversation with the openid connect provider. The hard stuff is all the auth hoops you jump through to get an access token, and thence using the refresh token to keep it alive. That is where all the real work lies.
Once you finally have an access token, assuming you have configured the "OpenID Connect" Authorization Mode in appsync, then you can, very simply, drop the access token into the header:
response = requests.post(
url="https://nc3xxxxxxxxxx123456zwjka.appsync-api.ap-southeast-2.amazonaws.com/graphql",
headers={"Authorization": ACCESS_TOKEN},
json={'query': "query foo{GetStuff{cat, dog, tree}}"}
)
You can set up an API key on the AppSync end and use the code below. This works for my case.
import requests
# establish a session with requests session
session = requests.Session()
# As found in AWS Appsync under Settings for your endpoint.
APPSYNC_API_ENDPOINT_URL = 'https://vxxxxxxxxxxxxxxxxxxy.appsync-api.ap-southeast-2.amazonaws.com/graphql'
# setup the query string (optional)
query = """query listItemsQuery {listItemsQuery {items {correlation_id, id, etc}}}"""
# Now we can simply post the request...
response = session.request(
url=APPSYNC_API_ENDPOINT_URL,
method='POST',
headers={'x-api-key': '<APIKEYFOUNDINAPPSYNCSETTINGS>'},
json={'query': query}
)
print(response.json()['data'])
Building off Joseph Warda's answer you can use the class below to send AppSync commands.
# fileName: AppSyncLibrary
import requests
class AppSync():
def __init__(self,data):
endpoint = data["endpoint"]
self.APPSYNC_API_ENDPOINT_URL = endpoint
self.api_key = data["api_key"]
self.session = requests.Session()
def graphql_operation(self,query,input_params):
response = self.session.request(
url=self.APPSYNC_API_ENDPOINT_URL,
method='POST',
headers={'x-api-key': self.api_key},
json={'query': query,'variables':{"input":input_params}}
)
return response.json()
For example in another file within the same directory:
from AppSyncLibrary import AppSync
APPSYNC_API_ENDPOINT_URL = {YOUR_APPSYNC_API_ENDPOINT}
APPSYNC_API_KEY = {YOUR_API_KEY}
init_params = {"endpoint":APPSYNC_API_ENDPOINT_URL,"api_key":APPSYNC_API_KEY}
app_sync = AppSync(init_params)
mutation = """mutation CreatePost($input: CreatePostInput!) {
createPost(input: $input) {
id
content
}
}
"""
input_params = {
"content":"My first post"
}
response = app_sync.graphql_operation(mutation,input_params)
print(response)
Note: This requires you to activate API access for your AppSync API. Check this AWS post for more details.
graphql-python/gql supports AWS AppSync since version 3.0.0rc0.
It supports queries, mutation and even subscriptions on the realtime endpoint.
The documentation is available here
Here is an example of a mutation using the API Key authentication:
import asyncio
import os
import sys
from urllib.parse import urlparse
from gql import Client, gql
from gql.transport.aiohttp import AIOHTTPTransport
from gql.transport.appsync_auth import AppSyncApiKeyAuthentication
# Uncomment the following lines to enable debug output
# import logging
# logging.basicConfig(level=logging.DEBUG)
async def main():
# Should look like:
# https://XXXXXXXXXXXXXXXXXXXXXXXXXX.appsync-api.REGION.amazonaws.com/graphql
url = os.environ.get("AWS_GRAPHQL_API_ENDPOINT")
api_key = os.environ.get("AWS_GRAPHQL_API_KEY")
if url is None or api_key is None:
print("Missing environment variables")
sys.exit()
# Extract host from url
host = str(urlparse(url).netloc)
auth = AppSyncApiKeyAuthentication(host=host, api_key=api_key)
transport = AIOHTTPTransport(url=url, auth=auth)
async with Client(
transport=transport, fetch_schema_from_transport=False,
) as session:
query = gql(
"""
mutation createMessage($message: String!) {
createMessage(input: {message: $message}) {
id
message
createdAt
}
}"""
)
variable_values = {"message": "Hello world!"}
result = await session.execute(query, variable_values=variable_values)
print(result)
asyncio.run(main())
I am unable to add a comment due to low rep, but I just want to add that I tried the accepted answer and it didn't work. I was getting an error saying my session_token is invalid. Probably because I was using AWS Lambda.
I got it to work pretty much exactly, but by adding to the session token parameter of the aws4auth object. Here's the full piece:
import requests
import os
from requests_aws4auth import AWS4Auth
def AppsyncHandler(event, context):
# These are env vars that are always present in an AWS Lambda function
# If not using AWS Lambda, you'll need to add them manually to your env.
access_id = os.environ.get("AWS_ACCESS_KEY_ID")
secret_key = os.environ.get("AWS_SECRET_ACCESS_KEY")
session_token = os.environ.get("AWS_SESSION_TOKEN")
region = os.environ.get("AWS_REGION")
# Your AppSync Endpoint
api_endpoint = os.environ.get("AppsyncConnectionString")
resource = "appsync"
session = requests.Session()
session.auth = AWS4Auth(access_id,
secret_key,
region,
resource,
session_token=session_token)
The rest is the same.
Hope this Helps Everyone
import requests
import json
import os
from dotenv import load_dotenv
load_dotenv(".env")
class AppSync(object):
def __init__(self,data):
endpoint = data["endpoint"]
self.APPSYNC_API_ENDPOINT_URL = endpoint
self.api_key = data["api_key"]
self.session = requests.Session()
def graphql_operation(self,query,input_params):
response = self.session.request(
url=self.APPSYNC_API_ENDPOINT_URL,
method='POST',
headers={'x-api-key': self.api_key},
json={'query': query,'variables':{"input":input_params}}
)
return response.json()
def main():
APPSYNC_API_ENDPOINT_URL = os.getenv("APPSYNC_API_ENDPOINT_URL")
APPSYNC_API_KEY = os.getenv("APPSYNC_API_KEY")
init_params = {"endpoint":APPSYNC_API_ENDPOINT_URL,"api_key":APPSYNC_API_KEY}
app_sync = AppSync(init_params)
mutation = """
query MyQuery {
getAccountId(id: "5ca4bbc7a2dd94ee58162393") {
_id
account_id
limit
products
}
}
"""
input_params = {}
response = app_sync.graphql_operation(mutation,input_params)
print(json.dumps(response , indent=3))
main()

Connect to service bus using python SAS token

I need to connect to azure service bus using SAS token(generate and connect).
I don't see anything for the python implementation.
This link provides the implementation for Eventhubs -
https://learn.microsoft.com/en-us/rest/api/eventhub/generate-sas-token#python
Not sure where I can find the python implementation for servicebus.
I have found a way where you can do it for the ServiceBusService Class.
After running "pip install azure.servicebus", I imported it as:
from azure.servicebus.control_client import ServiceBusService
The ServiceBusService constructor takes an argument called "authentication", which isn't specified by default.
If you got into the ServiceBusService init file, you can see how authentication is handled in more detail.
if authentication:
self.authentication = authentication
else:
if not account_key:
account_key = os.environ.get(AZURE_SERVICEBUS_ACCESS_KEY)
if not issuer:
issuer = os.environ.get(AZURE_SERVICEBUS_ISSUER)
if shared_access_key_name and shared_access_key_value:
self.authentication = ServiceBusSASAuthentication(
shared_access_key_name,
shared_access_key_value)
elif account_key and issuer:
self.authentication = ServiceBusWrapTokenAuthentication(
account_key,
issuer)
If you don't pass a custom authentication object, it will then try to use the ServiceBusSASAuthentication class, which is the default if you populate the shared_access_key_name and shared_access_key_value.
So if you jump into the ServiceBusSASAuthentication class, you'll notice something useful.
class ServiceBusSASAuthentication:
def __init__(self, key_name, key_value):
self.key_name = key_name
self.key_value = key_value
self.account_key = None
self.issuer = None
def sign_request(self, request, httpclient):
request.headers.append(
('Authorization', self._get_authorization(request, httpclient)))
def _get_authorization(self, request, httpclient):
uri = httpclient.get_uri(request)
uri = url_quote(uri, '').lower()
expiry = str(self._get_expiry())
to_sign = uri + '\n' + expiry
signature = url_quote(_sign_string(self.key_value, to_sign, False), '')
auth_format = 'SharedAccessSignature sig={0}&se={1}&skn={2}&sr={3}' # <----awww, yeah
auth = auth_format.format(signature, expiry, self.key_name, uri)
return auth # <--after inserting values into string, the SAS Token is just returned.
def _get_expiry(self): # pylint: disable=no-self-use
'''Returns the UTC datetime, in seconds since Epoch, when this signed
request expires (5 minutes from now).'''
return int(round(time.time() + 300))
The sign_request function is the only function that will be directly referenced by the ServiceBusService class when it does authentication, but you'll notice that all its doing is adding an authentication header to a request that...IS IN THE FORMAT OF A SAS TOKEN.
So at this point I had all the information I needed to make my own authentication class. I made one that looked exactly like this.
class ServiceBusSASTokenAuthentication:
def __init__(self, sas_token):
self.sas_token = sas_token
# this method is the one used by ServiceBusService for authentication, need to leave signature as is
# even though we don't use httpClient like the original.
def sign_request(self, request, httpclient):
request.headers.append(
('Authorization', self._get_authorization())
)
def _get_authorization(self):
return self.sas_token
I probably could get rid of the _get_auth function all together, but I haven't polished everything up yet.
So now if you call this class like so in the ServiceBusService constructor with a valid SAS Token, it should work.
subscription_client = ServiceBusService(
authentication=ServiceBusSASTokenAuthentication(sas_token=sas_token),
service_namespace=service_namespace
)
Once you create the Service Bus using Azure Portal, ServiceBusService object enables you to work with queues.
Follow this document for more information on creating the queue, sending message to queue, receiving message from a queue using python to programmatically access the Service Bus.

Resources