How do I get id_token to properly load in Cloud Run? - python-3.x

I have a Django app that I have been working on. When I run it locally it runs perfectly. When I run it in a container using Cloud Run I get the following error:
'Credentials' object has no attribute 'id_token'
Here is the offending code (payload is a dictionary object):
def ProcessPayload(payload):
# Get authorized session credentials
credentials, _ = google.auth.default()
session = AuthorizedSession(credentials)
credentials.refresh(Request(session))
# Process post request
headers = {'Authorization': f'Bearer {credentials.id_token}'}
response = requests.post(URL, json=payload, headers=headers)
In my local environment, the refresh properly loads credentials with the correct id_toled for the needed header, but for some reason when the code is deployed to Cloud Run this does not work. I have the Cloud run instance set to use a service account so it should be able to get credentials from it. How do I make this work? I have googled until my fingers hurt and have found no viable solutions.

When executing code under a Compute Service (Compute Engine, Cloud Run, Cloud Functions), call the metadata service to obtain an OIDC Identity Token.
import requests
METADATA_HEADERS = {'Metadata-Flavor': 'Google'}
METADATA_URL = 'http://metadata.google.internal/computeMetadata/v1/' \
'instance/service-accounts/default/identity?' \
'audience={}'
def fetch_identity_token(audience):
# Construct a URL with the audience and format.
url = METADATA_URL.format(audience)
# Request a token from the metadata server.
r = requests.get(url, headers=METADATA_HEADERS)
r.raise_for_status()
return r.text
def ProcessPayload(payload):
id_token = fetch_identity_token('replace_with_service_url')
# Process post request
headers = {'Authorization': f'Bearer {id_token}'}
response = requests.post(URL, json=payload, headers=headers)
The equivalent curl command to fetch an Identity Token looks like this. You can test from a Compute Engine instance:
curl -H "metadata-flavor: Google" \
http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/identity?audience=URL
where URL is the URL of the service you are calling.
Authentication service-to-service
I have seen this metadata URL shortcut (for Cloud Run), but I have not verified it:
http://metadata/instance/service-accounts/default/identity?audience=URL

So, after much playing around I found a solution that works in both places. Many thanks to Paul Bonser for coming up with this simple method!
import google.auth
from google.auth.transport.requests import AuthorizedSession, Request
from google.oauth2.id_token import fetch_id_token
import requests
def GetIdToken(audience):
credentials, _ = google.auth.default()
session = AuthorizedSession(credentials)
request = Request(session)
credentials.refresh(request)
if hasattr(credentials, "id_token"):
return credentials.id_token
return fetch_id_token(request, audience)
def ProcessPayload(url, payload):
# Get the ID Token
id_token = GetIdToken(url)
# Process post request
headers = {'Authorization': f'Bearer {id_token}'}
response = requests.post(url, json=payload, headers=headers)

Related

Add (AWS Signature) Authorization to python requests

I am trying to make a GET request to an endpoint which uses AWS Authorization. I made request using postman, It works. But when i tried following method in python, it's giving error.
CODE
url = 'XXX'
payload = {}
amc_api_servicename = 'sts'
t = datetime.utcnow()
headers = {
'X-Amz-Date': t.strftime('%Y%m%dT%H%M%SZ'),
'Authorization': 'AWS4-HMAC-SHA256 Credential={}/{}/{}/{}/aws4_request,SignedHeaders=host;x-amz-date,Signature=3ab1067335503c5b1792b811eeb84998f3902e5fde925ec8678e0ff99373d08b'.format(amc_api_accesskey, current_date, amc_api_region, amc_api_servicename )
}
print(url, headers)
response = requests.request("GET", url, headers=headers, data=payload)
ERROR
The request signature we calculated does not match the signature you provided. Check your AWS Secret Access Key and signing method.
Please point me in the right direction.
import boto3
client = boto3.client('sts')
respone=client.assume_role(RoleArn='your i am urn',RoleSessionName='PostmanNNN')

Make requests to Google API with Python

I'm trying to make requests to the Google API to create source repositories using a service account and his JSON key file.
Since there are no client libraries for this product, I am using the queries with Python using this documentation
https://cloud.google.com/source-repositories/docs/reference/rest
I already used a similar code to invoke my cloud-functions with success, but this time I'm block for these requests at the 401 error. I set up the GOOGLE_APPLICATION_CREDENTIALS with the JSON of my service account, give the service-account the permissions of Source Repository Administrator, but still return 401.
Here's my code
import urllib.request
import json
import urllib
import google.auth.transport.requests
import google.oauth2.id_token
body = { "name" : "projects/$my_project_name/repos/$name_repo"}
jsondata = json.dumps(body).encode("utf8")
req = urllib.request.Request('https://sourcerepo.googleapis.com/v1/projects/$my_project_name/repos')
req.add_header('Content-Type', 'application/json; charset=utf-8')
auth_req = google.auth.transport.requests.Request()
id_token = google.oauth2.id_token.fetch_id_token(auth_req, 'https://www.googleapis.com/auth/cloud-platform')
req.add_header("Authorization", f"Bearer {id_token}")
response = urllib.request.urlopen(req, jsondata)
print (response.read().decode())
I tried also using the with an API-KEY at the end of the url like this
req = urllib.request.Request('https://sourcerepo.googleapis.com/v1/projects/$my_project_name/repos?key=$my-api-key')
Thank you
I tried also using the with an API-KEY at the end of the url like this
API Keys are not supported.
Your code is using an OIDC Identity Token instead of an OAuth Acess Token.
from google.oauth2 import service_account
credentials = service_account.Credentials.from_service_account_file(
'/path/to/key.json',
scopes=['https://www.googleapis.com/auth/cloud-platform'])
request = google.auth.transport.requests.Request()
credentials.refresh(request)
// Use the following code to add the access token:
req.add_header("Authorization", f"Bearer {credentials.token}")

How to send a GraphQL query to AppSync from python?

How do we post a GraphQL request through AWS AppSync using boto?
Ultimately I'm trying to mimic a mobile app accessing our stackless/cloudformation stack on AWS, but with python. Not javascript or amplify.
The primary pain point is authentication; I've tried a dozen different ways already. This the current one, which generates a "401" response with "UnauthorizedException" and "Permission denied", which is actually pretty good considering some of the other messages I've had. I'm now using the 'aws_requests_auth' library to do the signing part. I assume it authenticates me using the stored /.aws/credentials from my local environment, or does it?
I'm a little confused as to where and how cognito identities and pools will come into it. eg: say I wanted to mimic the sign-up sequence?
Anyways the code looks pretty straightforward; I just don't grok the authentication.
from aws_requests_auth.boto_utils import BotoAWSRequestsAuth
APPSYNC_API_KEY = 'inAppsyncSettings'
APPSYNC_API_ENDPOINT_URL = 'https://aaaaaaaaaaaavzbke.appsync-api.ap-southeast-2.amazonaws.com/graphql'
headers = {
'Content-Type': "application/graphql",
'x-api-key': APPSYNC_API_KEY,
'cache-control': "no-cache",
}
query = """{
GetUserSettingsByEmail(email: "john#washere"){
items {name, identity_id, invite_code}
}
}"""
def test_stuff():
# Use the library to generate auth headers.
auth = BotoAWSRequestsAuth(
aws_host='aaaaaaaaaaaavzbke.appsync-api.ap-southeast-2.amazonaws.com',
aws_region='ap-southeast-2',
aws_service='appsync')
# Create an http graphql request.
response = requests.post(
APPSYNC_API_ENDPOINT_URL,
json={'query': query},
auth=auth,
headers=headers)
print(response)
# this didn't work:
# response = requests.post(APPSYNC_API_ENDPOINT_URL, data=json.dumps({'query': query}), auth=auth, headers=headers)
Yields
{
"errors" : [ {
"errorType" : "UnauthorizedException",
"message" : "Permission denied"
} ]
}
It's quite simple--once you know. There are some things I didn't appreciate:
I've assumed IAM authentication (OpenID appended way below)
There are a number of ways for appsync to handle authentication. We're using IAM so that's what I need to deal with, yours might be different.
Boto doesn't come into it.
We want to issue a request like any regular punter, they don't use boto, and neither do we. Trawling the AWS boto docs was a waste of time.
Use the AWS4Auth library
We are going to send a regular http request to aws, so whilst we can use python requests they need to be authenticated--by attaching headers.
And, of course, AWS auth headers are special and different from all others.
You can try to work out how to do it
yourself, or you can go looking for someone else who has already done it: Aws_requests_auth, the one I started with, probably works just fine, but I have ended up with AWS4Auth. There are many others of dubious value; none endorsed or provided by Amazon (that I could find).
Specify appsync as the "service"
What service are we calling? I didn't find any examples of anyone doing this anywhere. All the examples are trivial S3 or EC2 or even EB which left uncertainty. Should we be talking to api-gateway service? Whatsmore, you feed this detail into the AWS4Auth routine, or authentication data. Obviously, in hindsight, the request is hitting Appsync, so it will be authenticated by Appsync, so specify "appsync" as the service when putting together the auth headers.
It comes together as:
import requests
from requests_aws4auth import AWS4Auth
# Use AWS4Auth to sign a requests session
session = requests.Session()
session.auth = AWS4Auth(
# An AWS 'ACCESS KEY' associated with an IAM user.
'AKxxxxxxxxxxxxxxx2A',
# The 'secret' that goes with the above access key.
'kwWxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxgEm',
# The region you want to access.
'ap-southeast-2',
# The service you want to access.
'appsync'
)
# As found in AWS Appsync under Settings for your endpoint.
APPSYNC_API_ENDPOINT_URL = 'https://nqxxxxxxxxxxxxxxxxxxxke'
'.appsync-api.ap-southeast-2.amazonaws.com/graphql'
# Use JSON format string for the query. It does not need reformatting.
query = """
query foo {
GetUserSettings (
identity_id: "ap-southeast-2:8xxxxxxb-7xx4-4xx4-8xx0-exxxxxxx2"
){
user_name, email, whatever
}}"""
# Now we can simply post the request...
response = session.request(
url=APPSYNC_API_ENDPOINT_URL,
method='POST',
json={'query': query}
)
print(response.text)
Which yields
# Your answer comes as a JSON formatted string in the text attribute, under data.
{"data":{"GetUserSettings":{"user_name":"0xxxxxxx3-9102-42f0-9874-1xxxxx7dxxx5"}}}
Getting credentials
To get rid of the hardcoded key/secret you can consume the local AWS ~/.aws/config and ~/.aws/credentials, and it is done this way...
# Use AWS4Auth to sign a requests session
session = requests.Session()
credentials = boto3.session.Session().get_credentials()
session.auth = AWS4Auth(
credentials.access_key,
credentials.secret_key,
boto3.session.Session().region_name,
'appsync',
session_token=credentials.token
)
...<as above>
This does seem to respect the environment variable AWS_PROFILE for assuming different roles.
Note that STS.get_session_token is not the way to do it, as it may try to assume a role from a role, depending where it keyword matched the AWS_PROFILE value. Labels in the credentials file will work because the keys are right there, but names found in the config file do not work, as that assumes a role already.
OpenID
In this scenario, all the complexity is transferred to the conversation with the openid connect provider. The hard stuff is all the auth hoops you jump through to get an access token, and thence using the refresh token to keep it alive. That is where all the real work lies.
Once you finally have an access token, assuming you have configured the "OpenID Connect" Authorization Mode in appsync, then you can, very simply, drop the access token into the header:
response = requests.post(
url="https://nc3xxxxxxxxxx123456zwjka.appsync-api.ap-southeast-2.amazonaws.com/graphql",
headers={"Authorization": ACCESS_TOKEN},
json={'query': "query foo{GetStuff{cat, dog, tree}}"}
)
You can set up an API key on the AppSync end and use the code below. This works for my case.
import requests
# establish a session with requests session
session = requests.Session()
# As found in AWS Appsync under Settings for your endpoint.
APPSYNC_API_ENDPOINT_URL = 'https://vxxxxxxxxxxxxxxxxxxy.appsync-api.ap-southeast-2.amazonaws.com/graphql'
# setup the query string (optional)
query = """query listItemsQuery {listItemsQuery {items {correlation_id, id, etc}}}"""
# Now we can simply post the request...
response = session.request(
url=APPSYNC_API_ENDPOINT_URL,
method='POST',
headers={'x-api-key': '<APIKEYFOUNDINAPPSYNCSETTINGS>'},
json={'query': query}
)
print(response.json()['data'])
Building off Joseph Warda's answer you can use the class below to send AppSync commands.
# fileName: AppSyncLibrary
import requests
class AppSync():
def __init__(self,data):
endpoint = data["endpoint"]
self.APPSYNC_API_ENDPOINT_URL = endpoint
self.api_key = data["api_key"]
self.session = requests.Session()
def graphql_operation(self,query,input_params):
response = self.session.request(
url=self.APPSYNC_API_ENDPOINT_URL,
method='POST',
headers={'x-api-key': self.api_key},
json={'query': query,'variables':{"input":input_params}}
)
return response.json()
For example in another file within the same directory:
from AppSyncLibrary import AppSync
APPSYNC_API_ENDPOINT_URL = {YOUR_APPSYNC_API_ENDPOINT}
APPSYNC_API_KEY = {YOUR_API_KEY}
init_params = {"endpoint":APPSYNC_API_ENDPOINT_URL,"api_key":APPSYNC_API_KEY}
app_sync = AppSync(init_params)
mutation = """mutation CreatePost($input: CreatePostInput!) {
createPost(input: $input) {
id
content
}
}
"""
input_params = {
"content":"My first post"
}
response = app_sync.graphql_operation(mutation,input_params)
print(response)
Note: This requires you to activate API access for your AppSync API. Check this AWS post for more details.
graphql-python/gql supports AWS AppSync since version 3.0.0rc0.
It supports queries, mutation and even subscriptions on the realtime endpoint.
The documentation is available here
Here is an example of a mutation using the API Key authentication:
import asyncio
import os
import sys
from urllib.parse import urlparse
from gql import Client, gql
from gql.transport.aiohttp import AIOHTTPTransport
from gql.transport.appsync_auth import AppSyncApiKeyAuthentication
# Uncomment the following lines to enable debug output
# import logging
# logging.basicConfig(level=logging.DEBUG)
async def main():
# Should look like:
# https://XXXXXXXXXXXXXXXXXXXXXXXXXX.appsync-api.REGION.amazonaws.com/graphql
url = os.environ.get("AWS_GRAPHQL_API_ENDPOINT")
api_key = os.environ.get("AWS_GRAPHQL_API_KEY")
if url is None or api_key is None:
print("Missing environment variables")
sys.exit()
# Extract host from url
host = str(urlparse(url).netloc)
auth = AppSyncApiKeyAuthentication(host=host, api_key=api_key)
transport = AIOHTTPTransport(url=url, auth=auth)
async with Client(
transport=transport, fetch_schema_from_transport=False,
) as session:
query = gql(
"""
mutation createMessage($message: String!) {
createMessage(input: {message: $message}) {
id
message
createdAt
}
}"""
)
variable_values = {"message": "Hello world!"}
result = await session.execute(query, variable_values=variable_values)
print(result)
asyncio.run(main())
I am unable to add a comment due to low rep, but I just want to add that I tried the accepted answer and it didn't work. I was getting an error saying my session_token is invalid. Probably because I was using AWS Lambda.
I got it to work pretty much exactly, but by adding to the session token parameter of the aws4auth object. Here's the full piece:
import requests
import os
from requests_aws4auth import AWS4Auth
def AppsyncHandler(event, context):
# These are env vars that are always present in an AWS Lambda function
# If not using AWS Lambda, you'll need to add them manually to your env.
access_id = os.environ.get("AWS_ACCESS_KEY_ID")
secret_key = os.environ.get("AWS_SECRET_ACCESS_KEY")
session_token = os.environ.get("AWS_SESSION_TOKEN")
region = os.environ.get("AWS_REGION")
# Your AppSync Endpoint
api_endpoint = os.environ.get("AppsyncConnectionString")
resource = "appsync"
session = requests.Session()
session.auth = AWS4Auth(access_id,
secret_key,
region,
resource,
session_token=session_token)
The rest is the same.
Hope this Helps Everyone
import requests
import json
import os
from dotenv import load_dotenv
load_dotenv(".env")
class AppSync(object):
def __init__(self,data):
endpoint = data["endpoint"]
self.APPSYNC_API_ENDPOINT_URL = endpoint
self.api_key = data["api_key"]
self.session = requests.Session()
def graphql_operation(self,query,input_params):
response = self.session.request(
url=self.APPSYNC_API_ENDPOINT_URL,
method='POST',
headers={'x-api-key': self.api_key},
json={'query': query,'variables':{"input":input_params}}
)
return response.json()
def main():
APPSYNC_API_ENDPOINT_URL = os.getenv("APPSYNC_API_ENDPOINT_URL")
APPSYNC_API_KEY = os.getenv("APPSYNC_API_KEY")
init_params = {"endpoint":APPSYNC_API_ENDPOINT_URL,"api_key":APPSYNC_API_KEY}
app_sync = AppSync(init_params)
mutation = """
query MyQuery {
getAccountId(id: "5ca4bbc7a2dd94ee58162393") {
_id
account_id
limit
products
}
}
"""
input_params = {}
response = app_sync.graphql_operation(mutation,input_params)
print(json.dumps(response , indent=3))
main()

Error calling CF API login one time passcode

I am working with the CF API RESTful services. Trying to get an access token from cloud foundry's UAA API using https://login..../oauth/token web method.
I have verified that headers & body content is correct, but calling the api always returns a 400 error code with message missing grant type.
I have implemented this call in Objective-C, Swift & now Python. All tests return the same result. Here is my code example in Python:
import json
import requests
import urllib
params = {"grant_type": "password",
"passcode": "xxx"
}
url = "https://login.system.aws-usw02-pr.ice.predix.io/oauth/token"
headers = {"Authorization": "Basic Y2Y6", "Content-Type": "application/json", "Accept": "application/x-www-form-urlencoded"}
encodeParams = urllib.parse.urlencode(params)
response = requests.post(url, headers=headers, data=encodeParams)
rjson = response.json()
print(rjson)
Each time I run this, I get the response
error invalid request, Missing grant type
Any help would be greatly appreciated.
Your code mostly worked for me, although I used a different UAA server.
I had to make only one change. You had the Accept and Content-Type headers flipped around. Accept should be application/json because that's the format you want back, and Content-Type should be application/x-www-form-urlencoded because that's the format you are sending.
See the API Docs for reference.
import json
import requests
import urllib
import getpass
UAA_SERVER = "https://login.run.pivotal.io"
print("go to {}/passcode".format(UAA_SERVER))
params = {
"grant_type": "password",
"passcode": getpass.getpass(),
}
url = "https://login.run.pivotal.io/oauth/token"
headers = {
"Authorization": "Basic Y2Y6",
"Content-Type": "application/x-www-form-urlencoded",
"Accept": "application/json"
}
encodeParams = urllib.parse.urlencode(params)
response = requests.post(url, headers=headers, data=encodeParams)
rjson = response.json()
print(json.dumps(rjson, indent=4, sort_keys=True))
I made a couple other minor changes, but they should affect the functionality.
Use getpass.getpass() to load the passcode.
Set the target server as a variable.
Pretty print the JSON response.
The only other thing to note, is that the OAuth2 client you use must be allowed to use the password grant type. It looks like you're using the same client that the cf cli uses, so if your UAA server is part of a standard Cloud Foundry install that is likely to be true, but if it still doesn't work for you then you may need to talk with an administrator and make sure the client is set up to allow this.

Add headers and payload in requests.put

I am trying to use requests in a Python3 script to update a deploy key in a gitlab project with write access. Unfortunately, I am receiving a 404 error when trying to connect via the requests module. The code is below:
project_url = str(url)+('/deploy_keys/')+str(DEPLOY_KEY_ID)
headers = {"PRIVATE-TOKEN" : "REDACTED"}
payload = {"can_push" : "true"}
r = requests.put(project_url, headers=headers, json=payload)
print(r)
Is there something that I am doing wrong where in the syntax of my Private Key/headers?
I have gone through gitlab api and requests documentation. I have also confirmed that my Private Token is working outside of the script.
I am expecting it to update the deploy key with write access, but am receiving a upon exit, making me think the issue is with the headers/auth.
This is resolved, it was actually not an auth problem. You must use the project ID instead of url, for example:
project_url = f'https://git.REDACTED.com/api/v4/projects/{project_id}/deploy_keys/{DEPLOY_KEY_ID}'
headers = {"PRIVATE-TOKEN" : "REDACTED"}
payload = {'can_push' : 'true'}
try:
r = requests.put(project_url, headers=headers, data=payload)

Resources