Invalid parameter: ARN account () must match authenticated user.", - python-3.x

My Python version is 3.9 and I am writing a AWS lambda function using boto3
In addition to assigning all admin access and s3full and datasync full roles, I also created a trust relationship but still receive the following error.
I wonder if anyone has experienced the same issue and solved it.
"errorMessage": "An error occurred (InvalidRequestException) when
calling the CreateTask operation: Invalid parameter: ARN account ()
must match authenticated user.",
import json
import logging
import sys
import boto3
from botocore.exceptions import ClientError
logger = logging.getLogger(__name__)
client = boto3.client('datasync', region_name='us-east-1')
create_location_s3 = client.create_location_s3(
Subdirectory='/',
S3BucketArn='arn:aws:s3:::data-sync-bucket',
S3StorageClass='STANDARD',
S3Config={
'BucketAccessRoleArn': 'arn:aws:iam::XXXX:role/datasync-data-sync-bucket-ARN'
},
AgentArns=[
'',
],
Tags=[
{
'Key': 'name',
'Value': 'datasync-lambda'
},
]
)

I think your error is more related to datasync than IAM or Lambda.
Please go through below links eventhough they are not going to help.
https://githubmemory.com/repo/hashicorp/terraform/issues/29593
https://issueexplorer.com/issue/hashicorp/terraform/29593
I would recommend if you have aws support subscription then they can help you better for this issue. else you can go through the documentation provided by AWS on datasync in this link https://docs.aws.amazon.com/datasync/latest/userguide/sync-dg.pdf

Related

Boto3: Unable to list contents of AWS S3 through python

I am trying to connect to AWS S3 and list the buckets from my local machine through python. I am using the following code
import boto3
from boto3 import Session
from boto.s3.connection import S3Connection
from boto.sts import STSConnection
import pandas
session = boto3.Session(profile_name='mfa_0729')
credentials=session.get_credentials()
dev_s3_client = session.client('s3')
dev_s3_resource = session.resource('s3')
bucketname = 'my-bucket-name'
startAfter = 'my-claim-name'
obj1=dev_s3_client.list_objects_v2(Bucket=bucketname, StartAfter=startAfter )
There is a session token that I have to use that I saved in the profile along with other credentials. When executing the last line, I get the error
botocore.exceptions.ClientError: An error occurred (AccessDenied) when calling the ListObjectsV2 operation: Access Denied
Can anyone point out what am I doing wrong? I am new to AWS and boto3.

Create an event with conference using python and Google Calendar API creates the event but not the conference

I am trying to create an event using Google Calendar API in Python 3. I also want to generate a Google Meet conference link for the event. I am using the documentations provided here:
https://developers.google.com/calendar/quickstart/python
https://developers.google.com/calendar/v3/reference/events#conferenceData
https://developers.google.com/calendar/create-events
The event is created without a problem. However, it is missing the conference link. My code so far is as follows:
from pathlib import Path
from pickle import load
from pickle import dump
from google.auth.transport.requests import Request
from google_auth_oauthlib.flow import InstalledAppFlow
from googleapiclient.discovery import build
from uuid import uuid4
from typing import Dict, List
from oauth2client import file, client, tools
class EventPlanner:
def __init__(self, guests: Dict[str, str], schedule: Dict[str, str]):
guests = [{"email": email} for email in guests.values()]
service = self._authorize()
self.event_states = self._plan_event(guests, schedule, service)
#staticmethod
def _authorize():
scopes = ["https://www.googleapis.com/auth/calendar"]
credentials = None
token_file = Path("./calendar_creds/token.pickle")
if token_file.exists():
with open(token_file, "rb") as token:
credentials = load(token)
if not credentials or not credentials.valid:
if credentials and credentials.expired and credentials.refresh_token:
credentials.refresh(Request())
else:
flow = InstalledAppFlow.from_client_secrets_file('calendar_creds/credentials.json', scopes)
credentials = flow.run_local_server(port=0)
with open(token_file, "wb") as token:
dump(credentials, token)
calendar_service = build("calendar", "v3", credentials=credentials)
return calendar_service
#staticmethod
def _plan_event(attendees: List[Dict[str, str]], event_time, service: build):
event = {"summary": "test meeting",
"start": {"dateTime": event_time["start"]},
"end": {"dateTime": event_time["end"]},
"attendees": attendees,
"conferenceData": {"createRequest": {"requestId": f"{uuid4().hex}",
"conferenceSolutionKey": {"type": "hangoutsMeet"}}},
"reminders": {"useDefault": True}
}
event = service.events().insert(calendarId="primary", sendNotifications=True, body=event, conferenceDataVersion=1).execute()
return event
if __name__ == "__main__":
plan = EventPlanner({"test_guest": "test.guest#gmail.com"}, {"start": "2020-07-31T16:00:00",
"end": "2020-07-31T16:30:00"})
print(plan.event_states)
I suspect that the problem is with where I have passed conferenceDataVersion but the docs are not exactly clear about where it has to be passed other than that it must be passed. I also tried putting it in the body of the event or in createRequest. It always creates the event but not the conference. Unfortunately, I could not find anything about this online anywhere. Maybe I'm actually that bad at searching, but I have been testing different things and searching for a solution for several days! If anyone knows what I am missing, I will truly appreciate their assistance.
Thanks to #Tanaike, I found what was the problem. The token which is generated the first time the API is authenticated is very specific. The problem I was having turned out to be just with that. As soon as I removed the token and had it get generated again, the problem was solved. That being said, I have no idea why the problem appeared in the first place. I will update the response if I find the reason behind it. But for now, if you are having the same problem, just remove the token and regenerate it.
With this code we can create the conference (Google meet link) for me it works
"conferenceData": {
"createRequest": {
"requestId": "SecureRandom.uuid"
}
}

How to send a GraphQL query to AppSync from python?

How do we post a GraphQL request through AWS AppSync using boto?
Ultimately I'm trying to mimic a mobile app accessing our stackless/cloudformation stack on AWS, but with python. Not javascript or amplify.
The primary pain point is authentication; I've tried a dozen different ways already. This the current one, which generates a "401" response with "UnauthorizedException" and "Permission denied", which is actually pretty good considering some of the other messages I've had. I'm now using the 'aws_requests_auth' library to do the signing part. I assume it authenticates me using the stored /.aws/credentials from my local environment, or does it?
I'm a little confused as to where and how cognito identities and pools will come into it. eg: say I wanted to mimic the sign-up sequence?
Anyways the code looks pretty straightforward; I just don't grok the authentication.
from aws_requests_auth.boto_utils import BotoAWSRequestsAuth
APPSYNC_API_KEY = 'inAppsyncSettings'
APPSYNC_API_ENDPOINT_URL = 'https://aaaaaaaaaaaavzbke.appsync-api.ap-southeast-2.amazonaws.com/graphql'
headers = {
'Content-Type': "application/graphql",
'x-api-key': APPSYNC_API_KEY,
'cache-control': "no-cache",
}
query = """{
GetUserSettingsByEmail(email: "john#washere"){
items {name, identity_id, invite_code}
}
}"""
def test_stuff():
# Use the library to generate auth headers.
auth = BotoAWSRequestsAuth(
aws_host='aaaaaaaaaaaavzbke.appsync-api.ap-southeast-2.amazonaws.com',
aws_region='ap-southeast-2',
aws_service='appsync')
# Create an http graphql request.
response = requests.post(
APPSYNC_API_ENDPOINT_URL,
json={'query': query},
auth=auth,
headers=headers)
print(response)
# this didn't work:
# response = requests.post(APPSYNC_API_ENDPOINT_URL, data=json.dumps({'query': query}), auth=auth, headers=headers)
Yields
{
"errors" : [ {
"errorType" : "UnauthorizedException",
"message" : "Permission denied"
} ]
}
It's quite simple--once you know. There are some things I didn't appreciate:
I've assumed IAM authentication (OpenID appended way below)
There are a number of ways for appsync to handle authentication. We're using IAM so that's what I need to deal with, yours might be different.
Boto doesn't come into it.
We want to issue a request like any regular punter, they don't use boto, and neither do we. Trawling the AWS boto docs was a waste of time.
Use the AWS4Auth library
We are going to send a regular http request to aws, so whilst we can use python requests they need to be authenticated--by attaching headers.
And, of course, AWS auth headers are special and different from all others.
You can try to work out how to do it
yourself, or you can go looking for someone else who has already done it: Aws_requests_auth, the one I started with, probably works just fine, but I have ended up with AWS4Auth. There are many others of dubious value; none endorsed or provided by Amazon (that I could find).
Specify appsync as the "service"
What service are we calling? I didn't find any examples of anyone doing this anywhere. All the examples are trivial S3 or EC2 or even EB which left uncertainty. Should we be talking to api-gateway service? Whatsmore, you feed this detail into the AWS4Auth routine, or authentication data. Obviously, in hindsight, the request is hitting Appsync, so it will be authenticated by Appsync, so specify "appsync" as the service when putting together the auth headers.
It comes together as:
import requests
from requests_aws4auth import AWS4Auth
# Use AWS4Auth to sign a requests session
session = requests.Session()
session.auth = AWS4Auth(
# An AWS 'ACCESS KEY' associated with an IAM user.
'AKxxxxxxxxxxxxxxx2A',
# The 'secret' that goes with the above access key.
'kwWxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxgEm',
# The region you want to access.
'ap-southeast-2',
# The service you want to access.
'appsync'
)
# As found in AWS Appsync under Settings for your endpoint.
APPSYNC_API_ENDPOINT_URL = 'https://nqxxxxxxxxxxxxxxxxxxxke'
'.appsync-api.ap-southeast-2.amazonaws.com/graphql'
# Use JSON format string for the query. It does not need reformatting.
query = """
query foo {
GetUserSettings (
identity_id: "ap-southeast-2:8xxxxxxb-7xx4-4xx4-8xx0-exxxxxxx2"
){
user_name, email, whatever
}}"""
# Now we can simply post the request...
response = session.request(
url=APPSYNC_API_ENDPOINT_URL,
method='POST',
json={'query': query}
)
print(response.text)
Which yields
# Your answer comes as a JSON formatted string in the text attribute, under data.
{"data":{"GetUserSettings":{"user_name":"0xxxxxxx3-9102-42f0-9874-1xxxxx7dxxx5"}}}
Getting credentials
To get rid of the hardcoded key/secret you can consume the local AWS ~/.aws/config and ~/.aws/credentials, and it is done this way...
# Use AWS4Auth to sign a requests session
session = requests.Session()
credentials = boto3.session.Session().get_credentials()
session.auth = AWS4Auth(
credentials.access_key,
credentials.secret_key,
boto3.session.Session().region_name,
'appsync',
session_token=credentials.token
)
...<as above>
This does seem to respect the environment variable AWS_PROFILE for assuming different roles.
Note that STS.get_session_token is not the way to do it, as it may try to assume a role from a role, depending where it keyword matched the AWS_PROFILE value. Labels in the credentials file will work because the keys are right there, but names found in the config file do not work, as that assumes a role already.
OpenID
In this scenario, all the complexity is transferred to the conversation with the openid connect provider. The hard stuff is all the auth hoops you jump through to get an access token, and thence using the refresh token to keep it alive. That is where all the real work lies.
Once you finally have an access token, assuming you have configured the "OpenID Connect" Authorization Mode in appsync, then you can, very simply, drop the access token into the header:
response = requests.post(
url="https://nc3xxxxxxxxxx123456zwjka.appsync-api.ap-southeast-2.amazonaws.com/graphql",
headers={"Authorization": ACCESS_TOKEN},
json={'query': "query foo{GetStuff{cat, dog, tree}}"}
)
You can set up an API key on the AppSync end and use the code below. This works for my case.
import requests
# establish a session with requests session
session = requests.Session()
# As found in AWS Appsync under Settings for your endpoint.
APPSYNC_API_ENDPOINT_URL = 'https://vxxxxxxxxxxxxxxxxxxy.appsync-api.ap-southeast-2.amazonaws.com/graphql'
# setup the query string (optional)
query = """query listItemsQuery {listItemsQuery {items {correlation_id, id, etc}}}"""
# Now we can simply post the request...
response = session.request(
url=APPSYNC_API_ENDPOINT_URL,
method='POST',
headers={'x-api-key': '<APIKEYFOUNDINAPPSYNCSETTINGS>'},
json={'query': query}
)
print(response.json()['data'])
Building off Joseph Warda's answer you can use the class below to send AppSync commands.
# fileName: AppSyncLibrary
import requests
class AppSync():
def __init__(self,data):
endpoint = data["endpoint"]
self.APPSYNC_API_ENDPOINT_URL = endpoint
self.api_key = data["api_key"]
self.session = requests.Session()
def graphql_operation(self,query,input_params):
response = self.session.request(
url=self.APPSYNC_API_ENDPOINT_URL,
method='POST',
headers={'x-api-key': self.api_key},
json={'query': query,'variables':{"input":input_params}}
)
return response.json()
For example in another file within the same directory:
from AppSyncLibrary import AppSync
APPSYNC_API_ENDPOINT_URL = {YOUR_APPSYNC_API_ENDPOINT}
APPSYNC_API_KEY = {YOUR_API_KEY}
init_params = {"endpoint":APPSYNC_API_ENDPOINT_URL,"api_key":APPSYNC_API_KEY}
app_sync = AppSync(init_params)
mutation = """mutation CreatePost($input: CreatePostInput!) {
createPost(input: $input) {
id
content
}
}
"""
input_params = {
"content":"My first post"
}
response = app_sync.graphql_operation(mutation,input_params)
print(response)
Note: This requires you to activate API access for your AppSync API. Check this AWS post for more details.
graphql-python/gql supports AWS AppSync since version 3.0.0rc0.
It supports queries, mutation and even subscriptions on the realtime endpoint.
The documentation is available here
Here is an example of a mutation using the API Key authentication:
import asyncio
import os
import sys
from urllib.parse import urlparse
from gql import Client, gql
from gql.transport.aiohttp import AIOHTTPTransport
from gql.transport.appsync_auth import AppSyncApiKeyAuthentication
# Uncomment the following lines to enable debug output
# import logging
# logging.basicConfig(level=logging.DEBUG)
async def main():
# Should look like:
# https://XXXXXXXXXXXXXXXXXXXXXXXXXX.appsync-api.REGION.amazonaws.com/graphql
url = os.environ.get("AWS_GRAPHQL_API_ENDPOINT")
api_key = os.environ.get("AWS_GRAPHQL_API_KEY")
if url is None or api_key is None:
print("Missing environment variables")
sys.exit()
# Extract host from url
host = str(urlparse(url).netloc)
auth = AppSyncApiKeyAuthentication(host=host, api_key=api_key)
transport = AIOHTTPTransport(url=url, auth=auth)
async with Client(
transport=transport, fetch_schema_from_transport=False,
) as session:
query = gql(
"""
mutation createMessage($message: String!) {
createMessage(input: {message: $message}) {
id
message
createdAt
}
}"""
)
variable_values = {"message": "Hello world!"}
result = await session.execute(query, variable_values=variable_values)
print(result)
asyncio.run(main())
I am unable to add a comment due to low rep, but I just want to add that I tried the accepted answer and it didn't work. I was getting an error saying my session_token is invalid. Probably because I was using AWS Lambda.
I got it to work pretty much exactly, but by adding to the session token parameter of the aws4auth object. Here's the full piece:
import requests
import os
from requests_aws4auth import AWS4Auth
def AppsyncHandler(event, context):
# These are env vars that are always present in an AWS Lambda function
# If not using AWS Lambda, you'll need to add them manually to your env.
access_id = os.environ.get("AWS_ACCESS_KEY_ID")
secret_key = os.environ.get("AWS_SECRET_ACCESS_KEY")
session_token = os.environ.get("AWS_SESSION_TOKEN")
region = os.environ.get("AWS_REGION")
# Your AppSync Endpoint
api_endpoint = os.environ.get("AppsyncConnectionString")
resource = "appsync"
session = requests.Session()
session.auth = AWS4Auth(access_id,
secret_key,
region,
resource,
session_token=session_token)
The rest is the same.
Hope this Helps Everyone
import requests
import json
import os
from dotenv import load_dotenv
load_dotenv(".env")
class AppSync(object):
def __init__(self,data):
endpoint = data["endpoint"]
self.APPSYNC_API_ENDPOINT_URL = endpoint
self.api_key = data["api_key"]
self.session = requests.Session()
def graphql_operation(self,query,input_params):
response = self.session.request(
url=self.APPSYNC_API_ENDPOINT_URL,
method='POST',
headers={'x-api-key': self.api_key},
json={'query': query,'variables':{"input":input_params}}
)
return response.json()
def main():
APPSYNC_API_ENDPOINT_URL = os.getenv("APPSYNC_API_ENDPOINT_URL")
APPSYNC_API_KEY = os.getenv("APPSYNC_API_KEY")
init_params = {"endpoint":APPSYNC_API_ENDPOINT_URL,"api_key":APPSYNC_API_KEY}
app_sync = AppSync(init_params)
mutation = """
query MyQuery {
getAccountId(id: "5ca4bbc7a2dd94ee58162393") {
_id
account_id
limit
products
}
}
"""
input_params = {}
response = app_sync.graphql_operation(mutation,input_params)
print(json.dumps(response , indent=3))
main()

How to interact with the Gmail API - Delegate User - Python 3.7

I am trying to write a Python application which simply adds a user as a delegate to another users mailbox.
I am following the API # Google API Documentation - Users.settings.delegates: create
However, I am struggling to find how to the parameters of:
User - an account which is TOBE added to a delegate Mailbox
Mailbox - the account which has the Mailbox I wish the account to become a delegate of.
I have currently tried making an API which has the delegate user. However, it does not seem to be interacting how I would expect. I am hoping Google will create a responsive API for the browser to support this. However, I am struggling with the code:
from googleapiclient import discovery
from oauth2client.service_account import ServiceAccountCredentials
def main(user_to_be_added, delegated_mailbox):
service_account_credentials = ServiceAccountCredentials.from_json_keyfile_name('credentials/service_account.json')
service_account_credentials = service_account_credentials.create_scoped('https://mail.google.com/ https://www.googleapis.com/auth/gmail.insert https://www.googleapis.com/auth/gmail.modify')
service_account_credentials = service_account_credentials.create_delegated(user_to_be_added)
service = discovery.build('gmail', 'v1', credentials=service_account_credentials)
response = service.users().settings().delegates().create().execute(userId=delegated_mailbox)
if __name__ == '__main__':
main('some_account_to_be_added#gmail.com', 'delegated_mailbox#gmail.com')
Am I interacting with this API completely wrong? If so, how has anyone else achieved this?
Thank you for your time.
Jordan
Working Solution:
from googleapiclient import discovery
from google.oauth2 import service_account
def _create_client(subject):
credentials = service_account.Credentials
credentials = credentials.from_service_account_file('credentials/service_account.json',
scopes=['https://www.googleapis.com/auth/gmail.settings.sharing',
'https://www.googleapis.com/auth/gmail.settings.basic'],
subject=subject)
service = discovery.build('gmail', 'v1', credentials=credentials)
return service
def add_delegate_to_email(user_to_be_added, delegated_mailbox):
service = _create_client(user_to_be_added)
body = {
"delegateEmail": delegated_mailbox,
"verificationStatus": "accepted"
}
try:
response = service.users().settings().delegates().create(userId='me', body=body).execute()
print(response)
except Exception as e:
print('Exception: {}'.format(e))
Main problem: from oauth2client.service_account import ServiceAccountCredentials is deprecated as Google took ownership with google-auth.

Invalid credentials when trying to connect to vTiger

I'm trying to log in via webservice within vtiger CRM5 with python
When putting my key and user name in params, I just get an INVALID_AUTH_TOKEN, but when putting it into body, I get INVALID_USER_CREDENTIALS. Which seems better but not quite working !
{'success': False, 'error': {'code': 'INVALID_USER_CREDENTIALS', 'message': 'Invalid username or password'}}
# -*- coding: utf-8 -*-
import json
import requests
from hashlib import md5
from requests.auth import HTTPBasicAuth
api_url_base = 'http://crmaddress/webservice.php'
username = 'myusername'
accessKey = 'fghdhgfhfdhgfd'
headers = {"ContentType":"application/x-www-form-urlencoded"}
response = requests.get(api_url_base,params={"operation":"getChallenge","username":username})
token = json.loads(response.content.decode('utf-8'))['result']['token']
key = md5(accessKey.encode('utf-8')+token.encode('utf-8')).hexdigest()
print(key)
response = requests.post(api_url_base,data={"operation":"login","accessKey":key,"username":username,},auth=HTTPBasicAuth('myusername','mypassword'),headers=headers)
print(json.loads(response.content.decode('utf-8')))
I cannot verify without running the code, but the problem seems to be somewhere along
key = md5(accessKey.encode('utf-8')+token.encode('utf-8')).hexdigest()
Also, instead of directly using the webservice, I would recommend creating a wrapper class. Please check out a python3 wrapper I wrote at github. Let me know if this helps.

Resources