Boto3 not assuming IAM role from credentials where aws-cli does without problem - python-3.x

I am setting up some file transfer scripts and am using boto3 to do this.
I need to send some files from local to a third party AWS account (cross-account). I have a role set-up on the other account with permissions to write to the bucket, and assigned this role to a user on my account.
I am able to do this no problem on CLI, but Boto keeps on kicking out an AccessDenied error for the bucket.
I have read through the boto3 docs on this area such as they are here, and have set-up the credential and config files as they are supposed to be (assume they are correct as the CLI approach works), but I am unable to get this working.
Credential File:-
[myuser]
aws_access_key_id = XXXXXXXXXXXXXXXXXXXXXX
aws_secret_access_key = XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
Config File:-
[profile crossaccount]
region = eu-west-2
source_profile=myuser
role_arn = arn:aws:iam::0123456789:role/crossaccountrole
and here is the code I am trying to get working with this:-
#set-up variables
bucket_name = 'otheraccountbucket'
file_name = 'C:\\Users\\test\\testfile.csv'
object_name = 'testfile.csv'
#create a boto session with profile name for assume role call to be made with correct credentials
session = boto3.Session(profile_name='crossaccount')
#Create s3_client from that profile based session
s3_client = session.client('s3')
#try and upload the file
response = s3_client.upload_file(
file_name, bucket, object_name,
ExtraArgs={'ACL': 'bucket-owner-full-control'}
)
EDIT:
in response to John's multi-part permission comment, I have tried to upload via put_object method to bypass this - but still getting AccessDenied, but now on the PutObject permission - which I have confirmed is in place:-
#set-up variables
bucket_name = 'otheraccountbucket'
file_name = 'C:\\Users\\test\\testfile.csv'
object_name = 'testfile.csv'
#create a boto session with profile name for assume role call to be made with correct credentials
session = boto3.Session(profile_name='crossaccount')
#Create s3_client from that profile based session
s3_client = session.client('s3')
#try and upload the file
with open(file_name, 'rb') as fd:
response = s3_client.put_object(
ACL='bucket-owner-full-control',
Body=fd,
Bucket=bucket,
ContentType='text/csv',
Key=object_name
)
Crossaccountrole has PutObject permissions - error is :-
An error occurred (AccessDenied) when calling the PutObject operation: Access Denied
END EDIT
Here is the working aws-cli command:-
aws s3 cp "C:\Users\test\testfile.csv" s3://otheraccountbucket --profile crossaccount
I am expecting this to upload correctly as the equivalent cli code does, but instead I get an S3UploadFailedError exception - An error occurred (AccessDenied) when calling the CreateMultipartUpload operation: Access Denied
Any Help would be much appreciated

I had this same problem, my issue ended up being the fact that I had AWS CLI configured with different credentials than my python app where I was trying to use Boto3 to upload files into an s3 bucket.
Here's what worked for me, this only applies to people that have AWS CLI installed:
Open your command line or terminal
Type aws configure
Enter the ID & Secret key of the IAM user you are using for your python boto3 app when prompted
Run your python app and test boto3, you should no longer get the access denied message

Related

Upload a file from form in S3 bucket using boto3 and handler is created in lambda

I want to upload image , audio files of small size from form to the S3 using postman for test. I successfully uploaded file in AWS S3 bucket from my application running on my local machine. Following is the part of the code I used for file uploading .
import boto3
s3_client = boto3.client('s3',aws_access_key_id =AWS_ACCESS_KEY_ID,aws_secret_access_key = AWS_SECRET_ACCESS_KEY,)
async def save_file_static_folder(file, endpoint, user_id):
_, ext = os.path.splitext(file.filename)
raw_file_name = f'{uuid.uuid4().hex}{ext}'
# Save image file in folder
if ext.lower() in image_file_extension:
relative_file_folder =user_id+'/'+endpoint
contents = await file.read()
try:
response = s3_client.put_object(Bucket = S3_BUCKET_NAME,Key = (relative_file_folder+'/'+raw_file_name),Body = contents)
except:
return FileEnum.ERROR_ON_INSERT
I called this function from another endpoint and form data (e.g. name, date of birth and other details) are successfully saved in Mongodb database and files are uploaded in S3 bucket.
This app is using fastapi and files are uploaded in S3 bucket while deploying this app in local machine.
Same app is delpoyed in AWS lambda and S3 bucket as storage. For handling whole app , following is added in endpoint file.
handler = Mangum(app)
After deploying app in AWS creating lambda function from root user account of AWS, files didnot get uploaded in S3 bucket.
If I didnot provide files during form then the AWS API endpoint successfully works. Form data gets stored in MongoDB database (Mongodb atlas) and app works fine hosted using Lambda.
App deployed using Lambda function works successfully except file uploads in form. FOr local machine, file uploads in S3 get success.
EDIT
While tracing in Cloudwatch I got following error
exception An error occurred (InvalidAccessKeyId) when calling the PutObject operation: The AWS Access Key Id you provided does not exist in our records.
I checked AWS Access Key Id and secret key many times and they are correct and root user credentials are kept.
It looks like you have configured your Lambda function with an execution IAM role, but you are overriding the AWS credentials supplied to the boto3 SDK here:
s3_client = boto3.client('s3',aws_access_key_id =AWS_ACCESS_KEY_ID,aws_secret_access_key = AWS_SECRET_ACCESS_KEY,)
You don't need to provide credentials explicitly because the boto3 SDK (and all language SDKs) will automatically retrieve credentials dynamically for you. So, ensure that your Lambda function is configured with the correct IAM role, and then change your code as follows:
s3_client = boto3.client('s3')
As an aside, you indicated that you may be using AWS root credentials. It's generally a best security practice in AWS to not use root credentials. Instead, create IAM roles and IAM users.
We strongly recommend that you do not use the root user for your everyday tasks, even the administrative ones. Instead, adhere to the best practice of using the root user only to create your first IAM user. Then securely lock away the root user credentials and use them to perform only a few account and service management tasks.

Need help for AWS lambda

I am working on one issue where I need Lambda to write the logs in S3 bucket but the tricky part here is, Lambda will read the logs and write in another s3 bucket which is in another AWS account. Can we achieve this?
I wrote some code but it isn't working.
from urllib.request import urlopen
import boto3
import os
import time
BUCKET_NAME = '***'
CSV_URL = f'***'
def lambda_handler(event, context):
response = urlopen(CSV_URL)
s3 = boto3.client('s3')
s3.upload_fileobj(response, BUCKET_NAME, time.strftime('%Y/%m/%d'))
response.close()
It sounds like you are asking how to allow the Lambda function to create an object in an Amazon S3 bucket that belongs to a different AWS Account.
Bucket Policy on target bucket
The simplest method is to ask the owner of the target bucket (that is, somebody with Admin permissions in that other AWS Account) to add a Bucket Policy that permits PutObject access to the IAM Role being used by the AWS Lambda function. You will need to supply them with the ARN of the IAM Role being used by the Lambda function.
Also, make sure that the IAM Role has been given permission to write to the target bucket. Please note that two sets of permissions are required: The IAM Role needs to be allowed to write to the bucket in the other account, AND the bucket needs to permit access by the IAM Role. This double-set of permissions is required because access both accounts need to permit this access.
It is possible that you might need to grant some additional permissions, such as PutObjectACL.
Assuming an IAM Role from the target account
An alternative method (instead of using the Bucket Policy) is:
Create an IAM Role in the target account and give it permission to access the bucket
Grant trust permissions so that the IAM Role used by the Lambda function is allowed to 'Assume' the IAM Role in the target account
Within the Lambda function, use the AssumeRole() API call to obtain credentials from the target account
Use those credentials when connecting to S3, which will allow you to access the bucket in the other account
Frankly, creating the Bucket Policy is a lot easier.

How to store and access microsoft office365 account token inside AWS Lambda in python3.6

I have zipped and uploaded a python library O365 for accessing MS outlook calendar inside AWS Lambda-Layer. I'm able to import it, but the problem is the authorization. When I tested it in local the bearer token was generated and stored in the local txt file using the FileSytemTokenBackend.
But When I load this into AWS Lambda using layers, it is again asking to copy paste the URL process which is not able to fetch from the layer token file.
And I have tried FireSystemTokenBackend, but that also I'm failed to configure successfully. I have used this Token storage docs in local while testing the functionality.
My question is how to store and authenticate my account using the token file generated in my local. Because in the AWS lambda the input() functionality is throwing error in runtime. How can I keep that token file inside the aws lambda and use it without doing authentication everytime?
I have faced the same issue. The lambda filesystem is temporal, so you will need to do the autenticate process every time you run the function and the o365 lib will ask for the url.
So try saving your token (o365_token.txt) in S3 instead of getting it in lambda filesystem and the use this token for authentication.
I hope this code will help you:
import boto3
bucket_name = 'bucket_name'
# replace with your bucket name
filename_token = 'o365_token.txt'
# replace with your AWS credentials
s3 = boto3.resource('s3',aws_access_key_id='xxxx', aws_secret_access_key='xxxx')
# Read the token in S3 and save to /tmp directory in Lambda
s3.Bucket(bucket_name).download_file(filename_token, f'/tmp/{filename_token}')
# Read the token in /tmp directory
token_backend = FileSystemTokenBackend(token_path='/tmp',
token_filename=filename_token)
# Your azure credentials
credentials = ('xxxx', 'xxxx')
account = Account(credentials,token_backend=token_backend)
# Then do the normal authentication process and include the refresh token command
if not account.is_authenticated:
account.authenticate()
account.connection.refresh_token()

Python S3 file upload showing permission error

I am trying to upload file to s3 in python. So far now my code is like this
import boto3
from botocore.exceptions import NoCredentialsError
ACCESS_KEY = 'XXXXXXXXXXXX'
SECRET_KEY = 'XXXXXXXXXXXX'
def upload_to_aws(local_file, bucket, s3_file):
s3 = boto3.client('s3', aws_access_key_id=ACCESS_KEY,
aws_secret_access_key=SECRET_KEY)
try:
s3.upload_file(local_file, bucket, s3_file)
print("Upload Successful")
return True
except FileNotFoundError:
print("The file was not found")
return False
except NoCredentialsError:
print("Credentials not available")
return False
uploaded = upload_to_aws('image-1.png', 'bucketname', 'image-1.png')
But when I am trying to run the code its showing error like
boto3.exceptions.S3UploadFailedError: Failed to upload image-1.png to bucketname/image-1.png: An error occurred (AccessDenied) when calling the PutObject operation: Access Denied
I have checked the bucket permission and its fine. The permission is like this:
Block all public access
Off
Block public access to buckets and objects granted through new access control lists (ACLs)
Off
Block public access to buckets and objects granted through any access control lists (ACLs)
Off
Block public access to buckets and objects granted through new public bucket policies
On
Block public and cross-account access to buckets and objects through any public bucket policies
On
couple things:
make sure that you actually created the security credential successfully via the dashboard
did you create the bucket? if the bucket was not created and the user group your security credential corresponds to doesn't have permission to upload to that bucket, then you wouldn't be able to put objects into that bucket, right?
minor detail: please use environment variables instead of copy-paste your credentials to the script
import os
ACCESS_KEY = os.environ['access_key_id']
SECRET_KEY = os.environ['secret_access_key']

GCP Cloud Storage file push by python using service account json file

I have written simple python program as per Google documentation. It throws me an error saying given account does not have access. Tried different combinations but didn't work.
I have cross checked the given access by supplying to java program and gsutil. Both these places I am able to access the bucket and upload file. Issue with the python program. Kindly show some light on this issue.
from google.oauth2 import service_account
from google.cloud import storage
credentials = service_account.Credentials.from_service_account_file('C:/Users/AWS/python/sit.json'
,scopes=['https://www.googleapis.com/auth/cloud-platform'])
storage_client = storage.Client(credentials=credentials,project='proj-sit')
bucket = storage_client.get_bucket('b-sit')
blob = bucket.blob('myfile')
blob.upload_from_string('New contents!. This is test.')
and i have received below error
Traceback (most recent call last):
File "C:\Users\AWS\python\mypgm.py", line 21, in <module>
bucket = storage_client.get_bucket('pearson-gcss-sit') # pearson-gcss-sit pearson-bbi-dev global-integration-nonprod
File "D:\Development_Avecto\Python36\lib\site-packages\google\cloud\storage\client.py", line 227, in get_bucket
bucket.reload(client=self)
File "D:\Development_Avecto\Python36\lib\site-packages\google\cloud\storage\_helpers.py", line 106, in reload
method="GET", path=self.path, query_params=query_params, _target_object=self
File "D:\Development_Avecto\Python36\lib\site-packages\google\cloud\_http.py", line 319, in api_request
raise exceptions.from_http_response(response)
google.api_core.exceptions.Forbidden: 403 GET https://www.googleapis.com/storage/v1/b/b-sit?projection=noAcl: someid-sit#someinfo-sit.iam.gserviceaccount.com does not have storage.buckets.get access to b-sit.
[Finished in 10.6s]
Note : I can see role as 'storage.objectAdmin' in console.cloud.google.com.
For more information, I can upload the files by using below java program.
GoogleCredentials credentials = GoogleCredentials.fromStream(new FileInputStream(connectionKeyPath))
.createScoped(Lists.newArrayList("https://www.googleapis.com/auth/cloud-platform"));
Storage storage = StorageOptions.newBuilder().setCredentials(credentials).build().getService();
BlobId blobId = BlobId.of("some-sit", "cloudDirectory/file.zip");
BlobInfo blobInfo = BlobInfo.newBuilder(blobId).setContentType("application/zip").build();
Blob blob = storage.create(blobInfo, fileContent);
I got the root cause of the issue.
Root cause : My bucket has access of 'roles/storage.objectAdmin' which does not have access to 'storage.buckets.get'. Hence I get the above error in the line where i have get_bucket function. This I have found in the google documentation.
All the sample codes in the documentation has get_bucket function to upload the files. My questions how can we upload the files to bucket without this access (storage.buckets.get)? Because we uploaded to the same bucket by Java without this access.
Can you show some light on this ? please
The service account you are using does not have the proper permissions.
You can solve this issue by granting at least the roles/storage.objectAdmin role at bucket or project level.
The roles/storage.objectAdmin role:
Grants full control over objects, including listing, creating, viewing, and deleting objects.
To grant it at bucket level run:
gsutil iam ch serviceAccount:someid-sit#someinfo-sit.iam.gserviceaccount.com:roles/storage.objectAdmin gs://[BUCKET_NAME]
To grant it at project level run:
gcloud projects add-iam-policy-binding yourProject --member serviceAccount:someid-sit#someinfo-sit.iam.gserviceaccount.com --role roles/storage.objectAdmin
EDIT:
You need to pass the credentials to the storage_client:
storage_client = storage.Client('proj-sit', credentials=credentials)
I have removed the line to get the bucket in my code and added below line. Below line done a trick to me. Hence I can upload the file with the access of 'roles/storage.objectAdmin'
bucket = storage_client.bucket('b-sit')

Resources