I am trying to connect to AWS S3 and list the buckets from my local machine through python. I am using the following code
import boto3
from boto3 import Session
from boto.s3.connection import S3Connection
from boto.sts import STSConnection
import pandas
session = boto3.Session(profile_name='mfa_0729')
credentials=session.get_credentials()
dev_s3_client = session.client('s3')
dev_s3_resource = session.resource('s3')
bucketname = 'my-bucket-name'
startAfter = 'my-claim-name'
obj1=dev_s3_client.list_objects_v2(Bucket=bucketname, StartAfter=startAfter )
There is a session token that I have to use that I saved in the profile along with other credentials. When executing the last line, I get the error
botocore.exceptions.ClientError: An error occurred (AccessDenied) when calling the ListObjectsV2 operation: Access Denied
Can anyone point out what am I doing wrong? I am new to AWS and boto3.
Related
I have written below the AWS lambda function to export dynamodb table to the S3 bucket. But when I execute the below code, I am getting an error
'dynamodb.ServiceResource' object has no attribute 'export_table_to_point_in_time'
import boto3
import datetime
def lambda_handler(event,context):
client = boto3.resource('dynamodb',endpoint_url="http://localhost:8000")
response = client.export_table_to_point_in_time(
TableArn='table arn string',
ExportTime=datetime(2015, 1, 1),
S3Bucket='my-bucket',
S3BucketOwner='string',
ExportFormat='DYNAMODB_JSON'
)
print("Response :", response)
Boto 3 version : 1.24.82
ExportTableToPointInTime is not available on DynamoDB Local, so if you are trying to do it in local (assumed from the localhost endpoint) you cannot.
Moreover, the Resource client does not have that API. You must use the Client client.
import boto3
dynamodb = boto3.client('dynamodb')
I am trying to read the parquet file which is in s3 using pandas.
Below is the code
import boto3
import pandas as pd
key = 'key'
secret = 'secret'
s3_client = boto3.client(
's3',
aws_access_key_id = key,
aws_secret_access_key = secret,
region_name = 'region_name'
)
print(s3_client)
AWS_S3_BUCKET='bucket_name'
filePath='data/wine_dataset'
response = s3_client.get_object(Bucket=AWS_S3_BUCKET, Key=filePath)
status = response.get("ResponseMetadata", {}).get("HTTPStatusCode")
if status == 200:
print(f"Successful S3 get_object response. Status - {status}")
books_df = pd.read_parquet(response.get("Body"))
print(books_df)
else:
print(f"Unsuccessful S3 get_object response. Status - {status}")
I am getting the below error
NoSuchKey: An error occurred (NoSuchKey) when calling the GetObject operation: The specified key does not exist.
But when I read the same s3 path using pyspark it worked
path= 's3a://bucket_name/data/wine_dataset'
df = spark.read.parquet(path)
I am not sure why it is not working using pandas. Can anyone help me on this?
My Python version is 3.9 and I am writing a AWS lambda function using boto3
In addition to assigning all admin access and s3full and datasync full roles, I also created a trust relationship but still receive the following error.
I wonder if anyone has experienced the same issue and solved it.
"errorMessage": "An error occurred (InvalidRequestException) when
calling the CreateTask operation: Invalid parameter: ARN account ()
must match authenticated user.",
import json
import logging
import sys
import boto3
from botocore.exceptions import ClientError
logger = logging.getLogger(__name__)
client = boto3.client('datasync', region_name='us-east-1')
create_location_s3 = client.create_location_s3(
Subdirectory='/',
S3BucketArn='arn:aws:s3:::data-sync-bucket',
S3StorageClass='STANDARD',
S3Config={
'BucketAccessRoleArn': 'arn:aws:iam::XXXX:role/datasync-data-sync-bucket-ARN'
},
AgentArns=[
'',
],
Tags=[
{
'Key': 'name',
'Value': 'datasync-lambda'
},
]
)
I think your error is more related to datasync than IAM or Lambda.
Please go through below links eventhough they are not going to help.
https://githubmemory.com/repo/hashicorp/terraform/issues/29593
https://issueexplorer.com/issue/hashicorp/terraform/29593
I would recommend if you have aws support subscription then they can help you better for this issue. else you can go through the documentation provided by AWS on datasync in this link https://docs.aws.amazon.com/datasync/latest/userguide/sync-dg.pdf
I am trying to create bucket using aws python boto 3.
Here is my code:-
import boto3
response = S3_CLIENT.create_bucket(
Bucket='symbols3arg',
CreateBucketConfiguration={'LocationConstraint': 'eu-west-1'}
)
print(response)
I am getting below error:-
botocore.exceptions.ClientError: An error occurred (IllegalLocationConstraintException) when calling the CreateBucket operation: The unspecified location constraint is incompatible for the region specific endpoint this request was sent to.
This happens you configured a different region during aws configure in specifying a different region in s3 client object initiation.
Suppose my AWS config look like
$ aws configure
AWS Access Key ID [None]: AKIAIOSFODEXAMPLE
AWS Secret Access Key [None]: wJalrXUtnFEMI/K7MDENG/bPxRfiCYEXAMPLEKEY
Default region name [None]: us-west-2
Default output format [None]: json
and my python script for creating bucket
import logging
import boto3
from botocore.exceptions import ClientError
def create_bucket(bucket_name, region=None):
# Create bucket
try:
if region is None:
s3_client = boto3.client('s3')
s3_client.create_bucket(Bucket=bucket_name)
else:
s3_client = boto3.client('s3')
location = {'LocationConstraint': region}
s3_client.create_bucket(Bucket=bucket_name,
CreateBucketConfiguration=location)
except ClientError as e:
logging.error(e)
return False
return True
create_bucket("test-bucket-in-region","us-west-1")
This will throw the below error
ERROR:root:An error occurred (IllegalLocationConstraintException) when calling the CreateBucket operation: The us-west-1 location constraint is incompatible for the region specific endpoint this request was sent to.
To solve this issue all you need to specify the region in s3 client object initiation. A working example in different region regardless of aws configure
import logging
import boto3
from botocore.exceptions import ClientError
def create_bucket(bucket_name, region=None):
"""Create an S3 bucket in a specified region
If a region is not specified, the bucket is created in the S3 default
region (us-east-1).
:param bucket_name: Bucket to create
:param region: String region to create bucket in, e.g., 'us-west-2'
:return: True if bucket created, else False
"""
# Create bucket
try:
if region is None:
s3_client = boto3.client('s3')
s3_client.create_bucket(Bucket=bucket_name)
else:
s3_client = boto3.client('s3', region_name=region)
location = {'LocationConstraint': region}
s3_client.create_bucket(Bucket=bucket_name,
CreateBucketConfiguration=location)
except ClientError as e:
logging.error(e)
return False
return True
create_bucket("my-working-bucket","us-west-1")
create-an-amazon-s3-bucket
Send the command to S3 in the same region:
import boto3
s3_client = boto3.client('s3', region_name='eu-west-1')
response = s3_client.create_bucket(
Bucket='symbols3arg',
CreateBucketConfiguration={'LocationConstraint': 'eu-west-1'}
)
You can try the following code.
import boto3
client = boto3.client('s3',region_name="aws_region_code")
response = client.create_bucket(
Bucket='string'
)
Hope, it might helps.
I am trying to download a file from Amazon S3 bucket to my local using the below code but I get an error saying "Unable to locate credentials"
Given below is the code I have written:
from boto3.session import Session
import boto3
ACCESS_KEY = 'ABC'
SECRET_KEY = 'XYZ'
session = Session(aws_access_key_id=ACCESS_KEY,
aws_secret_access_key=SECRET_KEY)
s3 = session.resource('s3')
your_bucket = s3.Bucket('bucket_name')
for s3_file in your_bucket.objects.all():
print(s3_file.key) # prints the contents of bucket
s3 = boto3.client ('s3')
s3.download_file('your_bucket','k.png','/Users/username/Desktop/k.png')
Could anyone help me on this?
You are not using the session you created to download the file, you're using s3 client you created. If you want to use the client you need to specify credentials.
your_bucket.download_file('k.png', '/Users/username/Desktop/k.png')
or
s3 = boto3.client('s3', aws_access_key_id=... , aws_secret_access_key=...)
s3.download_file('your_bucket','k.png','/Users/username/Desktop/k.png')
From an example in the official documentation, the correct format is:
import boto3
s3 = boto3.client('s3', aws_access_key_id=... , aws_secret_access_key=...)
s3.download_file('BUCKET_NAME', 'OBJECT_NAME', 'FILE_NAME')
You can also use a file-like object opened in binary mode.
s3 = boto3.client('s3', aws_access_key_id=... , aws_secret_access_key=...)
with open('FILE_NAME', 'wb') as f:
s3.download_fileobj('BUCKET_NAME', 'OBJECT_NAME', f)
f.seek(0)
The code in question uses s3 = boto3.client ('s3'), which does not provide any credentials.
The format for authenticating a client is shown here:
import boto3
client = boto3.client(
's3',
aws_access_key_id=ACCESS_KEY,
aws_secret_access_key=SECRET_KEY,
aws_session_token=SESSION_TOKEN,
)
# Or via the Session
session = boto3.Session(
aws_access_key_id=ACCESS_KEY,
aws_secret_access_key=SECRET_KEY,
aws_session_token=SESSION_TOKEN,
)
And lastly you can also re-use the authenticated session you created to get the bucket, and then download then file from the bucket.
from boto3.session import Session
import boto3
ACCESS_KEY = 'ABC'
SECRET_KEY = 'XYZ'
session = Session(aws_access_key_id=ACCESS_KEY,
aws_secret_access_key=SECRET_KEY)
# session is authenticated and can access the resource in question
session.resource('s3')
.Bucket('bucket_name')
.download_file('k.png','/Users/username/Desktop/k.png')
For others trying to download files from AWS S3 looking for a more user-friendly solution with other industrial-strength features, check out https://github.com/d6t/d6tpipe. It abstracts the S3 functions into a simpler interface. It also supports directory sync, uploading files, permissions and many other things you need to sync files from S3 (and ftp).
import d6tpipe
api = d6tpipe.api.APILocal() # keep permissions locally for security
settings = \
{
'name': 'my-files',
'protocol': 's3',
'location': 'bucket-name',
'readCredentials' : {
'aws_access_key_id': 'AAA',
'aws_secret_access_key': 'BBB'
}
}
d6tpipe.api.create_pipe_with_remote(api, settings)
pipe = d6tpipe.Pipe(api, 'my-files')
pipe.scan_remote() # show all files
pipe.pull_preview() # preview
pipe.pull(['k.png']) # download single file
pipe.pull() # download all files
pipe.files() # show files
file=open(pipe.dirpath/'k.png') # access file
You can setup your AWS profile with awscli to avoid introduce your credentials in the file. First add your profile:
aws configure --profile account1
Then in your code add:
aws_session = boto3.Session(profile_name="account1")
s3_client = aws_session.client('s3')
FileName:
can be any name; with that name; file will be downloaded.
It can be added to any existing local directory.
Key:
Is the S3 file path along with the file name in the end.
It does not start with a backslash.
Session()
It automatically picks the credentials from ~/.aws/config OR ~/.aws/credentials
If not you need to explicitly pass that.
from boto3.session import Session
import boto3
# Let's use Amazon S3
s3 = boto3.resource("s3")
# Print out bucket names to check you have accessibility
# for bucket in s3.buckets.all():
# print(bucket.name)
session = Session()
OR
session = Session(aws_access_key_id="AKIAYJN2LNOU",
aws_secret_access_key="wMyT0SxEOsoeiHYVO3v9Gc",
region_name="eu-west-1")
session.resource('s3').Bucket('bucket-logs').download_file(Key="logs/20221122_0_5ee03da676ac566336e2279decfc77b3.gz", Filename="/tmp/Local_file_name.gz")