I have written below the AWS lambda function to export dynamodb table to the S3 bucket. But when I execute the below code, I am getting an error
'dynamodb.ServiceResource' object has no attribute 'export_table_to_point_in_time'
import boto3
import datetime
def lambda_handler(event,context):
client = boto3.resource('dynamodb',endpoint_url="http://localhost:8000")
response = client.export_table_to_point_in_time(
TableArn='table arn string',
ExportTime=datetime(2015, 1, 1),
S3Bucket='my-bucket',
S3BucketOwner='string',
ExportFormat='DYNAMODB_JSON'
)
print("Response :", response)
Boto 3 version : 1.24.82
ExportTableToPointInTime is not available on DynamoDB Local, so if you are trying to do it in local (assumed from the localhost endpoint) you cannot.
Moreover, the Resource client does not have that API. You must use the Client client.
import boto3
dynamodb = boto3.client('dynamodb')
Related
I want to write a lambda function with Python, to enable S3 bucket default encryption, if the newly created bucket is not encryption enabled
Need to have following steps
Trigger Lambda function when new S3 bucket is created
If Default encryption is not enabled, it will enable automatically
SNS topic will be triggered and send email to administrator & bucket creator/owner
Following lambda function, I have created will encrypt any existing bucket periodically. I want to extend it to trigger at new bucket creation
import json
import boto3
def lambda_handler(event, context):
s3 = boto3.client("s3")
response = s3.list_buckets()
buckets = [bucket['Name'] for bucket in response['Buckets']]
status = 401
unencrypted_buckets = []
for bucket in buckets:
try:
s3.get_bucket_encryption(Bucket=bucket)
print(f"Bucket {bucket} has already Encryption enabled")
except s3.exceptions.ClientError:
unencrypted_buckets.append(bucket)
encryption_enabled_buckets = []
for unencrypted_bucket in unencrypted_buckets:
try:
print(f"Bucket {unencrypted_bucket} has no Encryption enabled")
s3.put_bucket_encryption(
Bucket=unencrypted_bucket,
ServerSideEncryptionConfiguration={
'Rules': [
{
'ApplyServerSideEncryptionByDefault':
{
'SSEAlgorithm': 'AES256'
}
}
]
}
)
encryption_enabled_buckets.append(unencrypted_bucket)
status = 200
except s3.exceptions.ClientError:
status = 500
break
return {
'statusCode': status,
'details': 'Default encryption enabled',
'encryption enabling success': encryption_enabled_buckets,
'encryption enabling failed': list(set(unencrypted_buckets) - set(encryption_enabled_buckets)) + list(
set(encryption_enabled_buckets) - set(unencrypted_buckets))
}
You may not have to code this at all. Consider using AWS Config Rules for this, and other, compliance requirements.
See AWS Config managed rules:
s3-bucket-server-side-encryption-enabled.html
s3-default-encryption-kms.html
AWS Config can send notifications via SNS and here is an example of How can I be notified when an AWS resource is non-compliant using AWS Config?
I am trying to connect to AWS S3 and list the buckets from my local machine through python. I am using the following code
import boto3
from boto3 import Session
from boto.s3.connection import S3Connection
from boto.sts import STSConnection
import pandas
session = boto3.Session(profile_name='mfa_0729')
credentials=session.get_credentials()
dev_s3_client = session.client('s3')
dev_s3_resource = session.resource('s3')
bucketname = 'my-bucket-name'
startAfter = 'my-claim-name'
obj1=dev_s3_client.list_objects_v2(Bucket=bucketname, StartAfter=startAfter )
There is a session token that I have to use that I saved in the profile along with other credentials. When executing the last line, I get the error
botocore.exceptions.ClientError: An error occurred (AccessDenied) when calling the ListObjectsV2 operation: Access Denied
Can anyone point out what am I doing wrong? I am new to AWS and boto3.
I am trying to change the storage class of an object in S3 from standard to IA
This is similar to this thread. But I would like to do it using boto3 and lambda trigger.
thanks
You can use copy_object class:
You can use the CopyObject action to change the storage class of an object that is already stored in Amazon S3 using the StorageClass parameter.
For example:
import boto3
s3 = boto3.client('s3')
bucket_name = '<your bucket-name>'
object_key = '<your-object-key>'
r = s3.copy_object(
CopySource=f"{bucket_name}/{object_key}",
Bucket=bucket_name,
Key=object_key,
StorageClass='STANDARD_IA')
print(r)
I am trying to create bucket using aws python boto 3.
Here is my code:-
import boto3
response = S3_CLIENT.create_bucket(
Bucket='symbols3arg',
CreateBucketConfiguration={'LocationConstraint': 'eu-west-1'}
)
print(response)
I am getting below error:-
botocore.exceptions.ClientError: An error occurred (IllegalLocationConstraintException) when calling the CreateBucket operation: The unspecified location constraint is incompatible for the region specific endpoint this request was sent to.
This happens you configured a different region during aws configure in specifying a different region in s3 client object initiation.
Suppose my AWS config look like
$ aws configure
AWS Access Key ID [None]: AKIAIOSFODEXAMPLE
AWS Secret Access Key [None]: wJalrXUtnFEMI/K7MDENG/bPxRfiCYEXAMPLEKEY
Default region name [None]: us-west-2
Default output format [None]: json
and my python script for creating bucket
import logging
import boto3
from botocore.exceptions import ClientError
def create_bucket(bucket_name, region=None):
# Create bucket
try:
if region is None:
s3_client = boto3.client('s3')
s3_client.create_bucket(Bucket=bucket_name)
else:
s3_client = boto3.client('s3')
location = {'LocationConstraint': region}
s3_client.create_bucket(Bucket=bucket_name,
CreateBucketConfiguration=location)
except ClientError as e:
logging.error(e)
return False
return True
create_bucket("test-bucket-in-region","us-west-1")
This will throw the below error
ERROR:root:An error occurred (IllegalLocationConstraintException) when calling the CreateBucket operation: The us-west-1 location constraint is incompatible for the region specific endpoint this request was sent to.
To solve this issue all you need to specify the region in s3 client object initiation. A working example in different region regardless of aws configure
import logging
import boto3
from botocore.exceptions import ClientError
def create_bucket(bucket_name, region=None):
"""Create an S3 bucket in a specified region
If a region is not specified, the bucket is created in the S3 default
region (us-east-1).
:param bucket_name: Bucket to create
:param region: String region to create bucket in, e.g., 'us-west-2'
:return: True if bucket created, else False
"""
# Create bucket
try:
if region is None:
s3_client = boto3.client('s3')
s3_client.create_bucket(Bucket=bucket_name)
else:
s3_client = boto3.client('s3', region_name=region)
location = {'LocationConstraint': region}
s3_client.create_bucket(Bucket=bucket_name,
CreateBucketConfiguration=location)
except ClientError as e:
logging.error(e)
return False
return True
create_bucket("my-working-bucket","us-west-1")
create-an-amazon-s3-bucket
Send the command to S3 in the same region:
import boto3
s3_client = boto3.client('s3', region_name='eu-west-1')
response = s3_client.create_bucket(
Bucket='symbols3arg',
CreateBucketConfiguration={'LocationConstraint': 'eu-west-1'}
)
You can try the following code.
import boto3
client = boto3.client('s3',region_name="aws_region_code")
response = client.create_bucket(
Bucket='string'
)
Hope, it might helps.
In an AWS lambda, I am using boto3 to put a string into an S3 file:
import boto3
s3 = boto3.client('s3')
data = s3.get_object(Bucket=XXX, Key=YYY)
data.put('Body', 'hello')
I am told this:
[ERROR] AttributeError: 'dict' object has no attribute 'put'
The same happens with data.put('hello') which is the method recommended by the top answers at How to write a file or data to an S3 object using boto3 and with data.put_object: 'dict' object has no attribute 'put_object'.
What am I doing wrong?
On the opposite, reading works great (with data.get('Body').read().decode('utf-8')).
put_object is a method of the s3 object, not the data object.
Here is a full working example with Python 3.7:
import json
import boto3
s3 = boto3.client('s3')
import logging
logger = logging.getLogger()
logger.setLevel(logging.INFO)
def lambda_handler(event, context):
bucket = 'mybucket'
key = 'id.txt'
id = None
# Write id to S3
s3.put_object(Body='Hello!', Bucket=bucket, Key=key)
# Read id from S3
data = s3.get_object(Bucket=bucket, Key=key)
id = data.get('Body').read().decode('utf-8')
logger.info("Id:" + id)
return {
'statusCode': 200,
'body': json.dumps('Id:' + id)
}