AWS Lambda function to enable Default Encryption on bucket creation - python-3.x

I want to write a lambda function with Python, to enable S3 bucket default encryption, if the newly created bucket is not encryption enabled
Need to have following steps
Trigger Lambda function when new S3 bucket is created
If Default encryption is not enabled, it will enable automatically
SNS topic will be triggered and send email to administrator & bucket creator/owner
Following lambda function, I have created will encrypt any existing bucket periodically. I want to extend it to trigger at new bucket creation
import json
import boto3
def lambda_handler(event, context):
s3 = boto3.client("s3")
response = s3.list_buckets()
buckets = [bucket['Name'] for bucket in response['Buckets']]
status = 401
unencrypted_buckets = []
for bucket in buckets:
try:
s3.get_bucket_encryption(Bucket=bucket)
print(f"Bucket {bucket} has already Encryption enabled")
except s3.exceptions.ClientError:
unencrypted_buckets.append(bucket)
encryption_enabled_buckets = []
for unencrypted_bucket in unencrypted_buckets:
try:
print(f"Bucket {unencrypted_bucket} has no Encryption enabled")
s3.put_bucket_encryption(
Bucket=unencrypted_bucket,
ServerSideEncryptionConfiguration={
'Rules': [
{
'ApplyServerSideEncryptionByDefault':
{
'SSEAlgorithm': 'AES256'
}
}
]
}
)
encryption_enabled_buckets.append(unencrypted_bucket)
status = 200
except s3.exceptions.ClientError:
status = 500
break
return {
'statusCode': status,
'details': 'Default encryption enabled',
'encryption enabling success': encryption_enabled_buckets,
'encryption enabling failed': list(set(unencrypted_buckets) - set(encryption_enabled_buckets)) + list(
set(encryption_enabled_buckets) - set(unencrypted_buckets))
}

You may not have to code this at all. Consider using AWS Config Rules for this, and other, compliance requirements.
See AWS Config managed rules:
s3-bucket-server-side-encryption-enabled.html
s3-default-encryption-kms.html
AWS Config can send notifications via SNS and here is an example of How can I be notified when an AWS resource is non-compliant using AWS Config?

Related

How to export dynamodb table using boto3?

I have written below the AWS lambda function to export dynamodb table to the S3 bucket. But when I execute the below code, I am getting an error
'dynamodb.ServiceResource' object has no attribute 'export_table_to_point_in_time'
import boto3
import datetime
def lambda_handler(event,context):
client = boto3.resource('dynamodb',endpoint_url="http://localhost:8000")
response = client.export_table_to_point_in_time(
TableArn='table arn string',
ExportTime=datetime(2015, 1, 1),
S3Bucket='my-bucket',
S3BucketOwner='string',
ExportFormat='DYNAMODB_JSON'
)
print("Response :", response)
Boto 3 version : 1.24.82
ExportTableToPointInTime is not available on DynamoDB Local, so if you are trying to do it in local (assumed from the localhost endpoint) you cannot.
Moreover, the Resource client does not have that API. You must use the Client client.
import boto3
dynamodb = boto3.client('dynamodb')

AWS S3 EventNotification to SNS Topic on a Bucket not Created with (CDK, Python)

I am trying to create a notification---whenever an object is created in a folder under an S3-Bucket this should send a notification to an SNS-Topic. However, with the code below I don't get any error and no notification created either. I suppose some kind of binding the eventNotification to a bucket is missing? Can anyone help?
from aws_cdk import (
aws_s3 as s3,
aws_sns as sns,
aws_s3_notifications as _s3_notifications,
core as cdk
)
from aws_cdk import core
class CdkCodeStack(cdk.Stack):
def __init__(self, scope: cdk.Construct, construct_id: str, **kwargs) -> None:
super().__init__(scope, construct_id, **kwargs)
# Create S3 Bucket
account_id = core.Aws.ACCOUNT_ID
my_bucket = s3.Bucket(self, id='my-bucket-id', bucket_name='my-bucket')
# Create SNS Topics named 'landing' and 'export'
sns_topic_landing = sns.Topic(self,id='sns_topic_landing_id', topic_name='sns_topic_landing')
sns_topic_export = sns.Topic(self,id='sns_topic_export_id', topic_name='sns_topic_export')
#send notifications to sns_topic_landing when an object is added to S3 Bucket folder = my-bucket/landing
sns_destination_topic = _s3_notifications.SnsDestination(sns_topic_landing)

How to change storage class of object in s3 bucket using boto3?

I am trying to change the storage class of an object in S3 from standard to IA
This is similar to this thread. But I would like to do it using boto3 and lambda trigger.
thanks
You can use copy_object class:
You can use the CopyObject action to change the storage class of an object that is already stored in Amazon S3 using the StorageClass parameter.
For example:
import boto3
s3 = boto3.client('s3')
bucket_name = '<your bucket-name>'
object_key = '<your-object-key>'
r = s3.copy_object(
CopySource=f"{bucket_name}/{object_key}",
Bucket=bucket_name,
Key=object_key,
StorageClass='STANDARD_IA')
print(r)

Unable to Create S3 Bucket(in specific Region) using AWS Python Boto3

I am trying to create bucket using aws python boto 3.
Here is my code:-
import boto3
response = S3_CLIENT.create_bucket(
Bucket='symbols3arg',
CreateBucketConfiguration={'LocationConstraint': 'eu-west-1'}
)
print(response)
I am getting below error:-
botocore.exceptions.ClientError: An error occurred (IllegalLocationConstraintException) when calling the CreateBucket operation: The unspecified location constraint is incompatible for the region specific endpoint this request was sent to.
This happens you configured a different region during aws configure in specifying a different region in s3 client object initiation.
Suppose my AWS config look like
$ aws configure
AWS Access Key ID [None]: AKIAIOSFODEXAMPLE
AWS Secret Access Key [None]: wJalrXUtnFEMI/K7MDENG/bPxRfiCYEXAMPLEKEY
Default region name [None]: us-west-2
Default output format [None]: json
and my python script for creating bucket
import logging
import boto3
from botocore.exceptions import ClientError
def create_bucket(bucket_name, region=None):
# Create bucket
try:
if region is None:
s3_client = boto3.client('s3')
s3_client.create_bucket(Bucket=bucket_name)
else:
s3_client = boto3.client('s3')
location = {'LocationConstraint': region}
s3_client.create_bucket(Bucket=bucket_name,
CreateBucketConfiguration=location)
except ClientError as e:
logging.error(e)
return False
return True
create_bucket("test-bucket-in-region","us-west-1")
This will throw the below error
ERROR:root:An error occurred (IllegalLocationConstraintException) when calling the CreateBucket operation: The us-west-1 location constraint is incompatible for the region specific endpoint this request was sent to.
To solve this issue all you need to specify the region in s3 client object initiation. A working example in different region regardless of aws configure
import logging
import boto3
from botocore.exceptions import ClientError
def create_bucket(bucket_name, region=None):
"""Create an S3 bucket in a specified region
If a region is not specified, the bucket is created in the S3 default
region (us-east-1).
:param bucket_name: Bucket to create
:param region: String region to create bucket in, e.g., 'us-west-2'
:return: True if bucket created, else False
"""
# Create bucket
try:
if region is None:
s3_client = boto3.client('s3')
s3_client.create_bucket(Bucket=bucket_name)
else:
s3_client = boto3.client('s3', region_name=region)
location = {'LocationConstraint': region}
s3_client.create_bucket(Bucket=bucket_name,
CreateBucketConfiguration=location)
except ClientError as e:
logging.error(e)
return False
return True
create_bucket("my-working-bucket","us-west-1")
create-an-amazon-s3-bucket
Send the command to S3 in the same region:
import boto3
s3_client = boto3.client('s3', region_name='eu-west-1')
response = s3_client.create_bucket(
Bucket='symbols3arg',
CreateBucketConfiguration={'LocationConstraint': 'eu-west-1'}
)
You can try the following code.
import boto3
client = boto3.client('s3',region_name="aws_region_code")
response = client.create_bucket(
Bucket='string'
)
Hope, it might helps.

Download file from AWS S3 using Python

I am trying to download a file from Amazon S3 bucket to my local using the below code but I get an error saying "Unable to locate credentials"
Given below is the code I have written:
from boto3.session import Session
import boto3
ACCESS_KEY = 'ABC'
SECRET_KEY = 'XYZ'
session = Session(aws_access_key_id=ACCESS_KEY,
aws_secret_access_key=SECRET_KEY)
s3 = session.resource('s3')
your_bucket = s3.Bucket('bucket_name')
for s3_file in your_bucket.objects.all():
print(s3_file.key) # prints the contents of bucket
s3 = boto3.client ('s3')
s3.download_file('your_bucket','k.png','/Users/username/Desktop/k.png')
Could anyone help me on this?
You are not using the session you created to download the file, you're using s3 client you created. If you want to use the client you need to specify credentials.
your_bucket.download_file('k.png', '/Users/username/Desktop/k.png')
or
s3 = boto3.client('s3', aws_access_key_id=... , aws_secret_access_key=...)
s3.download_file('your_bucket','k.png','/Users/username/Desktop/k.png')
From an example in the official documentation, the correct format is:
import boto3
s3 = boto3.client('s3', aws_access_key_id=... , aws_secret_access_key=...)
s3.download_file('BUCKET_NAME', 'OBJECT_NAME', 'FILE_NAME')
You can also use a file-like object opened in binary mode.
s3 = boto3.client('s3', aws_access_key_id=... , aws_secret_access_key=...)
with open('FILE_NAME', 'wb') as f:
s3.download_fileobj('BUCKET_NAME', 'OBJECT_NAME', f)
f.seek(0)
The code in question uses s3 = boto3.client ('s3'), which does not provide any credentials.
The format for authenticating a client is shown here:
import boto3
client = boto3.client(
's3',
aws_access_key_id=ACCESS_KEY,
aws_secret_access_key=SECRET_KEY,
aws_session_token=SESSION_TOKEN,
)
# Or via the Session
session = boto3.Session(
aws_access_key_id=ACCESS_KEY,
aws_secret_access_key=SECRET_KEY,
aws_session_token=SESSION_TOKEN,
)
And lastly you can also re-use the authenticated session you created to get the bucket, and then download then file from the bucket.
from boto3.session import Session
import boto3
ACCESS_KEY = 'ABC'
SECRET_KEY = 'XYZ'
session = Session(aws_access_key_id=ACCESS_KEY,
aws_secret_access_key=SECRET_KEY)
# session is authenticated and can access the resource in question
session.resource('s3')
.Bucket('bucket_name')
.download_file('k.png','/Users/username/Desktop/k.png')
For others trying to download files from AWS S3 looking for a more user-friendly solution with other industrial-strength features, check out https://github.com/d6t/d6tpipe. It abstracts the S3 functions into a simpler interface. It also supports directory sync, uploading files, permissions and many other things you need to sync files from S3 (and ftp).
import d6tpipe
api = d6tpipe.api.APILocal() # keep permissions locally for security
settings = \
{
'name': 'my-files',
'protocol': 's3',
'location': 'bucket-name',
'readCredentials' : {
'aws_access_key_id': 'AAA',
'aws_secret_access_key': 'BBB'
}
}
d6tpipe.api.create_pipe_with_remote(api, settings)
pipe = d6tpipe.Pipe(api, 'my-files')
pipe.scan_remote() # show all files
pipe.pull_preview() # preview
pipe.pull(['k.png']) # download single file
pipe.pull() # download all files
pipe.files() # show files
file=open(pipe.dirpath/'k.png') # access file
You can setup your AWS profile with awscli to avoid introduce your credentials in the file. First add your profile:
aws configure --profile account1
Then in your code add:
aws_session = boto3.Session(profile_name="account1")
s3_client = aws_session.client('s3')
FileName:
can be any name; with that name; file will be downloaded.
It can be added to any existing local directory.
Key:
Is the S3 file path along with the file name in the end.
It does not start with a backslash.
Session()
It automatically picks the credentials from ~/.aws/config OR ~/.aws/credentials
If not you need to explicitly pass that.
from boto3.session import Session
import boto3
# Let's use Amazon S3
s3 = boto3.resource("s3")
# Print out bucket names to check you have accessibility
# for bucket in s3.buckets.all():
# print(bucket.name)
session = Session()
OR
session = Session(aws_access_key_id="AKIAYJN2LNOU",
aws_secret_access_key="wMyT0SxEOsoeiHYVO3v9Gc",
region_name="eu-west-1")
session.resource('s3').Bucket('bucket-logs').download_file(Key="logs/20221122_0_5ee03da676ac566336e2279decfc77b3.gz", Filename="/tmp/Local_file_name.gz")

Resources