Long polling AWS S3 to check if item exists? - python-3.x

The context here is simple, there's a lambda (lambda1) that creates a file asynchronously and then uploads it to S3.
Then, another lambda (lambda2) receives the soon-to-exist file name and needs to keep checking S3 until the file exists.
I don't think S3 triggers will work because lambda2 is invoked by a client request
1) Do I get charged for this kind of request between lambda and S3? I will be polling it until the object exists
2) What other way could I achieve this that doesn't incur charges?
3) What method do I use to check if a file exists in S3? (just try to get it and check status code?)

This looks like you should be using an S3 objectCreated trigger on the Lambda. That way, whenever an object gets created, it will trigger your Lambda function automatically with the file metadata.
See here for information on configuring an S3 event trigger

Let me make sure I understand correctly.
Client calls Lambda1. Lambda1 creates a file async and uploads to S3
the call to lambda one returns as soon as lambda1 has started it's async processing.
Client calls lambda2, to pull the file from s3 that lambda1 is going to push there.
Why not just wait for Lambda one to create the file and return it to client? Otherwise this is going to be an expensive file exchange.

Related

AWS S3 lambda function doesn't trigger when upload large file

I have 2 buckets on the S3 service. I have a lambda function "create-thumbnail" that triggered when an object is created into an original bucket, if it is an image, then resize it and upload it into the resized bucket.
Everything is working fine, but the function doesn't trigger when I upload files more than 4MB on the original bucket.
Function configurations are as follow,
Timeout Limit: 2mins
Memory 10240
Trigger Event type: ObjectCreated (that covers create, put, post, copy and multipart upload complete)
Instead of using the lambda function, I have used some packages on the server and resize the file accordingly and then upload those files on the S3 bucket.
I know this is not a solution to this question, but that's the only solution I found
Thanks to everyone who took their time to investigate this.

Cloud Function storage trigger on folder of a particular Bucket

I have a scenario for executing a cloud function when something is changed in particular folder of a bucket. While I am deploying a function using cli and passing BUCKET/FOLDERNAME as a trigger, it was giving me an error invalid arguments. Is there any one to give trigger at FOLDER level?
You can only specify a bucket name. You cannot specify a folder within the bucket.
A key point to note is that the namespace for buckets is flat. Folders are emulated, they don't actually exist. All objects in a bucket have the bucket as the parent, not a directory.
What you can actually do is implement an if condition inside of your function to only do stuff if the request contains an object with the name of your folder. Keep in mind that by following this approach your function will still be triggered for every object uploaded to your bucket.

s3 trigger event not working when file uploaded by using node js

I am working in node js. I want to execute the trigger when user upload the files to s3. So I created the script in node js which will upload the file to s3 bucket. But s3 event is not fired, however whenever I upload the file to s3 manually then trigger fires.
Please help
Since in your questions some things are unclear i.e. which method you are using in node js to upload file and what is your configuration in AWS Lambda to trigger the event.
I would recommend If you are using s3.upload() then try to use s3.putObject({}) to upload file in S3.
Check the trigger configuration is rightly created in AWS Lambda, Make sure the Event type as PUT is selected.
Check for IAM policy for the lambda function. It should have the below permission:
S3:PutBucketNotification

How to upload downloaded file to s3 bucket using Lambda function

I saw different questions/answers but I could not find the one that worked for me. Hence, I am really new to AWS, I need your help. I am trying to download gzip file and load it to the json file then upload it to the S3 bucket using Lambda function. I wrote the code to download the file and convert it to json but having problem while uploading it to the s3 bucket. Assume that file is ready as x.json. What should I do then?
I know it is really basic question but still help needed :)
This code will upload to Amazon S3:
import boto3
s3_client = boto3.client('s3', region_name='us-west-2') # Change as appropriate
s3._client.upload_file('/tmp/foo.json', 'my-bucket', 'folder/foo.json')
Some tips:
In Lambda functions you can only write to /tmp/
There is a limit of 512MB
At the end of your function, delete the files (zip, json, etc) because the container can be reused and you don't want to run out of disk space
If your lambda has proper permission to write a file into S3, then simply use boto3 package which is an AWS SDK for python.
https://boto3.amazonaws.com/v1/documentation/api/latest/reference/services/s3.html
Be aware that if the lambda locates inside of VPC then lambda cannot access to the public internet, and also boto3 API endpoint. Thus, you may require a NAT gateway to proxy lambda to the public.

How do I create events for the buckets in s3 using nodejs or from lambda function

I wish to create s3 event from nodejs lambda function to to call another lambda function.
for example-> I have a lambda function test1 that creates s3 bucket and attaches the event to the bucket such that on the occurence of the event it call some different lambda function.
Problem -> I can create the lambda function to create an s3 bucket but I am not able to find create event method in AWS S3 documentation to implement this.
It is called BucketNotificationConfiguration in s3.
Refer
http://docs.aws.amazon.com/AWSJavaScriptSDK/latest/AWS/S3.html#putBucketNotificationConfiguration-property
http://docs.aws.amazon.com/AmazonS3/latest/API/RESTBucketPUTnotification.html
http://docs.aws.amazon.com/lambda/latest/dg/with-s3.html

Resources