Error while invoking the AWS Lambda function - node.js

I am trying to integrate AWS S3 with Lambda, based on this AWS tutorial. When an image is added to S3, it will trigger a Lambda function which will get the image from S3, resize it and upload the same to S3 back again.
After copying the function to the AWS Lambda Management, I do get the below message. I am not sure how to handle it. I am using Node.js 8.10 as the runtime. The complete code can be found here. The file name is index.js, the Lambda handler is index.handler and exports.handler is defined in the Lambda function.
Upon saving the Lambda function and triggering the same by putting an image in S3, I do get the below message in the CloudWatch Logs.
I am not familiar with Node.js and am stuck here. Any solution would be appreciated.
Update: Here is the folder structure or the tree.

The problem is that you have not deployed the Lambda function correctly. This code has dependencies on the GraphicsMagick and Async libraries, and you have not uploaded either of them to Lambda so your require() calls are failing. You should re-read the Tutorial, but basically you need to:
npm init
npm install gm async --save
zip -r function.zip .
aws lambda create-function ... (per the tutorial)
Your deployed Lambda function should look like this (note the inclusion of a package.json file as well as node_modules subfolders for the dependent NPM packages):

Related

How to share my own custom fucntions on AWS lambda nodejs

I Currently have a project in AWS with several lambda functions, most of the functions in NodeJS, I want to know if is there a way to create a lambda layer with my own code functions that I use in different lambdas without publish it in npm, I already search in old questions in stack question-1 question-2, but these were not answered
Thanks for help!
create a folder in your local machine called nodejs
put your "shared" logic in that folder like /nodejs/shared.js
you can zip this nodejs folder and upload as a layer
in your lambda code require the shared.js as const shared = require('/opt/nodejs/shared.js')
Links:
Lambda layers: https://docs.aws.amazon.com/lambda/latest/dg/configuration-layers.html
Detailed guide: https://explainexample.com/computers/aws/aws-lambda-layers-node-app
Using layers with SAM: https://docs.aws.amazon.com/serverlessrepo/latest/devguide/sharing-lambda-layers.html

How can i add a file to my AWS SAM lambda function runtime?

While working with aws i need to load a WSDL file in order to setup a soap service. The problem I now encounter however is that i don't know how i can possibly add a file to the docker container running my lambda function so that i can just read the file inside my lambda like in the code snippet below.
const files = readdirSync(__dirname + pathToWsdl);
files.forEach(file => {
log.info(file);
});
any suggestions on how i can do this are greatly appreciated!
Here are a few options:
If the files are static and small, you can bundle them in the Lambda package.
If the files are static or change infrequently then you can store them in S3 and pull from there on Lambda cold start.
If the files need to be accessed and modified by multiple Lambda functions concurrently or if you have a large volume of data with low-latency access requirements, then use EFS.
EFS is overkill for a small, static file. I would just package the file with the Lambda function.

s3 trigger event not working when file uploaded by using node js

I am working in node js. I want to execute the trigger when user upload the files to s3. So I created the script in node js which will upload the file to s3 bucket. But s3 event is not fired, however whenever I upload the file to s3 manually then trigger fires.
Please help
Since in your questions some things are unclear i.e. which method you are using in node js to upload file and what is your configuration in AWS Lambda to trigger the event.
I would recommend If you are using s3.upload() then try to use s3.putObject({}) to upload file in S3.
Check the trigger configuration is rightly created in AWS Lambda, Make sure the Event type as PUT is selected.
Check for IAM policy for the lambda function. It should have the below permission:
S3:PutBucketNotification

AWS Lambda function to connect to a Postgresql database

Does anyone know how I can connect to a PostgreSQL database through an AWS Lambda function. I searched it up online but I couldn't find anything about it. If you could tell me how to go about it that would be great.
If you can find something wrong with my code (node.js) that would be great otherwise can you tell me how to go about it?
exports.handler = (event, context, callback) => {
"use strict"
const pg = require('pg');
const connectionStr =
"postgres://username:password#host:port/db_name";
var client = new pg.Client(connectionStr);
client.connect(function(err){
if(err) {
callback(err)
}
callback(null, 'Connection established');
});
context.callbackWaitsForEmptyEventLoop = false;
};
The code throws an error:
cannot find module 'pg'
I wrote it directly on AWS Lambda and didn't upload anything if that makes a difference.
I wrote it directly on AWS Lambda and didn't upload anything if that makes a difference.
Yes this makes the difference! Lambda doesnt provide 3rd party libraries out of the box. As soon as you have a dependency on a 3rd party library you need to zip and upload your Lambda code manually or with the use of the API.
Fore more informations: Lambda Execution Environment and Available Libraries
You need to refer Creating a Deployment Package (Node.js)
Simple scenario – If your custom code requires only the AWS SDK library, then you can use the inline editor in the AWS Lambda console. Using the console, you can edit and upload your code to AWS Lambda. The console will zip up your code with the relevant configuration information into a deployment package that the Lambda service can run.
and
Advanced scenario – If you are writing code that uses other resources, such as a graphics library for image processing, or you want to use the AWS CLI instead of the console, you need to first create the Lambda function deployment package, and then use the console or the CLI to upload the package.
Your case like mine falls under Advanced scenario. So we need to create a deployment package and then upload it. Here what I did -
mkdir deployment
cd deployment
vi index.js
write your lambda code in this file. Make sure your handler name is index.handler when you create it.
npm install pg
You should see node_modules directory created in deployment directory which has multiple modules in it
Package the deployment directory into a zip file and upload to Lambda.
You should be good then
NOTE : npm install will install node modules in same directory under node_modules directory unless it sees a node_module directory in parent directory. To be same first do npm init followed by npm install to ensure modules are installed in same directory for deployment.

Lambda function failing, no logs generated

I'm playing with this PDF To Image converter and I've cloned the repo, run npm install, changed this section:
var s3EventHandler = new S3EventHandler({
region: 'my-region',
outputBucketName: 'my-bucket-name'
s3: s3,
resolution: 72
});
Renamed it exports.js, zipped up the the js, node_modules folder, package.json and event.json (I've also tried with both of these jsons removed) and uploaded it into my Lambda function. The s3 trigger has been created and so far is working fine.
I've had multiple test failures because it couldn't find a either the async module and tmp module, which I've moved to the top level and it seems to fix it (however it doesn't complain about the other modules that it requires and aren't in the top level).
In the test it complains s3 is not defined which I'm sorta lost with as there isn't a lot of details with it. I thought it could be that I'm just running test so the s3 trigger with itself is missing.
When I upload a pdf into the bucket, Lambda reports that it runs but fails. Going into CloudWatch Logs says there is no log stream for it. I've checked the IAM role and it has permissions to CreateLogStream and PutLogEvents (it was the templated IAM policy).
How can I get my logs working to find the problem? Or what can I do to fix the s3 not defined issue which is my only clue atm? It could be related to the top level module requirement however that doesn't seem consistent as only some modules need to be at the top level?
Looks like "CreateLogGroup" Permission is missing from what you have mentioned. The following permissions are required for lambda to write logs to CloudWatch
"logs:CreateLogGroup", "logs:CreateLogStream", "logs:PutLogEvents"

Resources