How to share my own custom fucntions on AWS lambda nodejs - node.js

I Currently have a project in AWS with several lambda functions, most of the functions in NodeJS, I want to know if is there a way to create a lambda layer with my own code functions that I use in different lambdas without publish it in npm, I already search in old questions in stack question-1 question-2, but these were not answered
Thanks for help!

create a folder in your local machine called nodejs
put your "shared" logic in that folder like /nodejs/shared.js
you can zip this nodejs folder and upload as a layer
in your lambda code require the shared.js as const shared = require('/opt/nodejs/shared.js')
Links:
Lambda layers: https://docs.aws.amazon.com/lambda/latest/dg/configuration-layers.html
Detailed guide: https://explainexample.com/computers/aws/aws-lambda-layers-node-app
Using layers with SAM: https://docs.aws.amazon.com/serverlessrepo/latest/devguide/sharing-lambda-layers.html

Related

How can i add a file to my AWS SAM lambda function runtime?

While working with aws i need to load a WSDL file in order to setup a soap service. The problem I now encounter however is that i don't know how i can possibly add a file to the docker container running my lambda function so that i can just read the file inside my lambda like in the code snippet below.
const files = readdirSync(__dirname + pathToWsdl);
files.forEach(file => {
log.info(file);
});
any suggestions on how i can do this are greatly appreciated!
Here are a few options:
If the files are static and small, you can bundle them in the Lambda package.
If the files are static or change infrequently then you can store them in S3 and pull from there on Lambda cold start.
If the files need to be accessed and modified by multiple Lambda functions concurrently or if you have a large volume of data with low-latency access requirements, then use EFS.
EFS is overkill for a small, static file. I would just package the file with the Lambda function.

SAM Lambda Layer module not found for shared nodejs code

I'm defining multiple lambda functions in a single template.yaml. These functions have some common code, but not published modules. I assumed I could turn this common stuff into a versioned layer. With a directory to the effect as follows:
Project
LambdaFunc1
package.json
node_modules
func1.js
LambdaFunc2
package.json
node_modules
func2.js
common-stuff
package.json
my-common.js
template.yaml
node_modules
After testing, I copy common-stuff into the Projects/node_modules directory and my other LambdaFuncs resolve require('common-stuff') based on Node moving up the directory structure for not found modules.
To have SAM do the build/package/deploy, I noticed SAM doesn't touch the common-stuff however creates an .aws-sam/build structure with the two other Lambda functions. I had to create a structure for SAM's CodeURI to zip up.
Package/common-stuff/packaged/nodejs/node_modules/common-stuff/ with my package.json and my-common.js.
My package.json uses name: "common-stuff", main: "my-common.js"
There are no other files - nothing under nodejs as I'm only packaging the modules. This appears to me the reason for Layers. I have verified SAM packages a zip file containing nodejs/node_modules/common-stuff/... by downloading the Layer zip file.
In the Lambda function template def, I add the permission to allow 'lambda:GetLayerVersion'. When I view the Lambda function in the console, I see this permission along with the others.
Interestingly, aws lambda get-layer-version-policy --layer-name MyLayer --version-number 8 --output text
returns an error that there are no policies attached. My guess is that is because I've directly added it to the function, as I see it on the Lambda function with the correct Allow/GetLayerVersion.
This would seem to satisfy what I've read, however Node doesn't find the module. CloudWatch logs just say it can't find the module, nothing about permissions or syntax. Also, these functions worked until I added the Layer approach.
'sam local start-api' doesn't work either, same error. When I look in the Windows 10 default layer cache directory C:\Users\me\AppData\Roaming\AWS SAM\ there is an empty layers-pkg directory.
Is there some other magic I'm missing? Is there a better approach for sharing common code across Node Lambda functions?
I can't tell if AWS can't get the Layer, or the zip structure is wrong, or the require('common-stuff') is different (hope not).
Scott

Error while invoking the AWS Lambda function

I am trying to integrate AWS S3 with Lambda, based on this AWS tutorial. When an image is added to S3, it will trigger a Lambda function which will get the image from S3, resize it and upload the same to S3 back again.
After copying the function to the AWS Lambda Management, I do get the below message. I am not sure how to handle it. I am using Node.js 8.10 as the runtime. The complete code can be found here. The file name is index.js, the Lambda handler is index.handler and exports.handler is defined in the Lambda function.
Upon saving the Lambda function and triggering the same by putting an image in S3, I do get the below message in the CloudWatch Logs.
I am not familiar with Node.js and am stuck here. Any solution would be appreciated.
Update: Here is the folder structure or the tree.
The problem is that you have not deployed the Lambda function correctly. This code has dependencies on the GraphicsMagick and Async libraries, and you have not uploaded either of them to Lambda so your require() calls are failing. You should re-read the Tutorial, but basically you need to:
npm init
npm install gm async --save
zip -r function.zip .
aws lambda create-function ... (per the tutorial)
Your deployed Lambda function should look like this (note the inclusion of a package.json file as well as node_modules subfolders for the dependent NPM packages):

Can I use lambda to compress all images under a bucket?

Can I use lambda to compress images under a bucket?
I can get the images under a particular bucket visa listObject. How do you compress these returns and write it in another bucket?
Yes, you can absolutely use lambda. Try this library: aws-lambda-image-compressor
AWS lambda function to compress and resize images
This is a Lambda Function which resizes/reduces images automatically. When an image is put on some AWS S3 bucket, this function will resize/reduce it and save it into a new bucket. I have used it in the past and I loved it.
Usage
edit lambda-config.js file and assign name, description, memory size, timeout of your lambda function.
edit .env file with your AWS access data
npm install
gulp deploy
You can also try this other library which is more popular in the crowd - aws-lambda-image
If you really want to create something of your own and want a good start.
I would recommend these 2 articles that explain it very well -
Image conversion using Amazon Lambda and S3 in Node.js
Automating Image Compression Using S3 & Lambda
If you are fine to use Amazon API Gateway then u can follow this AWS Compute Blog -
Resize Images on the Fly with Amazon S3, AWS Lambda, and Amazon API Gateway
Hope this was useful.

AWS Lambda function to connect to a Postgresql database

Does anyone know how I can connect to a PostgreSQL database through an AWS Lambda function. I searched it up online but I couldn't find anything about it. If you could tell me how to go about it that would be great.
If you can find something wrong with my code (node.js) that would be great otherwise can you tell me how to go about it?
exports.handler = (event, context, callback) => {
"use strict"
const pg = require('pg');
const connectionStr =
"postgres://username:password#host:port/db_name";
var client = new pg.Client(connectionStr);
client.connect(function(err){
if(err) {
callback(err)
}
callback(null, 'Connection established');
});
context.callbackWaitsForEmptyEventLoop = false;
};
The code throws an error:
cannot find module 'pg'
I wrote it directly on AWS Lambda and didn't upload anything if that makes a difference.
I wrote it directly on AWS Lambda and didn't upload anything if that makes a difference.
Yes this makes the difference! Lambda doesnt provide 3rd party libraries out of the box. As soon as you have a dependency on a 3rd party library you need to zip and upload your Lambda code manually or with the use of the API.
Fore more informations: Lambda Execution Environment and Available Libraries
You need to refer Creating a Deployment Package (Node.js)
Simple scenario – If your custom code requires only the AWS SDK library, then you can use the inline editor in the AWS Lambda console. Using the console, you can edit and upload your code to AWS Lambda. The console will zip up your code with the relevant configuration information into a deployment package that the Lambda service can run.
and
Advanced scenario – If you are writing code that uses other resources, such as a graphics library for image processing, or you want to use the AWS CLI instead of the console, you need to first create the Lambda function deployment package, and then use the console or the CLI to upload the package.
Your case like mine falls under Advanced scenario. So we need to create a deployment package and then upload it. Here what I did -
mkdir deployment
cd deployment
vi index.js
write your lambda code in this file. Make sure your handler name is index.handler when you create it.
npm install pg
You should see node_modules directory created in deployment directory which has multiple modules in it
Package the deployment directory into a zip file and upload to Lambda.
You should be good then
NOTE : npm install will install node modules in same directory under node_modules directory unless it sees a node_module directory in parent directory. To be same first do npm init followed by npm install to ensure modules are installed in same directory for deployment.

Resources