How to create a nodejs AWS lambda function locally using vscode and not serverless or SAM CLI? - node.js

I've go through hundreds of blogs/videos/resources but nowhere it mentions how to create a simple lambda function for Nodejs REST API locally using vscode, AWS toolkit extension and AWS cli. Is there any way where I can create a simple nodejs endpoint on my local and run using above and not serverless or SAM?( There's some internal restrictions hence I can't use them)

What you need is to set up a API gateway and event trigger for your lambda that triggers whenever a HTTP request comes in. So here are steps
Look into serverless framwork where you will define a serverless.yaml file that will have configuration to mention how your lambda will get invoked (In this case, its a HTTP event)
In your IDE of choice, use serverless-offline npm package
Your IDE config will look something like this (This example uses IntelliJ IDE)
IDE config to start up Lambda in local
Once you start up the service in local, you should be able to hit the REST endpoint in local using any rest client like Postman
Instead of (4) above you could also directly invoke your lambda function in local using AWS CLI like aws lambda invoke /dev/null \ --endpoint-url http://localhost:3002 \ --function-name <Your lambda function name> \ --payload '{<Your payload>}'

Related

How to create a Google API key using NodeJS or Rest?

I was able to do this using the gcloud CLI:
gcloud --project=some-project alpha services api-keys create
But I could not find any way to do this using googleapis, nor was I able find any leads at their node repository google-api-nodejs-client.
For context, I will be running this functions in AWS Lambda.
I think (!?) that this API is not yet exposed through APIs Explorer:
E.g. The following 404s (NOT_FOUND)
API=apkeys
VER=v2alpha1
curl https://www.googleapis.com/discovery/v1/apis/${API}/${VER}/rest
Unfortunately, until it is, (there's no discovery document and) the API Client library is unable to auto-generate the SDK for it.
It's unclear to me whether this is policy or an oversight.
I recommend you pester the Cloud SDK team on Google's Issue Tracker (for Cloud SDK)
Note:
If you append --log-http to (any) gcloud command, it will display the underlying REST calls for the command. Absent a Google-provided SDK for these methods, you could introspect the API and code the REST calls directly:
gcloud alpha services api-keys create ... \
--project=${PROJECT} \
--log-http
Yields:
==== request start ====
uri: https://apikeys.googleapis.com/v2alpha1/projects/${PROJECT}/keys?alt=json
method: POST
== headers start ==
b'accept': b'application/json'
b'authorization': b'Bearer ya29...'
== headers end ==
== body start ==
== body end ==

How to trigger a particular version of lambda from s3 events

I am using lambda as an ETL tool to process raw files coming in the s3 bucket.
As time will pass, functionality of lambda function will grow.
Each month, I will change lambda function. so, I want to publish version 1,2,3
How do I make the s3 bucket trigger particular version of lambda for the files ?
How do I test this functionality of production vs test in this case ?
From AWS Lambda function aliases - Documentation:
When you use a resource-based policy to give a service, resource, or account access to your function, the scope of that permission depends on whether you applied it to an alias, to a version, or to the function. If you use an alias name (such as helloworld:PROD), the permission is valid only for invoking the helloworld function using the alias ARN. You get a permission error if you use a version ARN or the function ARN. This includes the version ARN that the alias points to.
For example, the following AWS CLI command grants Amazon S3 permissions to invoke the PROD alias of the helloworld Lambda function. Note that the --qualifier parameter specifies the alias name.
$ aws lambda add-permission --function-name helloworld \
--qualifier PROD --statement-id 1 --principal s3.amazonaws.com --action lambda:InvokeFunction \
--source-arn arn:aws:s3:::examplebucket --source-account 123456789012
In this case, Amazon S3 is now able to invoke the PROD alias. Lambda can then execute the helloworld Lambda function version that the PROD alias references. For this to work correctly, you must use the PROD alias ARN in the S3 bucket's notification configuration.
How do I make the s3 bucket trigger particular version of lambda for the files ?
Best practice is not to point to lambda versions, but to use lambda alias which will point to the version you will configure. You can just append the alias name after the ARN of the Lambda.
arn:aws:lambdaName:aliasName
How do I test this functionality of production vs test in this case ?
You can trigger the same event multiple times with different lambda aliases (like a production version and a testing one)
Example of multiple event notifications

How to deploy AWS Lambda Function Correctly

Using serverless, I am trying to deploy a lambda function through AppSync. I see that my function gets deployed in the AWS Console, however, it is showing that it's over 50MB and can't display the inline editing. How do I properly deploy my lambda function?
Here's what the console looks like.
Like #LLL said, the lambda function has already been deployed. One way you can confirm that is by going to your CloudFormation stack and check the status of the lambda function. If it's successfully deployed the status should be CREATE_COMPLETE or UPDATE_COMPLETE.
If you would like to view the deployed function, you can click the Actions dropdown the page you visited in the image and click Export function. This will download all the files deployed with your lambda function.

Why am I unable to set Amazon S3 as a trigger for my Serverless Lambda Function?

I am attempting to set a NodeJS Lambda function to be triggered when an image is uploaded to an Amazon S3 bucket. I have seen multiple tutorials and have the yml file set up as shown. Below is the YML config file:
functions:
image-read:
handler: handler.imageRead
events:
- s3:
bucket: <bucket-name-here>
event: s3:ObjectCreated:*
Is there something I am missing for the configuration? Is there something I need to do in an IAM role to set this up properly?
The YAML that you have here looks good but there may be some other problems.
Just to get you started:
are you deploying the function using the right credentials? (I've seen it many times that people are deploying in some other account etc. than they think - verify in the web console that it's there)
can you invoke the function in some other way? (from the serverless command line, using http trigger etc.)
do you see anything in the logs of that function? (add console.log statements to see if anything is being run)
do you see the trigger installed in the web console?
can you add trigger manually on the web console?
Try to add a simple function that would only print some logs when it is run and try to add a trigger for that function manually. If it works then try to do the same with the serverless command line but start with a simple function with just one log statement and if it works then go from there.
See also this post for more hints - S3 trigger is not registered after deployment:
https://forum.serverless.com/t/s3-trigger-is-not-registered-after-deployment/1858

AWS Lambda function to connect to a Postgresql database

Does anyone know how I can connect to a PostgreSQL database through an AWS Lambda function. I searched it up online but I couldn't find anything about it. If you could tell me how to go about it that would be great.
If you can find something wrong with my code (node.js) that would be great otherwise can you tell me how to go about it?
exports.handler = (event, context, callback) => {
"use strict"
const pg = require('pg');
const connectionStr =
"postgres://username:password#host:port/db_name";
var client = new pg.Client(connectionStr);
client.connect(function(err){
if(err) {
callback(err)
}
callback(null, 'Connection established');
});
context.callbackWaitsForEmptyEventLoop = false;
};
The code throws an error:
cannot find module 'pg'
I wrote it directly on AWS Lambda and didn't upload anything if that makes a difference.
I wrote it directly on AWS Lambda and didn't upload anything if that makes a difference.
Yes this makes the difference! Lambda doesnt provide 3rd party libraries out of the box. As soon as you have a dependency on a 3rd party library you need to zip and upload your Lambda code manually or with the use of the API.
Fore more informations: Lambda Execution Environment and Available Libraries
You need to refer Creating a Deployment Package (Node.js)
Simple scenario – If your custom code requires only the AWS SDK library, then you can use the inline editor in the AWS Lambda console. Using the console, you can edit and upload your code to AWS Lambda. The console will zip up your code with the relevant configuration information into a deployment package that the Lambda service can run.
and
Advanced scenario – If you are writing code that uses other resources, such as a graphics library for image processing, or you want to use the AWS CLI instead of the console, you need to first create the Lambda function deployment package, and then use the console or the CLI to upload the package.
Your case like mine falls under Advanced scenario. So we need to create a deployment package and then upload it. Here what I did -
mkdir deployment
cd deployment
vi index.js
write your lambda code in this file. Make sure your handler name is index.handler when you create it.
npm install pg
You should see node_modules directory created in deployment directory which has multiple modules in it
Package the deployment directory into a zip file and upload to Lambda.
You should be good then
NOTE : npm install will install node modules in same directory under node_modules directory unless it sees a node_module directory in parent directory. To be same first do npm init followed by npm install to ensure modules are installed in same directory for deployment.

Resources