AWS Lambda - Cognito sign up/log in on a node.js lambda function - node.js

I am trying to run AWS Cognito Identity on AWS Lambda, trying to handle user sign up in a function, rather than putting all that logic in the view.
Firstly, is this at all possible?
Here's what I've done:
1 Wrote a Lambda function, using some of the example code AWS published in their docs.
installed the 'amazon-cognito-identity-js' node packages.
Zipped it all up and published it to Lambda
Here is the first few lines of my function:
const AWSCognito = require('amazon-cognito-identity-js');
const userPoolId = '<region>-blah';
const clientId = 'blah';
AWSCognito.config.region = '<region>';
exports.handler = function(event, context, callback) {
I am getting the following error though:
{
"errorMessage": "Cannot find module '/var/task/index'",
"errorType": "Error",
"stackTrace": [
"Function.Module._load (module.js:417:25)",
"Module.require (module.js:497:17)",
"require (internal/module.js:20:19)"
]
}
I've looked around online and everything I've found says that it can be because I'm zipping it up wrong. I've checker, and the only thing in the .zip file is the node_modules folder and my ''userSignUp.js`` file.
Can anyone spot something I'm missing here, or is it simply not possible?

Is the .js file with your code called "index.js" or something else?
If it is not called index.js, you will get that error if you zipped it up correctly.
Check your function config for the "Handler" parameter. By default it should be "index.handler". Say your file is called xyz.js, then you should change the handler to be "xyz.handler".

Related

AWS Lambda returns Unable to import module 'main': No module named 'main' when modules are there

So I'm trying to set up a function in AWS Lambda to run some python code I imported from a zip.
I've edited the handler to run the file then the function I want to run
I've tried having the file in the directory created when I imported the zip folder, after which I I moved it to the main function directory. Neither worked
Not too sure what is wrong here,
the full error returned when I run test is:
Response
{
"errorMessage": "Unable to import module 'main': No module named 'main'",
"errorType": "Runtime.ImportModuleError",
"stackTrace": []
}
Edit: really new to Lambda so please excuse any silly mistakes
The problem is that, while you appear to have a module named main, it has not been deployed to the Lambda service yet. When you click Test, Lambda runs the deployed code. Perhaps your module was renamed to main some time after your initial deployment?
Local changes to code need to be saved and then deployed. The deploy step is important because until you deploy the code, the Lambda service will continue to run the previous code.
This has actually been a common problem historically in the Lambda console, but enhancements have been made to make it more obvious that a deployment is needed. For example the console now indicates "Changes not deployed" after you make a change, until you hit the Deploy button.
I found this question while facing the problem myself. Issue was that the zip put "main.py" in a subfolder.
Hope this is helpful for any others!

GCP Cloud Functions not looking for function.js

According to GCP doc
Cloud Functions will look for files with specific names for deployable functions. For Node.js, these filenames are index.js or function.js.
Source: https://cloud.google.com/sdk/gcloud/reference/functions/deploy#--source
In my function.js file, I have:
exports.myFunction = async (req, res) => {}
And I am deploying with this command:
gcloud functions deploy myFunction --entry-point=myFunction \
--region=us-central1 --project=my-gcp-project
This causes this error
Function 'myFunction' is not defined in the provided module.
Did you specify the correct target function to execute?
Could not load the function, shutting down.
Error: function terminated. Recommended action: inspect logs for termination reason.
Curiously enough, the deployment works if I rename function.js to index.js.
Does anyone know what I might be missing here?
Following the recommended structure, you need to import all methods from relevant modules and re-export them in the index.js file so that the Virtual image can find and bind them to the appropriate functions. Without this, your functions could be simply additional code that is used in other methods as Firebase has no way to tell the difference.
I suggest checking out the following documentation:
https://firebase.google.com/docs/functions/organize-functions#write_functions_in_multiple_files

Problems using airtable.js module in AWS lambda

I'm trying to access Airtable from an AWS lambda function.
First off, to test, I installed airtable.js (npm install -s airtable) in a local project, wrote a short test script, and executed with node test.js. All works fine.
Using exactly the same core test code, inside an appropriate node.js function wrapper, I've tried running the same test in an AWS lambda function, and I get an error in my CloudWatch logs:
Error: Cannot find module '/var/task/node_modules/abort-controller/dist/abort-controller'. Please verify that the package.json has a valid \"main\" entry
I've tried both zipping the npm packages up with the function code in a deployment package, and also creating a lambda layer from the airtable package. Both produce the same error. Note that the package is picked up - if I try the layer approach, but removing the layer itself then it can't find airtable. So this seems to be something specific with how the airtable package is trying to access abort-controller.
For what it's worth, here's the [redacted] test code that I'm using in my lambda function: (the returns etc are because it's operating behind an API gateway call - but that's not part of the issue because the same error occurs regardless of whether testing inside the lambda console or calling through the API)
var AWS = require("aws-sdk");
AWS.config.update({ region: "eu-west-1", });
const Airtable = require('airtable');
const base = new Airtable({ apiKey: "xxxxx" }).base('yyyyy');
exports.handler = async(event) => {
try {
console.info('trying create');
let records = await base('Base1').create([{
"fields": {
"Status": "Scored",
"C1": "testing",
"C2": "airtable",
"C3": "api",
"C4": "from",
"C5": "node",
}
}, ]);
console.info('completed create');
let records_list = records.map(r => r.getId()).join(',');
console.info(records_list);
return ({statusCode: 200, body: JSON.stringify(records_list)});
} catch (e) {
console.error(e);
return ({statusCode: 401, body: JSON.stringify(e)});
}
}
I've built many lambdas previously, both with layers and with embedded packages, and so I've made most of the common mistakes - and I don't think I'm repeating them here. Is there anything special about airtable.js which means there's a different approach needed here?
Turns out the problem was in the zip of the deployment package - whether in a layer or baked into the lambda, the zip file seems to have been missing something. I was doing that as part of my terraform configuration / deployment, and what's perplexing is that it seems to be exactly the same structure and setup as I've used successfully for over 20 functions and 5 layers in a different project, but here it's failing.
So - solution seems to be, for the moment at least, to manually zip the layer package, upload to s3, and then get terraform to pick it up from there.

Error while invoking the AWS Lambda function

I am trying to integrate AWS S3 with Lambda, based on this AWS tutorial. When an image is added to S3, it will trigger a Lambda function which will get the image from S3, resize it and upload the same to S3 back again.
After copying the function to the AWS Lambda Management, I do get the below message. I am not sure how to handle it. I am using Node.js 8.10 as the runtime. The complete code can be found here. The file name is index.js, the Lambda handler is index.handler and exports.handler is defined in the Lambda function.
Upon saving the Lambda function and triggering the same by putting an image in S3, I do get the below message in the CloudWatch Logs.
I am not familiar with Node.js and am stuck here. Any solution would be appreciated.
Update: Here is the folder structure or the tree.
The problem is that you have not deployed the Lambda function correctly. This code has dependencies on the GraphicsMagick and Async libraries, and you have not uploaded either of them to Lambda so your require() calls are failing. You should re-read the Tutorial, but basically you need to:
npm init
npm install gm async --save
zip -r function.zip .
aws lambda create-function ... (per the tutorial)
Your deployed Lambda function should look like this (note the inclusion of a package.json file as well as node_modules subfolders for the dependent NPM packages):

AWS Lambda function to connect to a Postgresql database

Does anyone know how I can connect to a PostgreSQL database through an AWS Lambda function. I searched it up online but I couldn't find anything about it. If you could tell me how to go about it that would be great.
If you can find something wrong with my code (node.js) that would be great otherwise can you tell me how to go about it?
exports.handler = (event, context, callback) => {
"use strict"
const pg = require('pg');
const connectionStr =
"postgres://username:password#host:port/db_name";
var client = new pg.Client(connectionStr);
client.connect(function(err){
if(err) {
callback(err)
}
callback(null, 'Connection established');
});
context.callbackWaitsForEmptyEventLoop = false;
};
The code throws an error:
cannot find module 'pg'
I wrote it directly on AWS Lambda and didn't upload anything if that makes a difference.
I wrote it directly on AWS Lambda and didn't upload anything if that makes a difference.
Yes this makes the difference! Lambda doesnt provide 3rd party libraries out of the box. As soon as you have a dependency on a 3rd party library you need to zip and upload your Lambda code manually or with the use of the API.
Fore more informations: Lambda Execution Environment and Available Libraries
You need to refer Creating a Deployment Package (Node.js)
Simple scenario – If your custom code requires only the AWS SDK library, then you can use the inline editor in the AWS Lambda console. Using the console, you can edit and upload your code to AWS Lambda. The console will zip up your code with the relevant configuration information into a deployment package that the Lambda service can run.
and
Advanced scenario – If you are writing code that uses other resources, such as a graphics library for image processing, or you want to use the AWS CLI instead of the console, you need to first create the Lambda function deployment package, and then use the console or the CLI to upload the package.
Your case like mine falls under Advanced scenario. So we need to create a deployment package and then upload it. Here what I did -
mkdir deployment
cd deployment
vi index.js
write your lambda code in this file. Make sure your handler name is index.handler when you create it.
npm install pg
You should see node_modules directory created in deployment directory which has multiple modules in it
Package the deployment directory into a zip file and upload to Lambda.
You should be good then
NOTE : npm install will install node modules in same directory under node_modules directory unless it sees a node_module directory in parent directory. To be same first do npm init followed by npm install to ensure modules are installed in same directory for deployment.

Resources