Problems using airtable.js module in AWS lambda - node.js

I'm trying to access Airtable from an AWS lambda function.
First off, to test, I installed airtable.js (npm install -s airtable) in a local project, wrote a short test script, and executed with node test.js. All works fine.
Using exactly the same core test code, inside an appropriate node.js function wrapper, I've tried running the same test in an AWS lambda function, and I get an error in my CloudWatch logs:
Error: Cannot find module '/var/task/node_modules/abort-controller/dist/abort-controller'. Please verify that the package.json has a valid \"main\" entry
I've tried both zipping the npm packages up with the function code in a deployment package, and also creating a lambda layer from the airtable package. Both produce the same error. Note that the package is picked up - if I try the layer approach, but removing the layer itself then it can't find airtable. So this seems to be something specific with how the airtable package is trying to access abort-controller.
For what it's worth, here's the [redacted] test code that I'm using in my lambda function: (the returns etc are because it's operating behind an API gateway call - but that's not part of the issue because the same error occurs regardless of whether testing inside the lambda console or calling through the API)
var AWS = require("aws-sdk");
AWS.config.update({ region: "eu-west-1", });
const Airtable = require('airtable');
const base = new Airtable({ apiKey: "xxxxx" }).base('yyyyy');
exports.handler = async(event) => {
try {
console.info('trying create');
let records = await base('Base1').create([{
"fields": {
"Status": "Scored",
"C1": "testing",
"C2": "airtable",
"C3": "api",
"C4": "from",
"C5": "node",
}
}, ]);
console.info('completed create');
let records_list = records.map(r => r.getId()).join(',');
console.info(records_list);
return ({statusCode: 200, body: JSON.stringify(records_list)});
} catch (e) {
console.error(e);
return ({statusCode: 401, body: JSON.stringify(e)});
}
}
I've built many lambdas previously, both with layers and with embedded packages, and so I've made most of the common mistakes - and I don't think I'm repeating them here. Is there anything special about airtable.js which means there's a different approach needed here?

Turns out the problem was in the zip of the deployment package - whether in a layer or baked into the lambda, the zip file seems to have been missing something. I was doing that as part of my terraform configuration / deployment, and what's perplexing is that it seems to be exactly the same structure and setup as I've used successfully for over 20 functions and 5 layers in a different project, but here it's failing.
So - solution seems to be, for the moment at least, to manually zip the layer package, upload to s3, and then get terraform to pick it up from there.

Related

How to get past errors using putParameter with aws-sdk for nodejs in Lambda?

I'm trying to set a parameter using putParameter in the AWS SDK for JavaScript in Node.js. In particular, I'd like to take advantage of the "Advanced" Tier, with an Expiration policy and Tags if possible. When I execute my code, I keep getting errors like:
There were 2 validation errors:
* UnexpectedParameter: Unexpected key 'Policies' found in params
* UnexpectedParameter: Unexpected key 'Tier' found in params
I suspected the issue was around the aws-sdk version I was using, so I've tried running the code locally using SAM local, and from Lambda functions using the nodejs8.10 and nodejs10.x environments. The errors do not go away.
const AWS = require('aws-sdk');
AWS.config.update({region: 'us-east-1'});
const ssm = new AWS.SSM({apiVersion: '2014-11-06'});
exports.lambdaHandler = async () => {
const tokenExpiration = new Date();
tokenExpiration.setSeconds(tokenExpiration.getSeconds() + 60);
await ssm.putParameter({
Name: 'SECRET_TOKEN',
Type: 'SecureString',
Value: '12345',
Policies: JSON.stringify([
{
"Type":"Expiration",
"Version":"1.0",
"Attributes":{
"Timestamp": tokenExpiration.toISOString()
}
}
]),
Overwrite: true,
Tier: 'Advanced'
}).promise();
};
I would expect this code to work and set a parameter with the expiration. However, it appears that the sdk doesn't recognize the "Policies" and "Tier" parameters, which are available according to the documentation. I don't know if it's an issue of waiting for the newest AWS SDK for JavaScript, but the runtimes page suggest that nodejs10.x is running AWS SDK for JavaScript 2.437.0.
It might be helpful to know that I can get the code running correctly without the parameters in question (ie, just the "Name", "Type", and "Value" parameters).
Unfortunately both Tier and Policies weren't added until v2.442.0 (see diff)
This means that to use these features you'll have to deploy with the version of the aws-sdk you're developing against.
It should be noted that either developing/testing against the built-in version, or deploying with the aws-sdk you do use, is often cited as good practice. If you're deploying your version you can use explicit client imports (e.g. const SSM = require('aws-sdk/clients/ssm') to keep the deployment size down. This is even more effective if you develop against the preview AWS-SDK Version 3.

SageMaker NodeJS's SDK is not locking the API Version

I am running some code in AWS Lambda that dynamically creates SageMaker models.
I am locking Sagemaker's API version like so:
const sagemaker = new AWS.SageMaker({apiVersion: '2017-07-24'});
And here's the code to create the model:
await sagemaker.createModel({
ExecutionRoleArn: 'xxxxxx',
ModelName: sageMakerConfigId,
Containers: [{
Image: ecrUrl
}]
}).promise()
This code runs just fine locally with aws-sdk on 2.418.0.
However, when this code is deployed to Lambda, it doesn't work due to some validation errors upon creating the model:
MissingRequiredParameter: Missing required key 'PrimaryContainer' in params
UnexpectedParameter: Unexpected key 'Containers' found in params
Is anyone aware of existing bugs in the aws-sdk for NodeJS using the SDK provided by AWS in the Lambda context? I believe the SDK available inside AWS Lambda is more up-to-date than 2.418.0 but apparently there are compatibility issues.
As you've noticed the 'embedded' lambda version of the aws-sdk lags behind. It's actually on 2.290.0 (you can see the full details on the environment here: https://docs.aws.amazon.com/lambda/latest/dg/current-supported-versions.html)
You can see here: https://github.com/aws/aws-sdk-js/blame/master/clients/sagemaker.d.ts that it is not until 2.366.0 that the params for this method included Containers and did not require PrimaryContainer.
As you've noted, the workaround is to deploy your lambda with the aws-sdk version that you're using. This is sometimes noted as a best practice, as it pins the aws-sdk on the functionality you've built and tested against.

AWS Lambda - Cognito sign up/log in on a node.js lambda function

I am trying to run AWS Cognito Identity on AWS Lambda, trying to handle user sign up in a function, rather than putting all that logic in the view.
Firstly, is this at all possible?
Here's what I've done:
1 Wrote a Lambda function, using some of the example code AWS published in their docs.
installed the 'amazon-cognito-identity-js' node packages.
Zipped it all up and published it to Lambda
Here is the first few lines of my function:
const AWSCognito = require('amazon-cognito-identity-js');
const userPoolId = '<region>-blah';
const clientId = 'blah';
AWSCognito.config.region = '<region>';
exports.handler = function(event, context, callback) {
I am getting the following error though:
{
"errorMessage": "Cannot find module '/var/task/index'",
"errorType": "Error",
"stackTrace": [
"Function.Module._load (module.js:417:25)",
"Module.require (module.js:497:17)",
"require (internal/module.js:20:19)"
]
}
I've looked around online and everything I've found says that it can be because I'm zipping it up wrong. I've checker, and the only thing in the .zip file is the node_modules folder and my ''userSignUp.js`` file.
Can anyone spot something I'm missing here, or is it simply not possible?
Is the .js file with your code called "index.js" or something else?
If it is not called index.js, you will get that error if you zipped it up correctly.
Check your function config for the "Handler" parameter. By default it should be "index.handler". Say your file is called xyz.js, then you should change the handler to be "xyz.handler".

AWS Lambda function to connect to a Postgresql database

Does anyone know how I can connect to a PostgreSQL database through an AWS Lambda function. I searched it up online but I couldn't find anything about it. If you could tell me how to go about it that would be great.
If you can find something wrong with my code (node.js) that would be great otherwise can you tell me how to go about it?
exports.handler = (event, context, callback) => {
"use strict"
const pg = require('pg');
const connectionStr =
"postgres://username:password#host:port/db_name";
var client = new pg.Client(connectionStr);
client.connect(function(err){
if(err) {
callback(err)
}
callback(null, 'Connection established');
});
context.callbackWaitsForEmptyEventLoop = false;
};
The code throws an error:
cannot find module 'pg'
I wrote it directly on AWS Lambda and didn't upload anything if that makes a difference.
I wrote it directly on AWS Lambda and didn't upload anything if that makes a difference.
Yes this makes the difference! Lambda doesnt provide 3rd party libraries out of the box. As soon as you have a dependency on a 3rd party library you need to zip and upload your Lambda code manually or with the use of the API.
Fore more informations: Lambda Execution Environment and Available Libraries
You need to refer Creating a Deployment Package (Node.js)
Simple scenario – If your custom code requires only the AWS SDK library, then you can use the inline editor in the AWS Lambda console. Using the console, you can edit and upload your code to AWS Lambda. The console will zip up your code with the relevant configuration information into a deployment package that the Lambda service can run.
and
Advanced scenario – If you are writing code that uses other resources, such as a graphics library for image processing, or you want to use the AWS CLI instead of the console, you need to first create the Lambda function deployment package, and then use the console or the CLI to upload the package.
Your case like mine falls under Advanced scenario. So we need to create a deployment package and then upload it. Here what I did -
mkdir deployment
cd deployment
vi index.js
write your lambda code in this file. Make sure your handler name is index.handler when you create it.
npm install pg
You should see node_modules directory created in deployment directory which has multiple modules in it
Package the deployment directory into a zip file and upload to Lambda.
You should be good then
NOTE : npm install will install node modules in same directory under node_modules directory unless it sees a node_module directory in parent directory. To be same first do npm init followed by npm install to ensure modules are installed in same directory for deployment.

Serverless Framework with Azure functions

I am writing services with Serverless Framework & Azure Functions. Examples out there are very simple. But when I try to take a step further, I run into problem. Currently learning from AWS Lambda and then trying to implement it on Azure Functions.
The goal of doing so is:
1) Implement functions as es6 classes and then building the project with webpack.
2) Find a right project structure, which makes more sense.
3) Follow SoC pattern.
I have created a github project https://github.com/GeekOnGadgets/serverless-azure-settings and when I try to build this project serverless package it creates .serverless folder and inside it there is .zip file (the compiled version). Which I understand gets deployed to azure when you run serverless deploy. But when I check on Azure the function is just development code and not the compiled one (please refer to the code below).
Can someone please help with this. Any suggestions is appreciated.
import Settings from './src/Settings/Settings'
module.exports.settings = (event, context, callback) => {
let settings = new Settings();
const response = {
statusCode: 200,
headers: {
"Content-Type": "application/json"
},
body: JSON.stringify(settings.dev()),
};
callback(null, response);
}
Indeed javascript azure functions run on nodejs so commonjs modules are the natural format. Node also natively supports much of ES6, though the Functions version of node might not be the latest.
however, there is a current speed issue with loading all the dependencies in node_modules. This is due to file access so a workaround exists to bundle everything into a single script which package.json -> main points to.
I cant comment on how that fits in with serverless, but perhaps this will help clarify.
As far as I know, Node.js still does not support import/export ES6 syntax for modules. See also here.
Try a new deploy changing from
import Settings from './src/Settings/Settings'
to
const Settings = require('./src/Settings/Settings')

Resources