How can i invoke lambda function correctly? [closed] - node.js

Closed. This question needs debugging details. It is not currently accepting answers.
Edit the question to include desired behavior, a specific problem or error, and the shortest code necessary to reproduce the problem. This will help others answer the question.
Closed 1 year ago.
Improve this question
I'm trying to create my first Lambda function on AWS.
I'm following a YouTube tutorial and I wrote the following code in order to get some Users from the DynamoDB online service.
const aws = require('aws-sdk')
aws.config.update({
region : "eu-west-3"
})
const dynamodb = new aws.DynamoDB.DocumentClient()
const dynamoUsersTable = "Users"
const usersPath = "/users"
exports.handler = async (event) => {
console.log('Request event : ', event);
let response;
switch (true) {
case event.httpMethod === 'GET' && event.path === usersPath:
response = await getAllUsers();
break;
default:
break;
}
// UPDATED ANSWER - ADDED RETURN RESPONSE LIGNE ! -> New error : Missing Authentication Token
return response;
}
const buildResponse = (statusCode, body) => {
return {
statusCode : statusCode,
headers : {
"content-type" : "application/json"
},
body : JSON.stringify(body)
}
}
const scanDynamoRecords = async (scanParams, itemArray ) => {
try {
const dynamoData = await dynamodb.scan(scanParams).promise();
itemArray = itemArray.concat(dynamoData.Items);
if (dynamoData.LastEvaluatedKey) {
scanParams.ExclusiveStartkey = dynamoData.LastEvaluatedKey;
return await scanDynamoRecords(scanParams, itemArray);
}
return itemArray;
} catch(error) {
console.error('Do your custom error handling here. I am just gonna log it: ', error);
}
}
const getAllUsers = async () => {
const params = {
TableName: dynamoUsersTable
}
const allUsers = await scanDynamoRecords(params, []);
const body = {
users: allUsers
}
return buildResponse(200, body);
}
But apparently, I get Internal Server Error when I make my GET Request on Postman :
PS: The function has DynamoDBFullAccess Policy and CloudWatch Policy, and I can't find the error message log on cloud watch.
UPDATE: The error was that I had a missing "return" statement after the switch, however and after the fix, "Missing Authentication Token" message is returned as a response from a GET request on Postman. I'm suspecting that IAM has a say in this since I'm using multiple AWS services.

When you create an Lambda function, you define an IAM role that is used to run the Lambda function. If your Lambda function is going to invoke other AWS Services, then you need to create an IAM role that has permissions to invoke those given services. If you do not correctly setup the IAM role permissions, then your Lambda function is not successful.
See this AWS tutorial. Although it is implemented in Java, it covers important information such as setting up an IAM role to invoke AWS services from the Lambda function, and so on.
Creating scheduled events to invoke Lambda functions

You need to return the response variable in your handler method.

Related

AWS API GATEWAY <=> AWS Lambda CORS problem

This is my first time with AWS, I did set up a DynamoDB table as well as created Lambda functions to do CRUD manipulations, I made the functions static at first. Then, I proceeded to create API Gateway methods to access the Lambda functions dynamically.
The method that I try to call is a GET method, it should be callable using JS and from a different domain, I need to enable CORS, unfortunately even after I have it in the header of my lambda function it gives me CORS error:
'use strict';
const AWS = require('aws-sdk');
AWS.config.update({ region: 'us-east-2' });
exports.handler = async (event, context) => {
const ddb = new AWS.DynamoDB({ apiVersion: '2012-10-08' });
const documentClient = new AWS.DynamoDB.DocumentClient({ region: 'us-east-2' });
let responseBody = "";
let statusCode = 0;
var params = {
TableName: "Users",
};
// scan is a function, that is like the SELECT .. WHERE .. clause in SQL
try{
const data = await documentClient.scan(params).promise();
responseBody = JSON.stringify(data.Items);
statusCode = 200;
}catch(err) {
responseBody = `Unable to Scan Users, error: ${err}`;
statusCode = 403;
}
const response = {
statusCode: statusCode,
headers: {
"access-control-allow-origin": "*"
},
body: {
responseBody
}
};
return response;
};
This is the code I have in a VueJS application that should make a request to the GET method in API Gateway:
try{
const res = await axios.get("https://here-goes-my-url");
console.log(res.data);
} catch (err){
console.log(err);
}
this is the error I'm getting:
Access to XMLHttpRequest at 'https://here-goes-the-url' from origin 'http://localhost:8080' has been blocked by CORS policy: No 'Access-Control-Allow-Origin' header is present on the requested resource.
The Lambda GET method should return all items in the table "Users".
Please help, thanks!
Ok you have your header set.
Have you enabled cors from the API Gateway ?
And when you see CORS errors, they are not always related to CORS in case of AWS lambda as sometimes the response is blocked by API Gateway and the client is still getting a CORS error. I think this is made so it is more difficult to hack, as errors are useful if you are trying to hack the server.
So I would strongly suggest you to test your application first in the API Gateway console.
This will give you a more actionable error. I am guessing this might be related to permissions. Hope this helps.

I get 'Error [ConfigError]: Missing region in config' each time I attempt AWS lambda function call from NodeJS

I'm new to AWS lambda functions and was given some sample code (below -- in the actual code I fill in the placeholders correctly, of course). But each time I call the lambda function, I get
Error [ConfigError]: Missing region in config
at Request.VALIDATE_REGION (/Users/abc/Documents/projects/bot/node_modules/aws-sdk/lib/event_listeners.js:92:45)
Even though the region is set in aws.config.update. I'm not sure what to do to fix this. I've tried removing the call to aws.config.update, removing the reference to region, but nothing helps.
I'd also be keen to know if there's a way to load the credentials from the shared file in ~/.aws/credentials, instead of having to enter them directly here.
Thanks for any help!
const aws = require('aws-sdk');
// do this only in dev, in prod you will not need to put your keys...
aws.config.update({accessKeyId: '<YOU ACCESS KEY>', secretAccessKey: '<YOUR SECRET>', region:'ap-northeast-1'});
const lambda = new aws.Lambda();
function invokeLambda(options) {
return new Promise((resolve, reject) => {
lambda.invoke(options, (err, data) => {
if (err) {
return reject(err);
}
return resolve(data);
});
});
}
const sampleCall = async () => {
try {
const options = {
FunctionName: '<NAME OF THE FUNCTION YOU WANT TO INVOKE>',
Payload: '<The JSON.stringify of the object you want to provide as parameter>',
}
const result = await invokeLambda(options);
console.log(result);
} catch (err) {
console.error(err);
}
};
You can configure the region before using any services.
like this:
var AWS = require('aws-sdk');
AWS.config.update({region:'us-east-1'});
const lambda = new aws.Lambda();
AWS services such as lambda has the ability to assume an IAM Role. Because the applications does not need to store credentials.
here is what you should do:
The IAM role is attached to the lambda
Permissions are then attached to the IAM role
Basically you can attach the same policy that your IAM user is using to the IAM role. After doing that, you can remove credentials from the code.
Reference:
https://aws.amazon.com/blogs/security/how-to-create-an-aws-iam-policy-to-grant-aws-lambda-access-to-an-amazon-dynamodb-table/
Hey No need for configure the whole AWS setttings. if you want to keep it unconfigured and not for all of your request to use the region.
The lambda constructor receive 'region' param just set it to your desired region.
Example:
const lambda = new Lambda({
region: 'us-east-1'
});
lambda.invoke({ FunctionName: 'your-function-name'},
function(err, data) {
if(err) {
console.log(err, err.stack);
}
else {
console.log(data);
}
}
);

How to write unit test for the function which is accessing aws resources?

I have a function which is accessing multiple aws resources and now need to test this function, but I don't know how to mock these resources.
I have tried following github of aws-sdk-mock, but didn't get much there.
function someData(event, configuration, callback) {
// sts set-up
var sts = new AWS.STS(configuration.STS_CONFIG);
sts.assumeRole({
DurationSeconds: 3600,
RoleArn: process.env.CROSS_ACCOUNT_ROLE,
RoleSessionName: configuration.ROLE_NAME
}, function(err, data) {
if (err) {
// an error occurred
console.log(err, err.stack);
} else {
// successful response
// resolving static credential
var creds = new AWS.Credentials({
accessKeyId: data.Credentials.AccessKeyId,
secretAccessKey: data.Credentials.SecretAccessKey,
sessionToken: data.Credentials.SessionToken
});
// Query function
var dynamodb = new AWS.DynamoDB({apiVersion: configuration.API_VERSION, credentials: creds, region: configuration.REGION});
var docClient = new AWS.DynamoDB.DocumentClient({apiVersion: configuration.API_VERSION, region: configuration.REGION, endpoint: configuration.DDB_ENDPOINT, service: dynamodb });
// extract params
var ID = event.queryStringParameters.Id;
console.log('metrics of id ' + ID);
var params = {
TableName: configuration.TABLE_NAME,
ProjectionExpression: configuration.PROJECTION_ATTR,
KeyConditionExpression: '#ID = :ID',
ExpressionAttributeNames: {
'#ID': configuration.ID
},
ExpressionAttributeValues: {
':ID': ID
}
};
queryDynamoDB(params, docClient).then((response) => {
console.log('Params: ' + JSON.stringify(params));
// if the query is Successful
if( typeof(response[0]) !== 'undefined'){
response[0]['Steps'] = process.env.STEPS;
response[0]['PageName'] = process.env.STEPS_NAME;
}
console.log('The response you get', response);
var success = {
statusCode: HTTP_RESPONSE_CONSTANTS.SUCCESS.statusCode,
body: JSON.stringify(response),
headers: {
'Content-Type': 'application/json'
},
isBase64Encoded: false
};
return callback(null, success);
}, (err) => {
// return internal server error
return callback(null, HTTP_RESPONSE_CONSTANTS.BAD_REQUEST);
});
}
});
}
This is lambda function which I need to test, there are some env variable also which is being used here.
Now I tried writing Unit test for above function using aws-sdk-mock but still I am not able to figure out how to actually do it. Any help will be appreciated. Below is my test code
describe('test getMetrics', function() {
var expectedOnInvalid = HTTP_RESPONSE_CONSTANTS.BAD_REQUEST;
it('should assume role ', function(done){
var event = {
queryStringParameters : {
Id: '123456'
}
};
AWS.mock('STS', 'assumeRole', 'roleAssumed');
AWS.restore('STS');
AWS.mock('Credentials', 'credentials');
AWS.restore('Credentials');
AWS.mock('DynamoDB.DocumentClient', 'get', 'message');
AWS.mock('DynamoDB', 'describeTable', 'message');
AWS.restore('DynamoDB');
AWS.restore('DynamoDB.DocumentClient');
someData(event, configuration, (err, response) => {
expect(response).to.deep.equal(expectedOnInvalid);
done();
});
});
});
I am getting the following error :
{ MultipleValidationErrors: There were 2 validation errors:
* MissingRequiredParameter: Missing required key 'RoleArn' in params
* MissingRequiredParameter: Missing required key 'RoleSessionName' in params
Try setting aws-sdk module explicitly.
Project structures that don't include the aws-sdk at the top level node_modules project folder will not be properly mocked. An example of this would be installing the aws-sdk in a nested project directory. You can get around this by explicitly setting the path to a nested aws-sdk module using setSDK().
const AWSMock = require('aws-sdk-mock');
import AWS = require('aws-sdk');
AWSMock.setSDKInstance(AWS);
For more details on this : Read aws-sdk-mock documentation, they have explained it even better.
I strongly disagree with #ttulka's answer, so I have decided to add my own as well.
Given you received an event in your Lambda function, it's very likely you'll process the event and then invoke some other service. It could be a call to S3, DynamoDB, SQS, SNS, Kinesis...you name it. What is there to be asserted at this point?
Correct arguments!
Consider the following event:
{
"data": "some-data",
"user": "some-user",
"additionalInfo": "additionalInfo"
}
Now imagine you want to invoke documentClient.put and you want to make sure that the arguments you're passing are correct. Let's also say that you DON'T want the additionalInfo attribute to be persisted, so, somewhere in your code, you'd have this to get rid of this attribute
delete event.additionalInfo
right?
You can now create a unit test to assert that the correct arguments were passed into documentClient.put, meaning the final object should look like this:
{
"data": "some-data",
"user": "some-user"
}
Your test must assert that documentClient.put was invoked with a JSON which deep equals the JSON above.
If you or any other developer now, for some reason, removes the delete event.additionalInfo line, tests will start failing.
And this is very powerful! If you make sure that your code works the way you expect, you basically don't have to worry about creating integration tests at all.
Now, if a SQS consumer Lambda expects the body of the message to contain some field, the producer Lambda should always take care of it to make sure the right arguments are being persisted in the Queue. I think by now you get the idea, right?
I always tell my colleagues that if we can create proper unit tests, we should be good to go in 95% of the cases, leaving integration tests out. Of course it's better to have both, but given the amount of time spent on creating integration tests like setting up environments, credentials, sometimes even different accounts, is not worth it. But that's just MY opinion. Both you and #ttulka are more than welcome to disagree.
Now, back to your question:
You can use Sinon to mock and assert arguments in your Lambda functions. If you need to mock a 3rd-party service (like DynamoDB, SQS, etc), you can create a mock object and replace it in your file under test using Rewire. This usually is the road I ride and it has been great so far.
I see unit testing as a way to check if your domain (business) rules are met.
As far as your Lambda contains exclusively only integration of AWS services, it doesn't make much sense to write a unit test for it.
To mock all the resources means, your test will be testing only communication among those mocks - such a test has no value.
External resources mean input/output, this is what integration testing focuses on.
Write integration tests and run them as a part of your integration pipeline against real deployed resources.
This is how we can mock STS in nodeJs.
import { STS } from 'aws-sdk';
export default class GetCredential {
constructor(public sts: STS) { }
public async getCredentials(role: string) {
this.log.info('Retrieving credential...', { role });
const apiRole = await this.sts
.assumeRole({
RoleArn: role,
RoleSessionName: 'test-api',
})
.promise();
if (!apiRole?.Credentials) {
throw new Error(`Credentials for ${role} could not be retrieved`);
}
return apiRole.Credentials;
}
}
Mock for the above function
import { STS } from 'aws-sdk';
import CredentialRepository from './GetCredential';
const sts = new STS();
let testService: GetCredential;
beforeEach(() => {
testService = new GetCredential(sts);
});
describe('Given getCredentials has been called', () => {
it('The method returns a credential', async () => {
const credential = {
AccessKeyId: 'AccessKeyId',
SecretAccessKey: 'SecretAccessKey',
SessionToken: 'SessionToken'
};
const mockGetCredentials = jest.fn().mockReturnValue({
promise: () => Promise.resolve({ Credentials: credential }),
});
testService.sts.assumeRole = mockGetCredentials;
const result = await testService.getCredentials('fakeRole');
expect(result).toEqual(credential);
});
});

AWS SES create template with lambda function always return null

so on my first time learning AWS stuff (it is a beast), I'm trying to create e-mail templates, I have this lambda function:
// Load the AWS SDK for Node.js
var AWS = require('aws-sdk');
// Set the region
AWS.config.update({ region: "us-east-1" });
exports.handler = async (event, context, callback) => {
// Create createTemplate params
var params = {
Template: {
TemplateName: "notification" /* required */,
HtmlPart: "HTML_CONTENT",
SubjectPart: "SUBJECT_LINE",
TextPart: "sending emails with aws lambda"
}
};
// Create the promise and SES service object
const templatePromise = new AWS.SES({ apiVersion: "2010-12-01" })
.createTemplate(params)
.promise();
// Handle promise's fulfilled/rejected states
templatePromise
.then((data) => {
console.log(data);
callback(null, JSON.stringify(data) );
// also tried callback(null, data);
}, (err) => {
console.error(err, err.stack);
callback(JSON.stringify(err) );
});
as far as I am understanding, this function should return me a template? an object, anything? when I use the lambda test functionality I always got null in the request response
does anyone know what I am doing wrong here?
edit: and It is not creating the e-mail template, I check the SES Panel - email templates and it is empty
edit2: if I try to return a string eg: callback(null, "some success message"); it does return the string, so my guess is something wrong with the SES, but this function is exactly what we have in the AWS docs, so I assume it should just work..
Try not to resolve the Promise and change your code to just returning it as-is:
return await templatePromise;
which should present you some more detail of what is really going wrong in your code - it might be some hidden access issue - so you might need to adjust the role your lambda function is using. createTemplate on the other side should not return much in case of successful execution but just create the template.
Also try to follow the following try/catch pattern when using async (as described here in more detail: https://aws.amazon.com/de/blogs/compute/node-js-8-10-runtime-now-available-in-aws-lambda/)
exports.handler = async (event) => {
try {
data = await lambda.getAccountSettings().promise();
}
catch (err) {
console.log(err);
return err;
}
return data;
};

AWS Lambda publishing to IOT Topic fires indefinitely

The Issue:
I have a node.js (8.10) AWS Lambda function that takes a json object and publishes it to an IOT topic. The function successfully publishes to the topic, however, once fired it is continuously called until I throttle the concurrency to zero to halt any further calling of the function.
I'm trying to figure out what I've implemented incorrectly that causes more than one instance the of the function to be called.
The Function:
Here is my function:
var AWS = require('aws-sdk');
exports.handler = function (event, context) {
var iotdata = new AWS.IotData({endpoint: 'xxxxxxxxxx.iot.us-east-1.amazonaws.com'});
var params = {
topic: '/PiDevTest/SyncDevice',
payload: JSON.stringify(event),
qos: 0
};
iotdata.publish(params, function(err, data) {
if (err) {
console.log(err, err.stack);
} else {
console.log("Message sent.");
context.succeed();
}
});
};
My test json is:
{
"success": 1,
"TccvID": "TestID01"
}
The test console has a response of "null", but the IOT topic shows the data from the test json, published to the topic about once per second.
What I've Tried
-I've attempted to define the handler in it's own, non-anonymous function called handler, and then having the exports.handler = handler; This didn't produce any errors, but didn't successfully post to the iot topic either.
-I thought maybe the issues was with the node.js callback. I've tried implementing it and leaving it out (Current iteration above), but neither way seemed to make a difference. I had read somewhere that the function will retry if it errors, but I believe that only happens three times so it wouldn't explain the indefinite calling of the function.
-I've also tried calling the function from another lambda to make sure that the issue wasn't the aws test tool. This produced the same behavior, though.
Summary:
What am I doing incorrectly that causes this function to publish the json data indefinitely to the iot topic?
Thanks in advance for your time and expertise.
Use aws-iot-device-sdk to create a MQTT client and use it's messageHandler and publish method to publish your messages to IOT topic. Sample MQTT client code is below,
import * as DeviceSdk from 'aws-iot-device-sdk';
import * as AWS from 'aws-sdk';
let instance: any = null;
export default class IoTClient {
client: any;
/**
* Constructor
*
* #params {boolean} createNewClient - Whether or not to use existing client instance
*/
constructor(createNewClient = false, options = {}) {
}
async init(createNewClient, options) {
if (createNewClient && instance) {
instance.disconnect();
instance = null;
}
if (instance) {
return instance;
}
instance = this;
this.initClient(options);
this.attachDebugHandlers();
}
/**
* Instantiate AWS IoT device object
* Note that the credentials must be initialized with empty strings;
* When we successfully authenticate to the Cognito Identity Pool,
* the credentials will be dynamically updated.
*
* #params {Object} options - Options to pass to DeviceSdk
*/
initClient(options) {
const clientId = getUniqueId();
this.client = DeviceSdk.device({
region: options.region || getConfig('iotRegion'),
// AWS IoT Host endpoint
host: options.host || getConfig('iotHost'),
// clientId created earlier
clientId: options.clientId || clientId,
// Connect via secure WebSocket
protocol: options.protocol || getConfig('iotProtocol'),
// Set the maximum reconnect time to 500ms; this is a browser application
// so we don't want to leave the user waiting too long for reconnection after
// re-connecting to the network/re-opening their laptop/etc...
baseReconnectTimeMs: options.baseReconnectTimeMs || 500,
maximumReconnectTimeMs: options.maximumReconnectTimeMs || 1000,
// Enable console debugging information
debug: (typeof options.debug === 'undefined') ? true : options.debug,
// AWS access key ID, secret key and session token must be
// initialized with empty strings
accessKeyId: options.accessKeyId,
secretKey: options.secretKey,
sessionToken: options.sessionToken,
// Let redux handle subscriptions
autoResubscribe: (typeof options.debug === 'undefined') ? false : options.autoResubscribe,
});
}
disconnect() {
this.client.end();
}
attachDebugHandlers() {
this.client.on('reconnect', () => {
logger.info('reconnect');
});
this.client.on('offline', () => {
logger.info('offline');
});
this.client.on('error', (err) => {
logger.info('iot client error', err);
});
this.client.on('message', (topic, message) => {
logger.info('new message', topic, JSON.parse(message.toString()));
});
}
updateWebSocketCredentials(accessKeyId, secretAccessKey, sessionToken) {
this.client.updateWebSocketCredentials(accessKeyId, secretAccessKey, sessionToken);
}
attachMessageHandler(onNewMessageHandler) {
this.client.on('message', onNewMessageHandler);
}
attachConnectHandler(onConnectHandler) {
this.client.on('connect', (connack) => {
logger.info('connected', connack);
onConnectHandler(connack);
});
}
attachCloseHandler(onCloseHandler) {
this.client.on('close', (err) => {
logger.info('close', err);
onCloseHandler(err);
});
}
publish(topic, message) {
this.client.publish(topic, message);
}
subscribe(topic) {
this.client.subscribe(topic);
}
unsubscribe(topic) {
this.client.unsubscribe(topic);
logger.info('unsubscribed from topic', topic);
}
}
***getConfig() is to get environment variables from a yml file or else you can directly specify it here.
While he only posted it as an comment, MarkB pointed me in the correct direction.
The problem was the solution was related to another lambda who was listening to the same topic and invoking the lambda I was working on. This resulted in circular logic as the exit condition was never met. Fixing that code solved this issue.

Resources