Serverless lambda - Function not executing completely - node.js

I'm using serverless with nodejs. I have lambda function which have around 6 API in it.
All APIs are working except one. But this non-working api is working fine in local serverless offline. After deployment in server, then it is behaving delayed.
Here is skeleton of what I did in api function
let rec_list = await db.sequelize.query(query).spread(rec_list => { return rec_list; }).catch((e)=>{
console.log("error");
throw e;
})
let rec_list2= rec_list .map((rec_list_sub) => {
//some assignment here
//let new_var = {}; // just assignment - no db operation
return new_var;
});
let resultArr = await Promise.all(rec_list2).then((result) => {
return result;
}).catch((e) => {
throw e;
});
let tem_list = await db.mymodel.bulkCreate(resultArr).then(function (li) {
selectedIds = li.map(({ id }) => {
return {
reqId: id,
description: 'sent',
status: 0
}
});
return li;
}).catch(function (err) {
throw err;
});
//send fcm push
//triggering push notification to user
fcm_send_msg("success", "body-message-here",["fdsfdsfdsf-device-id"]); //*--> Push notification is not triggered. When i hit any of api from this same lambda function, then previously called apis' push notification triggered.*
If I place push notification function call which is aysnc (tried with/without await) before bulkcreate call, then it is working but bulk create delayed.
Execution time for lambda function is 6 sec but it took 108ms only. Memory allocated 1024mb but 120mb only used.

I'm going to make a lot of assumptions, so if I am totally off base let me know and I will delete this answer. I assume your lambda is running in a VPC since it is accessing a database. And from your question I am understanding you to be saying that when the call "bulkCreate" happens before the call to "fcm_send_msg" then you are experiencing the issue, but when the call the "fcm_send_msg" happens before the call to "bulkCreate" then you get the notification, but the bulk create is still delayed.
I suspect you may have a permission issue. Lambda uses an ephemeral port range for calls, so you will want to be sure your security groups and NACLs allow the full ephemeral port range from which Lambda may originate the calls. See https://docs.aws.amazon.com/vpc/latest/userguide/vpc-network-acls.html#nacl-ephemeral-ports:
AWS Lambda functions use ports 1024-65535.
You can check the CloudWatch logs for the lambda (from monitoring) to see what it is doing, and if you see things in the logs that seem like something failed and it is retrying it could very well indicate that the calls are being blocked due to security groups or NACLs that don't allow the communication.

Related

Why Does my AWS lambda function randomly fail when using private elasticache network calls as well as external API calls?

I am trying to write a caching function that returns cached elasticcache data or makes an api call to retrieve that data. However, the lambda function seems to be very unrealiable and timing out often.
It seems that the issue is having redis calls as well as public api calls causes the issue. I can confirm that I have setup aws correctly with a subnet with an internet gateway and a private subnet with a nat gateway. The function works, but lonly 10 % of the time.The remaining times exceution is stopped right before making the API call.
I have also noticed that the api calls fail after creating the redis client. If I make the external api call prior to making the redis check it seems the function is a lot more reliable and doesn't time out.
Not sure what to do. Is it best practice to seperate these 2 tasks or am I doing something wrong?
let data = null;
module.exports.handler = async (event) => {
//context.callbackWaitsForEmptyEventLoop = false;
let client;
try {
client = new Redis(
6379,
"redis://---.---.ng.0001.use1.cache.amazonaws.com"
);
client.get(event.token, async (err, result) => {
if (err) {
console.error(err);
} else {
data = result;
await client.quit();
}
});
if (data && new Date().getTime() / 1000 - eval(data).timestamp < 30) {
res.send(`({
"address": "${token}",
"price": "${eval(data).price}",
"timestamp": "${eval(data).timestamp}"
})`);
} else {
getPrice(event); //fetch api data
}
```
There a lot of misunderstand in your code. I'll try to guide you to fix it and understand how to do that correctly.
You are mixing asynchronous and synchronous code in your function.
You should use JSON.parse instead of eval to parse the data because eval allows arbitrary code to be executed in your function
You're using the res.send to return response to the client instead of callback. Remember the usage of res.send is only in express and you're using a lambda and to return the result to client you need to use callback function
To help you in this task, I completely rewrite your code solving these misundersand.
const Redis = require('ioredis');
module.exports.handler = async (event, context, callback) => {
// prefer to use lambda env instead of put directly in the code
const client = new Redis(
"REDIS_PORT_ENV",
"REDIS_HOST_ENV"
);
const data = await client.get(event.token);
client.quit();
const parsedData = JSON.parse(data);
if (parsedDate && new Date().getTime() / 1000 - parsedData.timestamp < 30) {
callback(null, {
address: event.token,
price: parsedData.price,
timestamp: parsedData.timestamp
});
} else {
const dataFromApi = await getPrice(event);
callback(null, dataFromApi);
}
};
There another usage with lambdas that return an object instead of pass a object inside callback, but I think you get the idea and understood your mistakes.
Follow the docs about correctly usage of lambda:
https://docs.aws.amazon.com/sdk-for-javascript/v2/developer-guide/using-lambda-functions.html
To undestand more about async and sync in javascript:
https://www.freecodecamp.org/news/synchronous-vs-asynchronous-in-javascript/
JSON.parse x eval: JSON.parse vs. eval()

AWS Node SDK async SNS.publish.promise() does not send SNS message immediately (from Lambda)

I understand when we use the .promise method with an AWS resource (eg SNS.publish) the AWS resource should be called immediately, the response resolving/rejecting the returned promise.
On my tests with SNS.publish the SNS message does not get sent until the promise is awaited.
I am using node within a Lambda function.
Example code:
'use strict';
const SNS = require('aws-sdk/clients/sns')
const client = new SNS()
module.exports.handler = async (event) => {
try {
let requestParams = {
Message: JSON.stringify(event),
TopicArn: `arn:aws:sns:XXXX`,
};
let publishPromise = client.publish(requestParams).promise();
//pause for an amount of time
console.log('before pause');
var start = Date.now()
var now = start;
while (now - start < 10000) {
now = Date.now();
}
console.log('after pause');
await publishPromise;
} catch (e) {
throw (e);
}
};
In my case the SNS triggers an SQS queue that triggers a target Lambda, I am using the target Lambda's cloudwatch logs to judge whether the SNS message is sent as soon as SNS.publish is invoked, or later when the await is called (comparing target Lambda's log timestamps to the above Lambda's).
The pause section in the above code pauses the script before awaiting the promise allowing us to more clearly see when the SNS message gets sent. In the above example a 10 second pause is used, any pause period gives me the same results, the target Lambda receives the messages after the await is called, not when SNS.publish is called.
Am I missing something? The SNS request to publish a message should go out immediately when SNS.publish is invoked, not held back until the await, what I am observing kind of defeats the purpose of async.. I must be missing something simple here..
for anyone else here, the behaviour is as expected. publishPromise is a Promise, not a function, so it is called with a .then
The code should be
publishPromise.then(
(result) => console.log("SNS published with",result),
(err) => console.log("Error publishing SNS", err)
)
But the real issue is that it's not a real Promise and does not hold up your Lambda when called. It sails right past and will probably finish and close before the SNS is actually published

AWS Cognito lambda triggers twice

I'm using an AWS Lambda function (using nodejs).
Once any request from APP to Cognito to signUp users.
Then I have set the Pre sign-up trigger to validate the user's customer and check users custom attribute available in our database or not. If yes then return an error and else insert new records in DB and return the event to Cognito.
TimeoutInfo - 5 min.
It happens sometime in the request, not all the time.
RequestId as different. (it will trigger 3 times sometime and most of the time twice)
Lambda trigger code as below.
users/index.js
const handler = async (event, context) => {
log.info('createUserLambda:start');
// immediately return once the call back is called to avoid
// lambda time out because of any open db connections
context.callbackWaitsForEmptyEventLoop = false;
return await preUserCreate(event);
};
exports.handler = handler;
users/users.js
export const preUserCreate = async (event) => {
log.info('preUserCreate:Start');
let userAttributes = event.request.userAttributes;
const currentDate = moment().utc().format('YYYY-MM-DD HH:mm:ss');
try {
let userParams = {
'docStatus': 'VRF'
};
let docParams = [{
'docNumber': userAttributes['custom:document_number'] ? userAttributes['custom:document_number'] : '',
'createdDate': currentDate
}];
if (docParams.length && docParams[0].docNumber) {
let documentExit = await getDocs(docParams[0].docNumber);
if (documentExit.length) {
log.info('preUserCreate:Error');
throw new Error('Document number already exist.');;
}
}
let documentRs = await insertDocument(docParams);
userParams = {
'did': documentRs[0].id,
'id': event.userName,
'createdDate': currentDate,
'updatedDate': currentDate,
...userParams
};
let userRs = await insertUser([userParams]);
if (docParams.length && docParams[0].docNumber) {
let resultData = await getUserAccountFromAPI(docParams[0].docNumber);
if (resultData) {
let foramattedData = await formattedAccountsData(resultData, userRs[0].id, documentRs[0].id);
await insertUserAccounts(foramattedData);
}
}
log.info('preUserCreate:Success');
event.response = {
'autoConfirmUser': false,
'autoVerifyPhone': false,
'autoVerifyEmail': false
};
return event;
} catch (error) {
log.info('preUserCreate:Error', error);
throw (error);
}
}
This likely happens because of the Cognito-imposed execution timeout of 5 seconds for integration Lambdas - and it cannot be changed. Also note that the maximum amount of times that Cognito will (re-)attempt to call the function is 3 times.
In the Customizing User Pool Workflows with Lambda Triggers section it states that:
Important
Amazon Cognito invokes Lambda functions synchronously. When
called, your Lambda function must respond within 5 seconds. If it does
not, Amazon Cognito retries the call. After 3 unsuccessful attempts,
the function times out. This 5-second timeout value cannot be changed.
Therefore to reduce the execution time it would be worth to consider introducing caching where possible. Including database connections etc.
Do however note that you have little to no control over how often Lambdas are re-used versus re-launched and you will need to keep this in mind in terms of warm-up times.
Any chance you are running your lambda in a VPC? I've seen similar behavior with a Cognito trigger that ran in a VPC when it was cold started. Once the lambda was warm the problem went away
My hunch was that internally Cognito has a very short timeout period for executing the trigger, and if the trigger didn't reply in time, it would automatically retry.
We ended up having to add logic to our trigger to test for this scenario so that we weren't duplicating writes to our database.

Where to Put subscription.pull when want to get the publish data Google Cloud pub/sub NODE JS

So i've tried to develop some pub/sub system based on node js
i am using express post request to publish the data that i wanted. it's send the data well, but my question is where should i put my code for the subscription
Right now i put the code at the root of my file like this
pubSub.subscribe()
.then(({results, subscription}) => {
results[0].data.forEach((item) => {
let key = ['UserId', fakeId(1, 100), 'FeedId', fakeId(100, 200), 'plugin', fakeId(1, 100)]
upsert(key, item, () => {
console.log('Sync Success')
console.log(item)
}, error => console.error(error))
})
subscription.ack(results.map((result) => result.ackId));
})
.catch(error => console.error(error))
i have some helper to subscribe like this
function subscribe () {
const subscription = pubSub.subscription('plugin_subscription')
return new Promise ((resolve, reject) => {
return subscription.pull((error, results) => {
console.log('ini ke trigger')
if (error) reject(error)
resolve({results, subscription})
})
})
}
well it's kind of only work once. if i publish message i dont' have any response log from the subscriber, but if i restart the node js server my log is show that i successfully receive the data and could proceed to the next step.
Am i doing something wrong here?
thanks a lot man
A couple ideas:
You're using promises to handle received messages. A promise on its
own can only be triggered once - to trigger it multiple times, you'd
need some sort of loop or recursive call.
Try using event
handlers (see this example) instead of promises - those should trigger every time an
event occurs without any additional looping or recursion. Note that for this example, you'll need to remove the code that removes the messageHandler listener.
Hopefully this helps!

Node Postgres Module not responding

I have an amazon beanstalk node app that uses the postgres amazon RDS. To interface node with postgres I use node postgres. Code looks like this:
var pg = require('pg'),
done,client;
function DataObject(config,success,error) {
var PG_CONNECT = "postgres://"+config.username+":"+config.password+"#"+
config.server+":"+config.port+"/"+config.database;
self=this;
pg.connect(PG_CONNECT, function(_error, client, done) {
if(_error){ error();}
else
{
self.client = client;
self.done = done;
success();
}
});
}
DataObject.prototype.add_data = function(data,success,error) {
self=this;
this.client.query('INSERT INTO sample (data) VALUES ($1,$2)',
[data], function(_error, result) {
self.done();
success();
});
};
To use it I create my data object and then call add_data every time new data comes along. Within add_data I call 'this/self.done()' to release the connection back to the pool. Now when I repeatedly make those requests the client.query never gets back. Under what circumstance could this lead to a blocking/not responding database interface?
The way you are using pool is incorrect.
You are asking for a connection from pool in the function DataObject. This function acts as a constructor and is executed once per data object. Thus only one connection is asked for from the pool.
When we call add_data the first time, the query is executed and the connection is returned to the pool. Thus the consequent calls are not successful since the connection is already returned.
You can verify this by logging _error:
DataObject.prototype.add_data = function(data,success,error) {
self=this;
this.client.query('INSERT INTO sample (data) VALUES ($1,$2)',
[data], function(_error, result) {
if(_error) console.log(_error); //log the error to console
self.done();
success();
});
};
There are couple of ways you can do it differently:
Ask for a connection for every query made. Thus you'll need to move the code which ask for pool to function add_data.
Release client after performing all queries. This is a tricky way since calls are made asynchronously, you need to be careful that client is not shared i.e. no new request be made until client.query callback function is done.

Resources