I'm trying to run a specific function from an existing app via AWS Lambda, using the JS SDK to invoke Lambda from my node.js app. Since I'm overwriting the existing function, I'll have to keep its basic structure, which is this:
overwrittenFunction = function(params) {
//get some data
return dataArray;
}
..so I need to end up with an array that I can return, if I'm looking to keep the underlying structure of the lib I use the same. Now as far as I know, Lambda invocations are asynchronous, and it's therefore not possible to do something like this:
overwrittenFunction = function(params) {
lambda.invoke(params, callback);
function callback(err,data) {
var dataArray = data;
}
return dataArray;
}
(I've also tried similar things with promises and async/await).
afaik I have two options now: somehow figure out how to do a synchronous Lambda invocation, or modify my library / existing app (which I would rather not do if possible).
Is there any way to do such a thing and somehow return the value I'm expecting?
(I'm using node v8.9.4)
Lambda and async/await are a bit tricky, but the following is working for me (in production):
const lambdaParams = {
FunctionName: 'my-lambda',
// RequestResponse is important here. Without it we won't get the result Payload
InvocationType: 'RequestResponse',
LogType: 'Tail', // other option is 'None'
Payload: {
something: 'anything'
}
};
// Lambda expects the Payload to be stringified JSON
lambdaParams.Payload = JSON.stringify(lambdaParams.Payload);
const lambdaResult = await lambda.invoke(lambdaParams).promise();
logger.debug('Lambda completed, result: ', lambdaResult.Payload);
const resultObject = JSON.parse(lambdaResult.Payload)
Wrap it all up in a try/catch and go to town.
You can use async await but as the AWS SDK uses node callback pattern you'll need to wrap the function with the built-in promisify.
const promisify = require('utils').promisify
const aws = require('aws-sdk');
const lambda = aws.Lambda();
const invoke = promisify(lambda.invoke);
async function invocation(params) {
try {
return await invoke(params);
} catch (err) {
throw new Error('Somethings up');
}
}
const data = invocation(params);
Related
I am trying to write a caching function that returns cached elasticcache data or makes an api call to retrieve that data. However, the lambda function seems to be very unrealiable and timing out often.
It seems that the issue is having redis calls as well as public api calls causes the issue. I can confirm that I have setup aws correctly with a subnet with an internet gateway and a private subnet with a nat gateway. The function works, but lonly 10 % of the time.The remaining times exceution is stopped right before making the API call.
I have also noticed that the api calls fail after creating the redis client. If I make the external api call prior to making the redis check it seems the function is a lot more reliable and doesn't time out.
Not sure what to do. Is it best practice to seperate these 2 tasks or am I doing something wrong?
let data = null;
module.exports.handler = async (event) => {
//context.callbackWaitsForEmptyEventLoop = false;
let client;
try {
client = new Redis(
6379,
"redis://---.---.ng.0001.use1.cache.amazonaws.com"
);
client.get(event.token, async (err, result) => {
if (err) {
console.error(err);
} else {
data = result;
await client.quit();
}
});
if (data && new Date().getTime() / 1000 - eval(data).timestamp < 30) {
res.send(`({
"address": "${token}",
"price": "${eval(data).price}",
"timestamp": "${eval(data).timestamp}"
})`);
} else {
getPrice(event); //fetch api data
}
```
There a lot of misunderstand in your code. I'll try to guide you to fix it and understand how to do that correctly.
You are mixing asynchronous and synchronous code in your function.
You should use JSON.parse instead of eval to parse the data because eval allows arbitrary code to be executed in your function
You're using the res.send to return response to the client instead of callback. Remember the usage of res.send is only in express and you're using a lambda and to return the result to client you need to use callback function
To help you in this task, I completely rewrite your code solving these misundersand.
const Redis = require('ioredis');
module.exports.handler = async (event, context, callback) => {
// prefer to use lambda env instead of put directly in the code
const client = new Redis(
"REDIS_PORT_ENV",
"REDIS_HOST_ENV"
);
const data = await client.get(event.token);
client.quit();
const parsedData = JSON.parse(data);
if (parsedDate && new Date().getTime() / 1000 - parsedData.timestamp < 30) {
callback(null, {
address: event.token,
price: parsedData.price,
timestamp: parsedData.timestamp
});
} else {
const dataFromApi = await getPrice(event);
callback(null, dataFromApi);
}
};
There another usage with lambdas that return an object instead of pass a object inside callback, but I think you get the idea and understood your mistakes.
Follow the docs about correctly usage of lambda:
https://docs.aws.amazon.com/sdk-for-javascript/v2/developer-guide/using-lambda-functions.html
To undestand more about async and sync in javascript:
https://www.freecodecamp.org/news/synchronous-vs-asynchronous-in-javascript/
JSON.parse x eval: JSON.parse vs. eval()
it's the first time for me using async/await. I've got problems to use it in the context of a database request inside a dialogflow intent. How can I fix my code?
What happens?
When I try to run use my backend - this is what I get: "Webhook call failed. Error: Request timeout."
What do I suspect?
My helper function getTextResponse() waits for a return value of airtable, but never get's one.
What do I want to do?
"GetDatabaseField-Intent" gets triggered
Inside it sends a request to my airtable database via getTextResponse()
Because I use"await" the function will wait for the result before continuing
getTextResponse() will return the "returnData"; so the var result will be filled with "returnData"
getTextResponse() has finished; so the response will be created with it's return value
'use strict';
const {
dialogflow
} = require('actions-on-google');
const functions = require('firebase-functions');
const app = dialogflow({debug: true});
const Airtable = require('airtable');
const base = new Airtable({apiKey: 'MyKey'}).base('MyBaseID');
///////////////////////////////
/// Helper function - reading Airtable fields.
const getTextResponse = (mySheet, myRecord) => {
return new Promise((resolve, reject) => {
// Function for airtable
base(mySheet).find(myRecord, (err, returnData) => {
if (err) {
console.error(err);
return;
}
return returnData;
});
}
)};
// Handle the Dialogflow intent.
app.intent('GetDatabaseField-Intent', async (conv) => {
const sheetTrans = "NameOfSheet";
const recordFirst = "ID_OF_RECORD";
var result = await getTextResponse(sheetTrans, recordFirst, (callback) => {
// parse the record => here in the callback
myResponse = callback.fields.en;
});
conv.ask(myResponse);
});
// Set the DialogflowApp object to handle the HTTPS POST request.
exports.dialogflowFirebaseFulfillment = functions.https.onRequest(app);
As #Kolban pointed out, you are not accepting or rejecting the Promise you create in getTextResponse().
It also looks like the var result = await getTextResponse(...) call is incorrect. You have defined getTextResponse() to accept two parameters, but you are passing it three (the first two, plus an anonymous arrow function). But this extra function is never used/referenced.
I would generally avoid mixing explicit promises with async/await and definitely avoid mixing async/await with passing callbacks.
I don't know the details of the API you are using, but if the API already supports promises, then you should be able to do something like this:
const getTextResponse = async (mySheet, myRecord) => {
try {
return await base(mySheet).find(myRecord)
}
catch(err) {
console.error(err);
return;
}
)};
...
app.intent('GetDatabaseField-Intent', async (conv) => {
const sheetTrans = "NameOfSheet";
const recordFirst = "ID_OF_RECORD";
var result = await getTextResponse(sheetTrans, recordFirst)
myResponse = result.fields.en;
conv.ask(myResponse);
});
...
Almost all promised based libraries or APIs can be used with async/await, as they simply use Promises under the hood. Everything after the await becomes a callback that is called when the awaitted method resolves successfully. Any unsuccessful resolution throws a PromiseRejected error, which you handle by use of a try/catch block.
Looking at the code, it appears that you may have a misunderstanding of JavaScript Promises. When you create a Promise, you are passed two functions called resolve and reject. Within the body of your promise code (i.e. the code that will complete sometime in the future). You must invoke either resolve(returnData) or reject(returnData). If you don't invoke either, your Promise will never be fulfilled. Looking at your logic, you appear to be performing simple returns without invoking resolve or reject.
Let me ask you to Google again on JavaScript Promises and study them again with respect to the previous comments just made and see if that clears up the puzzle.
I am trying to use util.promisify to convert AWS Document client get function to a promise based utility. But it does not seem to behave as expected;
// This does not work as expected
const docClient = new AWS.DynamoDB.DocumentClient();
let docClientGet = require('util').promisify(docClient.get);
However when i do usual promise conversion like this,
let docClientGet = function (params) {
return new Promise((resolve, reject) => {
docClient.get(params, function (err, data) {
if (err) {
return reject(err);
}
return resolve(data);
})
})
};
And use it in an async function like this:
await docClientGet(params);
It works!.
I wonder where I am wrong in understanding util.promisify
If the method you are promisifying needs to be associated with the object it is on (which it looks like it does in your case, then this code:
let docClientGet = utils.promisify(docClient.get);
will not retain the association with the docClient object. What happens is that the promisified docClient.get() gets called without the this value set to the docClient object and it can't do its job properly.
You can work around that with this:
utils.promisify(docClient.get.bind(docClient));
The promisify doc does not make this clear because it uses an example from the fs library whose methods do not need to be associated with the fs object in order to work properly.
P.S. It's a bit unusual to put the util library into a variable named utils. That is likely to confuse some people reading your code.
It seems it is not possible to pass around some code (containing data and functions) that is invoked as a AWS lambda function within another AWS lambda function.
For example, take this customConfigLambda:
var callbackPayload = {};
callbackPayload.fooData = 'fooFromData';
callbackPayload.fooFunction = function() {return 'fooFromFunction'; };
exports.handler = (event, context, callback) => {
callback(null, callbackPayload);
};
When I call the previous AWS lambda function in another AWS lambda function like here:
var AWS = require('aws-sdk');
AWS.config.update({accessKey: '123', secretAccessKey: 'abc', region: 'us-east-1' });
var lambda = new AWS.Lambda({region: 'us-east-1'});
exports.handler = (event, context, callback) => {
var params = {FunctionName: 'customConfigLambda'};
lambda.invoke(params, function(err, callbackPayload) {
if (err) {
// do nothing
}
else {
console.log('callbackPayload:', JSON.stringify(callbackPayload, null, 2));
}
});
};
Then I can see only callbackPayload.fooData but not callbackPayload.fooFunction.
How can I have some callbackPayload.fooFunction(s) shared between multiple other AWS lambda functions?
As of AWS Reinvent 2018, Amazon has introduced Lambda Layers.
Lambda Layers, a way to centrally manage code and data that is shared across multiple functions.
The idea is that now you can put common components in a ZIP file and upload it as a Lambda Layer. Your function code doesn’t need to be changed and can reference the libraries in the layer as it would normally do instead of packaging them separately.
See the docs on Using the Callback Parameter at:
http://docs.aws.amazon.com/lambda/latest/dg/nodejs-prog-model-handler.html#nodejs-prog-model-handler-callback
It says this about the result (the callbackPayload in your code):
result – is an optional parameter that you can use to provide the
result of a successful function execution. The result provided must be
JSON.stringify compatible. If an error is provided, this parameter is
ignored.
To be JSON.stringify compatible you cannot have any functions there. See the http://json.org/ to see what is valid JSON (only strings, numbers, objects, arrays, true, false and null).
If you want to share code between your AWS Lambda functions in a broad sense, you have to require the same Node module in both of them, so that you can make a common set of functions available to all of your AWS Lamda handlers. But you cannot pass around arbitrary code between them because those will not survive the JSON.stringify.
As a test you can try running this in the browser:
var callbackPayload = {};
callbackPayload.fooData = 'fooFromData';
callbackPayload.fooFunction = function() {return 'fooFromFunction'; };
alert(JSON.stringify(callbackPayload));
(see DEMO)
or this in Node:
var callbackPayload = {};
callbackPayload.fooData = 'fooFromData';
callbackPayload.fooFunction = function() {return 'fooFromFunction'; };
console.log(JSON.stringify(callbackPayload));
and see the result:
{"fooData":"fooFromData"}
The functions is stripped out during the serialization process.
Of course you could do something like this:
callbackPayload.fooFunction
= function() {return 'fooFromFunction'; }.toString();
and get a JSON result:
{"fooData":"fooFromData","fooFunction":"function () {return 'fooFromFunction'; }"}
which you could theoretically eval on the other end but I wouldn't really recommend it.
I'm coming from a java background so a bit of a newbie on Javascript conventions needed for Lambda.
I've got a lambda function which is meant to do several AWS tasks in a particular order, depending on the result of the previous task.
Given that each task reports its results asynchronously, I'm wondering if the right way make sure they all happen in the right sequence, and the results of one operation are available to the invocation of the next function.
It seems like I have to invoike each function in the callback of the prior function, but seems like that will some kind of deep nesting and wondering if that is the proper way to do this.
For example on of these functions requires a DynamoDB getItem, following by a call to SNS to get an endpoint, followed by a SNS call to send a message, followed by a DynamoDB write.
What's the right way to do that in lambda javascript, accounting for all that asynchronicity?
I like the answer from #jonathanbaraldi but I think it would be better if you manage control flow with Promises. The Q library has some convenience functions like nbind which help convert node style callback API's like the aws-sdk into promises.
So in this example I'll send an email, and then as soon as the email response comes back I'll send a second email. This is essentially what was asked, calling multiple services in sequence. I'm using the then method of promises to manage that in a vertically readable way. Also using catch to handle errors. I think it reads much better just simply nesting callback functions.
var Q = require('q');
var AWS = require('aws-sdk');
AWS.config.credentials = { "accessKeyId": "AAAA","secretAccessKey": "BBBB"};
AWS.config.region = 'us-east-1';
// Use a promised version of sendEmail
var ses = new AWS.SES({apiVersion: '2010-12-01'});
var sendEmail = Q.nbind(ses.sendEmail, ses);
exports.handler = function(event, context) {
console.log(event.nome);
console.log(event.email);
console.log(event.mensagem);
var nome = event.nome;
var email = event.email;
var mensagem = event.mensagem;
var to = ['email#company.com.br'];
var from = 'site#company.com.br';
// Send email
mensagem = ""+nome+"||"+email+"||"+mensagem+"";
console.log(mensagem);
var params = {
Source: from,
Destination: { ToAddresses: to },
Message: {
Subject: {
Data: 'Form contact our Site'
},
Body: {
Text: {
Data: mensagem,
}
}
};
// Here is the white-meat of the program right here.
sendEmail(params)
.then(sendAnotherEmail)
.then(success)
.catch(logErrors);
function sendAnotherEmail(data) {
console.log("FIRST EMAIL SENT="+data);
// send a second one.
return sendEmail(params);
}
function logErrors(err) {
console.log("ERROR="+err, err.stack);
context.done();
}
function success(data) {
console.log("SECOND EMAIL SENT="+data);
context.done();
}
}
Short answer:
Use Async / Await — and Call the AWS service (SNS for example) with a .promise() extension to tell aws-sdk to use the promise-ified version of that service function instead of the call back based version.
Since you want to execute them in a specific order you can use Async / Await assuming that the parent function you are calling them from is itself async.
For example:
let snsResult = await sns.publish({
Message: snsPayload,
MessageStructure: 'json',
TargetArn: endPointArn
}, async function (err, data) {
if (err) {
console.log("SNS Push Failed:");
console.log(err.stack);
return;
}
console.log('SNS push suceeded: ' + data);
return data;
}).promise();
The important part is the .promise() on the end there. Full docs on using aws-sdk in an async / promise based manner can be found here: https://docs.aws.amazon.com/sdk-for-javascript/v2/developer-guide/using-promises.html
In order to run another aws-sdk task you would similarly add await and the .promise() extension to that function (assuming that is available).
For anyone who runs into this thread and is actually looking to simply push promises to an array and wait for that WHOLE array to finish (without regard to which promise executes first) I ended up with something like this:
let snsPromises = [] // declare array to hold promises
let snsResult = await sns.publish({
Message: snsPayload,
MessageStructure: 'json',
TargetArn: endPointArn
}, async function (err, data) {
if (err) {
console.log("Search Push Failed:");
console.log(err.stack);
return;
}
console.log('Search push suceeded: ' + data);
return data;
}).promise();
snsPromises.push(snsResult)
await Promise.all(snsPromises)
Hope that helps someone that randomly stumbles on this via google like I did!
I don't know Lambda but you should look into the node async library as a way to sequence asynchronous functions.
async has made my life a lot easier and my code much more orderly without the deep nesting issue you mentioned in your question.
Typical async code might look like:
async.waterfall([
function doTheFirstThing(callback) {
db.somecollection.find({}).toArray(callback);
},
function useresult(dbFindResult, callback) {
do some other stuff (could be synch or async)
etc etc etc
callback(null);
],
function (err) {
//this last function runs anytime any callback has an error, or if no error
// then when the last function in the array above invokes callback.
if (err) { sendForTheCodeDoctor(); }
});
Have a look at the async doco at the link above. There are many useful functions for serial, parallel, waterfall, and many more. Async is actively maintained and seems very reliable.
good luck!
A very specific solution that comes to mind is cascading Lambda calls. For example, you could write:
A Lambda function gets something from DynamoDB, then invokes…
…a Lambda function that calls SNS to get an endpoint, then invokes…
…a Lambda function that sends a message through SNS, then invokes…
…a Lambda function that writes to DynamoDB
All of those functions take the output from the previous function as input. This is of course very fine-grained, and you might decide to group certain calls. Doing it this way avoids callback hell in your JS code at least.
(As a side note, I'm not sure how well DynamoDB integrates with Lambda. AWS might emit change events for records that can then be processed through Lambda.)
Just saw this old thread. Note that future versions of JS will improve that. Take a look at the ES2017 async/await syntax that streamlines an async nested callback mess into a clean sync like code.
Now there are some polyfills that can provide you this functionality based on ES2016 syntax.
As a last FYI - AWS Lambda now supports .Net Core which provides this clean async syntax out of the box.
I would like to offer the following solution, which simply creates a nested function structure.
// start with the last action
var next = function() { context.succeed(); };
// for every new function, pass it the old one
next = (function(param1, param2, next) {
return function() { serviceCall(param1, param2, next); };
})("x", "y", next);
What this does is to copy all of the variables for the function call you want to make, then nests them inside the previous call. You'll want to schedule your events backwards. This is really just the same as making a pyramid of callbacks, but works when you don't know ahead of time the structure or quantity of function calls. You have to wrap the function in a closure so that the correct value is copied over.
In this way I am able to sequence AWS service calls such that they go 1-2-3 and end with closing the context. Presumably you could also structure it as a stack instead of this pseudo-recursion.
I found this article which seems to have the answer in native javascript.
Five patterns to help you tame asynchronis javascript.
By default Javascript is asynchronous.
So, everything that you have to do, it's not to use those libraries, you can, but there's simple ways to solve this. In this code, I sent the email, with the data that comes from the event, but if you want, you just need to add more functions inside functions.
What is important is the place where your context.done(); is going to be, he is going to end your Lambda function. You need to put him in the end of the last function.
var AWS = require('aws-sdk');
AWS.config.credentials = { "accessKeyId": "AAAA","secretAccessKey": "BBBB"};
AWS.config.region = 'us-east-1';
var ses = new AWS.SES({apiVersion: '2010-12-01'});
exports.handler = function(event, context) {
console.log(event.nome);
console.log(event.email);
console.log(event.mensagem);
nome = event.nome;
email = event.email;
mensagem = event.mensagem;
var to = ['email#company.com.br'];
var from = 'site#company.com.br';
// Send email
mensagem = ""+nome+"||"+email+"||"+mensagem+"";
console.log(mensagem);
ses.sendEmail( {
Source: from,
Destination: { ToAddresses: to },
Message: {
Subject: {
Data: 'Form contact our Site'
},
Body: {
Text: {
Data: mensagem,
}
}
}
},
function(err, data) {
if (err) {
console.log("ERROR="+err, err.stack);
context.done();
} else {
console.log("EMAIL SENT="+data);
context.done();
}
});
}