I need to retrieve an IoT-certificate status using a lambda function with NodeJS.
Following the documentation I should use describeCertificate() to accomplish this task.
This is the code I have so far (using for testes):
const AWS = require('aws-sdk')
const iot = new AWS.Iot()
let cert = {}
async function descCert (params) {
console.log("start descCert")
console.log("params")
console.log(params)
await iot.describeCertificate(params, function(err, data) {
console.log('describeCertificate - Fn')
if (err) {
console.log('describeCertificate - Error')
console.log(err, err.stack)
}else{
console.log('describeCertificate - data')
cert = data
console.log(data)
}
console.log("end describeCertificate - Fn")
})
console.log("end descCert")
}
module.exports.testFn = async (event, context, callback) => {
var zzz = {
certificateId: 'xxxx8c0891f8xxxxxx'
}
await descCert(zzz)
console.log("after descCert")
console.log(cert)
...
}
My guess is that I am not accessing the function await iot.describeCertificate( ... since I cant see the logs in CloudWatch.
I should receive this sequence:
start descCert
params
{certificateId: 'xxxx8c0891f8xxxxxx'}
describeCertificate - Fn
Or describeCertificate - Error Or describeCertificate - data
actual data response
end describeCertificate - Fn
end descCert
after descCert
actual data response
But this is what I am getting:
start descCert
params
{certificateId: 'xxxx8c0891f8xxxxxx'}
(8) end descCert
(9) after descCert
(10) os dados mesmo //{}
I can't see the steps 4-7 in the logs INFO. So the conclusion is the FUnctions is not being called.
What am I missing?
For accessing IoT core service (or any other services) you should give the Lambda function the corresponding RIGHT accesses to that service. To do that you can go to IAM -> Roles and add the corresponding policy to the role attached to your Lambda function.
You may add inline policy to the corresponding lambda-role:
{
"Version": "2012-10-17",
"Statement": [
{
"Sid": "VisualEditor0",
"Effect": "Allow",
"Action": "iot:DescribeCertificate",
"Resource": "arn:aws:iot:*:*:cert/*"
}
]
}
It is better to change the Resource to your corresponding arn.
var AWS = require('aws-sdk');
async function get_certificate_status(iot, params) {
try {
const data = await iot.describeCertificate(params).promise();
return data.certificateDescription.status;
} catch(e) {
throw new Error(e.message);
}
}
exports.handler = async function(event, context, callback) {
var iot = new AWS.Iot({'region': <region>, apiVersion: '2015-05-28'});
var params = { certificateId: <certificateId> };
var cert_status = await get_certificate_status(iot, params);
console.log("STATUS: " + cert_status);
}
Related
I try to invoke a “child” lambda from a “parent” lambda
The example of code is very simple as below (I am using Serverless framework).
child_lambda
const mainHandler = async (event, context) => {
console.log('event: ', JSON.stringify(event));
return context.functionName;
};
export const handler = mainHandler;
parent_lambda
import AWS from 'aws-sdk';
const lambda = new AWS.Lambda();
const invokeLambda = async () => {
let sampleData = { number1: 1, number2: 2 };
let params = {
FunctionName: 'child_lambda',
Payload: JSON.stringify(sampleData),
Qualifier: '1'
};
try {
await lambda.invoke(params).promise();
return true;
} catch (e) {
console.log('invokeLambda :: Error: ' + e);
}
};
const mainHandler = async (event, context) => {
console.log('event: ', JSON.stringify(event));
await invokeLambda();
return context.functionName;
};
export const handler = mainHandler;
serverless.yml
parent_lambda:
handler: handlers/lambda/parent_lambda.handler
name: dev_parent_lambda
iamRoleStatements:
- Effect: "Allow"
Action:
- lambda: InvokeFunction
- lambda: InvokeAsync
Resource: "*"
events:
- http:
path: test/invokeLambda
method: GET
child_lambda:
handler: handlers/lambda/child_lambda.handler
name: dev_child_lambda
I run the parent from Postman and the result is
ResourceNotFoundException: Function not found:
arn:aws:lambda:xxxx:xxxxx:function:dev_child_lambda
I tried to trigger the child_lambda from an S3 event, it worked fine, but never work with invoke as AWS SDK.
Any suggestion is appreciated
From the comments, code given in the question is perfect except the Qualifier parameter
Qualifier is used to
Specify a version or alias to invoke a published version of the
function.
In this case, lambda is not versioned. Hence we just need to remove qualifier .
const invokeLambda = async () => {
let sampleData = { number1: 1, number 2: 2 };
let params = {
FunctionName: 'child_lambda',
Payload: JSON.stringify(sampleData)
};
try {
await lambda.invoke(params).promise();
return true;
} catch (e) {
console.log('invokeLambda :: Error: ' + e);
}
};
Lambda Asynchronous invocation
Amazon Simple Storage Service (Amazon S3) invoke functions asynchronously to process events. When you invoke a function asynchronously, you don't wait for a response from the function code. You hand off the event to Lambda and Lambda handles the rest.
In that case, I would simply chain the lambdas using AWS lambda destinations. Supported destinations:
Amazon SQS – sqs:SendMessage
Amazon SNS – sns:Publish
Lambda – lambda:InvokeFunction
EventBridge – events:PutEvents
Configuring destinations for asynchronous invocation
Introducing AWS Lambda Destinations
I'm trying to set IAM policy using projects.locations.functions.setIamPolicy, on a function which I created and deployed, but I keep receiving the following error
GaxiosError: Permission 'cloudfunctions.functions.setIamPolicy' denied on resource '
Here is my code in Node.js
const { auth } = require('google-auth-library');
const { google } = require('googleapis');
const fs = require('fs');
const path = require('path');
function getCredentials() {
const filePath = path.join(__dirname, 'mykeyfile.json');
console.log(filePath);
if (fs.existsSync(filePath)) {
let rawdata = fs.readFileSync(filePath);
let jsonData = JSON.parse(rawdata);
return jsonData;
}
return null;
}
async function setPolicy() {
try {
const credentials = getCredentials();
if (!credentials)
return;
const client = auth.fromJSON(credentials);
client.scopes = ['https://www.googleapis.com/auth/cloud-platform'];
const cloudfunctions = await google.cloudfunctions({
version: 'v1',
auth: client
});
const request = {
// REQUIRED: The resource for which the policy is being specified.
// See the operation documentation for the appropriate value for this field.
resource_: "projects/{myprorjectid}/locations/{zone}/functions/function-1",
resource: {
"policy": {
"etag": "BwWP3fXnMuQ=",
"version": 1,
"bindings": [
{
"members": [
"allUsers"
],
"role": "roles/cloudfunctions.invoker"
}
]
}
},
auth: client,
};
const response = (await cloudfunctions.projects.locations.functions.setIamPolicy(request)).data;
console.log(JSON.stringify(response, null, 2));
} catch (err) {
console.log(err);
}
}
setIamPolicy();
Any help is much appreciated. Not interested in command like options. Must be done in Node.
It seems that the service account you're using on your code doesn't have the right permissions. The error is similar to this doc and the solution is to add Project Owner or Cloud Functions Admin role to your service account, as both contain the cloudfunctions.functions.setIamPolicy permission.
I'm trying to mock SES with Sinon, but facing below error. Tried using aws-sdk-mock, but it's not working.
Error: TypeError: Cannot stub non-existent own property sendEmail
Code snippet of test class:
import * as AWS from 'aws-sdk';
const sandbox = sinon.createSandbox();
sandbox.stub(AWS.SES, 'sendEmail').returns({promise: () => true});
Actual class:
import * as AWS from 'aws-sdk';
import * as _ from 'lodash';
export async function sendAlertMailOnFailure(status:any)
{
// load AWS SES
var ses = new AWS.SES();
const params = {
Destination: {
ToAddresses: <to_address>
},
Message: {...},
Source: <sender_address>
}
ses.sendEmail(params, (err, data) => {
if (err) {
log.error("Error sending mail::");
log.error(err, err.stack);
}
})
}
Is there any way to mock SES with Sinon or with aws-sdk-mock?
My answer here is not a direct solution for SES, but it is a working solution I'm using for mocking DynamoDB.DocumentClient and SQS. Perhaps you can adapt my working example for SES and other aws-sdk clients in your unit tests.
I just spent hours trying to get AWS SQS mocking working, without resorting to the aws-sdk-mock requirement of importing aws-sdk clients inside a function.
The mocking for AWS.DynamoDB.DocumentClient was pretty easy, but the AWS.SQS mocking had me stumped until I came across the suggestion to use rewire.
My lambda moves bad messages to a SQS FailQueue (rather than letting the Lambda fail and return the message to the regular Queue for retries, and then DeadLetterQueue after maxRetries). The unit tests needed to mock the following SQS methods:
SQS.getQueueUrl
SQS.sendMessage
SQS.deleteMessage
I'll try to keep this example code as concise as I can while still including all the relevant parts:
Snippet of my AWS Lambda (index.js):
const AWS = require('aws-sdk');
AWS.config.update({region:'eu-west-1'});
const docClient = new AWS.DynamoDB.DocumentClient();
const sqs = new AWS.SQS({ apiVersion: '2012-11-05' });
// ...snip
Abridged Lambda event records (event.json)
{
"valid": {
"Records": [{
"messageId": "c292410d-3b27-49ae-8e1f-0eb155f0710b",
"receiptHandle": "AQEBz5JUoLYsn4dstTAxP7/IF9+T1S994n3FLkMvMmAh1Ut/Elpc0tbNZSaCPYDvP+mBBecVWmAM88SgW7iI8T65Blz3cXshP3keWzCgLCnmkwGvDHBYFVccm93yuMe0i5W02jX0s1LJuNVYI1aVtyz19IbzlVksp+z2RxAX6zMhcTy3VzusIZ6aDORW6yYppIYtKuB2G4Ftf8SE4XPzXo5RCdYirja1aMuh9DluEtSIW+lgDQcHbhIZeJx0eC09KQGJSF2uKk2BqTGvQrknw0EvjNEl6Jv56lWKyFT78K3TLBy2XdGFKQTsSALBNtlwFd8ZzcJoMaUFpbJVkzuLDST1y4nKQi7MK58JMsZ4ujZJnYvKFvgtc6YfWgsEuV0QSL9U5FradtXg4EnaBOnGVTFrbE18DoEuvUUiO7ZQPO9auS4=",
"body": "{ \"key1\": \"value 1\", \"key2\": \"value 2\", \"key3\": \"value 3\", \"key4\": \"value 4\", \"key5\": \"value 5\" }",
"attributes": {
"ApproximateReceiveCount": "1",
"SentTimestamp": "1536763724607",
"SenderId": "AROAJAAXYIAN46PWMV46S:steve.goossens#bbc.co.uk",
"ApproximateFirstReceiveTimestamp": "1536763724618"
},
"messageAttributes": {},
"md5OfBody": "e5b16f3a468e6547785a3454cfb33293",
"eventSource": "aws:sqs",
"eventSourceARN": "arn:aws:sqs:eu-west-1:123456789012:sqs-queue-name",
"awsRegion": "eu-west-1"
}]
}
}
Abridged unit test file (test/index.test.js):
const AWS = require('aws-sdk');
const expect = require('chai').expect;
const LamdbaTester = require('lambda-tester');
const rewire = require('rewire');
const sinon = require('sinon');
const event = require('./event');
const lambda = rewire('../index');
let sinonSandbox;
function mockGoodSqsMove() {
const promiseStubSqs = sinonSandbox.stub().resolves({});
const sqsMock = {
getQueueUrl: () => ({ promise: sinonSandbox.stub().resolves({ QueueUrl: 'queue-url' }) }),
sendMessage: () => ({ promise: promiseStubSqs }),
deleteMessage: () => ({ promise: promiseStubSqs })
}
lambda.__set__('sqs', sqsMock);
}
describe('handler', function () {
beforeEach(() => {
sinonSandbox = sinon.createSandbox();
});
afterEach(() => {
sinonSandbox.restore();
});
describe('when SQS message is in dedupe cache', function () {
beforeEach(() => {
// mock SQS
mockGoodSqsMove();
// mock DynamoDBClient
const promiseStub = sinonSandbox.stub().resolves({'Item': 'something'});
sinonSandbox.stub(AWS.DynamoDB.DocumentClient.prototype, 'get').returns({ promise: promiseStub });
});
it('should return an error for a duplicate message', function () {
return LamdbaTester(lambda.handler)
.event(event.valid)
.expectReject((err, additional) => {
expect(err).to.have.property('message', 'Duplicate message: {"Item":"something"}');
});
});
});
});
You need to use prototype in AWS to stub it:
import AWS from 'aws-sdk';
const sandbox = sinon.createSandbox();
sandbox.stub(AWS.prototype, 'SES').returns({
sendEmail: () => {
return true;
}
});
The error seems to indicate that AWS is being imported as undefined.
It might be that your ES6 compiler isn't automatically turning this line:
import AWS from 'aws-sdk';
...into an import of everything in aws-sdk into AWS.
Change it to this:
import * as AWS from 'aws-sdk';
...and that may fix the issue.
(Disclaimer: I can't reproduce the error in my environment which is compiling with Babel v7 and automatically handles either approach)
Using require & without using prototype. This is working for me for mocking DynamoDB.
const aws = require('aws-sdk');
const sinon = require('sinon');
const sandbox = sinon.createSandbox();
this.awsStub = sandbox.stub(aws, 'DynamoDB').returns({
query: function() {
return {
promise: function() {
return {
Items: []
};
}
};
}
});
Packages:
"aws-sdk": "^2.453.0"
"sinon": "^7.3.2"
I was able to use awk-sdk-mock by doing the following:
test class
const AWSMock = require('aws-sdk-mock');
const AWS = require('aws-sdk');
AWSMock.setSDKInstance(AWS);
...
AWSMock.mock('SES', 'sendRawEmail', mockSendEmail);
// call method that needs to mock send an email goes below
sendEmail(to, from, subject, body, callback);
function mockSendEmail(params, callback) {
console.log('mock email');
return callback({
MessageId: '1234567',
});
}
Actual class
const aws = require('aws-sdk');
const nodemailer = require('nodemailer');
function sendEmail(to, from, subject, body, callback) {
let addresses = to;
if (!Array.isArray(addresses)) {
addresses = [addresses];
}
let replyTo = [];
if (from) {
replyTo.push(from);
}
let data = {
to: addresses,
replyTo,
subject,
text: body,
};
nodemailer.createTransport({ SES: new aws.SES({ apiVersion: '2010-12-01' }) }).sendMail(data, callback);
}
const AWS = require('aws-sdk');
...
const sandbox = sinon.createSandbox();
sandbox.stub(AWS, 'SES').returns({
sendRawEmail: () => {
console.log("My sendRawEmail");
return {
promise: function () {
return {
MessageId: '987654321'
};
}
};
}
});
let ses = new AWS.SES({ region: 'us-east-1' });
let result = ses.sendRawEmail(params).promise();
I am trying to invoke Lambda through cloudfront viewer request . Here is my Lambda code
'use strict';
const AWS = require("aws-sdk");
const docClient = new AWS.DynamoDB.DocumentClient();
exports.handler = (event, context, callback) => {
/* Get request */
const request = event.Records[0].cf.request;
const requestbody = Buffer.from(request.body.data, 'base64').toString();
const data = JSON.parse(requestbody);
const Id = data.Name;
console.log(Id);
/* Generate body for response */
const body =
'<html>\n'
+ '<head><title>Hello From Lambda#Edge</title></head>\n'
+ '<body>\n'
+ '<h1>You clicked more than 10 Times </h1>\n'
+ '</body>\n'
+ '</html>';
var params = {
TableName: "Test",
ProjectionExpression: "#V,#N",
KeyConditionExpression: "#N = :v1",
ExpressionAttributeNames: {
"#N" : "Name",
"#V" : "Value"
},
ExpressionAttributeValues: {
":v1": Id
}
};
var querydb = docClient.query(params).promise();
querydb.then(function(data) {
console.log(data.Items[0].Value);
if(data.Items[0].Value >= 11){
const response = {
status: '200',
body: body,
};
callback(null, response);
}else {
callback(null,request);
}
}).catch(function(err) {
console.log(err);
});
};
When i triggered the same lambda through console it is giving correct response. But when i deployed through Cloudfront it is giving 503 Error. But i had tried the same code withcode Dynamodb Client it worked perfectly fine. Here is the working one
'use strict';
const AWS = require("aws-sdk");
const docClient = new AWS.DynamoDB.DocumentClient();
exports.handler = (event, context, callback) => {
/* Get request */
const request = event.Records[0].cf.request;
const requestbody = Buffer.from(request.body.data, 'base64').toString();
const data = JSON.parse(requestbody);
/* Generate body for response */
const body =
'<html>\n'
+ '<head><title>Hello From Lambda#Edge</title></head>\n'
+ '<body>\n'
+ '<h1>You clicked more than 10 Times </h1>\n'
+ '</body>\n'
+ '</html>';
if(data.Value >= 10){
const response = {
status: '200',
body: body,
};
callback(null, response);
}
else {
callback(null, request);
}
};
I had given full dynamodb permissions to the lambda#edge.
Any help is appreciated
Thanks
Where have you specified region for DyanamoDB?
It is possible that Lambda#Edge is executing in a region where your DDB table is missing.
Have a look at AWS doc on region's order of precedence. You can also look at this L#E workshop code and documentation for more details on calling DDB.
On a side note: A viewer facing Lambda function, making a call to a cross region dynamodb table will have negative effects on your latency. Not sure about your use case but see if it is possible to move this call to an origin facing event or make async call to ddb.
I have the following function which I use to invoke a Lambda function from within my code.
However when I try to use it within a Lambda function, I get the following error:
AWS lambda undefined 0.27s 3 retries] invoke({ FunctionName: 'my-function-name',
InvocationType: 'RequestResponse',
LogType: 'Tail',
Payload: <Buffer > })
How can I invoke a Lambda function from within a Lambda function?
My function:
'use strict';
var AWS = require("aws-sdk");
var lambda = new AWS.Lambda({
apiVersion: '2015-03-31',
endpoint: 'https://lambda.' + process.env.DYNAMODB_REGION + '.amazonaws.com',
logger: console
});
var lambdaHandler = {};
// #var payload - type:string
// #var functionName - type:string
lambdaHandler.invokeFunction = function (payload, functionName, callback) {
var params = {
FunctionName: functionName, /* required */
InvocationType: "RequestResponse",
LogType: "Tail",
Payload: new Buffer(payload, 'utf8')
};
var lambdaRequestObj = lambda.invoke(params);
lambdaRequestObj.on('success', function(response) {
console.log(response.data);
});
lambdaRequestObj.on('error', function(response) {
console.log(response.error.message);
});
lambdaRequestObj.on('complete', function(response) {
console.log('Complete');
});
lambdaRequestObj.send();
callback();
};
module.exports = lambdaHandler;
Invoking a Lambda Function from within another Lambda function is quite simple using the aws-sdk which is available in every Lambda.
I suggest starting with something simple first.
This is the "Hello World" of intra-lambda invocation:
Lambda_A invokes Lambda_B
with a Payload containing a single parameter name:'Alex'.
Lambda_B responds with Payload: "Hello Alex".
First create Lambda_B which expects a name property
on the event parameter
and responds to request with "Hello "+event.name:
Lambda_B
exports.handler = function(event, context) {
console.log('Lambda B Received event:', JSON.stringify(event, null, 2));
context.succeed('Hello ' + event.name);
};
Ensure that you give Lambda_B and Lambda_A the same role.
E.g: create a role called lambdaexecute which has AWSLambdaRole, AWSLambdaExecute and
AWSLambdaBasicExecutionRole (All are required):
Lambda_A
var AWS = require('aws-sdk');
AWS.config.region = 'eu-west-1';
var lambda = new AWS.Lambda();
exports.handler = function(event, context) {
var params = {
FunctionName: 'Lambda_B', // the lambda function we are going to invoke
InvocationType: 'RequestResponse',
LogType: 'Tail',
Payload: '{ "name" : "Alex" }'
};
lambda.invoke(params, function(err, data) {
if (err) {
context.fail(err);
} else {
context.succeed('Lambda_B said '+ data.Payload);
}
})
};
Once you have saved both these Lambda functions, Test run Lambda_A:
Once you have the basic intra-lambdda invocation working you can easily extend it to invoke more elaborate Lambda functions.
The main thing you have to remember is to set the appropriate ARN Role for all functions.
As of Dec 3, 2016, you can simply use an AWS Step function to put Lambda function Lambda_B as the sequential step of Lambda_A.
With AWS Step Functions, you define your application as a state
machine, a series of steps that together capture the behavior of the
app. States in the state machine may be tasks, sequential steps,
parallel steps, branching paths (choice), and/or timers (wait). Tasks
are units of work, and this work may be performed by AWS Lambda
functions, Amazon EC2 instances of any type, containers, or on
premises servers—anything that can communicate with the Step Functions
API may be assigned a task.
So the following state machine should meet your need.
Here is the code corresponding to the state machine.
{
"Comment": "A simple example of the Amazon States Language using an AWS Lambda Function",
"StartAt": "Lambda_A",
"States": {
"Lambda_A": {
"Type": "Task",
"Resource": "arn:aws:lambda:REGION:ACCOUNT_ID:function:FUNCTION_NAME",
"Next": "Lambda_B"
},
"Lambda_B":{
"Type": "Task",
"Resource": "arn:aws:lambda:REGION:ACCOUNT_ID:function:FUNCTION_NAME",
"End": true
}
}
}
Moreover, you can add much more sophisticated logics in a state machine, such as parallel steps and catch failures. It even logs the details of every single execution which makes debugging a much better experience, especially for lambda functions.
Everything mentioned by #nelsonic is correct, except for the roles.
I tried choosing the roles that he mentioned above:
AWSLambdaExecute
AWSLambdaBasicExecutionRole
But it did not allow me to invoke my other lambda function, so I changed the role to the below:
AWSLambdaRole
AWSLambdaBasicExecutionRole
The reason behind is AWSLambdaExecute only provides Put, Get access to S3 and full access to CloudWatch Logs.
but AWSLambdaRole provides Default policy for AWS Lambda service role.
if you observe its permission policy it will talk about the invokeFunction
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Action": [
"lambda:InvokeFunction"
],
"Resource": [
"*"
]
}
]
}
Note: it is OK to proceed without AWSLambdaBasicExecutionRole policy as it only enables the logging in the cloud watch nothing much. But AWSLambdaRole is absolutely necessary.
It's easier to invoke a lambda using the AWS.Lambda promises interface in aws-sdk than using callbacks.
This example function lets you make a synchronous invocation of a lambda from another lambda (it uses 'RequestResponse' as InvocationType, so you'll can get the value returned by the invoked lambda).
If you use 'Event' (for asynchronous invocation), you can't get the value returned by the called lambda, only be able to detect whether the lambda could be invoked with success or not. It is intended for cases when you don't need to obtain a returned value from the invoked lambda.
//
// Full example of a lambda that calls another lambda
//
// (create a lambda in AWS with this code)
//
'use strict';
//
// Put here the name of the function you want to call
//
const g_LambdaFunctionName = 'PUT_HERE_THE_INVOKED_LAMBDA_NAME'; // <======= PUT THE DESIRED VALUE
const AWS = require('aws-sdk');
const lambda = new AWS.Lambda;
//
// Expected use:
//
// // (payload can be an object or a JSON string, for example)
// let var = await invokeLambda(lambdaFunctionName, payload);
//
const invokeLambda = async (lambdaFunctionName, payload) => {
console.log('>>> Entering invokeLambda');
// If the payload isn't a JSON string, we convert it to JSON
let payloadStr;
if (typeof payload === 'string')
{
console.log('invokeLambda: payload parameter is already a string: ', payload);
payloadStr = payload;
}
else
{
payloadStr = JSON.stringify(payload, null, 2);
console.log('invokeLambda: converting payload parameter to a string: ', payloadStr);
}
let params = {
FunctionName : lambdaFunctionName, /* string type, required */
// ClientContext : '', /* 'STRING_VALUE' */
InvocationType : 'RequestResponse', /* string type: 'Event' (async)| 'RequestResponse' (sync) | 'DryRun' (validate parameters y permissions) */
// InvocationType : 'Event',
LogType : 'None', /* string type: 'None' | 'Tail' */
// LogType : 'Tail',
Payload : payloadStr, /* Buffer.from('...') || 'JSON_STRING' */ /* Strings will be Base-64 encoded on your behalf */
// Qualifier : '', /* STRING_VALUE' */
};
//
// TODO/FIXME: add try/catch to protect this code from failures (non-existent lambda, execution errors in lambda)
//
const lambdaResult = await lambda.invoke(params).promise();
console.log('Results from invoking lambda ' + lambdaFunctionName + ': ' , JSON.stringify(lambdaResult, null, 2) );
// If you use LogType = 'Tail', you'll obtain the logs in lambdaResult.LogResult.
// If you use 'None', there will not exist that field in the response.
if (lambdaResult.LogResult)
{
console.log('Logs of lambda execution: ', Buffer.from(lambdaResult.LogResult, 'base64').toString());
}
console.log('invokeLambdaSync::lambdaResult: ', lambdaResult);
console.log('<<< Returning from invokeLambda, with lambdaResult: ', JSON.stringify(lambdaResult, null, 2));
// The actual value returned by the lambda it is lambdaResult.Payload
// There are other fields (some of them are optional)
return lambdaResult;
};
//
// We'll assign this as the calling lambda handler.
//
const callingFunc = async (event) => {
//
// in this example We obtain the lambda name from a global variable
//
const lambdaFunctionName = g_LambdaFunctionName;
// const payload = '{"param1" : "value1"}';
const payload = event;
//
// invokeLambda has to be called from a async function
// (to be able to use await)
//
const result = await invokeLambda(lambdaFunctionName, payload);
console.log('result: ', result);
};
// Assing handler function
exports.handler = callingFunc;
Notice that you should use await before invokeLambda:
...
//
// Called from another async function
//
const result = await invokeLambda(lambdaFunctionName, payload);
...
Some relevant links with additional information:
AWS Reference about invoke call: https://docs.aws.amazon.com/AWSJavaScriptSDK/latest/AWS/Lambda.html#invoke-property
AWS documentation about invoking a lambda: https://docs.aws.amazon.com/lambda/latest/dg/API_Invoke.html
Promises interface in AWS:
https://aws.amazon.com/es/blogs/compute/node-js-8-10-runtime-now-available-in-aws-lambda/
Examples (explanations in Spanish): https://www.it-swarm.dev/es/node.js/invocar-aws-lambda-desde-otra-lambda-de-forma-asincronica/826852446/
Handling errors, avoiding coupling between lambdas: https://www.rehanvdm.com/serverless/13-aws-lambda-design-considerations-you-need-to-know-about-part-2/index.html
Invoke Lambda AWS SDK Typescript
I wrote my own class to do this, parse the response and check errors. I've posted it here to save anyone else who wants the effort :)
This requires aws-sdk and ts-log.
import { AWSError, Lambda } from 'aws-sdk'
import { Logger } from 'tslog';
export class LambdaClient {
awsLambda: Lambda;
logger: Logger;
constructor(region: string) {
this.awsLambda = new Lambda({ region })
this.logger = new Logger({ name: "LambdaClient" })
}
trigger({ functionName, payload }): Promise<any> {
return new Promise(
(resolve, reject) => {
const params = {
FunctionName: functionName,
InvocationType: 'RequestResponse',
LogType: 'Tail',
Payload: JSON.stringify(payload)
};
this.awsLambda.invoke(params, (err: AWSError, data: Lambda.InvocationResponse) => {
if (err) {
this.logger.error({ message: "error while triggering lambda", errorMessage: err.message })
return reject(err)
}
if (data.StatusCode !== 200 && data.StatusCode !== 201) {
this.logger.error({ message: "expected status code 200 or 201", statusCode: data.StatusCode, logs: base64ToString(data.LogResult) })
return reject(data)
}
const responsePayload = data.Payload
return resolve(JSON.parse(responsePayload.toString()))
})
}
)
}
}
function base64ToString(logs: string) {
try {
return Buffer.from(logs, 'base64').toString('ascii');
} catch {
return "Could not convert."
}
}