I'm trying to start the execution of an AWS StepFunction from inside an AWS Lambda, but I receive null as result, with no error message. The StepFunctions here is an Express State Machine, so I use the method startSyncExecution(params = {}, callback), as pointed in the docs.
Here is the code of the Lambda:
const AWS = require('aws-sdk');
exports.handler = async(event, context, callback) => {
var params = {
stateMachineArn: "arn:aws:states:us-east-1:[AccountID]:stateMachine:BookLectureStateMachine",
input: JSON.stringify(event),
name: "test-from-lambda"
}
var stepfunctions = new AWS.StepFunctions();
console.log("Everything okay") //This one is logged
stepfunctions.startSyncExecution(params, function(err, data) {
console.log("This log isn't shown"); //This one isn't logged
if (err) {
callback(null, {
statusCode: 400,
body: err,
headers: {
'Access-Control-Allow-Origin': '*'
}
})
} else {
callback(null, {
statusCode: 200,
body: 'Lecture booked',
headers: {
'Access-Control-Allow-Origin': '*'
}
})
}
});
};
The response is null, nothing else.
I've checked the permissions, and the Lambda has Full Access to the Step Functions.
Any idea on how to solve it?
UPDATE
I think the StepFunction is not executed, since the logs are empty.
I increased the Lambda timeout to 1min in order to avoid timeout scenarios. The billed duration is about half a second
I believe it may have something to do with the mix of callbacks and async. Since you aren't using await in your handler anywhere, I would try removing async from the handler.
Either that or you could try changing the code to:
var data = await stepfunctions.startSyncExecution(params).promise()
I'm trying to get values being passed into a form to themselves be passed to my lambda function backend.
The first roadblock I encountered on this project was that the event instance that is passed as a parameter to my handler was empty. That is, when I would try and log the event instance, the event would come back empty.
As a rough example, this code:
module.exports.handler = (event, context, callback) => {console.log(event);} would return "{}."
After some research, I learned that the reason the event instance would come back as empty, is because I had failed to check the lambda proxy integration option at the "Integration Requests" page on the API's Resource/Method page.
So, I enabled lambda proxy integration, and it sort of worked. the event instance is no longer empty; It doesn't return "{}." anymore.
However, although the event instance is now full of information, the properties/values I'm trying to recover from the event instance are now null, specifically body.
So, although event is no longer empty, JSON.parse(event.body); returns null
I don't understand where or why the values being passed into the form are being lost when transmitted to my lambda function backend.
You can see Cloudwatch logs yourselves here:
And here are both the handlers and the front-end codes, respectively.
const AWS = require('aws-sdk');
const validator = require("validator");
module.exports.handler = function(event, context, callback) {
var response = {
statusCode:200,
headers:{
"Access-Control-Allow-Origin": "*",
'Access-Control-Allow-Credentials':"true",
"Content-Type":"application/json"
},
body: JSON.stringify({"body":event})
}
console.log(response);
console.log(context);
try{
console.log("inside the try loop", null, 2);
console.log("before parsing the json from the post method", null, 2);
console.log(event);
var data = JSON.parse(event.body);
console.log(data);
console.log("after parsing the json from the post method", null, 2);
var email = data.email;
var comment = data.comment;
if(validator.isEmail(email) == "false" || validator.isEmpty(comment) == "false"){
callback(new Error("The email or comment were defectous."));
return;
}
email = validator.escape(email);
comment = validator.escape(comment);
callback(null, response);
return;
} catch(error){
console.log("inside the error loop", null, 2);
console.log(error);
console.log(error.description);
callback(null, response);
return;
}
};
$(function(){
$('.contactForm').submit(function(event){
event.preventDefault();
console.log("Submit event is fired");
var data = {
email: getEmail(),
comment: getComment()
};
$.ajax({
url: "SOMEURL",
type: 'post',
contentType: "application/json",
body: JSON.stringify(data),
success: function(){
console.log("Lambda Function successully invoked.");
console.log(data);
$('.btn').attr('disabled', true);
$('#emailFormInput').val('');
$('#commentFormInput').val('');
},
error: function(jqXHR, textStatus, errorThrown){
if (jqXHR.status == 500) {
console.log('Internal error: ' + jqXHR.responseText+" "+errorThrown);
} else {
console.log(errorThrown);
}
}
});
});
});
I have an AWS Lambda function which triggers https request to Google API. I want the function to be awaitable, so that it does not end immediately, but only after getting response from Google API.
Yes, I know I pay for the execution, but this will not be called often, so it is fine.
The problem is that the http request does not seem to fire correctly. The callback is never executed.
I have made sure that the async/await works as expected by using setTimeout in a Promise. So the issue is somewhere in the https.request.
Also note that I am using Pulumi to deploy to AWS, so there might be some hidden problem in there. I just can't figure out where.
The relevant code:
AWS Lambda which calls the Google API
import config from '../../config';
import { IUserInfo } from '../../interfaces';
const https = require('https');
function sendHttpsRequest(options: any): Promise<any> {
console.log(`sending request to ${options.host}`);
console.log(`Options are ${JSON.stringify(options)}`);
return new Promise(function (resolve, reject) {
console.log(` request to ${options.host} has been sent A`);
let body = new Array<Buffer>();
const request = https.request(options, function (res: any) {
console.log('statusCode:', res.statusCode);
console.log('headers:', res.headers);
if (res.statusCode != 200) {
reject(res.statusCode);
}
res.on('data', (data: any) => {
console.log(`body length is ${body.length}`);
console.log('data arrived', data);
body.push(data);
console.log('pushed to array');
console.log(data.toString());
});
});
request.on('end', () => {
console.error('Request ended');
// at this point, `body` has the entire request body stored in it as a string
let result = Buffer.concat(body).toString();
resolve(result);
});
request.on('error', async (err: Error) => {
console.error('Errooooorrrr', err.stack);
console.error('Errooooorrrr request failed');
reject(err);
});
request.end();
console.log(` request to ${options.host} has been sent B`);
});
}
/**
* AWS Lambda to create new Google account in TopMonks domain
*/
export default async function googleLambdaImplementation(userInfo: IUserInfo) {
const payload = JSON.stringify({
"primaryEmail": userInfo.topmonksEmail,
"name": {
"givenName": userInfo.firstName,
"familyName": userInfo.lastName
},
"password": config.defaultPassword,
"changePasswordAtNextLogin": true
});
const resultResponse: Response = {
statusCode: 200,
body: 'Default response. This should not come back to users'
}
console.log('Calling google api via post request');
try {
const options = {
host: 'www.googleapis.com',
path: '/admin/directory/v1/users',
method: 'POST',
headers: {
'Content-Type': 'application/json',
'Content-Length': payload.length.toString()
},
form: payload
}
const responseFromGoogle = await sendHttpsRequest(options);
console.log('responseFromGoogle', JSON.stringify(responseFromGoogle));
}
catch (err) {
console.log('Calling google api failed with error', err);
resultResponse.statusCode = 503;
resultResponse.body = `Error creating new Google Account for ${userInfo.topmonksEmail}.`;
return resultResponse;
}
console.log('request to google sent');
return resultResponse;
}
The problem is that the http request does not seem to fire correctly. The callback is never executed.
I believe this part of the issue is related to some combination of (a) potentially not actually sending the https request and (b) not using the correct callback signature for https.request. See the documentation at https://nodejs.org/api/https.html#https_https_request_options_callback for details on both of these.
Use node-fetch package
The following example works for me using node-fetch:
import * as aws from "#pulumi/aws";
import fetch from "node-fetch";
const api = new aws.apigateway.x.API("api", {
routes: [{
method: "GET", path: "/", eventHandler: async (ev) => {
const resp = await fetch("https://www.google.com");
const body = await resp.text();
return {
statusCode: resp.status,
body: body,
}
},
}],
})
export const url = api.url;
Pulumi complains, it something like "Can not serialize native function" or something like that. The problematic part is that node-fetch relies on Symbol.iterator
As noted in the comments, some of the conditions that can lead to this are documented at https://pulumi.io/reference/serializing-functions.html. However, I don't see any clear reason why this code would hit any of those limitations. There may be details of how this is used outside the context of the snippet shared above which lead to this.
I have a node app that serves a react app as well as makes requests to Google Cloud Storage. The App works perfectly locally, but after I've deployed it to Heroku I get the following error whenever I make requests to any of my endpoints:
2017-05-26T21:53:34.426234+00:00 app[web.1]: app.post /upload_url Endpoint
2017-05-26T21:53:34.484393+00:00 app[web.1]: (node:34) UnhandledPromiseRejectionWarning: Unhandled promise rejection (rejection id: 1): RangeError: Invalid status code: 0
The first line is a console log to check if the endpoint is reached, the second line is the error. Here's the code for the /upload_url endpoint:
app.post('/upload_url', (req, res) => {
console.log("app.post /upload_url Endpoint");
var originalFileName = req.body["originalFileName"];
var buf = crypto.randomBytes(16);
var uniqueFilename = buf.toString('hex') + '_' + originalFileName;
var file = bucket.file(uniqueFilename);
file.createResumableUpload(function (err, uri) {
if (!err) {
var json = JSON.stringify({
uri: uri,
uniqueFilename: uniqueFilename,
originalFileName: originalFileName
});
res.writeHead(200)
res.end(json);
} else {
res.writeHead(err.code);
res.end();
}
});
});
This endpoint is called by the react front end with the following fetch call:
function upload(file) {
var data = new FormData();
data.append('file', file);
return fetch(`upload_url`,{
method: 'POST',
headers: new Headers({
"Content-Type": "application/json",
}),
body: JSON.stringify({
originalFileName: file.name
})
});
}
}
Again, this works fine in development but not after deploying to Heroku. I've tried Heroku's suggestion of adding concurrency to the app (detailed here) without any luck. Any thoughts or suggestions on how to solve this problem would be very much appreciated.
EDIT:
bucket is a google cloud bucket and is defined like this:
const gcs = require('#google-cloud/storage')({
projectId: 'my-project',
keyFilename: process.env.GCS_KEYFILE
});
var bucket = gcs.bucket('my-bucket');
ANSWER:
While this didn't solve my issue entirely, by handling response error codes more appropriately I was able to determine that my actual problem is related to google cloud authentication. Here's my updated upload_url endpoint:
file.createResumableUpload(function (err, uri) {
if (!err) {
var json = JSON.stringify({
uri: uri,
uniqueFilename: uniqueFilename,
originalFileName: originalFileName
});
res.writeHead(200)
res.end(json);
} else {
if (err.code >= 100 && err.code < 600){
console.error(err)
res.writeHead(err.code);
res.end();
}else{
console.error(err)
res.writeHead(500);
res.end();
}
}
});
});
Refer to this answer https://stackoverflow.com/a/38258590/4348875, make sure err.code is a valid HTTP status code.
I'm getting started with AWS Lambda and I'm trying to request an external service from my handler function. According to this answer, HTTP requests should work just fine, and I haven't found any documentation that says otherwise. (In fact, people have posted code that use the Twilio API to send SMS.)
My handler code is:
var http = require('http');
exports.handler = function(event, context) {
console.log('start request to ' + event.url)
http.get(event.url, function(res) {
console.log("Got response: " + res.statusCode);
}).on('error', function(e) {
console.log("Got error: " + e.message);
});
console.log('end request to ' + event.url)
context.done(null);
}
and I see the following 4 lines in my CloudWatch logs:
2015-02-11 07:38:06 UTC START RequestId: eb19c89d-b1c0-11e4-bceb-d310b88d37e2
2015-02-11 07:38:06 UTC eb19c89d-b1c0-11e4-bceb-d310b88d37e2 start request to http://www.google.com
2015-02-11 07:38:06 UTC eb19c89d-b1c0-11e4-bceb-d310b88d37e2 end request to http://www.google.com
2015-02-11 07:38:06 UTC END RequestId: eb19c89d-b1c0-11e4-bceb-d310b88d37e2
I'd expect another line in there:
2015-02-11 07:38:06 UTC eb19c89d-b1c0-11e4-bceb-d310b88d37e2 Got response: 302
but that's missing. If I'm using the essential part without the handler wrapper in node on my local machine, the code works as expected.
The inputfile.txt I'm using is for the invoke-async call is this:
{
"url":"http://www.google.com"
}
It seems like the part of the handler code that does the request is skipped entirely. I started out with the request lib and fell back to using plain http to create a minimal example. I've also tried to request a URL of a service I control to check the logs and there's no requests coming in.
I'm totally stumped. Is there any reason Node and/or AWS Lambda would not execute the HTTP request?
Of course, I was misunderstanding the problem. As AWS themselves put it:
For those encountering nodejs for the first time in Lambda, a common
error is forgetting that callbacks execute asynchronously and calling
context.done() in the original handler when you really meant to wait
for another callback (such as an S3.PUT operation) to complete, forcing
the function to terminate with its work incomplete.
I was calling context.done way before any callbacks for the request fired, causing the termination of my function ahead of time.
The working code is this:
var http = require('http');
exports.handler = function(event, context) {
console.log('start request to ' + event.url)
http.get(event.url, function(res) {
console.log("Got response: " + res.statusCode);
context.succeed();
}).on('error', function(e) {
console.log("Got error: " + e.message);
context.done(null, 'FAILURE');
});
console.log('end request to ' + event.url);
}
Update: starting 2017 AWS has deprecated the old Nodejs 0.10 and only the newer 4.3 run-time is now available (old functions should be updated). This runtime introduced some changes to the handler function. The new handler has now 3 parameters.
function(event, context, callback)
Although you will still find the succeed, done and fail on the context parameter, AWS suggest to use the callback function instead or null is returned by default.
callback(new Error('failure')) // to return error
callback(null, 'success msg') // to return ok
Complete documentation can be found at http://docs.aws.amazon.com/lambda/latest/dg/nodejs-prog-model-handler.html
Simple Working Example of Http request using node.
const http = require('https')
exports.handler = async (event) => {
return httprequest().then((data) => {
const response = {
statusCode: 200,
body: JSON.stringify(data),
};
return response;
});
};
function httprequest() {
return new Promise((resolve, reject) => {
const options = {
host: 'jsonplaceholder.typicode.com',
path: '/todos',
port: 443,
method: 'GET'
};
const req = http.request(options, (res) => {
if (res.statusCode < 200 || res.statusCode >= 300) {
return reject(new Error('statusCode=' + res.statusCode));
}
var body = [];
res.on('data', function(chunk) {
body.push(chunk);
});
res.on('end', function() {
try {
body = JSON.parse(Buffer.concat(body).toString());
} catch(e) {
reject(e);
}
resolve(body);
});
});
req.on('error', (e) => {
reject(e.message);
});
// send the request
req.end();
});
}
Yeah, awendt answer is perfect. I'll just show my working code... I had the context.succeed('Blah'); line right after the reqPost.end(); line. Moving it to where I show below solved everything.
console.log('GW1');
var https = require('https');
exports.handler = function(event, context) {
var body='';
var jsonObject = JSON.stringify(event);
// the post options
var optionspost = {
host: 'the_host',
path: '/the_path',
method: 'POST',
headers: {
'Content-Type': 'application/json',
}
};
var reqPost = https.request(optionspost, function(res) {
console.log("statusCode: ", res.statusCode);
res.on('data', function (chunk) {
body += chunk;
});
context.succeed('Blah');
});
reqPost.write(jsonObject);
reqPost.end();
};
I faced this issue on Node 10.X version.
below is my working code.
const https = require('https');
exports.handler = (event,context,callback) => {
let body='';
let jsonObject = JSON.stringify(event);
// the post options
var optionspost = {
host: 'example.com',
path: '/api/mypath',
method: 'POST',
headers: {
'Content-Type': 'application/json',
'Authorization': 'blah blah',
}
};
let reqPost = https.request(optionspost, function(res) {
console.log("statusCode: ", res.statusCode);
res.on('data', function (chunk) {
body += chunk;
});
res.on('end', function () {
console.log("Result", body.toString());
context.succeed("Sucess")
});
res.on('error', function () {
console.log("Result Error", body.toString());
context.done(null, 'FAILURE');
});
});
reqPost.write(jsonObject);
reqPost.end();
};
Modern Async/Await Example
You need to prevent the lambda from finishing before the https request has completed. It makes code with multiple requests easier to read as well.
const https = require('https');
// Helper that turns https.request into a promise
function httpsRequest(options) {
return new Promise((resolve, reject) => {
const req = https.request(options, (res) => {
if (res.statusCode < 200 || res.statusCode >= 300) {
return reject(new Error('statusCode=' + res.statusCode));
}
var body = [];
res.on('data', function(chunk) {
body.push(chunk);
});
res.on('end', function() {
try {
body = JSON.parse(Buffer.concat(body).toString());
} catch(e) {
reject(e);
}
resolve(body);
});
});
req.on('error', (e) => {
reject(e.message);
});
req.end();
});
}
// Lambda starts executing here
exports.handler = async event => {
// --- GET example request
var options = {
method: 'GET',
hostname: 'postman-echo.com',
path: encodeURI('/get?foo1=bar1'),
headers: {
},
};
try {
const getBody = await httpsRequest(options);
// The console.log below will not run until the GET request above finishes
console.log('GET completed successfully! Response body:', getBody);
} catch (err) {
console.error('GET request failed, error:', err);
}
// --- POST example request
var options = {
method: 'POST',
hostname: 'postman-echo.com',
path: encodeURI('/hi/there?hand=wave'),
headers: {
},
};
try {
const postBody = await httpsRequest(options);
// The console.log below will not run until the POST request above finishes
console.log('POST response body:', postBody);
} catch (err) {
console.error('POST request failed, error:', err);
}
};
I had the very same problem and then I realized that programming in NodeJS is actually different than Python or Java as its based on JavaScript. I'll try to use simple concepts as there may be a few new folks that would be interested or may come to this question.
Let's look at the following code :
var http = require('http'); // (1)
exports.handler = function(event, context) {
console.log('start request to ' + event.url)
http.get(event.url, // (2)
function(res) { //(3)
console.log("Got response: " + res.statusCode);
context.succeed();
}).on('error', function(e) {
console.log("Got error: " + e.message);
context.done(null, 'FAILURE');
});
console.log('end request to ' + event.url); //(4)
}
Whenever you make a call to a method in http package (1) , it is created as event and this event gets it separate event. The 'get' function (2) is actually the starting point of this separate event.
Now, the function at (3) will be executing in a separate event, and your code will continue it executing path and will straight jump to (4) and finish it off, because there is nothing more to do.
But the event fired at (2) is still executing somewhere and it will take its own sweet time to finish. Pretty bizarre, right ?. Well, No it is not. This is how NodeJS works and its pretty important you wrap your head around this concept. This is the place where JavaScript Promises come to help.
You can read more about JavaScript Promises here. In a nutshell, you would need a JavaScript Promise to keep the execution of code inline and will not spawn new / extra threads.
Most of the common NodeJS packages have a Promised version of their API available, but there are other approaches like BlueBirdJS that address the similar problem.
The code that you had written above can be loosely re-written as follows.
'use strict';
console.log('Loading function');
var rp = require('request-promise');
exports.handler = (event, context, callback) => {
var options = {
uri: 'https://httpbin.org/ip',
method: 'POST',
body: {
},
json: true
};
rp(options).then(function (parsedBody) {
console.log(parsedBody);
})
.catch(function (err) {
// POST failed...
console.log(err);
});
context.done(null);
};
Please note that the above code will not work directly if you will import it in AWS Lambda. For Lambda, you will need to package the modules with the code base too.
I've found lots of posts across the web on the various ways to do the request, but none that actually show how to process the response synchronously on AWS Lambda.
Here's a Node 6.10.3 lambda function that uses an https request, collects and returns the full body of the response, and passes control to an unlisted function processBody with the results. I believe http and https are interchangable in this code.
I'm using the async utility module, which is easier to understand for newbies. You'll need to push that to your AWS Stack to use it (I recommend the serverless framework).
Note that the data comes back in chunks, which are gathered in a global variable, and finally the callback is called when the data has ended.
'use strict';
const async = require('async');
const https = require('https');
module.exports.handler = function (event, context, callback) {
let body = "";
let countChunks = 0;
async.waterfall([
requestDataFromFeed,
// processBody,
], (err, result) => {
if (err) {
console.log(err);
callback(err);
}
else {
const message = "Success";
console.log(result.body);
callback(null, message);
}
});
function requestDataFromFeed(callback) {
const url = 'https://put-your-feed-here.com';
console.log(`Sending GET request to ${url}`);
https.get(url, (response) => {
console.log('statusCode:', response.statusCode);
response.on('data', (chunk) => {
countChunks++;
body += chunk;
});
response.on('end', () => {
const result = {
countChunks: countChunks,
body: body
};
callback(null, result);
});
}).on('error', (err) => {
console.log(err);
callback(err);
});
}
};
Use promises with resolve reject. It worked for me!
Add above code in API gateway under GET-Integration Request> mapping section.
Yes, there's in fact many reasons why you can access AWS Lambda like and HTTP Endpoint.
The architecture of AWS Lambda
It's a microservice. Running inside EC2 with Amazon Linux AMI (Version 3.14.26–24.46.amzn1.x86_64) and runs with Node.js. The memory can be beetwen 128mb and 1gb. When the data source triggers the event, the details are passed to a Lambda function as parameter's.
What happen?
AWS Lambda run's inside a container, and the code is directly uploaded to this container with packages or modules. For example, we NEVER can do SSH for the linux machine running your lambda function. The only things that we can monitor are the logs, with CloudWatchLogs and the exception that came from the runtime.
AWS take care of launch and terminate the containers for us, and just run the code. So, even that you use require('http'), it's not going to work, because the place where this code runs, wasn't made for this.