Another Node.js newb who doesn't get it - node.js

EDITED: adjusted my narrative, and attempted to add output to code as examples show but it doesn't work. What am I doing wrong?
Hi experts or enthusiast superior to myself,
The question is "how to properly get the output from a asynchronous function in node.js as a return". The examples all talk of this mysterious callback function but in the context of my code I don't see how it applies or gets implemented.
Yes, the question has been asked many times, if I had to ask it again it is because the explanations provided didn't get this newb to a understanding. Yes, I spent near 24 hours or so trying to follow the examples, documentation, and other posts, but I didn't find one that explained it clear enough that I could apply to my code.
The concept of asynchronous makes sense, the code runs but the, in this case, https call hasn't. The code doesn't wait for the https call. You have to somehow grab the result after it has completed. While I haven't found the practicality of it, I am sure I will as I continue to learn why node.js is special in this way. Assuming my understanding is mostly right, my question is still the same. Concept is one thing, application and syntax are another.
This seems to be a common question and something nearly everyone new has trouble with.
Thus far none of the examples or explanations seem to clarify where or how with what I am working with. I understand there are additional modules that handle these differently but I believe I wont understand the 'why/how' as it applies unless I figure this out properly.
As I am brand new to node.js feel free to expand on any aspect of my code as I am eager to learn.
If anyone is finding this, this code will get data from the official Clash Royal API for which you require to register your IP and get a token from https://developer.clashroyale.com.
app.js
require('dotenv').config();
var func = require('./functions.js');
console.log(func.uChests(process.env.MyPlayer)); //this should output the value
functions.js
require('dotenv').config();
//console.log('Loaded Functions')
module.exports.uChests = func_uChests
//Clearly wrong application
//function func_uChests (playerID) {
function func_uChests (playerID,output) {
//console.log('uChests')
var http = require("https");
var options = {
"method": "GET",
"hostname": "api.clashroyale.com",
"port": null,
"path": "/v1/players/%23"+ playerID + "/upcomingchests",
"headers": {
"content-length": "0",
"authorization": "Bearer " + process.env.Token,
"accept": "application/json"
}
};
var req = http.request(options, function (res) {
var chunks = [];
res.on("data", function (chunk) {
chunks.push(chunk);
});
res.on("end", function () {
var body = Buffer.concat(chunks);
console.log(body.toString());
/* example output
{"items":[{"index":0,"name":"Magical Chest"},{"index":1,"name":"Silver Chest"},{"index":2,"name":"Silver Chest"},{"index":3,"name":"Golden Chest"},{"index":4,"name":"Silver Chest"},{"index":5,"name":"Silver Chest"},{"index":6,"name":"Silver Chest"},{"index":7,"name":"Golden Chest"},{"index":8,"name":"Silver Chest"},{"index":22,"name":"Legendary Chest"},{"index":40,"name":"Giant Chest"},{"index":76,"name":"Super Magical Chest"},{"index":77,"name":"Epic Chest"}]}
{"items":[{"index":0,"name":"Magical Chest"},{"index":1,"name":"Silver Chest"},{"index":2,"name":"Silver Chest"},{"index":3,"name":"Golden Chest"},{"index":4,"name":"Silver Chest"},{"index":5,"name":"Silver Chest"},{"index":6,"name":"Silver Chest"},{"index":7,"name":"Golden Chest"},{"index":8,"name":"Silver Chest"},{"index":22,"name":"Legendary Chest"},{"index":40,"name":"Giant Chest"},{"index":76,"name":"Super Magical Chest"},{"index":77,"name":"Epic Chest"}]}
*/
});
});
req.end();
}
//Clearly wrong application
function uChests(input, output) {
func_uChests(input, output);
console.log(output);
};

i think you should understand better na async nature of node , the only way you can return values to the caller statement is using a function parameter or Async/Await with Promises API,take a look below.
´
// return from a function parameter
myAsyncFunction(function(value){
console.log(value)
})
// or using the Promise API
let value = await myAsyncFunction()´

Related

Getting a value from a Node.js callback

I'm trying to get my head around callbacks in Node.JS and it's pretty foreign compared to what I'm used to.
I have the following example from another post on here:
var someOutput;
var myCallback = function(data) {
console.log('got data: '+data);
someOutput = data;
};
var usingItNow = function(callback) {
callback('get it?');
};
//now do something with someOutput outside of the functions
I have added the someOutput variable - I want to get the text 'get it?' (obviously I don't) into there so that I can concatenate it on to a string, but it's not working in any combination I've tried.
I must be missing something fundamental here!
Thank you for the help.
You should call the function:
usingItNow(myCallback);
But anyway it is not a good practice to use callback to set variable and consume later. It will work for you now, but later if callback will be async, it will fail.
Consider using Promise and use someOutput in the then function.
A good Promise package is Bluebird
If you want to do it like a pro, consider using Promise.coroutine
http://bluebirdjs.com/docs/api/promise.coroutine.html
Or async + await if you are using an updated version of Node.js
Look on http://node.green/#ES2017-features-async-functions to check what avilable on your node version
As commented by emil:
"call usingItNow(myCallback);"
doh!

How do you structure sequential AWS service calls within lambda given all the calls are asynchronous?

I'm coming from a java background so a bit of a newbie on Javascript conventions needed for Lambda.
I've got a lambda function which is meant to do several AWS tasks in a particular order, depending on the result of the previous task.
Given that each task reports its results asynchronously, I'm wondering if the right way make sure they all happen in the right sequence, and the results of one operation are available to the invocation of the next function.
It seems like I have to invoike each function in the callback of the prior function, but seems like that will some kind of deep nesting and wondering if that is the proper way to do this.
For example on of these functions requires a DynamoDB getItem, following by a call to SNS to get an endpoint, followed by a SNS call to send a message, followed by a DynamoDB write.
What's the right way to do that in lambda javascript, accounting for all that asynchronicity?
I like the answer from #jonathanbaraldi but I think it would be better if you manage control flow with Promises. The Q library has some convenience functions like nbind which help convert node style callback API's like the aws-sdk into promises.
So in this example I'll send an email, and then as soon as the email response comes back I'll send a second email. This is essentially what was asked, calling multiple services in sequence. I'm using the then method of promises to manage that in a vertically readable way. Also using catch to handle errors. I think it reads much better just simply nesting callback functions.
var Q = require('q');
var AWS = require('aws-sdk');
AWS.config.credentials = { "accessKeyId": "AAAA","secretAccessKey": "BBBB"};
AWS.config.region = 'us-east-1';
// Use a promised version of sendEmail
var ses = new AWS.SES({apiVersion: '2010-12-01'});
var sendEmail = Q.nbind(ses.sendEmail, ses);
exports.handler = function(event, context) {
console.log(event.nome);
console.log(event.email);
console.log(event.mensagem);
var nome = event.nome;
var email = event.email;
var mensagem = event.mensagem;
var to = ['email#company.com.br'];
var from = 'site#company.com.br';
// Send email
mensagem = ""+nome+"||"+email+"||"+mensagem+"";
console.log(mensagem);
var params = {
Source: from,
Destination: { ToAddresses: to },
Message: {
Subject: {
Data: 'Form contact our Site'
},
Body: {
Text: {
Data: mensagem,
}
}
};
// Here is the white-meat of the program right here.
sendEmail(params)
.then(sendAnotherEmail)
.then(success)
.catch(logErrors);
function sendAnotherEmail(data) {
console.log("FIRST EMAIL SENT="+data);
// send a second one.
return sendEmail(params);
}
function logErrors(err) {
console.log("ERROR="+err, err.stack);
context.done();
}
function success(data) {
console.log("SECOND EMAIL SENT="+data);
context.done();
}
}
Short answer:
Use Async / Await — and Call the AWS service (SNS for example) with a .promise() extension to tell aws-sdk to use the promise-ified version of that service function instead of the call back based version.
Since you want to execute them in a specific order you can use Async / Await assuming that the parent function you are calling them from is itself async.
For example:
let snsResult = await sns.publish({
Message: snsPayload,
MessageStructure: 'json',
TargetArn: endPointArn
}, async function (err, data) {
if (err) {
console.log("SNS Push Failed:");
console.log(err.stack);
return;
}
console.log('SNS push suceeded: ' + data);
return data;
}).promise();
The important part is the .promise() on the end there. Full docs on using aws-sdk in an async / promise based manner can be found here: https://docs.aws.amazon.com/sdk-for-javascript/v2/developer-guide/using-promises.html
In order to run another aws-sdk task you would similarly add await and the .promise() extension to that function (assuming that is available).
For anyone who runs into this thread and is actually looking to simply push promises to an array and wait for that WHOLE array to finish (without regard to which promise executes first) I ended up with something like this:
let snsPromises = [] // declare array to hold promises
let snsResult = await sns.publish({
Message: snsPayload,
MessageStructure: 'json',
TargetArn: endPointArn
}, async function (err, data) {
if (err) {
console.log("Search Push Failed:");
console.log(err.stack);
return;
}
console.log('Search push suceeded: ' + data);
return data;
}).promise();
snsPromises.push(snsResult)
await Promise.all(snsPromises)
Hope that helps someone that randomly stumbles on this via google like I did!
I don't know Lambda but you should look into the node async library as a way to sequence asynchronous functions.
async has made my life a lot easier and my code much more orderly without the deep nesting issue you mentioned in your question.
Typical async code might look like:
async.waterfall([
function doTheFirstThing(callback) {
db.somecollection.find({}).toArray(callback);
},
function useresult(dbFindResult, callback) {
do some other stuff (could be synch or async)
etc etc etc
callback(null);
],
function (err) {
//this last function runs anytime any callback has an error, or if no error
// then when the last function in the array above invokes callback.
if (err) { sendForTheCodeDoctor(); }
});
Have a look at the async doco at the link above. There are many useful functions for serial, parallel, waterfall, and many more. Async is actively maintained and seems very reliable.
good luck!
A very specific solution that comes to mind is cascading Lambda calls. For example, you could write:
A Lambda function gets something from DynamoDB, then invokes…
…a Lambda function that calls SNS to get an endpoint, then invokes…
…a Lambda function that sends a message through SNS, then invokes…
…a Lambda function that writes to DynamoDB
All of those functions take the output from the previous function as input. This is of course very fine-grained, and you might decide to group certain calls. Doing it this way avoids callback hell in your JS code at least.
(As a side note, I'm not sure how well DynamoDB integrates with Lambda. AWS might emit change events for records that can then be processed through Lambda.)
Just saw this old thread. Note that future versions of JS will improve that. Take a look at the ES2017 async/await syntax that streamlines an async nested callback mess into a clean sync like code.
Now there are some polyfills that can provide you this functionality based on ES2016 syntax.
As a last FYI - AWS Lambda now supports .Net Core which provides this clean async syntax out of the box.
I would like to offer the following solution, which simply creates a nested function structure.
// start with the last action
var next = function() { context.succeed(); };
// for every new function, pass it the old one
next = (function(param1, param2, next) {
return function() { serviceCall(param1, param2, next); };
})("x", "y", next);
What this does is to copy all of the variables for the function call you want to make, then nests them inside the previous call. You'll want to schedule your events backwards. This is really just the same as making a pyramid of callbacks, but works when you don't know ahead of time the structure or quantity of function calls. You have to wrap the function in a closure so that the correct value is copied over.
In this way I am able to sequence AWS service calls such that they go 1-2-3 and end with closing the context. Presumably you could also structure it as a stack instead of this pseudo-recursion.
I found this article which seems to have the answer in native javascript.
Five patterns to help you tame asynchronis javascript.
By default Javascript is asynchronous.
So, everything that you have to do, it's not to use those libraries, you can, but there's simple ways to solve this. In this code, I sent the email, with the data that comes from the event, but if you want, you just need to add more functions inside functions.
What is important is the place where your context.done(); is going to be, he is going to end your Lambda function. You need to put him in the end of the last function.
var AWS = require('aws-sdk');
AWS.config.credentials = { "accessKeyId": "AAAA","secretAccessKey": "BBBB"};
AWS.config.region = 'us-east-1';
var ses = new AWS.SES({apiVersion: '2010-12-01'});
exports.handler = function(event, context) {
console.log(event.nome);
console.log(event.email);
console.log(event.mensagem);
nome = event.nome;
email = event.email;
mensagem = event.mensagem;
var to = ['email#company.com.br'];
var from = 'site#company.com.br';
// Send email
mensagem = ""+nome+"||"+email+"||"+mensagem+"";
console.log(mensagem);
ses.sendEmail( {
Source: from,
Destination: { ToAddresses: to },
Message: {
Subject: {
Data: 'Form contact our Site'
},
Body: {
Text: {
Data: mensagem,
}
}
}
},
function(err, data) {
if (err) {
console.log("ERROR="+err, err.stack);
context.done();
} else {
console.log("EMAIL SENT="+data);
context.done();
}
});
}

async.js waterfall in node.js: how to use bind and this?

I'm learning node.js coming from a PHP background with a limited JavaScript level. I think I got over now the change of mindset implied by the asynchronous approach. And I love it.
But, as many others before me, I quickly understood the concrete meaning of the "pyramid of doom".
So I build these little 'dummy' route and view to understand how to properly use Async.js. I just spend the last 5 hours writing the following code (rewritten of course tens of times). It works, but I wonder how I could go further and made this code more simple (less verbose, easier to read and maintain).
I found many resources on the web and especially here, but always by bits of info here and there.
I'm guessing at this point that I should use "bind" and "this" with async.apply to make to shorten the 2 last functions called by the waterfall.
The issue is to get the object "db" defined so I can use the "collection" method on it (for the second function).
I really searched an example in Google, but it's surprising that you don't get straightforward examples looking for "async waterfall bind" (as well as many keyword variations I tried). There are answers of course but none seems relevant to this particular issue... ore, quite possibly, I haven't understood them.
Can someone help me on this? I'll be quite grateful.
app.get('/dummy',
function(req, res) {
var MongoClient = require('mongodb').MongoClient;
async.waterfall(
[
async.apply(MongoClient.connect, 'mongodb://localhost:27017/mybdd'),
function(db, callback) {
db.collection('myCollection', callback);
},
function(collection, callback) {
collection.find().sort({"key":-1}).limit(10).toArray(callback);
}
], function(err, results) {
if (err) console.log('Error :', err);
else { res.render('dummy.jade', { title:'dummy', results: results} ); }
}
);
}
);
If you're using the mongodb JS Driver, then this should work:
async.waterfall(
[
function (cb) {
new MongoClient(...)
.connect('mongodb://localhost:27017/mybdd', cb);
},
function (db, callback) {
db.collection('myCollection', callback);
},
...
Alternatively, if you want to use async.apply, just pass an instance of MongoClient
async.apply(new MongoClient(...).connect, 'mongodb://localhost:27017/mybdd')
I've recently created a simple abstraction named WaitFor to call async functions in sync mode (based on Fibers): https://github.com/luciotato/waitfor
I'm not familiar with mongodb client, so i'll be mostly guessing what you're trying to do:
using WaitFor your code will be:
var MongoClient = require('mongodb').MongoClient;
var wait = require('waitfor');
app.get('/dummy', function(req, res) {
// handle request in a Fiber, keep node spinning
wait.launchFiber(handleDummy,req,res)
}
);
function handleDummy(req, res) {
try {
var db = wait.for(MongoClient.connect, 'mongodb://localhost:27017/mybdd');
var collection = wait.forMethod(db,'collection','myCollection');
var results = wait.forMethod(collection.,'sort',{"key":-1}).toArray();
res.render('dummy.jade', { title:'dummy', results: results} );
}
catch(err) {
res.render('error.jade', { title:'error', message: err.message} );
}
};

POST Request using req.write() and req.end()

I'm trying to do HTTP POST using the the request module from a node server to another server.
My code looks something like,
var req = request.post({url: "http://foo.com/bar", headers: myHeaders});
...
...
req.write("Hello");
...
...
req.end("World");
I expect the body of the request to be "Hello World" on the receiving end, but what I end up with is just "".
What am I missing here?
Note: The ellipsis in the code indicates that the write and the end might be executed in different process ticks.
It looks to me as if you are getting missed Request http.clientRequest/http.serverRequest
If you want to make a post to a server with request what you want to do is something like
request({ method:"post", url: "server.com", body:"Hello World"}, callback);
As 3on pointed, the correct syntax for a POST request is
request({ method:"post", url: "server.com", body:"Hello World"}, callback);
You also have a convenience method:
request.post({ url: "server.com", body:"Hello World"}, callback);
But from your question it seems like you want to stream:
var request = require('request');
var fs = require('fs');
var stream = fs.createWriteStream('file');
stream.write('Hello');
stream.write('World');
fs.createReadStream('file').pipe(request.post('http://server.com'));
Update:
You may break the chunks you write to the stream in any way you like, as long as you have the RAM (4mb is peanuts but keep in mind that v8 (the javascript engine behind node) has an allocation limit of 1.4GB I think);
You may see how much you "wrote" to the pipe with stream.bytesWritten where var stream = fs.createWriteStream('file') as you see in the piece of code above. I think you can't however know how much the other end of the pipe got, but bitesWritten should give you a pretty decent approximation.
You can listen to the data and end events of both stream and request.post('http://server.com')
I managed to make the code written in the question here valid and work as expected by modifying the request module a bit.
I noticed a block of code in request's main.js in the Request.prototype.init function (at line 356),
process.nextTick(function () {
if (self._aborted) return
if (self.body) {
if (Array.isArray(self.body)) {
self.body.forEach(function (part) {
self.write(part)
})
} else {
self.write(self.body)
}
self.end()
} else if (self.requestBodyStream) {
console.warn("options.requestBodyStream is deprecated, please pass the request object to stream.pipe.")
self.requestBodyStream.pipe(self)
} else if (!self.src) {
if (self.method !== 'GET' && typeof self.method !== 'undefined') {
self.headers['content-length'] = 0;
}
self.end();
}
self.ntick = true
})
I'm now overriding this function call by adding a new option (endOnTick) while creating the request. My changes: Comparing mikeal/master with GotEmB/master.

Node.js: detect when all events of a request have finished

Sorry if this question is simple but I have been using node.js for only a few days.
Basically i receive a json with some entries. I loop on these entries and launch a http request for each of them. Something like this:
for (var i in entries) {
// Lots of stuff
http.get(options, function(res) {
// Parse reponse and detect if it was successfully
});
}
How can i detect when all requests were done? I need this in order to call response.end().
Also i will need to inform if each entry had success or not. Should i use a global variable to save the result of each entry?
You can e.g. use caolans "async" library:
async.map(entries, function(entry, cb) {
http.get(options, function(res) {
// call cb here, first argument is the error or null, second one is the result
})
}, function(err, res) {
// this gets called when all requests are complete
// res is an array with the results
}
There are many different libraries for that. I prefer q and qq futures libraries to async as async leads to forests of callbacks in complex scenarios. Yet another library is Step.

Resources