effective way of sending the body as a callback - node.js

In my app.js
var employees = require('../models/employees');
employees.read(req.params.id, function(body) {
console.log(body.firstName);
});
in my models/employees
var request = require('request');
var employees = {
read: function(id, callback) {
request
.get('http://api.mysite.com/employees/' + id, function(error, response, body) {
body = JSON.parse(body);
return callback(body);
})
},
};
module.exports = employees;
this works. (returns the employee name correctly) but I´m not sure if this is the correct (async) way of getting data from an api and displaying it.
thank you!

Node.js by default is asynchronous so you don't have to 'make' it work in an async manner.
For future use though, once you have more requests, there may be times where you have to wait for certain request to finish before you can fire the next one off, i.e. run tasks synchronously. In that case you'll have to use something like http://caolan.github.io/async/ and queue function calls in a waterfall/series model.

Related

Nodejs global variable scope issue

I'm quite new to Nodejs. In the following code I am getting json data from an API.
let data_json = ''; // global variable
app.get('/', (req, res) => {
request('http://my-api.com/data-export.json', (error, response, body) => {
data_json = JSON.parse(body);
console.log( data_json ); // data prints successfully
});
console.log(data_json, 'Data Test - outside request code'); // no data is printed
})
data_json is my global variable and I assign the data returned by the request function. Within that function the json data prints just fine. But I try printing the same data outside the request function and nothing prints out.
What mistake am I making?
Instead of waiting for request to resolve (get data from your API), Node.js will execute the code outside, and it will print nothing because there is still nothing at the moment of execution, and only after node gets data from your api (which will take a few milliseconds) will it execute the code inside the request. This is because nodejs is asynchronous and non-blocking language, meaning it will not block or halt the code until your api returns data, it will just keep going and finish later when it gets the response.
It's a good practice to do all of the data manipulation you want inside the callback function, unfortunately you can't rely on on the structure you have.
Here's an example of your code, just commented out the order of operations:
let data_json = ''; // global variable
app.get('/', (req, res) => {
//NodeJS STARTS executing this code
request('http://my-api.com/data-export.json', (error, response, body) => {
//NodeJS executes this code last, after the data is loaded from the server
data_json = JSON.parse(body);
console.log( data_json );
//You should do all of your data_json manipluation here
//Eg saving stuff to the database, processing data, just usual logic ya know
});
//NodeJS executes this code 2nd, before your server responds with data
//Because it doesn't want to block the entire code until it gets a response
console.log(data_json, 'Data Test - outside request code');
})
So let's say you want to make another request with the data from the first request - you will have to do something like this:
request('https://your-api.com/export-data.json', (err, res, body) => {
request('https://your-api.com/2nd-endpoint.json', (err, res, body) => {
//Process data and repeat
})
})
As you can see, that pattern can become very messy very quickly - this is called a callback hell, so to avoid having a lot of nested requests, there is a syntactic sugar to make this code look far more fancy and maintainable, it's called Async/Await pattern. Here's how it works:
let data_json = ''
app.get('/', async (req,res) => {
try{
let response = await request('https://your-api.com/endpoint')
data_json = response.body
} catch(error) {
//Handle error how you see fit
}
console.log(data_json) //It will work
})
This code does the same thing as the one you have, but the difference is that you can make as many await request(...) as you want one after another, and no nesting.
The only difference is that you have to declare that your function is asynchronous async (req, res) => {...} and that all of the let var = await request(...) need to be nested inside try-catch block. This is so you can catch your errors. You can have all of your requests inside catch block if you think that's necessary.
Hopefully this helped a bit :)
The console.log occurs before your request, check out ways to get asynchronous data: callback, promises or async-await. Nodejs APIs are async(most of them) so outer console.log will be executed before request API call completes.
let data_json = ''; // global variable
app.get('/', (req, res) => {
let pr = new Promise(function(resolve, reject) {
request('http://my-api.com/data-export.json', (error, response, body) => {
if (error) {
reject(error)
} else {
data_json = JSON.parse(body);
console.log(data_json); // data prints successfully
resolve(data_json)
}
});
})
pr.then(function(data) {
// data also will have data_json
// handle response here
console.log(data_json); // data prints successfully
}).catch(function(err) {
// handle error here
})
})
If you don't want to create a promise wrapper, you can use request-promise-native (uses native Promises) created by the Request module team.
Learn callbacks, promises and of course async-await.

node.js server and AWS asynchronous call issue

I have a simple node Express app that has a service that makesa call to a node server. The node server makes a call to an AWS web service. The AWS simply lists any S3 buckets it's found and is an asynchronous call. The problem is I don't seem to be able to get the server code to "wait" for the AWS call to return with the JSON data and the function returns undefined.
I've read many, many articles on the web about this including promises, wait-for's etc. but I think I'm not understanding the way these work fully!
This is my first exposer to node and I would be grateful if somebody could point me in the right direction?
Here's some snippets of my code...apologies if it's a bit rough but I've chopped and changed things many times over!
Node Express;
var Httpreq = new XMLHttpRequest(); // a new request
Httpreq.open("GET","http://localhost:3000/listbuckets",false);
Httpreq.send(null);
console.log(Httpreq.responseText);
return Httpreq.responseText;
Node Server
app.get('/listbuckets', function (req, res) {
var bucketData = MyFunction(res,req);
console.log("bucketData: " + bucketData);
});
function MyFunction(res, req) {
var mydata;
var params = {};
res.send('Here are some more buckets!');
var request = s3.listBuckets();
// register a callback event handler
request.on('success', function(response) {
// log the successful data response
console.log(response.data);
mydata = response.data;
});
// send the request
request.
on('success', function(response) {
console.log("Success!");
}).
on('error', function(response) {
console.log("Error!");
}).
on('complete', function() {
console.log("Always!");
}).
send();
return mydata;
}
Use the latest Fetch API (https://developer.mozilla.org/en-US/docs/Web/API/Fetch_API) to make HTTP calls. It has built-in support with Promise.
fetch('http://localhost:3000/listbuckets').then(response => {
// do something with the response here
}).catch(error => {
// Error :(
})
I eventually got this working with;
const request = require('request');
request(url, function (error, response, body) {
if (!error && response.statusCode == 200) {
parseString(body, function (err, result) {
console.log(JSON.stringify(result));
});
// from within the callback, write data to response, essentially returning it.
res.send(body);
}
else {
// console.log(JSON.stringify(response));
}
})

Error: Can't set headers after they are sent Braintree

I am currently working on an admin panel for this website I am creating, so I am able to accept payments via Braintree but I need to implement the ability to retrieve a customers transactions but once a header is sent it sends just one of them and not the whole thing. Is it possible to combine the json to an array so it will send in the one header?
CODE:
router.get('/:cid/test', function(req, res) {
var stream = gateway.transaction.search(function (search) {
search.customerId().is(req.params.cid);
}, function (err, response) {
response.each(function (err, transaction) {
return res.render('admin/test', {transaction: transaction});
});
});
});
This is solely following the Braintree documentation and I know exactly why the error occurs. Any help is really appreciated and I am terrible at explaining so if you need to know more information please give me a holler!
UPDATE: So, I figured I would explore another method and I noticed the 'response' gives back an array of ids. So I will just use EJS to loop through all those and then have a seperate page for each transaction.
Disclaimer: I work for Braintree :)
As Robert noted, you can only call res.render (or any of the response methods that end the request) once per request (hence the error from express).
Unfortunately, you cannot treat response as an array, so you will need to use
one of the two documented ways of interacting with search responses. I personally prefer the streams approach because it is clearer:
app.get('/stream', function (req, res) {
var transactions = []
var transactionStream = gateway.transaction.search(function (search) {
search.customerId().is(req.params.cid);
})
transactionStream.on('data', function (transaction) {
transactions.push(transaction)
})
transactionStream.on('error', function () { /* handle errors */ })
transactionStream.on('end', function () {
res.json({transactions: transactions});
})
})
Alternately, you can use the ids property of response to compare the transactions array that you build from each to know when to end the request:
app.get('/calback', function (req, res) {
var transactionStream = gateway.transaction.search(function (search) {
search.customerId().is(req.params.cid);
}, function (err, response) {
var transactions = []
response.each(function (err, transaction) {
transactions.push(transaction)
if (transactions.length === response.ids.length) {
res.json({transactions: transactions});
}
})
})
})
You can only render one response per route. So you can only call this once and not in a loop:
res.render('admin/test', {transaction: transaction}); });
You can use the each method to iterate through the response and build up a result:
var transactions =[];
response.each(function (err, transaction) { transactions.push(transaction) });
return res.render('admin/test', {transaction: transactions});
That would work if the each method is synchronous. If it's not (and Nick would know), use the solution below.

How to Store the Response of a GET Request In a Local Variable In Node JS

I know the way to make a GET request to a URL using the request module. Eventually, the code just prints the GET response within the command shell from where it has been spawned.
How do I store these GET response in a local variable so that I can use it else where in the program?
This is the code i use:
var request = require("request");
request("http://www.stackoverflow.com", function(error, response, body) {
console.log(body);
});
The easiest way (but it has pitfalls--see below) is to move body into the scope of the module.
var request = require("request");
var body;
request("http://www.stackoverflow.com", function(error, response, data) {
body = data;
});
However, this may encourage errors. For example, you might be inclined to put console.log(body) right after the call to request().
var request = require("request");
var body;
request("http://www.stackoverflow.com", function(error, response, data) {
body = data;
});
console.log(body); // THIS WILL NOT WORK!
This will not work because request() is asynchronous, so it returns control before body is set in the callback.
You might be better served by creating body as an event emitter and subscribing to events.
var request = require("request");
var EventEmitter = require("events").EventEmitter;
var body = new EventEmitter();
request("http://www.stackoverflow.com", function(error, response, data) {
body.data = data;
body.emit('update');
});
body.on('update', function () {
console.log(body.data); // HOORAY! THIS WORKS!
});
Another option is to switch to using promises.

How does the Node.js event loop work?

After playing with Node.js and reading about async i/o & evented programming a lot I'm left with some question marks.
Consider the following (pseudo) code:
var http = require('http');
function onRequest(request, response)
{
// some non-blocking db query
query('SELECT name FROM users WHERE key=req.params['key']', function (err, results, fields) {
if (err) {
throw err;
}
username = results[0];
});
// some non-blocking db query
query('SELECT name FROM events WHERE key=req.params['key']', function (err, results, fields) {
if (err) {
throw err;
}
event_name = results[0];
});
var body = renderView(username, event_name, template);
res.writeHead(200, {'Content-Type': 'text/plain'});
res.write(body);
res.end();
};
http.createServer(onRequest).listen(8888);
// request A: http://127.0.0.1:1337/?key=A
// request B: http://127.0.0.1:1337/?key=B
(I think) I understand the basics of the event loop; With libev, Node.js creates an event loop that polls (epoll/kqueue/...) a bunch of file descriptors to see if any events are triggered (new connection, writable, data available etc). If there is a new request the event loop calls the anonymous function passed to createServer. What I don't understand is what happens after:
1) To run the queries concurrently the db driver has to have some kind of threading/connection pool, right?
2) In the scope of one request: What happens after sending two queries? renderView can't be called because the queries have not returned yet. How do we wait for the queries to return? Should it keep count of the callbacks pending to be fired before continuing? The basic thought I had was;
onRequest -> run async code -> wait for callbacks -> construct response. The waiting in this case would be blocking so you would effectively need to spawn a thread for each onRequest. How is the "waiting for callbacks to run before constructing response" done?
3) How does the db driver inform the event-loop that it's done and the callback it has for it needs to be called with the query results?
4) How does the event loop run the callback inside the anonymous function we created with the onRequest event? Is this where the closure concept comes in where the context is "saved" in the callback function?
4) Now that we have the db results, how do we continue executing the renderView/res.write/res.end parts?
Run parrallel async code pattern:
To 'Wait for result from both async functions' you can do: in both async calls callbacks check both result and if all ready, call your DoSomethingWithTwoDependantResults.
In your example you probably need to execute queries sequentially:
query(sql1, function(sqlres1) {
query(sql2, function(sqlres2) {
writeResultUsingDataFrom(sqlres1, sqlres2);
}
});
your original code, modified to execute two queries in parallel:
function writeReply(res, template, username, event_name)
{
var body = renderView(username, event_name, template);
res.writeHead(200, {'Content-Type': 'text/plain'});
res.write(body);
res.end();
}
function onRequest(request, response)
{
// some non-blocking db query
query('SELECT name FROM users WHERE key=req.params['key']', function (err, results, fields) {
if (err) {
throw err;
}
username = results[0];
if (username && event_name)
writeReply(res, template, username, event_name);
});
// some non-blocking db query
query('SELECT name FROM events WHERE key=req.params['key']', function (err, results, fields) {
if (err) {
throw err;
}
event_name = results[0];
if (username && event_name)
writeReply(res, template, username, event_name);
});
};
Have you seen this? I'm still getting the hang of it all and I can't answer your question in detail, but basically you're right about the thread-pool... Ryan explains quite a lot in the video.
EDIT: And this one from about a year later, when he goes into more detail.

Resources