nodejs - multiple async http requests - node.js

I have just started my journey with nodejs and would like to create a simple nodejs app that needs to:
- first request/get some initial data from via http,
- use received json to do another set of requests (some can be done in parallel, some needs to be executed first and data received will be used to create valid url).
Taking into account that nodejs is asynchronous and based on callbacks, I am wondering what is the best way to achieve this in order to have 'clean code' and not mess up with the code too much.
Thanks for any hints / guidelines, Mark

Maybe check out the Async library. Has a lot of built in functionality that seems to accomplish what you're looking for. Couple of useful ones right off the bat might be "async.waterfall" and "async.map".
async.waterfall
async.map

Agreed that this is subjective, in general the way to go is promises, there are native promises:
Native Promise Docs - MDN
For your particular question, imo, the npm module request-promise offers some great solutions. It is essentially a 'Promisified" version of the request module:
It will allow you to GET/POST/PUT/DELETE and follow up each request with a .then() where you can continue to do more calls like so:
-this code first GETS something from a server, then POSTS something else to that server.
function addUserToAccountName(url, accountName, username, password){
var options = assignUrl(url); // assignUrl is not in this code
request
.get(options) //first get
.auth(username, password)
.then(function(res) {
var id = parseId(res.data, accountName); //parse response
return id;
})
.then(function(id) {
var postOptions = Object.assign(defaultSettings, {url: url + id + '/users'})
request.post(postOptions) // then make a post
.auth(username, password)
.then(function(response) {
//console.log(response);
})
.catch(function(err) {
console.log((err.response.body.message));
})
})
}
You can just keep going with the .then() whatever you return from the previous .then() will be passed in to the function.
Request-Promise

Related

typescript fetch response streaming

i am trying to stream a response. But i want to be able to read the response (and work with the data) while it is still being sent. I basically want to send multiple messages in one response.
It works internally in node.js, but when i tried to do the same thing in typescript it doesnt work anymore.
My attempt was to do the request via fetch in typescript and the response is coming from a node.js server by writing parts of the response on the response stream.
fetch('...', {
...
}).then((response => {
const reader = response.body.getReader();
reader.read().then(({done, value}) => {
if (done) {
return response;
}
console.log(String.fromCharCode.apply(null, value)); //just for testing purposes
})
}).then(...)...
On the Node.js side it basically looks like this:
// doing stuff with the request
response.write(first_message)
// do some more stuff
response.write(second_message)
// do even more stuff
response.end(last_message)
In Node.js, like i said, i can just read every message once its sent via res.on('data', ...), but the reader.read in typescript only triggers(?) once and that is when the whole response was sent.
Is there a way to make it work like i want, or do i have to look for another way?
I hope it is kinda understandable what i want to do, i noticed while writing this how much i struggled explaining this :D
I found the problem, and as usual it was sitting in front of the pc.
I forgot to write a header first, before writing the response.

make nodejs http get call in synchronous way

I was asked a question in an interview. below is the question.
const JsonFromHTTPCall = function(){
// make get request to some api url and return json object.
}
// code below is not editable
let result = JsonFromHTTPCall();
console.log("result ", result);
I am not finding a way to make console.log statement wait until I get the result from http call.
Please give me a way to solve it.
Thanks in advance.
Nodejs does not offer synchronous networking in any way. All built-in networking is asynchronous. Therefore, you cannot directly return a value from a function retrieved via networking. Instead, you need to communicate back the result either via a callback function, an event you trigger or a returned promise.
For a summary of this issue see this highly active question/answer:
How do I return the response from an asynchronous call?
There is a gross hack that involves using a synchronous child process and having it do the networking for you, but it's unlikely that is what they were asking for in your interview.
So, the main answer to the question is that "nodejs does not offer synchronous networking" and further "you cannot change an asynchronous result into a synchronous result". Therefore the proper way to code this is to use nodejs asynchronous coding techniques.
The cleanest way I know of to make http get calls is using a library such as request-promise() or my newer favorite got() and use the promise interface plus async/await to make a nice clean code path:
const got = require('got');
async function getSomeJSON(url) {
let data = await got(url).json();
console.log(data);
return data;
}
getSomeJSON(myURL).then(data => {
console.log("got my data");
}).catch(err => {
console.log(err);
});

How to structure multiple HTTP requests in a server-side rendered React app?

I am currently using the below server side rendering logic (using reactjs + nodejs +redux) to fetch the data synchronously the first time and set it as initial state in store.
fetchInitialData.js
export function fetchInitialData(q,callback){
const url='http://....'
axios.get(url)
.then((response)=>{
callback((response.data));
}).catch((error)=> {
console.log(error)
})
}
I fetch the data asynchronously and load the output in to store the first time the page loads using callback.
handleRender(req, res){
fetchInitialData(q,apiResult => {
const data=apiResult;
const results ={data,fetched:true,fetching:false,queryValue:q}
const store = configureStore(results, reduxRouterMiddleware);
....
const html = renderToString(component);
res.status(200);
res.send(html);
})
}
I have a requirement to make 4 to 5 API calls on initial page load hence thought of checking to see if there is an easy way to achieve making multiple calls on page load.
Do I need to chain the api calls and manually merge the response from different API calls and send it back to load the initial state?
Update 1:
I am thinking of using axios.all approach, can someone let me know if that is a ideal approach?
You want to make sure that requests happen in parallel, and not in sequence.
I have solved this previously by creating a Promise for each API call, and wait for all of them with axios.all(). The code below is
basically pseudo code, I could dig into a better implementation at a later time.
handleRender(req, res){
fetchInitialData()
.then(initialResponse => {
return axios.all(
buildFirstCallResponse(),
buildSecondCallResponse(),
buildThirdCallResponse()
)
})
.then(allResponses => res.send(renderToString(component)))
}
buildFirstCallResponse() {
return axios.get('https://jsonplaceholder.typicode.com/posts/1')
.catch(err => console.error('Something went wrong in the first call', err))
.then(response => response.json())
}
Notice how all responses are bundled up into an array.
The Redux documentation goes into server-side rendering a bit, but you might already have seen that. redux.js.org/docs/recipes/ServerRendering
Also checkout the Promise docs to see exactly what .all() does.developer.mozilla.org/en/docs/Web/JavaScript/Reference/Global_Objects/Promise/all
Let me know if anything is unclear.
You could try express-batch or with GraphQL is another option.
You also could use Redux-Sagas to use pure Redux actions to trigger multiple api calls and handle all of those calls using pure actions. Introduction to Sagas

node async call return data in response

I am new to nodejs so I have a basic question and this is my scanrio
I have a javascript client which is making a http request to a node server to read a value from the database.
Once the node server receives the request it makes a simple db call and returns the data to the client in the response, and this is where the problem is.
router.get('/state', function(req, res){
var result = dbServer.makeDBCall();//Before this line executes and returns the result the next line executes
res.send(result);
}
The database call from the node server is asynchronous, therefore before the result is returned the node server has already sent a blank response to the client. What is the standard/acceptable way of getting this achieved, I know I can block the node thread using async, but then the whole purpose of node is gone right?
It depends on what kind of database node module you are using.
Other than the standard callback approach, there are also the promise way. The pg-promise library is 1 of those kind.
See sample code:
this.databaseConnection.makeDBCall('your query...')
.then(function(dbResponse) {
// Parse the response to the format you want then...
res.send(result);
})
.catch(function(error) {
// Handle error
res.send(error.message);
});
#spdev : I saw 1 of your comments about you being worried about how Node actually knows who to reply the response to, especially when there are multiple requests.
This is a very good question, and to be honest with you - I don't know much about it as well.
In short the answer is yes, Node somehow handles this by creating a corresponding ServerResponse object when a HTTP request comes through. This object seems to have some smartness to tell the Nodejs network stack how to route itself back to the caller when it gets parsed as data packets.
I tried Googling a bit for an answer but didn't got too far. I hope the ServerResponseObject documentation can provide more insight for you. Share with me if you got an answer thanks!
https://nodejs.org/api/all.html#http_class_http_serverresponse
Try below code.
router.get('/state', function(req, res){
var result = dbServer.makeDBCall(function(err,result){
if(!err) {
res.send(result);
}
});
}
Hope this Help.
The dbServer.makeDBCall(); must have a callback that runs when the statement completes executing.
Something like -
dbServer.makeDBCall({query: 'args'}, function(err, result){
if (err) // handle error
res.send(result);
})
You return the response from db from that callback function.
Learn more about callback from here-
nodeJs callbacks simple example
https://docs.nodejitsu.com/articles/getting-started/control-flow/what-are-callbacks/

Stateful interaction for testing Express Apps

I wrote a simple JSON api with express and I'm trying to use mocha to do some black-box testing. Throughly testing the API requires authenticating as different users, so each test for a specific feature is made of at least two requests: a login operation and one or more authenticated requests that test the actual feature.
I haven't found any library similar to django.test.client to simulate stateful interaction between an HTTP client and a server. Supertest seems to be popular, but it is very low-level compared to the django test client. This is how I would write a simple authenticated test with it (pardon my coffeescript):
it 'should return a 200 OK', (done) ->
supertest(server.app)
.post('/login')
.send("username=xxx&password=pass")
.end (err, res) ->
res.should.have.status(200)
supertest(server.app)
.get('/api/users')
.set('cookie', res.headers['set-cookie'][0])
.expect(200, done)
Is this really the cleanest way to execute the interaction? Is there any library that would help me with asynchronicity (it's not like I am going to need anything but plain serialization of the tests in 99% of cases, callbacks are just confusing) and statefulness? Something that would go like this:
it 'should rock', (done) -> myCoolLibrary [
->
#post '/login', {username: "xxx", password: "pass"}, (err, res) =>
res.should.have.status 200
#done()
,
->
#get '/api/users', (err, res) =>
res.should.have.status 200
#done()
]
If nothing similar exists, I should write it myself :-)
The reliance on the context is because I am using too much ZappaJS these days, and thanks to CoffeeScript's fat arrow, it's not a bad idiom at all.
It sounds like you could benefit from zombiejs. It simulates a browser and keeps cookies and session data between requests.
It also gives you more powerful features such as allowing you to fill out forms and submit them, for example.
A typical test would look something like this:
var Browser = require('zombie')
, browser = new Browser({site:'http://yoursite.com'});
describe('page',function(){
before(function(done){
browser.visit('/loginpage',done);
});
it('should return a 200 page',function(done){
browser.fill('username','xxx');
browser.fill('password','pass');
//assuming your form points to /login
browser.pressButton('button[type="submit"]'),function(){
assert(browser.success); //status code is 2xx
}).then(done,done); //call the done handler after promise is fulfilled
});
it('should rock',function(done){
browser.visit('/api/users',function(){
assert(browser.success);
}).then(done,done);
});
As a more general solution for making async code cleaner to read, check out async. https://github.com/caolan/async
async.serial would do just what you need, but I would particularly recommend async.auto, which allows you to link together various steps with their dependencies in a clear way.
I ended up writing myself a small library that is pretty close to my "ideal" example in the question. It doesn't deserve its own package for now, so I just put it in a gist:
https://gist.github.com/BruceBerry/5485917
I could not get superagent and supertest to perform stateful interaction, so I just ditched them in favor of request. The main difference seems to be that you can't chain expects and you have to do all tests in the callback, but those look odd anyway if you are already using another testing library such as should.js

Resources