When I make a http request, I need to concatenate the response:
request.on('response', function (response) {
var body = '';
response.on('data', function (chunk) {
body += chunk;
});
...
Why was that implemented this way? Why not output the whole result?
What you're getting back is a stream, which is a very handy construct in node.js. Required reading: https://github.com/substack/stream-handbook
If you want to wait until you've received the whole response, you can do this very easily:
var concat = require('concat-stream');
request.on('response', function(response) {
response.pipe(concat(function(body) {
console.log(body);
}));
});
Node only uses a single process, no thread. This mean that if spend a lot of time doing something you canĀ“t process other things, like for example other client requests...
For that reason when you are coding in node, you need code thinking in async way.
In this scenario, the request could be slowly, and the program will wait for this request doing nothing.
I found this:
Why is node.js asynchronous?
And this is so interesting as well:
http://en.wikipedia.org/wiki/Read%E2%80%93eval%E2%80%93print_loop
Related
So I'm fairly new to node js, and am having trouble wrapping my head around asynchronous programming. I'm trying to get a JSON from a website and pass it to a variable for use later, to test I have been using this code:
var https = require("https");
var a;
function getter(url){
var request = https.get(url, function(response){
var body = "";
response.on("data", function(chunk){
body += chunk;
});
response.on("end", function(){
if(response.statusCode === 200){
try{
a = JSON.parse(body);
}catch(err){
console.log(err);
}
}
})
})
};
getter('https://api.nasa.gov/planetary/apod?api_key=DEMO_KEY');
console.log(a);
When I run this I get a as undefined, which seems to make sense from what I've read. But I'm unclear as to what to do from here. How would I go about passing this JSON into a variable?
http.get is asynchronous and executes the event handlers when the events occur. When you call getter() this function immediately returns, ie it does not wait for the events and the next statement console.log(a) is executed.
Furthermore, js is single threaded, and the current execution stack is never interrupted for any other event/callback or whatsoever. So the event handlers can only run if the current execution has come to an end, ie contains noch more statements. Thus, your console.log() will always be executed before any eventhandler of the request, thus a is still undefined.
If you want to continue after the request finished, you have to do it from the eventhandler.
See this excellent presentation for some more details https://youtu.be/8aGhZQkoFbQ
A http request is returning me a incomplete string.
https.get(url, function(res) {
res.on('data', function(data) {
translationData = data.toString();
resolve(translationData);
})
});
I can't get more than 500 characters.
I suppose my code is vague, but what could cause this problem?
I've tried a lot of approaches but all of them failed.
I've something similar in How to display long messages in logcat, but nothing compared in nodeJS.
The response object you get from http.get is a Stream.
The 'data' event handler is called whenever a chunk of data is received. You need to handle all of the 'data' events and collect their payload until you get an 'end' event in order to get the entire response.
A simple way of doing this is using the concat-stream module.
var concat = require('concat-stream');
https.get(url, function(res) {
res.pipe(concat(function(data) {
// data is the entire response
}));
}
To learn more about streams, read substack's stream handbook
I'm trying to implement a simple HTTP endpoint for an application written in node.js. I've created the HTTP server, but now I'm stuck on reading the request content body:
http.createServer(function(r, s) {
console.log(r.method, r.url, r.headers);
console.log(r.read());
s.write("OK");
s.end();
}).listen(42646);
Request's method, URL and headers are printed correctly, but r.read() is always NULL. I can say it's not a problem in how the request is made, because content-length header is greater than zero on server side.
Documentation says r is a http.IncomingMessage object that implements the Readable Stream interface, so why it's not working?
'readable' event is wrong, it incorrectly adds an extra null character to the end of the body string
Processing the stream with chunks using 'data' event:
http.createServer((r, s) => {
console.log(r.method, r.url, r.headers);
let body = '';
r.on('data', (chunk) => {
body += chunk;
});
r.on('end', () => {
console.log(body);
s.write('OK');
s.end();
});
}).listen(42646);
Ok, I think I've found the solution. The r stream (like everything else in node.js, stupid me...) should be read in an async event-driven way:
http.createServer(function(r, s) {
console.log(r.method, r.url, r.headers);
var body = "";
r.on('readable', function() {
body += r.read();
});
r.on('end', function() {
console.log(body);
s.write("OK");
s.end();
});
}).listen(42646);
I'm trying to do HTTP POST using the the request module from a node server to another server.
My code looks something like,
var req = request.post({url: "http://foo.com/bar", headers: myHeaders});
...
...
req.write("Hello");
...
...
req.end("World");
I expect the body of the request to be "Hello World" on the receiving end, but what I end up with is just "".
What am I missing here?
Note: The ellipsis in the code indicates that the write and the end might be executed in different process ticks.
It looks to me as if you are getting missed Request http.clientRequest/http.serverRequest
If you want to make a post to a server with request what you want to do is something like
request({ method:"post", url: "server.com", body:"Hello World"}, callback);
As 3on pointed, the correct syntax for a POST request is
request({ method:"post", url: "server.com", body:"Hello World"}, callback);
You also have a convenience method:
request.post({ url: "server.com", body:"Hello World"}, callback);
But from your question it seems like you want to stream:
var request = require('request');
var fs = require('fs');
var stream = fs.createWriteStream('file');
stream.write('Hello');
stream.write('World');
fs.createReadStream('file').pipe(request.post('http://server.com'));
Update:
You may break the chunks you write to the stream in any way you like, as long as you have the RAM (4mb is peanuts but keep in mind that v8 (the javascript engine behind node) has an allocation limit of 1.4GB I think);
You may see how much you "wrote" to the pipe with stream.bytesWritten where var stream = fs.createWriteStream('file') as you see in the piece of code above. I think you can't however know how much the other end of the pipe got, but bitesWritten should give you a pretty decent approximation.
You can listen to the data and end events of both stream and request.post('http://server.com')
I managed to make the code written in the question here valid and work as expected by modifying the request module a bit.
I noticed a block of code in request's main.js in the Request.prototype.init function (at line 356),
process.nextTick(function () {
if (self._aborted) return
if (self.body) {
if (Array.isArray(self.body)) {
self.body.forEach(function (part) {
self.write(part)
})
} else {
self.write(self.body)
}
self.end()
} else if (self.requestBodyStream) {
console.warn("options.requestBodyStream is deprecated, please pass the request object to stream.pipe.")
self.requestBodyStream.pipe(self)
} else if (!self.src) {
if (self.method !== 'GET' && typeof self.method !== 'undefined') {
self.headers['content-length'] = 0;
}
self.end();
}
self.ntick = true
})
I'm now overriding this function call by adding a new option (endOnTick) while creating the request. My changes: Comparing mikeal/master with GotEmB/master.
Sorry if this question is simple but I have been using node.js for only a few days.
Basically i receive a json with some entries. I loop on these entries and launch a http request for each of them. Something like this:
for (var i in entries) {
// Lots of stuff
http.get(options, function(res) {
// Parse reponse and detect if it was successfully
});
}
How can i detect when all requests were done? I need this in order to call response.end().
Also i will need to inform if each entry had success or not. Should i use a global variable to save the result of each entry?
You can e.g. use caolans "async" library:
async.map(entries, function(entry, cb) {
http.get(options, function(res) {
// call cb here, first argument is the error or null, second one is the result
})
}, function(err, res) {
// this gets called when all requests are complete
// res is an array with the results
}
There are many different libraries for that. I prefer q and qq futures libraries to async as async leads to forests of callbacks in complex scenarios. Yet another library is Step.