I am trying to learn node.js.
I am trying to understand streams and piping.
Is it possible to pipe the response of http request to console.log?
I know how to do this by binding a handler to the data event but I am more interested in streaming it to the console.
http.get(url, function(response) {
response.pipe(console.log);
response.on('end', function() {
console.log('finished');
});
});
console.log is just a function that pipes the process stream to an output.
Note that the following is example code
console.log = function(d) {
process.stdout.write(d + '\n');
};
Piping to process.stdout does exactly the same thing.
http.get(url, function(response) {
response.pipe(process.stdout);
response.on('end', function() {
console.log('finished');
});
});
Note you can also do
process.stdout.write(response);
Related
I read the nodejs document,it says that the only difference between these two function is that http.get will execute req.end automatically.But I got a weird question.I write some codes like these:
http.get(url,function(res){
var data="";
res.on('data',function(chunk){
data+=chunk;
});
res.on('end',function(){
console.log(data);
});
}).on("error",function(){
});
in this place,the data works fine.But when I use http.request,something is wrong.
var pReq = http.request(options, function(pRes) {
var data=" ";
pRes.on('data',function (chunk) {
data+=chunk;
});
pRes.on('end',function() {
console.log(data)
});
}).on('error', function(e) {
});
in this place,I always got Garbled.I'm new in node,are there any mistakes about the sencode one?
I am working on a parser in nodejs. Therefore I request a website and parse the HTML.
I am working with require("htmlparser") and require('follow-redirects').http for the requests.
requestSite(options);
console.log("Done\n");
parser.done();
function requestSite(options) {
http.get(options, function(res) {
console.log("Got response: " + res.statusCode);
res.setEncoding('utf8');
res.on('data', function (chunk) {
parser.parseChunk(chunk.toString('utf8'));
});
}).on('error', function(e) {
console.log("Got error: " + e.message);
});
}
My problem now is that the done() is called before the requestSite function actually has finished its chunks resulting in following error:
Writing to the handler after done() called is not allowed without
calling a reset()
How can I wait for the chunks to finish?
You are not taking account the asynchronous nature of nodejs. It will call requestSite and then moveon to execute the next statement and call parser.done before requestSite is done executing. Do this instead.
requestSite(options, parser);
console.log("Done\n");
function requestSite(options, parser) {
http.get(options, function(res) {
console.log("Got response: " + res.statusCode);
res.setEncoding('utf8');
res.on('data', function (chunk) {
parser.parseChunk(chunk.toString('utf8'));
})
.on("end", function(){
parser.done();
})
}).on('error', function(e) {
console.log("Got error: " + e.message);
});
}
well this is the basic of node.js and the event driven architecture.
Node is not a line by line programing like php, python etc...
look at this simple example:
console.log(1);
setTimeout (function(err, res) {
console.log(2);
}, 0);
console.log(3);
as you will think it should print: 1,2,3
but this will print 1,3,2.
In your example you should move the
parser.done();
to the "end" of the http request.
currently you have an event for getting chunks of data, so simply use:
onEnd or something similar and than place the "parser.done()"
I'm aware that there are several questions related to mine, but I didn't find any of them useful:
this one doesn't apply to my case, I'm actually getting the answer, it's the contents that I can't get.
on this one, on the other hand, the problem is a wrong handling of an asynchronous call, which is not my case
there, well, I really didn't fully understand this question
And so on...
Then, I think this is a legitimate question. I'm actually performing some encryption in my server (express routing in node) through a post request:
app.post('/encrypt', encrypt);
Encrypt is doing:
function encrypt(req,res) {
if(req.body.key && req.body.message) {
var encryptedMessage = Encrypter.encrypt(req.body.key,req.body.message);
return res.status(200).json({ message: encryptedMessage });
}
res.status(409).json({ message: 'the message could not be encrypted, no key found' });
}
}
So, I tested this via console.log, and it's working. When the server receives the request, the encrypted message is being generated.
At the same time, I'm testing my thing with mocha and I'm doing it like so:
describe('# Here is where the fun starts ', function () {
/**
* Start and stop the server
*/
before(function () {
server.listen(port);
});
after(function () {
server.close();
});
it('Requesting an encrypted message', function(done) {
var postData = querystring.stringify({
key : key,
message : message
});
var options = {
hostname: hostname,
port: port,
path: '/encrypt',
method: 'POST',
headers: {
'Content-Type': 'application/x-www-form-urlencoded',
'Content-Length': postData.length
}
};
var req = http.request(options, function(res) {
res.statusCode.should.equal(200);
var encryptedMessage = res.message;
encryptedMessage.should.not.equal(message);
done();
});
req.on('error', function(e) {
//I'm aware should.fail doesn't work like this
should.fail('problem with request: ' + e.message);
});
req.write(postData);
req.end();
});
});
So, whenever I execute the tests, it fails with Uncaught TypeError: Cannot read property 'should' of undefined because res.message does not exist.
None of the res.on (data, end, events is working, so I suppose the data should be available from there. First I had this:
var req = http.request(options, function(res) {
res.statusCode.should.equal(200);
var encryptedMessage;
res.on('data', function (chunk) {
console.log('BODY: ' + chunk);
encryptedMessage = chunk.message;
});
encryptedMessage.should.not.equal(message);
done();
});
But res.on was never accessed (the console.log didn't show anything). I'm therefore a bit stuck here. I'm surely doing some basic stuff wrong, but I don't have a clue, and the many questions I found doesn't seem to apply to my case.
Weird enough, if I launch a test server and then I curl it
curl --data "key=secret&message=veryimportantstuffiabsolutellyneedtoprotect" localhost:2409/encrypt
Curl justs waits ad aeternam.
Actually I was doing it properly at the beginning, and the problem was indeed the same than in the second question I mentionned I was actually "clearing" my context with done() before the post data arrived. The solution is:
var req = http.request(options, function(res) {
res.statusCode.should.equal(200);
res.on('data', function(data) {
encryptedMessage = JSON.parse(data).message;
encryptedMessage.should.not.equal(message);
done();
});
});
In such a way that done() is only called when the data has been threated. Otherwise, mocha will not wait for the answer.
I'm trying to read a PDF from a URL and display it to a user's browser (via the passed in 'response' object). I've tried to use the code below and it works sometimes, but generally fails:
function writePdfToBrowser(url, response) {
http.get(url, function(res) {
logger.verbose('about to start download...');
var chunks = [];
res.on('data', function(chunk) {
chunks.push(chunk);
});
res.on("end", function() {
logger.verbose('downloaded');
var buffer = new Buffer.concat(chunks);
//write downloaded pdf to the original response
response.write(buffer);
//response.send(buffer);
response.end();
});
}).on("error", function() {
logger.error("error!");
});
}
In the new page where I attempted to load the pdf it would just say "Failed to load pdf".
I'm new to Node, so not sure where the problem lies, any ideas? Anyone have any working code to do the same thing?
Thank you for any help!
Mark
Use piping:
function pipe(url, res) {
var request = http.get(url, function(response) {
res.writeHead(response.statusCode, response.headers)
response.pipe(res);
});
request.on('error', function(error){
res.statusCode = 500;
res.end(error.message);
});
}
... and please provide next time more information about what and how it fails, some logs, inspect response im browser before. And so on..
While working with Facebook graph api, I have used https.get to make a request for facebook user data.
var optionsP = {
host: 'graph.facebook.com',
path: '/me?access_token=XXXX'
};
https.get(optionsP, function(resp) {
resp.on('data', function(d) {
console.log('ondata')
console.log(d.length)
process.stdout.write(d)
});
}).on('error', function(e) {
console.error(e);
});
But the response data comes as 2 parts! First time it prints upto 1034 characters, then again same call back will work and prints remaining 1347 characters. What is the reason for these partial responses?
That's normal. resp is a stream. It's a ClientResponse object, that implements the readable stream interface. Here are the docs: http://nodejs.org/api/http.html#http_http_clientresponse
You can either pipe the output somewhere that accepts streams, or store it in a buffer until you receive the 'end' event.
Here is an example that stores the data in a String in memory, until it has all arrived:
https.get(optionsP, function(resp) {
resp.setEncoding(); //Now the data is a string!
var store = "";
resp.on('data', function(d) {
store += d;
});
resp.on('end', function() {
console.log("this is all: " + store);
});
}).on('error', function(e) {
console.error(e);
});