Node.js - res.end and res.write can accept a callback function - node.js

When creating a basic HTTP server with Node.js, I noticed that the res.write and res.end methods of the 'http.ServerResponse' object can both accept a callback function like so:
require('http').createServer(function (req, res) {
res.writeHead(200, {'Content-Type': 'text/plain'});
res.write('Hello ', function() { console.log('here 1'); });
res.end(' World', function() { console.log('here 2'); });
}).listen(1337, '127.0.0.1');
'Hello World' is output in the browser, and 'here 1' and 'here 2' is output into the terminal.
However, these callback arguments aren't documented anywhere, for instance http://nodejs.org/api/http.html#http_response_end_data_encoding unless I'm missing something.
Can I really use these callback functions? I have an interesting use case for them. Or are they some internal use thing and should be avoided?

This appears to be a 'feature'. It is actually meant to be the encoding to be used in the body but the way the net module works is the second argument is an optional callback.
The stack is like this (about)
res.write(data, encoding)
res._send(data, encoding)
res._writeRaw(data, encoding)
res.socket.write(data, encoding, cb)
At that last point the number of arguments changes from 2 to 3. data and encoding to data, encoding, optional callback. So what is happening is your function (as the encoding argument) is being passed to socket.write where encoding can be optional.
This could be considered a bug as you cannot push all three arguments from the response write method. I'd advise using it with extreme care.

Related

I keep trying to use require to load in node fs module and it keeps giving me invalid callback error, how do i rectify this

Here is the sourcecode:
console.log('Starting app.');
const fs = require('fs');
fs.appendFile('greetings.txt', 'Hello world!');
fs.appendFileSync('greetings.txt', 'Hello world!');
when i load the app in the terminal, it keeps giving me this error message.
fs.appendFile() is the asynchronous version of that interface and it requires that the last argument be a callback that gives you both completion and/or error conditions.
See the doc.
fs.appendFile(path, data[, options], callback)
The callback is NOT optional.
The proper usage of that function would be this:
fs.appendFile('greetings.txt', 'Hello world!', err => {
if (err) {
console.log(err);
} else {
console.log("data appended successfully");
}
});
Also, please note that this is asynchronous and non-blocking so the callback will get called some indeterminate time later (when the append finishes), but the next lines of code after this will execute immediately (before the callback is called).
Other relevant interfaces are the promise version of the asynchronous interface:
fs.promises.appendFile(path, data[, options])
You do not pass a callback to this version. Instead, it returns a promise which you use to get notified of completion/error.
fs.promises.appendFile('greetings.txt', 'Hello world!').then(() => {
console.log("data appended successfully");
}).catch(err => {
console.log(err);
});
For asynchronous interfaces, the promise-version is newer and considered more modern.

Wait a async operation before execute render in express using NodeJs

I'm starting to use NodeJs recently and I'm trying to create a API that will get some information from web compile it and show to the user.
My question is the follow
router.get('/', function (req, res, next) {
https.get(pageUrl, function (res) {
res.on('data', function (responseBuffer) {
//Important info;
info = responseBuffer;
}
}
res.render('page', { important: info});
}
How can I wait until I have the "info" var and then send the res.render. Because right now if I try to wait it usually the program ends and don't wait the result.
Thanks.
Assuming your https.get call gives you a stream with an 'end' event [1], you can do the following:
router.get('/', function (req, res, next) {
https.get(pageUrl, function (res) {
var info;
res.on('data', function (responseBuffer) {
//Important info;
info = responseBuffer;
}
res.on('end', function() {
res.render('page', { important: info});
})
}
}
Note that the above code will not work because you shadowed the base res parameter with the res parameter from the https.get callback.
Also, note that the 'data' event may be emitted several times (again, assuming a standard stream implementation[1]), so you should accumulate the results inside your info variable.
[1] Could you please post more information about your code, such as where the https library comes from (is it the standard HTTPS lib?).
Personal thought: I highly suggest using the request module, disponible on NPM via npm install request, for HTTP(S) requests to external services. It's got a neat interface, is simple to use and handles a lot of situations for you (redirects are one example, JSON and Content-Type another).

Does Express.js support sending unbuffered progressively flushed responses?

Perl's Catalyst framework permitts you to send an progressively flushed response over an open connection. You could for instance use write_fh() on Catalyst::Response. I've begun using Node.js, and I can't find how to do the equivalent.
If I want to send a big CSV file, on the order of 200 megs is there a way to do that without buffering the whole CSV file in memory? Granted, the client will timeout if you don't send data in a certain amount of time, so a promise would be nice if -- but is there anyway to do this?
When I try to do a res.send(text) in a callback, I get
Express
500 Error: This socket has been ended by the other party
And, it doesn't seem that Express.js supports an explicit socket.close() or anything of the ilk.
Here is an example,
exports.foo = function (res) {
var query = client.query("SELECT * FROM naics.codes");
query.on('row', function(row) {
//console.log(row);
res.write("GOT A ROW");
});
query.on('end', function() {
res.end();
client.end();
});
};
I would expect for that to send "GOT A ROW" out for each row, until the call to client.end() signifying completion.
Express is built on the native HTTP module, which means res is an instance of http.ServerResponse, which inherits from the writable stream interface. That said, you can do this:
app.get('/', function(req, res) {
var stream = fs.createReadStream('./file.csv');
stream.pipe(res);
// or use event handlers
stream.on('data', function(data) {
res.write(data);
});
stream.on('end', function() {
res.end();
});
});
The reason you can't use the res.send() method in Express for streams is because it will use res.close() automatically for you.

Chain parameters between async functions with Q in node.js

How can I chain the parameters that I need for my both async functions.
The first function fs.readFile returns the content of the file in the callback function as second parameter.
The second function marked requires this content as first parameter. The second parameter is optional and can be an options object. The third parameter is the callback that should give me the converted content as second parameter.
Currently I've tried this code:
var readFile = q.nfbind(fs.readFile);
var md = q.nfbind(marked);
readFile(fileName, 'UTF8')
.then(md)
.then(function (html) {
res.setHeader('Content-Type', 'text/html');
res.setHeader('Content-Length', html.length);
res.status(200);
res.end(html);
})
.catch(function (error) {
res.setHeader('Content-Type', 'text/plain');
res.send(500, 'server error: ' + error);
res.end();
})
.done();
But it doesn't work, because the marked function needs the second parameter when it was called with an callback function as third parameter. How can I set the second parameter, to call the marked function correctly?
If you simply replace the .then(md) line with .then(marked), then the result of calling fs.readFile (the value with which the promise was fulfilled) will be passed to marked.

response.write() not working in exec

I am following a node.js tutorial and am making a simple server that will display contents of a directory.
function start(response) {
console.log("Request handler 'start' was called.");
// if I put response.write("blah") here it works
console.log(response);
exec("ls -lah", function (error, stdout, stderr) {
console.log(response);
response.writeHead(200, {"Content-Type": "text/plain"});
response.write(stdout);
response.write("asdfasdf");
console.log("asdf");
console.log(stdout);
response.end();
});
}
This prints nothing on the browser, but shows up on the console.
I tried putting response.write() outside of the exec callback function, and it shows up perfectly on the browser.
Firefox is reporting that the request is not being set at all, not even the content-type header is being set. If I move that outside the exec callback function, it does get set.

Resources