nodejs web server and simultaneous file append - node.js

I've read about nodejs event-loop, but what i don't understand well is:
i've created a simple http-server that logs the whole request post-data to a file .
i used apache-ab to flood it with a 700 kb file for-each request it does .
i imagined that each request will write some chunks after each other each tick in the event-loop, but i found that the full post data is written completely after each request and i don't know why , and i cannot understand it .
i'm using something like this
stream = require('fs').createWriteStream('path/to/log.file', {flags: 'a'})
log = function(data){
return stream.write(data)
}
require('http').createServer(function(req, res)
{
// or req.pipe(stream)
req.on('data', function(chunk){
log(chunk.toString() + "\r\n")
})
req.on('end', function(){
res.end("ok")
})
}).listen(8000)
sorry for my bad English :)

I edited the code to output the chunk size and also use an easily identifiable word 'SQUIRREL' to search for in the log file. I also sent the image file using curl instead of apache ab, mainly because I do not have it setup.
If you look at the output of http in the terminal you are running it, you will see the chunk size for each chunk as it is processed, which in my case was 65536 (as you alluded to in your comments we should see). In a text search, I see the word SQUIRREL one time for each chunk that was processed.
There is nothing wrong with your code, hopefully making these changes will allow you to see what you are expecting to see.
stream = require('fs').createWriteStream('./logfile.txt', {flags: 'a'});
log = function(data){
return stream.write(data);
};
require('http').createServer(function(req, res)
{
// or req.pipe(stream)
req.on('data', function(chunk){
log(chunk.toString() + "SQUIRREL ");
console.log(chunk.length);
})
req.on('end', function(){
res.end("ok");
})
}).listen(8000)
curl --data-binary "#IMAG0152.jpg" http://localhost:8000

thanks to everyone tried to help, i found the main issue
the stream.write was returning false i've added the following code to use drain event with .pause() and .resume() and the
problem solved
ok = stream.write(chunk)
if ( ! ok )
{
req.pause()
stream.once('drain', function(){
req.resume()
stream.write(chunk)
})
}

Related

Why nodejs return chunked data in request?

When I make a http request, I need to concatenate the response:
request.on('response', function (response) {
var body = '';
response.on('data', function (chunk) {
body += chunk;
});
...
Why was that implemented this way? Why not output the whole result?
What you're getting back is a stream, which is a very handy construct in node.js. Required reading: https://github.com/substack/stream-handbook
If you want to wait until you've received the whole response, you can do this very easily:
var concat = require('concat-stream');
request.on('response', function(response) {
response.pipe(concat(function(body) {
console.log(body);
}));
});
Node only uses a single process, no thread. This mean that if spend a lot of time doing something you canĀ“t process other things, like for example other client requests...
For that reason when you are coding in node, you need code thinking in async way.
In this scenario, the request could be slowly, and the program will wait for this request doing nothing.
I found this:
Why is node.js asynchronous?
And this is so interesting as well:
http://en.wikipedia.org/wiki/Read%E2%80%93eval%E2%80%93print_loop

NodeJS: convert pngjs stream to base64

I have a PNG object which I created using node-png and according the docs it's a "readable and writable Stream".
I would like to convert the PNG object to base64 and send it to the client via socket.io where I'll place the string in an image src.
I've tried many things but it seems that it's not trivial to convert a stream into a string.
Note that the data is created inside Node and not from the file system.
How can I achieve this?
Here is what I did for future readers (this helped too):
png.pack();
var chunks = [];
png.on('data', function (chunk) {
chunks.push(chunk);
console.log('chunk:', chunk.length);
});
png.on('end', function () {
var result = Buffer.concat(chunks);
console.log('final result:', result.length);
io.sockets.emit('image', result.toString('base64'));
});
You do not want to convert the stream into a string but its readable chunk:
stream.on('readable', function() {
var string = stream.read().toString('base64');
// send through websocket
});
You can also do this for the complete data running through the stream:
var data = '';
stream.on('readable', function() {
data += stream.read().toString('base64');
});
stream.on('end', function() {
console.log(data);
});
Depends on if the client requires the complete png picture to be available or if it is okay to have single chunks.
However if you are interested how this could look like in an practical example (with the image being uploaded by HTML5 Drag & Drop) you can checkout messenger.

HTTP request stream not firing readable when reading fixed sizes

I am trying to work with the new Streams API in Node.js, but having troubles when specifying a fixed read buffer size.
var http = require('http');
var req = http.get('http://143.226.75.100/waug_mp3_128k', function (res) {
res.on('readable', function () {
var receiveBuffer = res.read(1024);
console.log(receiveBuffer.length);
});
});
This code will receive a few buffers and then exit. However, if I add this line after the console.log() line:
res.read(0);
... all is well again. My program continues to stream as predicted.
Why is this happening? How can I fix it?
It's explained here.
As far as I understand it, by reading only 1024 bytes with each readable event, Node is left to assume that you're not interested in the rest of the data that's in the stream buffers, and discards it. Issuing the read(0) (in the same event loop iteration) 'resets' this behaviour. I'm not sure why the process exits after reading a couple of 1024-byte buffers though; I can recreate it, but I don't understand it yet :)
If you don't have a specific reason to use the 1024-byte reads, just read the entire buffer for each event:
var receiveBuffer = res.read();
Or instead of using non-flowing mode, use flowing mode by using the data/end events instead:
var http = require('http');
var req = http.get('http://143.226.75.100/waug_mp3_128k', function (res) {
var chunks = [];
res.on('data', function(chunk) {
chunks.push(chunk);
console.log('chunk:', chunk.length);
});
res.on('end', function() {
var result = Buffer.concat(chunks);
console.log('final result:', result.length);
});
});

Abort ReadStream on client connection close

I am trying to send a (huge) file with a limited amount of data passing every second (using TooTallNate/node-throttle):
var fs = require('fs');
var Throttle = require('throttle');
var throttle = new Throttle(64);
throttle.on('data', function(data){
console.log('send', data.length);
res.write(data);
});
throttle.on('end', function() {
console.log('error',arguments);
res.end();
});
var stream = fs.createReadStream(filePath).pipe(throttle);
If I cancel the download at the clients browser, the stream will just continue until it completly transferred.
I also tested the scenario above with npm node-throttled-stream, same behavour.
How to cancel the stream if the browser closed his request?
Edit:
I am able to obtain the connections close event by using
req.connection.on('close',function(){});
But the stream has neither a destroy nor an end or stop property which I could use to stop the stream from further reading.
I does provide the property pause Doc, but I would rather stop node from reading the whole file than just stopping to recieve the contents (as described in the doc).
I ended up using the following dirty workaround:
var aborted = false;
stream.on('data', function(chunk){
if(aborted) return res.end();
// stream contents
});
req.connection.on('close',function(){
aborted = true;
res.end();
});
As mentioned above, this isn't really a nice solution, but it works.
Any other solution would be highly appreciated!

Having trouble understanding node.js listeners

I'm working on two node.js tutorials at the moment and while I understand what is going on within each tutorial, I clearly don't understand what's going on that well.
The following code listens for "data" events and then adds new chunks of data to a variable named postData. Another listener sends this data along with other stuff to my route.js file.
request.addListener("data", function (postDataChunk) {
postData += postDataChunk;
console.log("Received POST data chunk '" + postDataChunk + "'.");
});
request.addListener("end", function () {
route(handle, pathname, response, postData);
});
The following code creates a variable, tailChild, that spawns the shell command 'tail' on my system log and then attempts to add this data to my postData variable:
var spawn = require('child_process').spawn;
var tail_child = spawn('tail', ['-f', '/var/log/system.log']);
tail_child.stdout.on('data', function (data) {
postData += data;
console.log("TAIL READING: " + data);
});
tail_child.stdout.on('end', function () {
route(handle, pathname, response, postData);
});
Now my console is updated in realtime with system.log data but my browser times out with a "No data received error."
I've tried tweaking the code above to figure what is going wrong and as near as I can tell node is telling me that var data is null so it is adding nothing to var postData. This doesn't make sense to me since console.log("TAIL READING: " + data) gives me the results of spawn('tail', ['-f', '/var/log/system.log']) in my terminal window. Clearly var data is not null.
Edit:
Here's a pastebin link to my server.js code
tail -f won't trigger the end callback so you never respond to the user.

Resources