Unknown method process.openStdin() - node.js

I'm trying to pipe grep results into nodejs script. I've found, that I should receive data from process.stdin.
Also I've found several ways to work with stdin. But they are different and I can't find all information about it. I know four ways (first 3 start with var data = ""):
1) Most popular in search results
process.stdin.resume();
process.stdin.setEncoding( 'utf8' );
process.stdin.on('data', function(chunk) { data += chunk; });
process.stdin.on('end', function() { console.log('data: ' + data); });
2) Looks like the first one, but with unknown function process.openStdin()
var stdin = process.openStdin();
stdin.on('data', function(chunk) { data += chunk; });
stdin.on('end', function() { console.log('data: ' + data); });
3) In the documentation I've read that calling stdin.resume() changes stdin to 'old type'. So if we didn't called 'resume' - we can use 'readable' event
process.stdin.setEncoding('utf8');
process.stdin.on('readable', function() { data += process.stdin.read(); });
process.stdin.on('end', function() { console.log('data: ' + data); });
4) Use module readline. It is very usefull as long as grep results are in mutiple lines and there I don't need to split received data by myself. But for a long time i couldn't understand why all information is piped to stdout directly. Then i i've found that we can pass empty object instead of process.stdout while creating interface and data wouldn't piped to output.
var readline = require('readline'),
//rl = readline.createInterface(process.stdin, process.stdout);
rl = readline.createInterface(process.stdin, {});
rl.on('line', function(data) { console.log('line: ' + data); });
5) My own variant. Use another module 'split' - it allows to read from stream and devide data into chuks by specified symbol (\r?\n by default). I used it to work with socket and as soon as stdin is also readable stream - we can use it here.
var split = require('split');
process.stdin.setEncoding('utf8');
process.stdin.pipe(split()).on('data', function(data) { console.log('line: ' + data); });
My question is "What is process.openStdin();????"
I've searched every page in google, but didn't found any documentation on this function!
Also while searching I've discovered, that official documentation for nodejs is ugly - not mentioned since which version methods are available, no detailed description on many objects/methods, no user comments. And this method (openStdin) - exists and works, but nowhere discribed! WTF???

While writing the question I've found the answer :)
It is in source code of nodejs:
process.openStdin = function() {
process.stdin.resume();
return process.stdin;
};
But I wonder, why is it not described in documentation? If it is a function for private use only, why is it used by many people, who wrote about working with stdin?

Related

fluent-ffmpeg get codec data without specifying output

I am using fluent-ffmpeg node module for getting codec data from a file.
It works if I give an output but I was wondering if there is any option to run fluent-ffmpeg without giving to it an output.
This is what I am doing:
readStream.end(new Buffer(file.buffer));
var process = new ffmpeg(readStream);
process.on('start', function() {
console.log('Spawned ffmpeg');
}).on('codecData', function(data) {
//get recording duration
const duration = data.duration;
console.log(duration)
}).save('temp.flac');
As you can see I am saving the file to temp.flac so I can get the seconds duration of that file.
If you don't want to save the ffmpeg process result to a file, one thing that comes to mind is to redirect the command output to /dev/null.
In fact, as the owner of the fluent-ffmpeg repository said in one comment, there is no need to specify a real file name for the destination when using null format.
So, for example, something like that will work:
let process = new ffmpeg(readStream);
process
.addOption('-f', 'null') // set format to null
.on('start', function() {
console.log('Spawned ffmpeg');
})
.on('codecData', function(data) {
//get recording duration
let duration = data.duration;
console.log(duration)
})
.output('nowhere') // or '/dev/null' or something else
.run()
It remains a bit hacky, but we must set an output to avoid the "No output specified" error.
When no stream argument is present, the pipe() method returns a PassThrough stream, which you can pipe to somewhere else (or just listen to events on).
var command = ffmpeg('/path/to/file.avi')
.videoCodec('libx264')
.audioCodec('libmp3lame')
.size('320x240')
.on('error', function(err) {
console.log('An error occurred: ' + err.message);
})
.on('end', function() {
console.log('Processing finished !');
});
var ffstream = command.pipe();
ffstream.on('data', function(chunk) {
console.log('ffmpeg just wrote ' + chunk.length + ' bytes');
});

Node.js stream data from a terminal command to the client

There were already a few questions here about node.js executing commands and outputting the data, but I still can't get this working. What I want is that using node.js I want to execute a python script that runs for a long time and produces some intermediate outputs. I want to stream these outputs to the client as soon as they are produced. I have tried the following, but what I get is that I get the whole output only once the command has finished. How can I make it pass on the data into the socket in real time? Thanks.
function run_cmd(cmd, args) {
var spawn = require('child_process').spawn,
child = spawn(cmd, args);
return child;
}
io.sockets.on('connection', function (socket) {
var foo = new run_cmd('python', ['test.py']);
foo.stdout.setEncoding('utf-8');
foo.stdout.on('data', function(data) {
console.log('sending data');
io.sockets.emit('terminal', {output: data});;
});
);
all your node.js code is okay.your code sends data only once because your code gets data only once.
The point is puts or print commands are not enough to trigger foo.stdout.on
try adding '$stdout.flush' at the point you want to send chunk in ruby code.
You need to order explicitly to output data.
here is my test code.
js
var spawn = require('child_process').spawn;
var cmd = spawn('ruby', ['testRuby.rb']);
var counter = 0;
cmd.stdout.on('data', function(data) {
counter ++;
console.log('stdout: ' + data);
});
cmd.stderr.on('data', function(data) {
console.log('stderr: ' + data);
});
cmd.on('exit', function(code) {
console.log('exit code: ' + code);
console.log(counter);
});
testRuby.rb
def execute_each_sec(sleep_sec)
yield
sleep sleep_sec
end
5.times do
execute_each_sec(1) do ||
puts "CurrentTime:#{Time.now}"
$stdout.flush
end
end
as you can see I'm calling $stdout.flush to output data explicitly in testRuby.rb.
if I remove that,node.js won't get anything until execution of testRuby.rb's finished.
edited
lol my bad. I used ruby instead of python.
in the case of python, use sys.stdout.flush() like svkk says
Edit:
In python you can also use -u flag to force it to flush after each print.

Parse output of spawned node.js child process line by line

I have a PhantomJS/CasperJS script which I'm running from within a node.js script using process.spawn(). Since CasperJS doesn't support require()ing modules, I'm trying to print commands from CasperJS to stdout and then read them in from my node.js script using spawn.stdout.on('data', function(data) {}); in order to do things like add objects to redis/mongoose (convoluted, yes, but seems more straightforward than setting up a web service for this...) The CasperJS script executes a series of commands and creates, say, 20 screenshots which need to be added to my database.
However, I can't figure out how to break the data variable (a Buffer?) into lines... I've tried converting it to a string and then doing a replace, I've tried doing spawn.stdout.setEncoding('utf8'); but nothing seems to work...
Here is what I have right now
var spawn = require('child_process').spawn;
var bin = "casperjs"
//googlelinks.js is the example given at http://casperjs.org/#quickstart
var args = ['scripts/googlelinks.js'];
var cspr = spawn(bin, args);
//cspr.stdout.setEncoding('utf8');
cspr.stdout.on('data', function (data) {
var buff = new Buffer(data);
console.log("foo: " + buff.toString('utf8'));
});
cspr.stderr.on('data', function (data) {
data += '';
console.log(data.replace("\n", "\nstderr: "));
});
cspr.on('exit', function (code) {
console.log('child process exited with code ' + code);
process.exit(code);
});
https://gist.github.com/2131204
Try this:
cspr.stdout.setEncoding('utf8');
cspr.stdout.on('data', function(data) {
var str = data.toString(), lines = str.split(/(\r?\n)/g);
for (var i=0; i<lines.length; i++) {
// Process the line, noting it might be incomplete.
}
});
Note that the "data" event might not necessarily break evenly between lines of output, so a single line might span multiple data events.
I've actually written a Node library for exactly this purpose, it's called stream-splitter and you can find it on Github: samcday/stream-splitter.
The library provides a special Stream you can pipe your casper stdout into, along with a delimiter (in your case, \n), and it will emit neat token events, one for each line it has split out from the input Stream. The internal implementation for this is very simple, and delegates most of the magic to substack/node-buffers which means there's no unnecessary Buffer allocations/copies.
I found a nicer way to do this with just pure node, which seems to work well:
const childProcess = require('child_process');
const readline = require('readline');
const cspr = childProcess.spawn(bin, args);
const rl = readline.createInterface({ input: cspr.stdout });
rl.on('line', line => /* handle line here */)
Adding to maerics' answer, which does not deal properly with cases where only part of a line is fed in a data dump (theirs will give you the first part and the second part of the line individually, as two separate lines.)
var _breakOffFirstLine = /\r?\n/
function filterStdoutDataDumpsToTextLines(callback){ //returns a function that takes chunks of stdin data, aggregates it, and passes lines one by one through to callback, all as soon as it gets them.
var acc = ''
return function(data){
var splitted = data.toString().split(_breakOffFirstLine)
var inTactLines = splitted.slice(0, splitted.length-1)
var inTactLines[0] = acc+inTactLines[0] //if there was a partial, unended line in the previous dump, it is completed by the first section.
acc = splitted[splitted.length-1] //if there is a partial, unended line in this dump, store it to be completed by the next (we assume there will be a terminating newline at some point. This is, generally, a safe assumption.)
for(var i=0; i<inTactLines.length; ++i){
callback(inTactLines[i])
}
}
}
usage:
process.stdout.on('data', filterStdoutDataDumpsToTextLines(function(line){
//each time this inner function is called, you will be getting a single, complete line of the stdout ^^
}) )
You can give this a try. It will ignore any empty lines or empty new line breaks.
cspr.stdout.on('data', (data) => {
data = data.toString().split(/(\r?\n)/g);
data.forEach((item, index) => {
if (data[index] !== '\n' && data[index] !== '') {
console.log(data[index]);
}
});
});
Old stuff but still useful...
I have made a custom stream Transform subclass for this purpose.
See https://stackoverflow.com/a/59400367/4861714
#nyctef's answer uses an official nodejs package.
Here is a link to the documentation: https://nodejs.org/api/readline.html
The node:readline module provides an interface for reading data from a Readable stream (such as process.stdin) one line at a time.
My personal use-case is parsing json output from the "docker watch" command created in a spawned child_process.
const dockerWatchProcess = spawn(...)
...
const rl = readline.createInterface({
input: dockerWatchProcess.stdout,
output: null,
});
rl.on('line', (log: string) => {
console.log('dockerWatchProcess event::', log);
// code to process a change to a docker event
...
});

Parsing the STDERR output of node.js child_process line by line

I'm writing a simple online conversion tool using FFMPEG and Node.js. I'm trying to figure out how to parse each line of the conversion output received from FFMPEG and only display pertinent results client side in the browser. In my case I want the encoding time counter that FFMPEG spits out on the command line.
My function thus far is:
function metric(ffmpeg, res) {
ffmpeg.stdout.on('data', function(data) {
res.writeHead(200, {'content-type': 'text/html'});
res.write('received upload:\n\n');
console.log(data);
});
ffmpeg.stderr.on('data', function (data) {
var temp += data.toString();
var lines = temp.split('\n');
//for debugging purposes
for(var i = 0;i<lines.length;i++) {
console.log('this is line: ' + i + '----' + lines[i]);
}
res.write(lines);
});
ffmpeg.on('exit', function (code) {
console.log('child process exited with code ' + code);
res.end();
});
}
What this ends up returning is multiple arrays, each of which includes the data from the previous array as well as the next data chunk. For example, the function returns array 1:{0=>A, 1=>B}, array 2:{0=>A, 1=>B, 2=>C}, array 3:{0=>A, 1=>B, 2=>C, 3=>D}, and so on.
I'm quite new to Node so I'm probably missing something simple. Any guidance would be much appreciated!
This should do the job:
var buff = new Buffer(data);
console.log(buff.toString('utf8'));
For more information on buffers, here is a link to the doc: http://nodejs.org/docs/v0.4.2/api/buffers.html

Having trouble understanding node.js listeners

I'm working on two node.js tutorials at the moment and while I understand what is going on within each tutorial, I clearly don't understand what's going on that well.
The following code listens for "data" events and then adds new chunks of data to a variable named postData. Another listener sends this data along with other stuff to my route.js file.
request.addListener("data", function (postDataChunk) {
postData += postDataChunk;
console.log("Received POST data chunk '" + postDataChunk + "'.");
});
request.addListener("end", function () {
route(handle, pathname, response, postData);
});
The following code creates a variable, tailChild, that spawns the shell command 'tail' on my system log and then attempts to add this data to my postData variable:
var spawn = require('child_process').spawn;
var tail_child = spawn('tail', ['-f', '/var/log/system.log']);
tail_child.stdout.on('data', function (data) {
postData += data;
console.log("TAIL READING: " + data);
});
tail_child.stdout.on('end', function () {
route(handle, pathname, response, postData);
});
Now my console is updated in realtime with system.log data but my browser times out with a "No data received error."
I've tried tweaking the code above to figure what is going wrong and as near as I can tell node is telling me that var data is null so it is adding nothing to var postData. This doesn't make sense to me since console.log("TAIL READING: " + data) gives me the results of spawn('tail', ['-f', '/var/log/system.log']) in my terminal window. Clearly var data is not null.
Edit:
Here's a pastebin link to my server.js code
tail -f won't trigger the end callback so you never respond to the user.

Resources