Catching console.log in node.js? - node.js

Is there a way that I can catch eventual console output caused by console.log(...) within node.js to prevent cloggering the terminal whilst unit testing a module?
Thanks

A better way could be to directly hook up the output you to need to catch data of, because with Linus method if some module write directly to stdout with process.stdout.write('foo') for example, it wont be caught.
var logs = [],
hook_stream = function(_stream, fn) {
// Reference default write method
var old_write = _stream.write;
// _stream now write with our shiny function
_stream.write = fn;
return function() {
// reset to the default write method
_stream.write = old_write;
};
},
// hook up standard output
unhook_stdout = hook_stream(process.stdout, function(string, encoding, fd) {
logs.push(string);
});
// goes to our custom write method
console.log('foo');
console.log('bar');
unhook_stdout();
console.log('Not hooked anymore.');
// Now do what you want with logs stored by the hook
logs.forEach(function(_log) {
console.log('logged: ' + _log);
});
EDIT
console.log() ends its output with a newline, you may want to strip it so you'd better write:
_stream.write = function(string, encoding, fd) {
var new_str = string.replace(/\n$/, '');
fn(new_str, encoding, fd);
};
EDIT
Improved, generic way to do this on any method of any object with async support See the gist.

module.js:
module.exports = function() {
console.log("foo");
}
program:
console.log = function() {};
mod = require("./module");
mod();
// Look ma no output!
Edit: Obviously you can collect the log messages for later if you wish:
var log = [];
console.log = function() {
log.push([].slice.call(arguments));
};

capture-console solves this problem nicely.
var capcon = require('capture-console');
var stderr = capcon.captureStderr(function scope() {
// whatever is done in here has stderr captured,
// the return value is a string containing stderr
});
var stdout = capcon.captureStdout(function scope() {
// whatever is done in here has stdout captured,
// the return value is a string containing stdout
});
and later
Intercepting
You should be aware that all capture functions will still pass the values through to the main stdio write() functions, so logging will still go to your standard IO devices.
If this is not desirable, you can use the intercept functions. These functions are literally s/capture/intercept when compared to those shown above, and the only difference is that calls aren't forwarded through to the base implementation.

Simply add the following snippet to your code will let you catch the logs and still print it in the console:
var log = [];
console.log = function(d) {
log.push(d);
process.stdout.write(d + '\n');
};

Related

Replay a log file with NodeJS as if it were happening in real-time

I have a log file with about 14.000 aircraft position datapoints captured from a system called Flarm, it looks like this:
{"addr":"A","time":1531919658.578100,"dist":902.98,"alt":385,"vs":-8}
{"addr":"A","time":1531919658.987861,"dist":914.47,"alt":384,"vs":-7}
{"addr":"A","time":1531919660.217471,"dist":925.26,"alt":383,"vs":-7}
{"addr":"A","time":1531919660.623466,"dist":925.26,"alt":383,"vs":-7}
What I need to do is find a way to 'play' this file back in real-time (as if it were occuring right now, even though it's pre-recorded), and emit an event whenever a log entry 'occurs'. The file is not being added to, it's pre-recorded and the playing back would occur at a later stage.
The reason for doing this is that I don't have access to the receiving equipment when I'm developing.
The only way I can think to do it is to set a timeout for every log entry, but that doesn't seem like the right way to do it. Also, this process would have to scale to longer recordings (this one was only an hour long).
Are there other ways of doing this?
If you want to "play them back" with the actual time difference, a setTimeout is pretty much what you have to do.
const processEntry = (entry, index) => {
index++;
const nextEntry = getEntry(index);
if (nextEntry == null) return;
const timeDiff = nextEntry.time - entry.time;
emitEntryEvent(entry);
setTimeout(processEntry, timeDiff, nextEntry, index);
};
processEntry(getEntry(0), 0);
This emits the current entry and then sets a timeout based on the difference until the next entry.
getEntry could either fetch lines from a prefilled array or fetch lines individually based on the index. In the latter case only two lines of data would only be in memory at the same time.
Got it working in the end! setTimeout turned out to be the answer, and combined with the input of Lucas S. this is what I ended up with:
const EventEmitter = require('events');
const fs = require('fs');
const readable = fs.createReadStream("./data/2018-07-18_1509log.json", {
encoding: 'utf8',
fd: null
});
function read_next_line() {
var chunk;
var line = '';
// While this is a thing we can do, assign chunk
while ((chunk = readable.read(1)) !== null) {
// If chunk is a newline character, return the line
if (chunk === '\n'){
return JSON.parse(line);
} else {
line += chunk;
}
}
return false;
}
var lines = [];
var nextline;
const processEntry = () => {
// If lines is empty, read a line
if (lines.length === 0) lines.push(read_next_line());
// Quit here if we've reached the last line
if ((nextline = read_next_line()) == false) return true;
// Else push the just read line into our array
lines.push(nextline);
// Get the time difference in milliseconds
var delay = Number(lines[1].time - lines[0].time) * 1000;
// Remove the first line
lines.shift();
module.exports.emit('data', lines[0]);
// Repeat after the calculated delay
setTimeout(processEntry, delay);
}
var ready_to_start = false;
// When the stream becomes readable, allow starting
readable.on('readable', function() {
ready_to_start = true;
});
module.exports = new EventEmitter;
module.exports.start = function() {
if (ready_to_start) processEntry();
if (!ready_to_start) return false;
}
Assuming you want to visualize the flight logs, you can use fs watch as below, to watch the log file for changes:
fs.watch('somefile', function (event, filename) {
console.log('event is: ' + event);
if (filename) {
console.log('filename provided: ' + filename);
} else {
console.log('filename not provided');
}
});
Code excerpt is from here. For more information on fs.watch() check out here
Then, for seamless update on frontend, you can setup a Websocket to your server where you watch the log file and send newly added row via that socket to frontend.
After you get the data in frontend you can visualize it there. While I haven't done any flight visualization project before, I've used D3js to visualize other stuff (sound, numerical data, metric analysis and etc.) couple of times and it did the job every time.

Redirecting stdout to file nodejs

I've created:
var access = fs.createWriteStream('/var/log/node/api.access.log', { flags: 'w' });
Then piped:
process.stdout.pipe(access);
Then tried:
console.log("test");
And nothing has appeared in /var/log/node/api.access.log. However this way is working:
process.stdout.pipe(access).write('test');
Could someone explain what am I doing wrong ?
I solved this problem the following way:
var access = fs.createWriteStream('/var/log/node/api.access.log');
process.stdout.write = process.stderr.write = access.write.bind(access);
Of course you can also separate stdout and stderr if you want.
I also would strongly recommend to handle uncaught exceptions:
process.on('uncaughtException', function(err) {
console.error((err && err.stack) ? err.stack : err);
});
This will cover the following situations:
process.stdout.write
process.stderr.write
console.log
console.dir
console.error
someStream.pipe(process.stdout);
throw new Error('Crash');
throw 'never do this';
throw undefined;
Checkout console.Console, the parent class of the normal console.
var myLogFileStream = fs.createWriteStream(pathToMyLogFile);
var myConsole = new console.Console(myLogFileStream, myLogFileStream);
You can then you use myConsole.log, myConsole.error, myConsole.dir, etc. and write directly to your file.
You can also monkey patch process.stdout.write as follows:
var fn = process.stdout.write;
function write() {
fn.apply(process.stdout, arguments);
myLogFileStream.write.apply(myLogFileStream, arguments);
}
process.stdout.write = write;
there are also other options for overwriting console._stdout depending on the motivation for logging the stdout to a file.
process.stdout is a Writable. pipe is a method of Readable(Cf StreamAPI documentation : https://nodejs.org/api/stream.html
You can see the documentation of process.stdout here : https://nodejs.org/api/process.html#process_process_stdout
It's surprising that you can do process.stdout.pipe(...); without any error. But i suppose this call just do nothing. Except returning a new Writable stream binded to stdout (or maybe it returns process.stdout itself. There's no specification for that in the documentation).
If you want to redirect stdout to a file, you have many solutions :
Just use your command line to do that. Windows style : node myfile.js > api.access.log.
Replace the console object by your own object. And you can rewrite console methods.
I'm not sure, but it may be possible to replace process.stdout with your own stream (and you can do whatever you want with this)
#user3173842
for the reply on
I solved this problem the following way:
var access = fs.createWriteStream('/var/log/node/api.access.log');
process.stdout.write = process.stderr.write = access.write.bind(access);
you do understand that process.stdout continues after process.on('exit') and therefore the fs.WriteStream closes after with process.stdout, according to
https://github.com/nodejs/node/issues/7606
so now the question remains, if the developer desired to have the fs.Writestream.write() return to its normal functionality and when fs.Writestream.end is called the writestream closes. How would the developer go about doing this I did
a_l = asyncify_listener
p_std_stream_m is a process stream manager object
p_std_stream_m.std_info.p_stdout_write = process.stdout.write
process.stdout.write = w_stream.write.bind(w_stream)
process.once('beforeExit', a_l( p_std_stream_m.handler,process.stdout,w_stream ) )
where in the 'beforeExit' event listener I did
process.stdout.write = p_std_stream_m.std_info.p_stdout_write
w_stream.end()
It works and you use the once method because the process.stdout seems to do a lot of work
at this time.
Is this good practice, would you do this or what would you do in this situation
anyone can feel free to reply.
Originally based on #Anatol-user3173842 answer
But in my case I needed to hook the stdout & stderr and also write into a file.
So for those who need to keep the normal stdout behaviour in addition to writing into the file. Use the following.
For non-errors:
// stdout logging hook
const stdoutWrite0 = process.stdout.write;
process.stdout.write = (args) => { // On stdout write
CustomLogger.writeToLogFile('log', args); // Write to local log file
args = Array.isArray(args) ? args : [args]; // Pass only as array to prevent internal TypeError for arguments
return stdoutWrite0.apply(process.stdout, args);
};
For errors:
// stderr logging hook
const stderrWrite0 = process.stderr.write;
process.stderr.write = (args) => { // On stderr write
CustomLogger.writeToLogFile('error', args); // Write to local error file
args = Array.isArray(args) ? args : [args]; // Pass only as array to prevent internal TypeError for arguments
return stderrWrite0.apply(process.stderr, args);
};
// uncaught exceptions
process.on('uncaughtException', (err) => {
CustomLogger.writeToLogFile('error', ((err && err.stack) ? err.stack : err));
});
Here is the CustomLogger code, where I also separate the log files by date:
export class CustomLogger {
static LOGS_DIR = 'location-of-my-log-files';
private static logDailyName(prefix: string): string {
const date = new Date().toLocaleDateString().replace(/\//g, '_');
return `${CustomLogger.LOGS_DIR}/${prefix}_${date}.log`;
}
private static writeToLogFile(prefix, originalMsg) {
const timestamp = Date.now();
const fileName = this.logDailyName(prefix);
const logMsg = prepareForLogFile(originalMsg);
fs.appendFileSync(fileName, `${timestamp}\t${logMsg}\n\n`);
return originalMsg;
}
}
Here's a quick example of a logger class that redirects stdout, stderr and exceptions to a file, while still writting everything to the console:
class Logger {
#log_stream
#stdout_write
#stderr_write
constructor(path) {
this.#log_stream = fs.createWriteStream(path, { flags: 'a' })
this.#stdout_write = process.stdout.write.bind(process.stdout)
this.#stderr_write = process.stderr.write.bind(process.stderr)
process.stdout.write = this.stdout_write.bind(this)
process.stderr.write = this.stderr_write.bind(this)
process.on('uncaughtException', function(err) {
console.error((err && err.stack) ? err.stack : err)
})
}
stdout_write(buffer) {
this.#log_stream.write(buffer)
this.#stdout_write(buffer)
}
stderr_write(buffer) {
this.#log_stream.write(buffer)
this.#stderr_write(buffer)
}
}
const logger = new Logger('example.log')

Passing a return from one function to another function that already has set parameters?

Edit: I know JS is asynchronous, I have looked over the How to Return thread. The issue I'm having is that going from "foo" examples to something specific = I'm not quite sure where to re-format this.
Also here is some context: https://github.com/sharkwheels/beanballs/blob/master/bean-to-osc-two.js
I have a question about returns in node. It might be a dumb question, but here goes. I have a function that connects to a socket, and gets OSC messages from processing:
var sock = dgram.createSocket("udp4", function(msg, rinfo) {
try {
// get at all that info being sent out from Processing.
//console.log(osc.fromBuffer(msg));
var getMsg = osc.fromBuffer(msg);
var isMsg = getMsg.args[0].value;
var isName = getMsg.args[1].value;
var isAdd = getMsg.address;
var isType = getMsg.oscType;
// make an array out of it
var isAll = [];
isAll.push(isName);
isAll.push(isMsg);
isAll.push(isAdd);
isAll.push(isType);
// return the array
console.log(isAll);
return isAll;
} catch (error) {
console.log(error);
}
});
Below I have the start of another function, to write some of that array to a BLE device. It needs name and characteristics from a different function. How do I get the below function to use isAll AND two existing parameters?
var writeToChars = function (name, characteristics) { // this is passing values from the BLE setup function
// i need to get isAll to here.
// eventually this will write some values from isAll into a scratch bank.
}
Thanks.
async call in this case be written something like this. state can be maintained in the variables in closure if required. In this particular case - you can do without any state (isAll) as well.
var isAll;
var soc = dgram.createSocket('udp4', oncreatesocket);
function oncreatesocket(msg, rinfo)
{
isAll = parseMessage(msg);
writeData(isAll);
}
function parseMessage(msg) {
...
// code to parse msg and return isAll
}
function writeData() {}
if the writeData is small enough function. It can be inside oncreatesocket without impacting the readability of the code.
Alright. So I figured out what to do, at least in this scenario. I'm sure there is a better way to do this, but for now, this works.
I'm mapping an existing global array of peripherals into the write function, while passing the OSC message to it as a parameter. This solved my issue of "how do I get two pieces of information to the same place". It figures out which peripheral is which and writes a different value to each scratch bank of each peripheral accordingly. Leaving here for future reference.
var writeToBean = function(passThrough){
var passThrough = passThrough;
console.log("in Write to bean: ", passThrough);
_.map(beanArray, function(n){
if(n.advertisement.localName === passThrough.name){
//var name = n.advertisement.localName;
n.discoverSomeServicesAndCharacteristics(['a495ff20c5b14b44b5121370f02d74de'], [scratchThr], function(error, services, characteristics){
var service = services[0];
var characteristic = characteristics[0];
var toSend = passThrough.msg;
console.log("service", service);
console.log("characteristic", characteristic);
if (toSend != null) {
characteristic.write(new Buffer([toSend]), false, function(error) {
if (error) { console.log(error); }
console.log("wrote " + toSend + " to scratch bank 3");
});
}
// not sure how to make the program resume, it stops here. No error, just stops processing.
});
}
});
}

Retrieve stdout to variable

I’m trying to run child process in next code:
run = function (cmd, callback) {
var spawn = require('child_process').spawn;
var command = spawn(cmd);
var result = '';
command.stdout.on('data', function (data) {
result += data.toString();
});
command.on('exit', function () {
callback(result);
});
}
execQuery = function (cmd) {
var result = {
errnum: 0,
error: 'No errors.',
body: ''
};
run(cmd, function (message) {
result.body = message;
console.log(message);
});
return result;
}
After execution execQuery('ls') result.body is always empty, but console.log is contain value.
I ran a quick test and the command's exit event is firing before all of stdouts data is drained. I at least got the output captured and printed if I changed your exit handler to look for command.stdout's end event.
command.stdout.on('end', function () {
callback(result);
});
That should help a bit. Note there are existing libraries you might want to use for this and a truly correct implementation would be significantly more involved than what you have, but my change should address your current roadblock problem.
Random tip: it is the node convention to always reserve the first argument of callback functions for an error and your snippet is inconsistent with that convention. You probably should adjust to match the convention.
Oh sorry, let me address your question about result.body. The run function is ASYNCHRONOUS! That means that your return result; line of code executes BEFORE the run callback body where result.body = message; is. You can't use return values like that anywhere in node when you have I/O involved. You have to use a callback.

Parse output of spawned node.js child process line by line

I have a PhantomJS/CasperJS script which I'm running from within a node.js script using process.spawn(). Since CasperJS doesn't support require()ing modules, I'm trying to print commands from CasperJS to stdout and then read them in from my node.js script using spawn.stdout.on('data', function(data) {}); in order to do things like add objects to redis/mongoose (convoluted, yes, but seems more straightforward than setting up a web service for this...) The CasperJS script executes a series of commands and creates, say, 20 screenshots which need to be added to my database.
However, I can't figure out how to break the data variable (a Buffer?) into lines... I've tried converting it to a string and then doing a replace, I've tried doing spawn.stdout.setEncoding('utf8'); but nothing seems to work...
Here is what I have right now
var spawn = require('child_process').spawn;
var bin = "casperjs"
//googlelinks.js is the example given at http://casperjs.org/#quickstart
var args = ['scripts/googlelinks.js'];
var cspr = spawn(bin, args);
//cspr.stdout.setEncoding('utf8');
cspr.stdout.on('data', function (data) {
var buff = new Buffer(data);
console.log("foo: " + buff.toString('utf8'));
});
cspr.stderr.on('data', function (data) {
data += '';
console.log(data.replace("\n", "\nstderr: "));
});
cspr.on('exit', function (code) {
console.log('child process exited with code ' + code);
process.exit(code);
});
https://gist.github.com/2131204
Try this:
cspr.stdout.setEncoding('utf8');
cspr.stdout.on('data', function(data) {
var str = data.toString(), lines = str.split(/(\r?\n)/g);
for (var i=0; i<lines.length; i++) {
// Process the line, noting it might be incomplete.
}
});
Note that the "data" event might not necessarily break evenly between lines of output, so a single line might span multiple data events.
I've actually written a Node library for exactly this purpose, it's called stream-splitter and you can find it on Github: samcday/stream-splitter.
The library provides a special Stream you can pipe your casper stdout into, along with a delimiter (in your case, \n), and it will emit neat token events, one for each line it has split out from the input Stream. The internal implementation for this is very simple, and delegates most of the magic to substack/node-buffers which means there's no unnecessary Buffer allocations/copies.
I found a nicer way to do this with just pure node, which seems to work well:
const childProcess = require('child_process');
const readline = require('readline');
const cspr = childProcess.spawn(bin, args);
const rl = readline.createInterface({ input: cspr.stdout });
rl.on('line', line => /* handle line here */)
Adding to maerics' answer, which does not deal properly with cases where only part of a line is fed in a data dump (theirs will give you the first part and the second part of the line individually, as two separate lines.)
var _breakOffFirstLine = /\r?\n/
function filterStdoutDataDumpsToTextLines(callback){ //returns a function that takes chunks of stdin data, aggregates it, and passes lines one by one through to callback, all as soon as it gets them.
var acc = ''
return function(data){
var splitted = data.toString().split(_breakOffFirstLine)
var inTactLines = splitted.slice(0, splitted.length-1)
var inTactLines[0] = acc+inTactLines[0] //if there was a partial, unended line in the previous dump, it is completed by the first section.
acc = splitted[splitted.length-1] //if there is a partial, unended line in this dump, store it to be completed by the next (we assume there will be a terminating newline at some point. This is, generally, a safe assumption.)
for(var i=0; i<inTactLines.length; ++i){
callback(inTactLines[i])
}
}
}
usage:
process.stdout.on('data', filterStdoutDataDumpsToTextLines(function(line){
//each time this inner function is called, you will be getting a single, complete line of the stdout ^^
}) )
You can give this a try. It will ignore any empty lines or empty new line breaks.
cspr.stdout.on('data', (data) => {
data = data.toString().split(/(\r?\n)/g);
data.forEach((item, index) => {
if (data[index] !== '\n' && data[index] !== '') {
console.log(data[index]);
}
});
});
Old stuff but still useful...
I have made a custom stream Transform subclass for this purpose.
See https://stackoverflow.com/a/59400367/4861714
#nyctef's answer uses an official nodejs package.
Here is a link to the documentation: https://nodejs.org/api/readline.html
The node:readline module provides an interface for reading data from a Readable stream (such as process.stdin) one line at a time.
My personal use-case is parsing json output from the "docker watch" command created in a spawned child_process.
const dockerWatchProcess = spawn(...)
...
const rl = readline.createInterface({
input: dockerWatchProcess.stdout,
output: null,
});
rl.on('line', (log: string) => {
console.log('dockerWatchProcess event::', log);
// code to process a change to a docker event
...
});

Resources