nodejs - pipe appjs console to a file - node.js

I try to pipe appjs console to a file with this code:
var fs = require('fs');
var logStream = fs.createWriteStream(__dirname+ '/log.txt', { flags: 'a' });
process.stdout.pipe(logStream);
process.stderr.pipe(logStream);
console.log("test");
It creates an empty file, but nothing more... With node.exe the "test" goes into the console, not into the log file. The platform is win32, but I don't think it counts.
What's the problem with the code?
conclusion:
Stdout, stderr and a file write stream are all sink type endpoints, so I cannot bind them together. I need to replace stdout and stderr with douplex mock streams so I will be able to bind these mock streams both to the original sinks and the log sink. I am not sure whether console.log and console.error will be affected by replacing the streams with the mechanism supernova suggested, I'd rather use a dedicated logger, which uses the console instead of this workaround.

you have to define getters for process.stdin, process.stdout and process.stderr
var fs = require("fs")
, errlog = fs.createWriteStream("./err.log", { flags: 'a' })
process.__defineGetter__("stderr", function(){
return errlog
})
process.stderr.write("test")
this should work

Related

nodejs - log to console stdio and file using only core modules

My application is simple and I want to avoid using a logging library like Winston. I need to log the output to both the console, and to file. I found a few tutorials on how to do this using a child process, such as this, but I can't find anything that leverages the main process stdio, like process.stdout and process.stdin
The key to solving this was recognizing that process.stdio is a writable stream whereas a child process's stdio using the child_process module is a readable stream (thanks to this article). Therefore I needed to create both a readable and writable file stream, and pipe the readable stream out to process.stdio. You could probably simplify this even further with a duplex stream, but for noobs like myself, this is a straightforward and easy to read approach.
const { Console } = require("console")
, process = require("process")
, path = require("path")
, fs = require('fs');
// Define the file paths to log to
const outputFilePath = path.join(__dirname, './stdout.log');
const errorFilePath = path.join(__dirname, './stderr.log');
// Create the empty files synchronously to guarantee it exists prior to stream creation.
// Change flag to 'w' to overwrite rather than append.
fs.closeSync(fs.openSync(outputFilePath, 'a+'));
fs.closeSync(fs.openSync(errorFilePath, 'a+'));
// Create a writable file stream for both stdout and stderr
const fileWriterOut = fs.createWriteStream(outputFilePath);
const fileWriterErr = fs.createWriteStream(errorFilePath);
// Create a new Console object using the file writers
const Logger = new Console({ stdout: fileWriterOut, stderr: fileWriterErr });
// Create readable file streams for process.stdio to consume
const fileReaderOut = fs.createReadStream(path.join(__dirname, './stdout.log'));
const fileReaderErr = fs.createReadStream(path.join(__dirname, './stderr.log'));
// Pipe out the file reader into process stdio
fileReaderOut.pipe(process.stdout);
fileReaderErr.pipe(process.stderr);
// Test the new logger
Logger.log("Logger initialized");
// Export
module.exports = Logger;

Redirect Readable object stdout process to file in node

I use an NPM library to parse markdown to HTML like this:
var Markdown = require('markdown-to-html').Markdown;
var md = new Markdown();
...
md.render('./test', opts, function(err) {
md.pipe(process.stdout)
});
This outputs the result to my terminal as intended.
However, I need the result inside the execution of my node program. I thought about writing the output stream to file and then reading it in at a later time but I can't figure out a way to write the output to a file instead.
I tried to play around var file = fs.createWriteStream('./test.html'); but the node.js streams rather give me headaches than results.
I've also looked into the library's repo and Markdown inherits from Readable via util like this:
var util = require('util');
var Readable = require('stream').Readable;
util.inherits(Markdown, Readable);
Any resources or advice would be highly appreciated. (I would also take another library for parsing the markdown, but this gave me the best results so far)
Actually creating a writable file-stream and piping the markdown to this stream should work just fine. Try it with:
const writeStream = fs.createWriteStream('./output.html');
md.render('./test', opts, function(err) {
md.pipe(writeStream)
});
// in case of errors you should handle them
writeStream.on('error', function (err) {
console.log(err);
});

How to capture JSON from stdin when parent doesn’t close pipe?

I developed a package called transpile-md-to-json that transpile multiple markdown files to a single JSON file.
Running transpile-md-to-json --src examples/content --watch transpiles markdown files to JSON and outputs result to stdout as markdown files are created, edited and deleted.
I tried using get-stdin to capture the JSON and process it some more using another node script.
transpile-md-to-json --src src/privacy-guides --blogify --watch | node test.js
Problem is stdin.on('end') is never fired because the pipe isn’t closed by transpile-md-to-json when watch mode is enabled (--watch).
See https://github.com/sindresorhus/get-stdin/blob/master/index.js#L23-L25
How can I work around this?
As pointed out by Mike in the comments, there appears to be no built-in way of achieving this as the pipe remains open until the parent exits, therefore stdin.on('end') is not fired.
The closest we can get is to use some kind of EOF indicator and use that to end a "cycle". An indicator we can hook to isn’t always present, but in the context of JSON, we’re good as each JSON payload ends with }.
const readline = require("readline")
const fs = require("fs")
process.stdin.setEncoding("utf-8")
const rl = readline.createInterface({
input: process.stdin,
})
var json = ""
rl.on("line", function(line) {
json += `${line}\n`
if (line === "}") {
console.log("done", json)
fs.writeFileSync("test.json", json)
json = ""
}
})

Get console output of current script

I want to get the console contents of the current running Node.js script.
I've tried to do this event but it doesn't work:
setInterval(function() { console.log("Hello World!") }, 1000);
process.stdout.on('message', (message) => {
console.log('stdout: ' + message.toString())
})
It doesn't listen to the event.
This is not a fully Node.js solution but it is very good in case you run Linux.
Create a start.sh file.
Put the following into it:
start.sh:
#!/bin/bash
touch ./console.txt
node ./MyScript.js |& tee console.txt &
wait
Now open your Node.js script (MyScript.js) and use this Express.js event:
MyScript.js:
const fs = require('fs');
app.get('/console', function(req, res){
var console2 = fs.readFileSync("./console.txt", 'utf8');
res.send(console2);
});
Always start your Node.js application by calling start.sh
Now calling http://example.com/console should output the console!
A part of this answer was used.
NOTE: To format the line breaks of the console output to be shown correctly in the browsers, you can use a module like nl2br.
An advice: The problems aren't always solved the direct way, most of the problems are solved using indirect ways. Keep searching about the possible ways to achieve what you want and don't search about what you're looking for only.
There's no 'message' event on process.stdout.
I want to make a GET in my Express.js app called /getconsole .. it
should return the console of the current running Node.js script (which
is running the Express.js app too)
What you should use is a custom logger, I recommend winston with a file transport, and then you can read from that file when you issue a request to your endpoint.
const express = require('express');
const fs = require('fs');
const winston = require('winston');
const path = require('path');
const logFile = path.join(__dirname, 'out.log');
const app = express();
const logger = winston.createLogger({
level: 'info',
format: winston.format.json(),
transports: [
new winston.transports.Console({
format: winston.format.simple()
}),
new winston.transports.File({
filename: logFile
})
]
});
// Don't use console.log anymore.
logger.info('Hi');
app.get('/console', (req, res) => {
// Secure this endpoint somehow
fs.createReadStream(logFile)
.pipe(res);
});
app.get('/log', (req, res) => {
logger.info('Log: ' + req.query.message);
});
app.listen(3000);
You can also use a websocket connection, and create a custom winston transport to emit the logs.
stdout, when going to a tty (terminal) is an instance of a writable stream. The same is true of stderr, to which node writes error messages. Those streams don't have message events. The on() method allows solicitation of any named event, even those that will never fire.
Your requirement is not clear from your question. If you want to intercept and inspect console.log operations, you can pipe stdout to some other stream. Similarly, you can pipe stderr to some other stream to intercept and inspect errors.
Or, in a burst of ugliness and poor maintainability, you can redefine the console.log and console.error functions to something that does what you need.
It sounds like you want to buffer up the material written to the console, and then return it to an http client in response to a GET operation. To do that you would either
stop using console.log for that output, and switch over to a high-quality logging npm package like winston.
redefine console.log (and possibly console.error) to save its output in some kind of simple express-app-scope data structure, perhaps an array of strings. Then implement your GET to read that array of strings, format it, and return it.
My first suggestion is more scalable.
By the way, please consider the security implications of making your console log available to malicious strangers.

nodejs : how to log to screen AND to file?

I use console.log in my node.js: that way I can log to screen
ex:
node myscript.js
If I use
node myscript.js>log.txt then I log to file log.txt
How can I log to screen AND to file ?
Use tee.
node myscript.js | tee log.txt
If you want this behavior to be persistent within your app, you could create a through stream and pipe it to both a writeStream and stdout.
var util = require('util');
var fs = require('fs');
// Use the 'a' flag to append to the file instead of overwrite it.
var ws = fs.createWriteStream('/path/to/log', {flags: 'a'});
var through = require('through2');
// Create through stream.
var t = new through();
// Pipe its data to both stdout and our file write stream.
t.pipe(process.stdout);
t.pipe(ws);
// Monkey patch the console.log function to write to our through
// stream instead of stdout like default.
console.log = function () {
t.write(util.format.apply(this, arguments) + '\n');
};
Now this will write to both stdout (terminal display) and to your log file.
You can also omit the through stream and just write to both streams in the monkey patched function.
console.log = function () {
var text = util.format.apply(this, arguments) + '\n';
ws.write(text);
process.stdout.write(text);
};
The through stream just gives you a single stream you could utilize in other ways around your app and you'd always know that it was piped to both output streams. But if all you want is to monkey patch console.log then the latter example is sufficient :)
If you only want to do this for a single run of your app from the terminal, see #andars' answer and the tee command :)
PS - This is all that console.log actually does in node, in case you were wondering.
Console.prototype.log = function() {
this._stdout.write(util.format.apply(this, arguments) + '\n');
};

Resources