Short explanation:
I'm attempting to write a simple game in Node.js that needs to wait for user input every turn. How do I avoid callback hell (e.g. messy code) internal to a turn loop where each turn loop iteration needs to block and wait for input from stdin?
Long explanation:
All the explanations I have read on StackOverflow when someone asks about blocking for stdin input seem to be "that's not what Node.js is about!"
I understand that Node.js is designed to be non-blocking and I also understand why. However I feel that it has me stuck between a rock and a hard place on how to solve this. I feel like I have three options:
Find a way to block for stdin and retain my while loop
Ditch the while loop and instead recursively call a method (like nextTurn) whenever the previous turn ends.
Ditch the while loop and instead use setTimeout(0, ...) or something similar to call a method (like nextTurn) whenever a turn ends.
With option (1) I am going against Node.js principles of non-blocking IO.
With option (2) I will eventually reach a stack overflow as each call adds another turn to the call stack.
With option (3) my code ends up being a mess to follow.
Internal to Node.js there are default functions that are marked **Sync (e.g. see the fs library or the sleep function) and I'm wondering why there is no Sync method for getting user input? And if I were to write something similar to fs.readSync how would I go about doing it and still follow best practices?
Just found this:
https://www.npmjs.com/package/readline-sync
Example code (after doing an npm install readline-sync)
var readlineSync = require('readline-sync');
while(true) {
var yn = readlineSync.question("Do you like having tools that let you code how you want, rather than how their authors wanted?");
if(yn === 'y') {
console.log("Hooray!");
} else {
console.log("Back to callback world, I guess...");
process.exit();
}
}
Only problem so far is the wailing of the "That's not how node is meant to be used!" chorus, but I have earplugs :)
I agree with the comment about moving towards an event based system and would ditch the loops. I've thrown together a quick example of text based processing which can be used for simple text games.
var fs = require('fs'),
es = require('event-stream');
process.stdin
.pipe(es.split())
.on('data', parseCommand);
var actionHandlers = {};
function parseCommand(command) {
var words = command.split(' '),
action = '';
if(words.length > 1) {
action = words.shift();
}
if(actionHandlers[action]) {
actionHandlers[action](words);
} else {
invalidAction(action);
}
}
function invalidAction(action) {
console.log('Unknown Action:', action);
}
actionHandlers['move'] = function(words) {
console.log('You move', words);
}
actionHandlers['attack'] = function(words) {
console.log('You attack', words);
}
You can now break up your actions into discrete functions which you can register with a central actionHandlers variable. This makes adding new commands almost trivial. If you can add some details on why the above approach wouldn't work well for you, let me know and I'll revise the answer.
ArtHare's solution, at least for my use case, blocks background execution, including those started by a promise. While this code isn't elegant, it did block execution of the current function, until the read from stdin completed.
While this code must run from inside an async function, keep in mind that running an async function from a top-level context (directly from a script, not contained within any other function) will block that function until it completes.
Below is a full .js script demonstrating usage, tested with node v8.12.0:
const readline = require('readline');
const sleep = (waitTimeInMs) => new Promise(resolve => setTimeout(resolve, waitTimeInMs));
async function blockReadLine() {
var rl = readline.createInterface({
input: process.stdin,
output: process.stdout,
terminal: false
});
let result = undefined;
rl.on('line', function(line){
result = line;
})
while(!result) await sleep(100);
return result;
}
async function run() {
new Promise(async () => {
while(true) {
console.log("Won't be silenced! Won't be censored!");
await sleep(1000);
}
});
let result = await blockReadLine();
console.log("The result was:" + result);
process.exit(0);
}
run();
Related
I am trying to write a gjs app that needs to send emails.
The way I have found to do this is using spawn_async_with_pipes() to call mail.
The app seems to spawn mail, and I don't get an error, but I don't get any useful output nor do I get the test emails...
I have been at this for a while now and have found little to no useful up to date documentation. I am working with gtk3 and gjs (and glib). I have also tried spawning a shell script that in turn calls mail. This resulted in "could not resolve host" errors and a dead letter queue. So I know that I am spawning my command. I am not concerned about the "could not resolve host command", but by the fact that I can't get it by spawning mail directly.
I am spawning mail like this:
const [res, pid, in_fd, out_fd, err_fd] =
await GLib.spawn_async_with_pipes(null,
['mail',
'-V',
`-s "${msgObj.subBlock}"`,
`-r ${to}`,
`-S smtp=${HOST}`,
'-S smtp-use-starttls',
'-S smtp-auth=login',
`-S smtp-auth-user=${USER}`,
`-S smtp-auth-password=${PASS}`,
FROM
], null, GLib.SpawnFlags.SEARCH_PATH, null);
const in_reader = new Gio.DataOutputStream({
base_stream: new Gio.UnixOutputStream({fd: in_fd})
});
var feedRes = in_reader.put_string(msgObj.msgBlock, null);
const out_reader = new Gio.DataInputStream({
base_stream: new Gio.UnixInputStream({fd: out_fd})
});
const err_reader = new Gio.DataInputStream({
base_stream: new Gio.UnixInputStream({fd: err_fd})
});
var out = out_reader.read_until("", null);
var err = err_reader.read_until("", null);
print(` > out : "${out}"`);
print(` > res : "${res}"`);
print(` > feedRes : "${feedRes}"`);
print(` > err : "${err}"`);
err is 0, and res is just true
I don't know what the output should be, but I'm not getting a recognizable error, and no email is being delivered...
How can I get my app to send emails? Is spawning mail not the way to go?
Thanks in advance for any pointers you can give me.
There's couple things here I think are confusing you I think I can clear up.
await GLib.spawn_async_with_pipes(
GLib has it's own concept of async functions, that when applicable need to be wrapped in a Promise to work effectively with the await keyword. In this case, GLib.spawn_async_with_pipes() is not asynchronous in the way you're thinking, but that's okay because we're going to use the higher level class Gio.Subprocess.
async function mail(msgObj, to, host, user, pass, cancellable = null) {
try {
let proc = new Gio.Subprocess({
argv: ['mail',
'-V',
// Option switches and values are separate args
'-s', `"${msgObj.subBlock}"`,
'-r', `${to}`,
'-S', `smtp=${host}`,
'-S', 'smtp-use-starttls',
'-S', 'smtp-auth=login',
'-S', `smtp-auth-user=${user}`,
'-S', `smtp-auth-password=${pass}`,
FROM
],
flags: Gio.SubprocessFlags.STDIN_PIPE |
Gio.SubprocessFlags.STDOUT_PIPE |
Gio.SubprocessFlags.STDERR_MERGE
});
// Classes that implement GInitable must be initialized before use, but
// you could use Gio.Subprocess.new(argv, flags) which will call this for you
proc.init(cancellable);
// We're going to wrap a GLib async function in a Promise so we can
// use it like a native JavaScript async function.
//
// You could alternatively return this Promise instead of awaiting it
// here, but that's up to you.
let stdout = await new Promise((resolve, reject) => {
// communicate_utf8() returns a string, communicate() returns a
// a GLib.Bytes and there are "headless" functions available as well
proc.communicate_utf8_async(
// This is your stdin, which can just be a JS string
msgObj.msgBlock,
// we've been passing this around from the function args; you can
// create a Gio.Cancellable and call `cancellable.cancel()` to
// stop the command or any other operation you've passed it to at
// any time, which will throw an "Operation Cancelled" error.
cancellable,
// This is the GAsyncReady callback, which works like any other
// callback, but we need to ensure we catch errors so we can
// propagate them with `reject()` to make the Promise work
// properly
(proc, res) => {
try {
let [ok, stdout, stderr] = proc.communicate_utf8_finish(res);
// Because we used the STDERR_MERGE flag stderr will be
// included in stdout. Obviously you could also call
// `resolve([stdout, stderr])` if you wanted to keep both
// and separate them.
//
// This won't affect whether the proc actually return non-
// zero causing the Promise to reject()
resolve(stdout);
} catch (e) {
reject(e);
}
}
);
});
return stdout;
} catch (e) {
// This could be any number of errors, but probably it will be a GError
// in which case it will have `code` property carrying a GIOErrorEnum
// you could use to programmatically respond to, if desired.
logError(e);
}
}
Gio.Subprocess is a better choice overall, but especially for language bindings that can't pass "out" arguments into functions. Using GLib.spawn_async_with_pipes you would usually pass in NULL to prevent opening any pipes you didn't want, and always ensure you close any pipes you don't. Since we can't do that in GJS, you can end up with dangling file descriptors you can't close.
Gio.Subprocess does a lot of leg work for you and ensures file descriptors are closing, prevents zombie processes, sets up child watches for you and other things you really don't want to worry about. It also has convenience functions for getting IO streams so you don't have to wrap the fd's yourself, among other useful things.
I wrote a longer primer on async programming in GJS that you might find helpful here. You should be able to breeze though it pretty quickly, and it tries to clear up some confusion about the relationship between GLib async, JavaScript async and the GLib main loop vs JS event loop.
I am new to Node, so please forgive me if my question is too simple. I fully appreciate the Async paradigm and why it is useful in single threads. But some logical operations are synchronous by nature.
I have found many posts about the async/sync issue, and have been reading for whole two days about callbacks, promises, async/await etc...
But still I cannot figure what should be straight forward and simple thing to do. Am I missing something!
Basically for the code below:
const fs = require('fs');
var readline = require('readline');
function getfile (aprodfile) {
var prodlines = [];
var prodfile = readline.createInterface({input: fs.createReadStream(aprodfile)});
prodfile.on('line', (line) => {prodlines.push(line)});
prodfile.on('close', () => console.log('Loaded '+prodlines.length));
// the above block is repeated several times to read several other
// files, but are omitted here for simplicity
console.log(prodlines.length);
// then 200+ lines that assume prodlines already filled
};
the output I get is:
0
Loaded 11167
whereas the output I expect is:
Loaded 11167
11167
This is because the console.log statement executes before the prodfile.on events are completed.
Is there a nice clean way to tell Node to execute commands sequentially, even if blocking? or better still to tell console.log (and the 200+ lines of code following it) to wait until prodlines is fully populated?
Here's the execution order of what you wrote :
prodfile.on('line', line => { // (1) subscribing to the 'line' event
prodlines.push(line) // (4), whenever the 'line' event is triggered
});
prodfile.on('close', () => { // (2) subscribing to the 'close' event
console.log('Loaded '+prodlines.length) // (5), whenever the 'close' event is triggered
});
console.log(prodlines.length); // (3) --> so it logs 0, nothing has happened yet
What you can do is this :
function getfile (aprodfile) {
var prodlines = [];
var prodfile = readline.createInterface({input: fs.createReadStream(aprodfile)});
prodfile.on('line', line => { prodlines.push(line) });
prodfile.on('close', () => {
console.log('Loaded '+prodlines.length)
finishedFetching( prodlines );
});
};
function finishedFetching( prodlines ) {
console.log(prodlines.length) // 200!
}
Super annoying problem here. Only the first write call on a fs.createWriteStream is working as expected.
I have this code
const errStrmPath = global.testStderrStrmPath = path.resolve(projRoot + '/suman/logs/test-stderr.log');
const strm = global.testStderrStrm = fs.createWriteStream(errStrmPath, {flags: 'w'});
global._writeTestError = function () {
global.checkTestErrorLog = true;
strm.write.apply(strm, arguments);
};
global._writeTestError('abc');
global._writeTestError('efg');
global._writeTestError('hij');
as it stands, it only writes to the log file once. So 'abc' shows up in the file, but 'efg' and 'hij' do not show up. This is a long-lived program, that only exits after some asynchronous actions are taken, why would only the first write to the file happen?
Anyone have any idea what might be wrong?
At the moment, I can accomplish what I want using sync methods, but this is much less ideal.
if I change the implementation of global._writeTestError from this
global._writeTestError = function () {
global.checkTestErrorLog = true;
strm.write.apply(strm, arguments); // async
};
to this
global._writeTestError = function (data) {
global.checkTestErrorLog = true;
fs.appendFileSync(errStrmPath, data); //now sync
};
withe the sync method, everything works as expected. It's very frustrating to not be able to use fs.createWriteStream, and force it to flush it's contents before calling process.exit(). Is there a way to force the stream to flush all data before exiting?
var assert = require('assert');
var parseJSON = require('json-parse-async');
var contact = new Object();
contact.firstname = "Jesper";
contact.surname = "Aaberg";
contact.phone = ["555-0100", "555-0120"];
var contact2 = new Object();
contact2.firstname = "JESPER";
contact2.surname = "AABERG";
contact2.phone = ["555-0100", "555-0120"];
contact.toJSON = function(key) {
var replacement = new Object();
for (var val in this) {
if (typeof(this[val]) === 'string')
replacement[val] = this[val].toUpperCase();
else
replacement[val] = this[val]
}
return replacement;
};
var jsonText = JSON.stringify(contact);
contact = JSON.parse(jsonText);
console.log(contact);
console.log(contact2);
assert.deepEqual(contact, contact2, 'these two objects are the same');
What are the asynchronous equivalent functions of JSON.parse, JSON.stringify and assert.deepEqual? I am trying to create a race condition and non-deterministic behavior within the following code but I have not been able lto find non-blocking, asynchronous equivalents of the functions mentioned above.
node.js does not have an actual asynchronous JSON parser built-in. If you want something that will actually do the parsing outside the main node.js Javascript thread, then you would have to find a third party module that parses the JSON outside of the Javascript thread (e.g. in a native code thread or in some other process). There are some modules in NPM that advertise themselves as asynchronous such as async-json-parser or async-json-parse or json-parse-async. You would have to verify that whichever implementation you were interested in was truly an asynchronous implementation (your Javascript continues to run while the parsing happens in the background).
But, reading the detail in your question about the problem you're trying to solve, it doesn't sound like you actually need a parser that truly happens in the background. To give you your ability to test what you're trying to test, it seems to me like you just need an indeterminate finish that allows other code to run before the parsing finishes. That can be done by wrapping the synchronous JSON.parse() in a setTimeout() with a promise that has a random delay. That will give some random amount of time for other code to run (to try to test for your race conditions). That could be done like this:
JSON.parseAsyncRandom = function(str) {
return new Promise(function(resolve, reject) {
// use a random 0-10 second delay
setTimeout(function() {
try {
resolve(JSON.parse(str));
} catch(e) {
reject(e);
}
}, Math.floor(Math.random() * 10000));
});
}
JSON.parseAsyncRandom(str).then(function(obj) {
// process obj here
}, function(err) {
// handle err here
});
Note: This is not true asynchronous execution. It's an asynchronous result (in that it arrives some random time later and other code will run before the result arrives), but true asynchronous execution happens in the background in parallel with other JS running and this isn't quite that. But, given your comment that you just want variable and asynchronous results for testing purposes, this should do that.
I've recently faced this problem myself, so I decided to create a library to handle JSON parsing in a really asynchronous way.
The idea behind it is to divide the parsing process into chunks, and then run each separately in the event loop so that other events (user interactions, etc) can still be evaluated within a few milliseconds, keeping the UI interactive.
If you are interested, the library it's called RAJI and you can find it here: https://github.com/federico-terzi/raji
After installing RAJI, you can then convert your synchronous JSON.parse calls into async raji.parse calls, such as:
const object = await parse(payload);
These calls won't block the UI
You can use 'bluebird', like this example to convert calling function to promise.
I write code below using javascript es6.
const Promise = require('bluebird')
function stringifyPromise(jsonText) {
return Promise.try(() => JSON.stringify(jsonText))
}
function parsePromise(str) {
return Promise.try(() => JSON.parse(str))
}
stringifyPromise(contact)
.then(jsonText => parsePromise(jsonText))
.then(contact => {
assert.deepEqual(contact, contact2, 'these two objects are the same')
})
})
I have a PhantomJS/CasperJS script which I'm running from within a node.js script using process.spawn(). Since CasperJS doesn't support require()ing modules, I'm trying to print commands from CasperJS to stdout and then read them in from my node.js script using spawn.stdout.on('data', function(data) {}); in order to do things like add objects to redis/mongoose (convoluted, yes, but seems more straightforward than setting up a web service for this...) The CasperJS script executes a series of commands and creates, say, 20 screenshots which need to be added to my database.
However, I can't figure out how to break the data variable (a Buffer?) into lines... I've tried converting it to a string and then doing a replace, I've tried doing spawn.stdout.setEncoding('utf8'); but nothing seems to work...
Here is what I have right now
var spawn = require('child_process').spawn;
var bin = "casperjs"
//googlelinks.js is the example given at http://casperjs.org/#quickstart
var args = ['scripts/googlelinks.js'];
var cspr = spawn(bin, args);
//cspr.stdout.setEncoding('utf8');
cspr.stdout.on('data', function (data) {
var buff = new Buffer(data);
console.log("foo: " + buff.toString('utf8'));
});
cspr.stderr.on('data', function (data) {
data += '';
console.log(data.replace("\n", "\nstderr: "));
});
cspr.on('exit', function (code) {
console.log('child process exited with code ' + code);
process.exit(code);
});
https://gist.github.com/2131204
Try this:
cspr.stdout.setEncoding('utf8');
cspr.stdout.on('data', function(data) {
var str = data.toString(), lines = str.split(/(\r?\n)/g);
for (var i=0; i<lines.length; i++) {
// Process the line, noting it might be incomplete.
}
});
Note that the "data" event might not necessarily break evenly between lines of output, so a single line might span multiple data events.
I've actually written a Node library for exactly this purpose, it's called stream-splitter and you can find it on Github: samcday/stream-splitter.
The library provides a special Stream you can pipe your casper stdout into, along with a delimiter (in your case, \n), and it will emit neat token events, one for each line it has split out from the input Stream. The internal implementation for this is very simple, and delegates most of the magic to substack/node-buffers which means there's no unnecessary Buffer allocations/copies.
I found a nicer way to do this with just pure node, which seems to work well:
const childProcess = require('child_process');
const readline = require('readline');
const cspr = childProcess.spawn(bin, args);
const rl = readline.createInterface({ input: cspr.stdout });
rl.on('line', line => /* handle line here */)
Adding to maerics' answer, which does not deal properly with cases where only part of a line is fed in a data dump (theirs will give you the first part and the second part of the line individually, as two separate lines.)
var _breakOffFirstLine = /\r?\n/
function filterStdoutDataDumpsToTextLines(callback){ //returns a function that takes chunks of stdin data, aggregates it, and passes lines one by one through to callback, all as soon as it gets them.
var acc = ''
return function(data){
var splitted = data.toString().split(_breakOffFirstLine)
var inTactLines = splitted.slice(0, splitted.length-1)
var inTactLines[0] = acc+inTactLines[0] //if there was a partial, unended line in the previous dump, it is completed by the first section.
acc = splitted[splitted.length-1] //if there is a partial, unended line in this dump, store it to be completed by the next (we assume there will be a terminating newline at some point. This is, generally, a safe assumption.)
for(var i=0; i<inTactLines.length; ++i){
callback(inTactLines[i])
}
}
}
usage:
process.stdout.on('data', filterStdoutDataDumpsToTextLines(function(line){
//each time this inner function is called, you will be getting a single, complete line of the stdout ^^
}) )
You can give this a try. It will ignore any empty lines or empty new line breaks.
cspr.stdout.on('data', (data) => {
data = data.toString().split(/(\r?\n)/g);
data.forEach((item, index) => {
if (data[index] !== '\n' && data[index] !== '') {
console.log(data[index]);
}
});
});
Old stuff but still useful...
I have made a custom stream Transform subclass for this purpose.
See https://stackoverflow.com/a/59400367/4861714
#nyctef's answer uses an official nodejs package.
Here is a link to the documentation: https://nodejs.org/api/readline.html
The node:readline module provides an interface for reading data from a Readable stream (such as process.stdin) one line at a time.
My personal use-case is parsing json output from the "docker watch" command created in a spawned child_process.
const dockerWatchProcess = spawn(...)
...
const rl = readline.createInterface({
input: dockerWatchProcess.stdout,
output: null,
});
rl.on('line', (log: string) => {
console.log('dockerWatchProcess event::', log);
// code to process a change to a docker event
...
});