I'm creating a pomodoro timer with node.
At the moment I start the project as so node start.js coding
I let that run as I do my work. When I need a break I terminate the process and the time I've spent coding get's added into a JSON file as so
{
"code": [
{
"seconds": 1,
"time": "00 : 00 : 01",
"date": "2020-06-28T03:08:42.340Z"
}
],
"read": [],
"write": []
}
Now I'm just trying to think of things I'll need in the future, I'll most definitely forget the keys in the above object. Is it code, coding, write, writing? So I thought I'd have a prompt.
var objKeys = [...Object.keys(obj), 'info']
const inputVariable = objKeys[readline.keyInSelect(objKeys, 'What are you going to be working on?')]
As it is, once I make the selection, the process terminates, I don't want that.
(I could make the selection when I want to actually terminate, but most likely that would be confusing)
Is there a way to make the selection and still keep the process running
EDIT
const time = require('./module/timeEntry');
var readline = require('readline-sync');
var obj = require('./data.json') // has the above json code
var start = process.hrtime(); // start the timer
var objKeys = Object.keys(obj)
const inputVariable = objKeys[readline.keyInSelect(objKeys, 'Which task you're going to work on?')]
function exitHandler(options, exitCode) {
if (options.cleanup) console.log('clean');
if (exitCode || exitCode === 0) {
//code
if(inputVariable !== "info"){
time.timeEntry(obj, start, inputVariable) // reads and writes to file
}
}
if (options.exit) process.exit();
}
process.on('exit', exitHandler.bind(null,{cleanup:true}));
// I want to be able to do this: `ctrl+c`
process.on('SIGINT', exitHandler.bind(null, {exit:true}));
The problem is the moment the inputVariable is entered by the user, the process ends.
time.timeEntry(obj, start, inputVariable) simply reads and writes some time keeping info into JSON.
var fs = require('fs');
var getTime = require('./getTime')
const timeEntry = (obj, start, segment ) => {
let totalSeconds = process.hrtime(start)[0];
obj[segment].push({
seconds: totalSeconds,
time: getTime.getTime(totalSeconds),
date: new Date
})
let data = JSON.stringify(obj);
fs.writeFileSync('data.json', data , 'utf-8');
}
exports.timeEntry = timeEntry;
I don't need to use readline-sync, if I instead use const inputVariable = process.argv[2] and run node start.js coding, the process wouldn't be terminated which is what I want
Here's what's your code is doing right now:
read in the inputVariable from the user
define a function exitHandler
tell Node to invoke exitHandler when the program exits
tell Node to invoke exitHandler upon receiving SIGINT
Note that none of these steps involves calling exitHandler, or doing anything else for that matter (e.g. there's no code here that waits for anything to happen).
Perhaps the confusion is coming from the use of process.on: this function tells Node that when it receives SIGINT (e.g. you press Ctrl-C), then it should call exitHandler. In particular, this does not tell your program "pause execution until SIGINT is received". As a result, after calling process.on, your code "continues" but there's no more code to run so the process ends (before it can ever receive SIGINT).
It seems like you want your program to do nothing until the signal is received; in that case you could add
while (true) {}
or similar at the end. So your code will do nothing (forever), until it receives SIGINT, then it will call exitHandler.
Related
I have a electron application that opens a external program (in my case Office), and has to wait for the program to be closed.
the code I wrote works great but sometimes the child_process.on('close') event is fired 10 or 20 seconds after the program has closed. The code is:
const cp = require("child_process");
child = cp.spawn(path/to/Office.exe + ' "' + path/to/myFile.pptx + '"', {shell: true});
child.on('close', function (code) {
//do something
});
Most of the time it reacts after 1 or 2 seconds which is fine, but sometimes it takes up to 20 seconds until I receive the close event. The program closes fast (according to the task manager), but node seems to wait for something.
I also tried child.on('exit'), calling the program with cp.exec()and using the options.stdio: ignore for spawn, as I thought maybe node is waiting for some stream from the child. But that made no difference.
Does anybody know a safe way to speed that process up?
I have tried your code and the close event triggers with a 0.5-2s delay, bearable i would say.
However, the 20s delay did not occur, but if this problem still persists on your end, you can try the approach below, which consists in checking the spawn pid.
const pidExists = (pid) => {
let pidOk = true;
try {
process.kill(pid, 0);
} catch (e) {
pidOk = false;
}
return pidOk;
};
const cp = require("child_process");
// I added the detach option because we won't need that process anymore since we're following the PID.
let child = cp.spawn(path/to/Office.exe + ' "' + path/to/myFile.pptx + '"', {shell: true, detach: true});
let officePID = child.pid; // this is the spawn pid
setInterval(()=>{
if( pidExists(officePID)){
console.log('file is still open', new Date().getTime());
}else{
console.log('file was closed', new Date().getTime());
process.exit(0);
}
}, 500);
This is a better approach since you said that the task manager shows you that the program was closed.
I'm trying to send messages from my child process to main my process but some chunks are not being sent, possibly because the file is too big.
main process:
let response = ''
let error = ''
await new Promise(resolve => {
const p = exec(command)
p.stdout.on('data', data => {
// this gets triggered many times because the html string is big and gets split up
response += data
})
p.stderr.on('data', data => {
error += data
})
p.on('exit', resolve)
})
console.log(response)
child process:
// only fetch 1 page, then quit
const bigHtmlString = await fetchHtmlString(url)
process.stdout.write(bigHtmlString)
I know the child process works because when I run the it directly, I can see the end of the file in in the console. But when I run the main process, I cannot see the end of the file. It's quite big so I'm not sure exactly what chunks are missing.
edit: there's also a new unknown problem. when I add a wait at the end of my child process, it doesn't wait, it closes. So I'm guessing it crashes somehow? I'm not seeing any error even with p.on('error', console.log)
example:
const bigHtmlString = await fetchHtmlString(url)
process.stdout.write(bigHtmlString)
// this never gets executed, the process closes. The wait works if I launch the child process directly
await new Promise(resolve => setTimeout(resolve, 1000000))
process.stdout.write(...) returns true/false depending on whether it wrote the string or not. If it returns false, you can listen() to the drain event to make sure it finishes.
Something like this:
const bigHtmlString = await fetchHtmlString(url);
const wrote = process.stdout.write(bigHtmlString);
if (!wrote){
// this effectively means "wait for this
// event to fire", but it doesn't block everything
process.stdout.on('drain', ...doSomethingHere)
}
My suggestion from the comments resolved the issue so I'm posting it as an answer.
I would suggest using spawn instead of exec. The latter buffers the output and flushes it when the process is ended (or the buffer is full) while spawn is streaming the output which is better for huge output like in your case
The cli could not receive keyboard input after execution, and this also includes 'ctrl-c' and 'ctrl-z', thus you have to manually exit the program. It gave me a lot of trouble, please take a look at it;
var { exec, spawn } = require("child_process");
let cmd = (cmdline, consolelog = true) => {
return new Promise((resolve, reject) => {
let cmdarray = cmdline.split(" ");
let result = "";
let error = "";
let child = spawn(cmdarray.shift(), cmdarray);
process.stdin.pipe(child.stdin);
child.stdout.setEncoding("utf8");
child.stderr.setEncoding("utf8");
child.stderr.on("data", data => {
if (consolelog) process.stdout.write(data.toString());
error = data.toString();
});
child.stdout.on("data", data => {
if (consolelog) process.stdout.write(data.toString());
result = data.toString();
});
child.on("close", code => {
if (consolelog) process.stdout.write(`Exit code: ${code}\n`);
code == 0 ? resolve(result) : reject(error);
});
});
};
OS: osx & ubuntu 19.04
Test case:
cmd("echo hi");
Edit:
Normal circumstances : put the code inside myprogram.js and use node myprogram.js to activate the script. It works perfectly, and you can also try different commands. HOWEVER, if you put following code by using
$ node
> let cmd = require(PATH_TO_CMD_FUNCTION)
> cmd("echo hi");
The node-cli will freeze and stop listening to your keyboard input.
Edit 2:
Found out, you need to channel through {stdio: "inherit"}
UPDATED ANSWER:
I trimmed down your spawner a little in order to be succinct, and eliminate any other possibilities. There is one common test case I could find to reproduce stated issue regarding signals, keyboard shortcuts, and trapped input.
If you spawn the 'sh' command, you will not be able to escape from the spawned process by means of conventional signal keyboard shortcuts. This is because node.js "traps" the input and forwards it directly to the spawned process.
Most processes allow killing via signalling through keyboard shortcuts such as CTRL-C. 'sh', however, does not-- and so is a perfect example.
The only ways to exit are to use the 'exit' command, close the window (which may possibly leave the spawned process running in the background), reboot your machine, etc. Also, internally or by other means sending a signal, but not via stdin or equivalent.
Your CTRL-C input, in other words, is "normally working" not because it is killing your node app, but because it is being forwarded to the spawned process and killing it.
The spawned process will continue to trap your input if it is immune.
require("child_process").spawn("sh", {
shell: true,
encoding: 'utf8',
stdio: [0,1,2]
});
This may not be the best example for your specific program, but it illustrates the principle, which is the closest I can come since I cannot replicate with the given test case (I have tried it on my phone, my laptop, and my cloud server, three different versions of node, two different versions of Ubuntu).
In any case, it sounds like your stdin is not being "let go" by the spawned process. You may need to "reassign" it to the original process.stdin .
As stated here:
https://node.readthedocs.io/en/latest/api/child_process/
Also, note that node establishes signal handlers for 'SIGINT' and
'SIGTERM', so it will not terminate due to receipt of those signals,
it will exit.
PREVIOUSLY:
It looks like your cmd function is only getting one argument (the command itself) due to the split and shift. Spawn expects a string with the whole command, so likely it is only getting "echo" without "hi", so it isn't exiting due to hanging on "echo". May need to append a newline ("\n") as well.
It also may help to nest the command in an sh command that then executes it, so it runs in a shell.
Like this:
var { exec, spawn } = require("child_process");
let cmd = (cmdline, consolelog = true) => {
return new Promise((resolve, reject) => {
let result = "";
let error = "";
// This. Note the shell option.
let child = spawn(cmdline, {shell:true});
process.stdin.pipe(child.stdin);
child.stdout.setEncoding("utf8");
child.stderr.setEncoding("utf8");
child.stderr.on("data", data => {
if (consolelog) process.stdout.write(data.toString());
error = data.toString();
});
child.stdout.on("data", data => {
if (consolelog) process.stdout.write(data.toString());
result = data.toString();
});
child.on("close", code => {
if (consolelog) process.stdout.write(`Exit code: ${code}\n`);
code == 0 ? resolve(result) : reject(error);
});
});
};
cmd("echo hello");
Output:
hello
Exit code: 0
I am experimenting with the event loop. First I begin with this straightforward code to read and print the contents of a file:
var fs = require('fs');
var PATH = "./.gitignore";
fs.readFile(PATH,"utf-8",function(err,text){
console.log("----read: "+text);
});
Then I place it into an infinite loop. In this case, the readFile function is never executed. If I am not mistaken it's because Node's single thread is busy iterating without letting I/O calls be executed.
while(true){
var fs = require('fs');
var PATH = "./.gitignore";
fs.readFile(PATH,"utf-8",function(err,text){
console.log("----read: "+text);
});
}
So, I would like to do something so that I/O calls are assigned process time intertwined with the loop. I tried with process.nextTick() but it doesn't work:
while(true){
process.nextTick(function(){
fs.readFile(PATH,"utf-8",function(err,text){
console.log("----read: "+text)
});
});
}
Why isn't it working and how could I make it?
Because your while loop is still running. It's just infinitely adding things to do in the next tick. If you let it go, your node process will crash as it runs out of memory.
When you work with async code, your normal loops and control structures tend to trip you up. The reason is that they execute synchronously in one step of the event loop. Until something happens that yields control to the event loop again, nothing 'nextTick' will happen.
Think of it like this, You are in Pass B of the event loop when your code runs. When you call
process.nextTick(function foo() { do.stuff(); })'
you are adding the foo to the list of 'things to do before you start pass C of the event loop.' Every time you call nextTick, you add one more thing to the list, but none of them will run until the synchronous code is done.
What you need to do instead is create 'do the next thing' links in your callbacks. Think linked-lists.
// var files = your list of files;
function do_read(count) {
var next = count+1;
fs.readFile(files[count], "utf-8", function(err,text) {
console.log("----read: " + text);
if (next < files.length) {
// this doesn't run until the previous readFile completes.
process.nextTick(function() { do_read(next) });
}
});
}
// kick off the first one:
do_read(files[0], 0);
(obviously this is a contrived example, but you get the idea)
This causes each 'next file' to be added to the 'nextTick' to-do queue only after the previous one has been fully processed.
TL;DR: Most of the time, you don't want to start it doing the next thing until the previous thing is completed
Hope that helps!
In a node program I'm reading from a file stream with fs.createReadStream. But when I pause the stream the program exits. I thought the program would keep running since the file is still opened, just not being read.
Currently to get it to not exit I'm setting an interval that does nothing.
setInterval(function() {}, 10000000);
When I'm ready to let the program exit, I clear it. But is there a better way?
Example Code where node will exit:
var fs = require('fs');
var rs = fs.createReadStream('file.js');
rs.pause();
Node will exit when there is no more queued work. Calling pause on a ReadableStream simply pauses the data event. At that point, there are no more events being emitted and no outstanding work requests, so Node will exit. The setInterval works since it counts as queued work.
Generally this is not a problem since you will probably be doing something after you pause that stream. Once you resume the stream, there will be a bunch of queued I/O and your code will execute before Node exits.
Let me give you an example. Here is a script that exits without printing anything:
var fs = require('fs');
var rs = fs.createReadStream('file.js');
rs.pause();
rs.on('data', function (data) {
console.log(data); // never gets executed
});
The stream is paused, there is no outstanding work, and my callback never runs.
However, this script does actually print output:
var fs = require('fs');
var rs = fs.createReadStream('file.js');
rs.pause();
rs.on('data', function (data) {
console.log(data); // prints stuff
});
rs.resume(); // queues I/O
In conclusion, as long as you are eventually calling resume later, you should be fine.
Short way based on answers below
require('fs').createReadStream('file.js').pause();