get response from CasperJS in Node.JS - node.js

I have a script which is calling another CasperJS script.
var exec = require('child_process').exec;
exec("casperjs webCheck/thisWebCheck.js",puts);
I need a way to get a response from this script back to the calling script. It just has to be a Boolean value, just yes the webcheck worked, or no the website is down. Then depending on that response I will execute one piece of code or another.
I have searched all through web posts, blog posts and stack exchange for an answer to this and have come up empty.
When I try to use the exit code, it never comes back with what I told it to exit with.
When i use STDOUT and try to validate against it such as
thisWebCheck.js
...
if(failed) {
console.log("failed");
} else {
console.log("success")
}
Main.js
var puts = function(error, stdout, stderr){
if (stdout == "failed"){
doSomething();
} else if (stdout == "success") {
doSomethingElse();
} else {
console.log(stdout);
}
};
The main.js will console.log() the stdout but won't validate against the strings.
TL/DR:
I just need some way to communicate between the two scripts. Also, if there is a better way to call the casper file, let me know.

Related

How to send input to child process created with spawn? nodejs

I'm running Windows 10, and I have a program, let's call it program, that can be run from the command line. When run, it responds to commands the user enters. The user enters a command, presses the return key, and the program prints a response. I did not make this program and do not have the source, so I cannot modify it.
I want to run this program from within Node.js, and have my Node.js program act as the user, sending it commands and getting the responses. I spawn my program like this:
var spawn = require('child_process').spawn;
var child = spawn('program');
child.stdout.on('data', function(data) {
console.log(`stdout: ${data}`);
});
Then I attempt to send it a command, for example, help.
child.stdin.write("help\n");
And nothing happens. If I manually run the program, type help, and press the return key, I get output. I want Node.js to run the program, send it input, and receive the output exactly as a human user would. I assumed that stdin.write() would send the program a command as if the user typed it in the console. However, as the program does not respond, I assume this is not the case. How can I send the program input?
I've seen many similar questions, but unfortunately the solutions their authors report as "working" did not work for me.
Sending input data to child process in node.js
I've seen this question and answer and tried everything in it with no success. I've tried ending the command with \r\n instead of \n. I've also tried adding the line child.stdin.end() after writing. Neither of these worked.
How to pass STDIN to node.js child process
This person, in their self-answer, says that they got theirs to work almost exactly as I'm doing it, but mine does not work.
Nodejs Child Process: write to stdin from an already initialised process
This person, in their self-answer, says they got it to work by writing their input to a file and then piping that file to stdin. This sounds overly complicated to send a simple string.
This worked for me, when running from Win10 CMD or Git Bash:
console.log('Running child process...');
const spawn = require('child_process').spawn;
const child = spawn('node');
// Also worked, from Git Bash:
//const child = spawn('cat');
child.stdout.on('data', (data) => {
console.log(`stdout: "${data}"`);
});
child.stdin.write("console.log('Hello!');\n");
child.stdin.end(); // EOF
child.on('close', (code) => {
console.log(`Child process exited with code ${code}.`);
});
Result:
D:\Martin\dev\node>node test11.js
Running child process...
stdout: "Hello!
"
Child process exited with code 0.
I also tried running aws configure like this, first it didn't work because I sent only a single line. But when sending four lines for the expected four input values, it worked.
Maybe your program expects special properties for stdin, like being a real terminal, and therefore doesn't take your input?
Or did you forget to send the EOF using child.stdin.end();? (If you remove that call from my example, the child waits for input forever.)
Here is what worked for me. I have used child_process exec to create a child process. Inside this child process Promise, I am handling the i/o part of the cmd given as parameter. Its' not perfect, but its working.
Sample function call where you dont need any human input.
executeCLI("cat ~/index.html");
Sample function call where you interact with aws cli. Here
executeCLI("aws configure --profile dev")
Code for custom executeCLI function.
var { exec } = require('child_process');
async function executeCLI(cmd) {
console.log("About to execute this: ", cmd);
var child = exec(cmd);
return new Promise((resolve, reject) => {
child.stdout.on('data', (data) => {
console.log(`${data}`);
process.stdin.pipe(child.stdin);
});
child.on('close', function (err, data) {
if (err) {
console.log("Error executing cmd: ", err);
reject(err);
} else {
// console.log("data:", data)
resolve(data);
}
});
});
}
Extract the user input code from browser and save that code into a file on your system using fs module. Let that file be 'program.cpp' and save the user data input in a text file.
As we can compile our c++ code on our terminal using g++ similarly we will be using child_process to access our system terminal and run user's code.
execFile can be used for executing our program
var { execFile } = require('child_process');
execFile("g++", ['program.cpp'], (err, stdout, stderr) => {
if (err) {
console.log("compilation error: ",err);
} else{
execFile ('./a.out' ,['<', 'input.txt'], {shell: true}, (err, stdout, stderr) => {
console.log("output: ", stdout);
})
}
})
In this code we simply require the child_process and uses its execFile function.
First we compile the code present in program.cpp, which creates a default a.out as output file
Then we pass the a.out file with input that is present in input.txt
Hence you can view the generated output in your terminal and pass that back to the user.
for more details you can check: Child Processes

Binding custom node addon to lua with extentions

I have a large collection of asynchronous functions that I have in nodejs code that I would like to expose to lua. The basic idea is that I would like to execute lua scripts and allow those scripts to call back into some of my nodejs code, as well as asynchronously return a value from an executed lua script,
In this example myCustomNodejsAddon would be a custom addon that I write that knows how to bind lua and run lua scripts. One outstanding question is how do I asynchronously return a value from a lua script?
Has anyone done something like this before? I would be very interested in any pointers, thoughts, examples.
EDIT with better example:
-- user written lua script
getUser(1, function(err, user)
if err then
print('Error', err)
else
print('Found user with id', user.id)
return ''
end
end)
/*Create object with mapping of async functions*/
var callbacks = {
"getUser": function(userId, cb) {
db.Users.fetchById(userId).then(function(user) {
cb(null, user);
}, function(err) {
cb(err, null);
}
}
};
myCustomNodejsAddon.provideCallbacks(callbacks);
/* user written lua script has been stored into `scriptSrc` variable */
myCustomNodejsAddon.execute(scriptSrc, function(returnValueOfScript) {
console.log('done running user script: ', retrunValueOfScript);
});
More than one approaches to this problem comes to my mind.
The first would be to create a nodejs script that once executed read the program command line arguments or input stream and execute the code indicated by this channel and stream the response back in JSON format for example. This is the less invasive way of doing this. The script would be something like:
if(require.main === module){
// asume first argument to be the source module for the function of iterest
var mod = require(process.argv[2]);
var fnc = mod[process.argv[3]];
args = process.argv.slice(4);
// by convention the last argument is a callback function
args.push(function(){
console.log(JSON.stringify(arguments));
process.exit();
})
fnc.apply(null, args);
}
An example usage will be:
$ node my-script.js fs readdir /some/path
This will respond with something like [null, ['file1', 'file2']] acording with the files on /some/path. Then you can create a lua module that invoque node with this script and pass the parameters according with the functions you want to call.

Exiting a process after a database call in Node?

I'm experimenting with calling a database from Node, and am using the following client.execute() sample code
socket.on('send', function(data){
client.execute('SELECT * FROM db.main', [], function(err, result) {
if (err) {
//do something
} else {
for (var i = 0; i < result.rows.length; i++) {
console.log('id=' + result.rows[i].get('topic_id'));
}
process.exit(0);
}
});
});
As seen above, I'm running this code inside a socket.io listener method. However, the server stops whenever it is executed. On the other hand, when I remove 'process.exit(0)', things seem to run just fine.
So is that line necessary?
The line: process.exit(0); will exit your program, i guess it was put there for debugging purpose or smth.
You generally should never need to manually call process.exit(0). If there is nothing left to do, the process will exit naturally.

node, grunt custom task using stream

Hi can anyone help me with a custom grunt task?
I'm basically using child_process to call
var cp = exec(cmd, {}, function (err, stdout, stderr) {}
this of course has the stdout etc. If I
cp.stdout.pipe(process.stdout);
it rights a bunch of stuff to the console. I am making this call in a loop so here's a skeleton
this.files.forEach(function(f) {
var src = f.src.map(function(filepath) {
var cmd = util.format("%s %s", options.xUnit, src);
var cp = exec(cmd, {}, function (err, stdout, stderr) {
....
cb();
});
if (options.stdout || grunt.option('verbose')) {
cp.stdout.pipe(process.stdout);
}
This all works fine. but since I'm doing it several times I want at the very least to be able to index how many times stderr has a value. I'm basically running collections of tests. If all pass then stderr is blank if not, I don't want to stop the rest of the tests I want them all to run, but i'd like to output at the end "you have 3 collections that have failing tests". I don't see why I can't seem to get it to work. If I declare
var output = 0;
then
if (options.stderr || grunt.option('verbose')) {
cp.stdout.pipe(process.stdout);
output ++;
}
I then try to print to console after they all run but it's always 0.
I would really like to some how get a copy of ALL the output and parse out the last line from each test session which has the number of tests that failed in it. But that just seems way out of realm of possiblity. I've tried all manner of stuff with the streams and got less than nothing.
Anyway, if anyone can help I would greatly appreciate it

node.js file system problems

I keep banging my head against the wall because of tons of different errors. This is what the code i try to use :
fs.readFile("balance.txt", function (err, data) //At the beginning of the script (checked, it works)
{
if (err) throw err;
balance=JSON.parse(data);;
});
fs.readFile("pick.txt", function (err, data)
{
if (err) throw err;
pick=JSON.parse(data);;
});
/*....
.... balance and pick are modified
....*/
if (shutdown)
{
fs.writeFile("balance2.txt", JSON.stringify(balance));
fs.writeFile("pick2.txt", JSON.stringify(pick));
process.exit(0);
}
At the end of the script, the files have not been modified the slightest. I then found out on this site that the files were being opened 2 times simultaneously, or something like that, so i tried this :
var balance, pick;
var stream = fs.createReadStream("balance.txt");
stream.on("readable", function()
{
balance = JSON.parse(stream.read());
});
var stream2 = fs.createReadStream("pick.txt");
stream2.on("readable", function()
{
pick = JSON.parse(stream2.read());
});
/****
****/
fs.unlink("pick.txt");
fs.unlink("balance.txt");
var stream = fs.createWriteStream("balance.txt", {flags: 'w'});
var stream2 = fs.createWriteStream("pick.txt", {flags: 'w'});
stream.write(JSON.stringify(balance));
stream2.write(JSON.stringify(pick));
process.exit(0);
But, this time, both files are empty... I know i should catch errors, but i just don't see where the problem is. I don't mind storing the 2 objects in the same file, if that can helps. Besides that, I never did any javascript in my life before yesterday, so, please give me a simple explanation if you know what failed here.
What I think you want to do is use readFileSync and not use readFile to read your files since you need them to be read before doing anything else in your program (http://nodejs.org/api/fs.html#fs_fs_readfilesync_filename_options).
This will make sure you have read both the files before you execute any of the rest of your code.
Make your like code do this:
try
{
balance = JSON.parse(fs.readFileSync("balance.txt"));
pick = JSON.parse(fs.readFileSync("pick.txt"));
}
catch(err)
{ throw err; }
I think you will get the functionality you are looking for by doing this.
Note, you will not be able to check for an error in the same way you can with readFile. Instead you will need to wrap each call in a try catch or use existsSync before each operation to make sure you aren't trying to read a file that doesn't exist.
How to capture no file for fs.readFileSync()?
Furthermore, you have the same problem on the writes. You are kicking off async writes and then immediately calling process.exit(0). A better way to do this would be to either write them sequentially asynchronously and then exit or to write them sequentially synchronously then exit.
Async option:
if (shutdown)
{
fs.writeFile("balance2.txt", JSON.stringify(balance), function(err){
fs.writeFile("pick2.txt", JSON.stringify(pick), function(err){
process.exit(0);
});
});
}
Sync option:
if (shutdown)
{
fs.writeFileSync("balance2.txt", JSON.stringify(balance));
fs.writeFileSync("pick2.txt", JSON.stringify(pick));
process.exit(0);
}

Resources