not running cron with casperjs - cron

sh_run.sh file..
#!/bin/bash
PHANTOMJS_EXECUTABLE=/usr/local/bin/phantomjs /usr/local/bin/casperjs /home/test/html/run/site_check.js
Setting in crontab..
# cat /etc/crontab
45 0 * * * root sh /home/test/html/run/sh_run.sh
but casperjs not running..
crontab status is Rl..
what is Rl ??
# ps ax|grep phantomjs
28155 ? Rl 0:18 /usr/local/bin/phantomjs /usr/local/casperjs/bin/bootstrap.js --casper-path=/usr/local/casperjs --cli /home/test/html/run/site_check.js
of course..
# casperjs site_check.js
is running..
add comment...
# sh sh_run.sh &
# ps ax|grep phantomjs
1625 pts/0 Sl 0:01 /usr/local/bin/phantomjs /usr/local/casperjs/bin/bootstrap.js --casper-path=/usr/local/casperjs --cli /home/test/html/run/site_check.js
is running...
if ps status is Sl, data change.. (namely running..)
but ps status is Rl, data not change.. run by cron, status is always Rl.
Rl status does not change.
what problem? plz..
help me..

Related

Why the node child spawn process detached from parent process and start running independently ?

I have a node child spawn process which is continuously write in to the file using writestream on every "data event" received. The script is run under ssh and facing an edge case problem.
Consider multiple terminals open with same ssh host and the script is start to run in one terminal. Someone accidentally close the ssh terminal during it's execution and doesn't want to stop child process execution. while using a command ps -ef | grep "command name" the child process still running with different parent process id (it shows 1) but the writestream in the child process stops writing to the file. It seems like child process become zombie process eventhough i detached the process from parent. You can find the script below:
var execSpawn = require('child_process').spawn;
var Promise = require('bluebird');
var spawnAction = function(path, cmd, cb){
return function(resolve, reject, onCancel){
cmdExec = execSpawn(path, cmd, {detached: true});
//cmdExec = execSpawn(path, cmd, {detached: true}).unref();
var fileData = {}
var count = 0;
var stream = fs.createWriteStream('filepath');
cmdExec.stdout.setEncoding('utf8');
cmdExec.stdout.on('data', function(data){
//Certain actions with filedata and count;
stream.write(data);
});
cmdExec.stderr.on('data', function(data){
//some actions
stream.write("error");
});
cmdExec.on('close', function(){
stream.end();
if(cb){
resolve(cb(fileData));
}else{
resolve(count);
}
});
}
}
This script is running properly when it is allow to run completely without any interruption. When the script execution terminal closes the child process stop the writestream to the file. If i try with detach along with unref() it throws an error like it couldn't figure out the event stdout.on over the child process.
Cannot read property 'stdout' of undefined
More information during the script running. This is taken in the same host in different terminal
ps -ef | grep command_name
root 19904 19191 0 20:16 ? 00:00:00 cli command_name pfitzner7 /dev/sdb
root 19905 19191 0 20:16 ? 00:00:00 cli command_name pfitzner7 /dev/sdc
root 19906 19191 0 20:16 ? 00:00:00 cli command_name pfitzner7 /dev/sdd
root 19907 19191 0 20:16 ? 00:00:00 cli command_name pfitzner7 /dev/sde
root 23101 13105 0 20:16 pts/0 00:00:00 grep --color=auto command_name
After closing the script running terminal before it finishes. I am getting this:
ps -ef | grep command_name
root 19904 1 0 20:16 ? 00:00:00 cli command_name pfitzner7 /dev/sdb
root 19905 1 0 20:16 ? 00:00:00 cli command_name pfitzner7 /dev/sdc
root 19906 1 0 20:16 ? 00:00:00 cli command_name pfitzner7 /dev/sdd
root 19907 1 0 20:16 ? 00:00:00 cli command_name pfitzner7 /dev/sde
root 23163 13105 0 20:16 pts/0 00:00:00 grep --color=auto command_name
I am trying to figure out why it's happening this issue and will it be there any possible ways to do in different way. Why did the parent process id changed to 1 even if it's detached child process? How can a child spawn process can run independently from the parent process?
Please let me know your suggestions on this approach or the reason for the error.
Thanks in advance.
I think, you don't need to detach your childrens. The childrens are spawned asynchronously, so you may manipulate with a few child's stdio at the same time

pgrep in linux only identify 15 bytes proc name

in my Linux , while I run shell : ps -ef | grep Speed , I got the following :
myid 143410 49092 0 10:21 pts/12 00:00:00 ./OutSpeedyOrderConnection
myid 145492 49053 0 10:35 pts/11 00:00:00 ./SpeedyOrderConnection
That means , the pid of these 2 process are 143410 and 145492 .
Then I run shell : pgrep -l Speed , I got the following :
143410 OutSpeedyOrderC
145492 SpeedyOrderConn
and I run shell : pgrep OutSpeedyOrderC , I got :
143410
pgrep OutSpeedyOrderCo will get nothing !!!!!
look like pgrep will only identify 15 bytes of processname ,
anything I can do to get the right answer while I run
pgrep OutSpeedyOrderConnection ?!

Setting a cron job for python script

I want to run the following command for cron job
python test.py -sau 0 -bg 200000 -t mcs3245 > g2g.log
I have setup a cron job like below
5 0 * * * /local/mnt/workspace/username/scripts/python test.py -sau 0 -bg 200000 -t mcs3245 > g2g.log
am getting the following error
/bin/sh: /local/mnt/workspace/username/scripts/python: No such file or directory
Can anyonehelp on what is wrong and how to set this up?
Unless /local/mnt/workspace/username/scripts/ is the directory of your python installation I would suggest something like this:
5 0 * * * /usr/bin/python /path/to/script/test.py -sau 0 -bg 200000 -t mcs3245 > g2g.log
If you want to execute the script as user USERNAME:
5 0 * * * USERNAME /usr/bin/python /path/to/script/test.py -sau 0 -bg 200000 -t mcs3245 > g2g.log
Found that last one here on superuser.com.

Cron generated by ISPConfig

I have a problem with crontab generated by ISPConfig.
MAILTO=''
* * * * * web9 /usr/bin/wget -q -O /dev/null 'http://inz.isedo.pl/test/cron.php' >/dev/null 2>&1 #inz.isedo.pl
In log, I have a errors:
Feb 16 21:11:01 s /usr/sbin/cron[21697]: (*system*ispc_web9) RELOAD (/etc/cron.d/ispc_web9)
Feb 16 21:11:01 s /USR/SBIN/CRON[23817]: (web9) CMD (/usr/bin/wget -q -O /dev/null 'http://inz.isedo.pl/test/cron.php' >/dev/null 2>&1^I#inz.isedo.pl)
Feb 16 21:11:01 s /USR/SBIN/CRON[23816]: (CRON) error (grandchild #23817 failed with exit status 1)
Does it work from the command line ?
Can you redirect the stdout and stderr to a file (as opposed to null) and pass on the output ?

Why shell_exec executing more than 1 process?

I dont understand why there is more than 1 process when I run run.php once from a browser
In the PHP code, I have the following:
run.php
<?php
shell_exec("php theprocess.php > /dev/null 2>&1 &");
?>
theprocess.php
<?php
$z = 1;
while ($z <= 20) {
echo $z . "\n";
$z++;
sleep(3);
}
?>
I execute run.php from the browser (eg: http://localhost/run.php)
Then I typed: ps aux | grep php
username# [~]# ps aux | grep php
username 27272 0.0 1.5 89504 64468 ? R 17:33 0:00 php theprocess.php
username 27274 0.0 1.2 89504 49872 ? R 17:33 0:00 php theprocess.php
username 27276 0.0 0.6 89504 28676 ? R 17:33 0:00 php theprocess.php
username 27278 0.0 0.0 22280 3704 ? R 17:33 0:00 php theprocess.php
username 27280 0.0 0.0 1940 508 ? S+ 17:33 0:00 grep php
I dont understand why is it showing more than 1 theprocess.php process?
Also why it still running at the background? it should terminate theprocess.php finish the task. How can that be done?
I have fixed the problem!
When running script from a webpage, it does not treat as PHP cli.
Replace
shell_exec("/usr/bin/php theprocess.php > /dev/null 2>&1 &");
To
shell_exec("/usr/bin/php-cli theprocess.php > /dev/null 2>&1 &");
I no longer have multiple procress running in the background.

Resources