Cron job from php (at command to be exact) - linux

I want to run a script just ONCE by setting up a cron job using "at" command.
I'm using this now:
<?php
include "config.php";
if (isset($_POST['add']))
{
$sql = mysql_query("INSERT INTO {$table}(msg) VALUES('{$_POST['msg']}')");
if ($sql)
{
$cmd = "wget /var/www/index.php?id=" . mysql_insert_id() . " | sudo at " . $_POST['runat'];
exec($cmd);
echo exec("atq");
echo $cmd;
}
exit();
}
echo "<form action='{$_SERVER['PHP_SELF']}' method='POST'>";
echo "<input type='text' name='msg' />";
echo "<input type='text' name='runat' />";
echo "<input type='submit' name='add' />";
echo "</form>";
?>
However, this doesn't seem to be working. Am I doing this right? Or could you recommend something else?

You are using at command in wrong way. You need to echo command and pass it to at. Try it like that:
$cmd = "echo wget /var/www/index.php?id=" . mysql_insert_id() . " | sudo at " . $_POST['runat'];

Related

How to get a response of file printing job to user from printer using CUPS in linux

I am trying to print a file in remote server through CUPS command , User needs to know the status of job status. How to get a response from that. Here is my code :
#!/usr/bin/perl
my $response = system("lpr -P laserJet123 -o raw -T test_womargin abc.txt");
print $response;
$? returns the response status of system command.
system("lpr -P laserJet123 -o raw -T test_womargin abc.txt");
if ($? == -1) {
print "failed to execute: $!\n";
}
elsif ($? & 127) {
printf "Job failed with signal %d, %s coredump\n",
($? & 127), ($? & 128) ? 'with' : 'without';
}
else {
printf "Job exited with value %d\n", $? >> 8;
}

Log output from bash script into log file and stdout while running from perl

I have perl script
perl script (install.pl):
use strict; use warnings;
use Term::ANSIColor;
my $logfilename = "install.log";
open LOG,">$logfilename" or die "failed to open log file reason:$!";
sub logMsg
{
my ($msg) = #_;
my ($error) = $_[1];
$msg .= "\n";
print LOG $msg;
if( $error ) { print color 'red' ;}
{print $msg}
if( $error ) { print color 'reset' ;}
}
sub lampInstall
{
logMsg("Installing apache2, php, and mysql db");
# Install apache2 and mysql:
my $cmdOutput = `./ubuntu-lamp-install.sh`;
print LOG "$cmdOutput\n";
}
lampInstall();
close LOG;
bash script (ubuntu-lamp-install.sh)
#!/bin/bash
sudo apt-get update
sudo apt-get install ntp
sudo /etc/init.d/ntp start
export DEBIAN_FRONTEND=noninteractive
echo "mysql-server mysql-server/root_password password test" | sudo debconf-set-selections
echo "mysql-server mysql-server/root_password_again password test" | sudo debconf-set-selections
sudo apt-get install -y apache2 php5 libapache2-mod-php5
sudo service apache2 restart
sudo a2enmod ssl
The issues is that , when install.pl is invoked, it waits for long time and does not give any output information of what is being installed.I need it to change the perl script to display the output information from bash script(better if instantaneous) as well as log the output to log file. The script is written by someone else , I have very little knowledge of perl scripting.
You could do something like
open my $cmdOutut, "-|", "./ubuntu-lamp-install.sh";
while (<$cmdOutput>) {
print LOG "$_\n";
print "$_\n";
}
in your lampInstall().
This is the new lampInstall():
sub lampInstall
{
logMsg("Installing apache2, php, and mysql db");
# Install apache2 and mysql:
open my $cmdOutut, "-|", "./ubuntu-lamp-install.sh";
while (<$cmdOutput>) {
print LOG "$_\n";
print "$_\n";
}
}

Zip file is not extracting by crontab

i am using a perl script for unzipping a zip file by crontab. The script is working properly if i execute it manually. but whenever i set it in cron the script does not work any more. i have tested cron setting other script files are working from cron only this zip extracting script is not working .
Script is as follows :
#!/usr/bin/perl
use IO::Uncompress::Unzip qw(unzip $UnzipError);
$dir = '/root/perl';
open (han2, "ls -l $dir/*.zip |awk '{print \$9}'|");
#array1 = <han2>;
chomp(#array1);
for ($i=0;$i<=$#array1;$i++) {
$zipfile = $array1[$i];
$u = new IO::Uncompress::Unzip $zipfile
or die "Cannot open $zipfile: $UnzipError";
die "Zipfile has no members"
if ! defined $u->getHeaderInfo;
for ( $status = 1; $status > 0; $status = $u->nextStream) {
$name = $u->getHeaderInfo->{Name};
warn "Processing member $name\n" ;
if ($name =~ /\/$/) {
mkdir $name;
}
else {
unzip $zipfile => $name, Name => $name
or die "unzip failed: $UnzipError\n";
}
}
}
Crontab setting :
34 14 * * * /root/perl/./unzip.pl > /dev/null 2>&1
Please help me to do this task by cronjob
When cron executes your script, the current directoy probably won't /root/perl. Try chdir($dir) after you set $dir, or use full pathnames where required:
$u = new IO::Uncompress::Unzip "$dir/$zipfile"
or die "Cannot open $zipfile: $UnzipError";
mkdir "$dir/$name";
unzip "$dir/$zipfile" => "$dir/$name" ...
Changing to the correct directory is probably easier.

Is it possible to set exit code of 'expect'

The following bash script doesn't work because command 'expect' always return 0 regardless which exit code of the remote script /tmp/my.sh returns.
any idea to make it work? thanks.
#!/usr/bash
user=root
passwd=123456abcd
host=10.58.33.21
expect -c "
spawn ssh -o StrictHostKeyChecking=no -l $user $host bash -x /tmp/my.sh
expect {
\"assword:\" {send \"$passwd\r\"}
eof {exit $?}
}
"
case "$?" in
0) echo "Password successfully changed on $host by $user" ;;
1) echo "Failure, password unchanged" ;;
2) echo "Failure, new and old passwords are too similar" ;;
3) echo "Failure, password must be longer" ;;
*) echo "Password failed to change on $host" ;;
esac
Edited at 10:23 AM 11/27/2013
Thanks for the comments. Let me emphasis the problem once again,
The main script is supposed to run on linux server A silently, during which it invokes another script my.sh on server B unattended. The question is how to get exit code of my.sh?
That's why I cannot leverage ssl_key approach in my case, which requires at least one time configuration.
#!/usr/bin/expect
set user root
set passwd 123456abcd
set host 10.58.33.21
set result_code 255
# exp_internal 1 to see internal processing
exp_internal 0
spawn ssh -o StrictHostKeyChecking=no -l $user $host bash -x /tmp/my.sh && echo aaa0bbb || echo aaa$?bbb
expect {
"assword:" {send "$passwd\r"; exp_continue}
-re "aaa(.*)bbb" {set result_code $expect_out(1,string)}
eof {}
timeout {set result_code -1}
}
switch $result_code {
0 { puts "Password successfully changed on $host by $user" }
1 { puts "Failure, password unchanged" }
2 { puts "Failure, new and old passwords are too similar" }
3 { puts "Failure, password must be longer" }
-1 { puts "Failure, timeout" }
default { puts "Password failed to change on $host" }
}
exit $result_code

Why isn't stdout set after executing a child process in a node.js script that's running as a daemon?

This works as expected if I run it from the command line (node index.js). But when I execute this Node.js (v0.10.4) script as a daemon from a init.d script the stdout return value in the exec callback is not set. How do I fix this?
node.js script:
var exec = require('child_process').exec;
setInterval(function()
{
exec('get_switch_state', function(err, stdout, stderr)
{
if(stdout == "on")
{
// Do something.
}
});
}, 5000);
init.d script:
#!/bin/bash
NODE=/development/nvm/v0.10.4/bin/node
SERVER_JS_FILE=/home/blahname/app/index.js
USER=root
OUT=/home/pi/nodejs.log
case "$1" in
start)
echo "starting node: $NODE $SERVER_JS_FILE"
sudo -u $USER $NODE $SERVER_JS_FILE > $OUT 2>$OUT &
;;
stop)
killall $NODE
;;
*)
echo "usage: $0 (start|stop)"
esac
exit 0
I ended up not using Node.js exec child_process. I modified the init.d script above (/etc/init.d/node-app.sh) as follows:
#!/bin/bash
NODE=/home/pi/development/nvm/v0.10.4/bin/node
SERVER_JS_FILE=/home/pi/development/mysql_test/index.js
USER=pi
OUT=/home/pi/development/mysql_test/nodejs.log
case "$1" in
start)
echo "starting node: $NODE $SERVER_JS_FILE"
sudo -u $USER TZ='PST' $NODE $SERVER_JS_FILE > $OUT 2>$OUT &
;;
stop)
killall $NODE
;;
*)
echo "usage: $0 (start|stop)"
esac
exit 0
This script launches the Node.js app "index.js" at boot up and everthing works as expected.

Resources