i have a function in a php web app that needs to get periodically called by a cron job. originally i just did a simple wget to the url to call the function and everything worked fine, but ever since we added user auth i am having trouble getting it to work.
if i manually execute these commands i can login, get the cookie and then access the correct url:
site=http://some.site/login/in/here
cookie=`wget --post-data 'username=testuser&password=testpassword' $site -q -S -O /dev/null 2>&1 | awk '/Set-Cookie/{print $2}' | awk 'NR==2{print}'`
wget -O /dev/null --header="Cookie: $cookie" http://some.site/call/this/function
but when executed as a script, either manually or through cron, it doesn't work.
i am new to shell scripting, any help would be appreciated
this is being run on ubuntu server 10.04
OK simple things first -
I assume the file begins with #!/bin/bash or something
You have chmodded the file +x
You're using unix 0x0d line endings
And you're not expecting to return any of the variables to the calling shell, I presume?
Failing this try teeing the output of each command to a log file.
In theory, the only difference from manually executing these and using a script would be the timing.
Try inserting a sleep 5 or so before the last command. Maybe the http server does some internal communication and that takes a while. Hard to say, because you didn't post the error you get.
Related
On my raspberry pi, I have entered the following entry in my crontab using crontab -e (without sudo):
* * * * * echo "Hello World" &>> /home/pi/test.txt
Just as a test, that is.
After restarting cron (sudo /etc/init.d/cron restart) the test.txt file is created in /home/pi/, but the content remains empty.
Why?
If I run this without the asterisks in my ssh terminal on the Rasberry Pi, it works fine.
I got to this point because my goal is to run a python script and log any errors, because sometimes it stops running and I don't know why. Hence my need for logging.
Thank you for your help!
P.S. I have the same problem on my Orange Pi that runs Raspbian (Debian Buster), but there cron doesn't work for some reason so there I am using rc.local to run my scripts on boot. But the same problem arises: Log file is created but no content is added.
It appears that Raspbian is using dash as the default shell for cron. dash is sh compliant and not bash compliant. That means that you cannot use &>>. You must use >> output.log 2>&1 instead.
You're saying that your goal is to run a Python script. There's a catch with that too. Python's buffering output so if it's too short it won't appear in your log file by default. You can use python -u for unbuffered output.
I hope that my answer makes sense.
I've tried every variation I can think of but can't seem to get my script to execute. I can hit it manually and it runs fine, but I'd prefer to have an outside server just hit it once per day. Is the trouble I'm having due to the presence of a query string?
My script URL looks like:
http://domain.com/index.php?g=main&reset=true
I've tried curl, wget, lynx, GET... all with no result whatsoever. What am I missing here?
Aha... knew it had to be something simple. Was missing quotes around the URL! Final working line in CPanel is:
wget -O - "http://domain.com/index.php?g=main&reset=true" >/dev/null 2>&1
You can just simply do like this in Cron Jobs
/usr/bin/curl --user-agent cPanel-Cron "http://example.com/index.php?g=main&reset=true"
Hi I'm trying to run a script that calls xclip in order to have a string ready to paste when i connect to the internet.
I have a script /etc/network/if-up.d/script that does execute when connecting (i make him post a date in a file succesfuly ) but the xclip instruction seems not to work, there's nothing to paste. If i call this script manually by typing /etc/network/if-up.d/script in a console it works perfectly.
If i try to launch a zenity message it also don't appeare when connecting. Again if i do it by hand it appeares.
Then I have a expect script that calls matlab (console mode), if I execute it manually it works but if i call it from cron it freezees when calling the script.
It's driving me crasy since it seems that only certain commands in a script can be executed when the system calls them automaticaly.
I'v tryed to call the instructions with nohup instruction & but still misses
This is working as designed, you search around and will see compliated ways to resolve this issue, or you can use xmessage as I describe here: Using Zenity in a root incron job to display message to currently logged in user
Easy option 1: xmessage (in the script)
MSSG="/tmp/mssg-file-${RANDOM}"
echo -e " MESSAGE \n ==========\n Done with task, YEY. " > ${MSSG}
xmessage -center -file ${MSSG} -display :0.0
[[ -s ${MSSG} ]] && rm -f ${MSSG}
Easy option 2: set the DISPLAY (then should work)
export DISPLAY=:0 && /usr/bin/somedirectory/somecommand
question is answered here for cron :
http://ubuntuforums.org/archive/index.php/t-105250.html
and here for if-up network :
Bash script not working properly when run automatically
I tried a lot of methods but ended up with nothing in hand..My simple target is to reset a variable to zero at the end of the day.
i checked the location of php as "which php" and "whereis php"
which resulted into /usr/bin/php
here are some of things i tried..
/usr/local/bin/php -f /home/USERNAME/public_html/developer3/crons/filename.php
/usr/bin/php -f /home/USERNAME/public_html/developer3/crons/filename.php
php -q /home/USERNAME/public_html/developer3/crons/filaname.php
/usr/bin/wget -O http://subdomain.sitename.com/crons/filename.php
for quick results, i kept the timming as every minute to execute the code. i could successfully execute the code as
http://subdomain.sitename.com/crons/filename.php
please guide me.
If the path to php on your system is /usr/bin/php then the command you should be running with cron should be
/usr/bin/php /full/path/to/php/script.php
Is the script you're running designed to be invoked from the php-cli in that way? If it isn't and works correctly when you use a browser then you can use curl, if it's installed on your server
curl -A cron-job http://subdomain.sitename.com/crons/filename.php
It would likely be helpful in any event to configure the MAILTO variable (since you're using cPanel, it's just a text box on the appropriate page that you can fill in) so that you get an email with the output from your cron jobs. Having the output emailed to you will help you diagnose what's causing your script to not have the desired effect when it runs.
I am trying to run the following command:
postfix status > tmp
however the resulting file never has any content written, and instead the output is still sent to the terminal.
I have tried adding the following into the mix, and even piping to echo before redirecting the output, but nothing seems ot have any effect
postfix status 2>&1 > tmp
Other commands work no problem.
script -c 'postfix status' -q tmp
It looks like it writes to the terminal instead to stdout. I don't understand piping to 'echo', did you mean piping to 'cat'?
I think you can always use the 'script' command, that logs everything that you see on the terminal. You would run 'script', then your command, then exit.
Thanks to another SO user, who deleted their answer, so now I can't thank, I was put on the right track. I found the answer here:
http://irbs.net/internet/postfix/0211/2756.html
So for those who want to be able to catch the response of the posfix, I used the following method.
Create a script which causes the output to go to where you wish. I did that like this:
#!/bin/sh
cat <<EOF | expect 2>&1
set timeout -1
spawn postfix status
expect eof
EOF
Then i ran the script (say script.sh) and could pipe/redirect from there. i.e. script.sh > file.txt
I needed this for PHP so I could use exec and actually get a response.