Shell script scheduling on acquia server - linux

I want to schedule a shell script to will run every hour. For that I was trying to schedule from cron on acquia server.
The shell script file is in "docroot/scripts/script_name.sh". In Command option I gave command as "/var/www/html/pllsrv2313.dev/scripts/script_name.sh", but it is not working.

I think that the shell script file should be in "scripts/script_name.sh".
If you are calling a php script from the shell file, you need to use the whole path to the php file even if it is in the same directory.

First start with logging into your box:
Then like #ScottA mentioned call your script: This time do it with verbose flag which is -x.
sh -x /var/www/html/pllsrv2313.dev/scripts/script_name.sh
This is going to dump out everything your script runs or error out. See if anything is not what you expected.
In my case Inside my script I had a curl request that was not going through as planned:
curl --silent --compressed http://dev.example.com/code.sh
So I flagged my curl call --verbose to get that to dump as well:
curl --silent --compressed --verbose http://dev.example.com/
and called:
sh -x /var/www/html/example.dev/docroot/ex_scripts/my_script.sh
I got back that the host could not be resolved because my dev domain was only available inside my DNS. So for me the problem is when cron runs my curl fails do to my non public accessible dev domain.
...
+ curl --silent --compressed --verbose http://dev.example.com/
* getaddrinfo(3) failed for dev.example.com:80
* Couldn't resolve host 'dev.example.com'
* Closing connection #0
...

Related

How to use spaces within cron?

While trying to run a cronjob (I don't have access to the SSH terminal, I only have access to record crons via a cPanel from my hosting) I need to put a space between the cron command itself:
wget -o https://abc.de/aaaaa/bbb ccc/ddd >/dev/null 2>&1
However, the cron job fails reporting:
wget: Unable to find directory https://abc.de/aaaaa/bbb
So how can I use a space there?
In this case, URL encoding should do the trick:
https://abc.de/aaaaa/bbb%20ccc/ddd
but
wget -o "https://abc.de/aaaaa/bbb ccc/ddd"
should work as well.

How do I make AcySms manual cron job in cPanel work?

When setting up AcySMS, there are few option for the cron job. "Web cron" runs at the fastest interval of 15 minutes, way too slow for me.
I have opted for "manual cron", and am given the following "cron URL" https://www.followmetrading.com/index.php?option=com_acysms&ctrl=cron
Putting that into the cPanel cron job manager just leaves me with an error everytime the cron attempts to run:
/usr/local/cpanel/bin/jailshell: http://www.followmetrading.com/index.php?option=com_acysms: No such file or directory
I have discovered that the following command executes the script properly.
curl --request GET 'https://www.followmetrading.com/index.php?option=com_acysms&ctrl=cron&no_html=1' >/dev/null 2>&1
Note: ensure the URL is surrounded by ' ' otherwise it seems to miss everything from the & onward.
Note: >/dev/null 2>&1 makes sure there is no email trail.

How to use gnome-terminal to log in a remote linux server(SSH) and executes commands on the server

The code is following:
gnome-terminal -x sh -c "ssh root#ip 'ls'"
And the 'ls' can executed well on the server, but after the execution it will log out the server and I want to stay in the server. So I want to know is there any way to solve this problem
Because you are supplying a command (the 'ls' part of your code) ssh will execute it on the remote server then log out of it, just as you experienced.
It you leave out the command, ssh should stay logged into the server,
gnome-terminal -x sh -c "ssh root#ip"

FeedWordPress cron job code

FeedWordPress (RSS fetcher) plugin of WordPress is working well, but it doesn't have an option to update RSS every 5 minutes (default is 60 minutes) so the only way is clicking the UPDATE NOW button manually.
I am new and some guys told me to trigger it every 5 minutes using a cron job, so I tried that in cpanel
First I tried this
curl http://domain.com/?update_feedwordpress=true > dev/null
but was getting this error
/bin/sh: dev/null: No such file or directory
Second I tried this
wget http://domain.com/?update_feedwordpress=1
but now I'm getting this error
/bin/sh: /usr/bin/wget: Permission denied
(I used my domain.com in that place)
Any correct/exact working code?
You can use this code:
/usr/local/bin/curl --silent -L "http://example.com/?update_feedwordpress=1" >/dev/null 2>&1
where you should replace example.com with your domain name.
Null device is /dev/null
The /usr/bin/wget does not seems to have permission to execute for the current user.
It may be easier to write a script (say mydebug.sh ) and run the script with cron
id
ls -l /usr/bin/wget
curl ...
Then check for any obvious errors due to permissions.

Adding a cronjob into cPanel

I just need to run the following url using cron jobs in my cPanel.
When I am trying to execute the link
http://www.insidewealth.com.au/index.php?option=com_acymailing&ctrl=cron
the link is running in browser but when I am tried to add the same URL as it is in cron jobs I am getting the following error
bash/sh/ file not found
and when I edited the cron job as
/usr/bin/php /home/staging/public_html/index.php?option=com_acymailing&ctrl=cron
but I am getting 404 error.
My cPanel username is staging
Can anybody tell me what's the syntax of cron job in cPanel.
Cron Job running every minute and email report showing this errors.
Use wget function with full URL.
#yannick-blondeau As suggested you can use a wget or curl to make a simple request to your website.
Usually wget will attempt to download a file but this is not necessary with the -O flag to pipe to /dev/null or -q (both options used to prevent from saving output to a file), an example will look like
wget -O /dev/null http://www.insidewealth.com.au/index.php?option=com_acymailing&ctrl=cron
wget -q http://www.insidewealth.com.au/index.php?option=com_acymailing&ctrl=cron
You can also use curl for the same effect
curl --silent http://www.insidewealth.com.au/index.php?option=com_acymailing&ctrl=cron

Resources