I have a php script that deletes a file from a specific folder on my server:
if (file_exists($_SERVER['DOCUMENT_ROOT']."/folder/file1"))
{
unlink($_SERVER['DOCUMENT_ROOT']."/folder/file1");
}
When I go to this script address with my browser it works fine.
I created a cron job to run this script every hour and running this script from the cron job - the file is not deleted.
I also created a flag that send me an email and I suspect that the cron job gets a false response to the "file_exists" test and not continue to the "unlink" action.
Any idea why cron job wont delete the file?
Thanks
Anyone??
Solved it:
Instead of $_SERVER['DOCUMENT_ROOT']."/folder/file1
Had to put this: /home/public_html/folder/file1
Related
I'm working on a script that runs through a never ending loop. Using cron I start the script on reboot. However, I need to update this script from github every 24 hours. I'm running a shell script that basically follows:
Backup cron to .txt file
Empty cron with crontab -r
Pull updates from GitHub
Load cron backup and start cron again.
The shell script empties cron, updates the code, then starts cron again with the same file name and cron runs the program again. I'm testing this by outputting a message to a text file every time the script completes one loop. When I change the message output in GitHub, cron pulls the update and I can see the updated message. The problem is, it continues to show the old message as well. For example:
Original Message "Test": Test Test Test Test Test Test
Updated Message "Update": Test Update Test Update Test Update
It continues to output old messages even though I cleared cron, updated the code, then started it again. It appears to me that simply emptying cron does not stop the previous loop from continuing to run.
I looked into using "killall" to stop all sh scripts from running, but in an attempt to clear out the many looping scripts I had created I killed every running process with killall5 -9. Now when I enter ps to view running processes, none are listed.
I'm very stuck. Any and all help would be appreciated!
Used sudo pkill python to end all running python scripts.
I have php script on "first" Linux server that publishes websites on "second" Linux server. websites are being published just fine, except permissions and ownership for files and folder getting changed and make websites crash. So, I found the bash script on "second" Linux Server that changes it. But I don't understand what triggers it. There is nothing in php code that would trigger. And I don't know how to find these daemon or zombie process that get trigger when certain event happens. can someone help how to find that?
A cronjob could have been set to trigger this bash script to run. You can view crons for the current user with the command crontab -l
If it is a cron, edit the crontab with crontab -e and delete/edit the line that is causing the bash script to run. Alternatively you can just usr crontab -r to remove all the crons for the user
I'm trying to add a cron job on my server, that is hosting here http://partisscan.bugs3.com/.
The provider for that is serversfree http://www.serversfree.com.
It is realy good but i can't make a cron job. I want my php file http://partisscan.bugs3.com/scan.php to be started every minute(maybe latter less often but for start). So i added a cron job in a cron job manager under control panel but it's not working.
my cron job is:
1 * * * * php -f /home/u798416153/scan.php
However it's not working:S
any ideas?
There are many possible reasons for not working.
Regarding the file:
Executing permissions of the file.
Owner of the file.
Regarding the crontab:
To which user does this crontab correspond the line you posted?
Does it have to be executed by root or any other user? If it is not root, you have to make sure that the user is not in /etc/cron.d/deny.
I need to create a cron job that runs a webpage (and retrieve some data) (not a file on the server). I tryed wget and it works if I set the cron job manually in unix, but not if I create the cron job in cpanel. Something like wget -O http://someurl.somehting.
cron jobs do not run under your user and environment. The path to wget may not be in the cron user's PATH. Specify the full path (e.g. /usr/bin/wget).
I'm struggling trying to debug a cron job which isn't working correctly. The cron job calls a shell script which should unrar a rar file - this works correctly when i run the script manually, but for some reason it's not working via cron. I am using the absolute file path and have verified that the path is correct. Has anyone got any ideas why this could be happening?
Well, you already said that you have used absolute paths, so the number one problem is dealt with.
Next to check are permissions. Which user is the cron job run as? Does it have all the permissions necessary?
Then, a little trick: if you have a shell script that fails and it's not run in a terminal I like to redirect the output of it to some files. Right at the start of the script, add:
exec &>/tmp/my.log
This will redirect STDOUT and STDERR to /tmp/my.log. Then it might also be a good idea to also add the line:
set -x
This will make bash print which command it's about to execute, and at what nesting level.
Happy debugging!
The first thing to check when cron jobs fail is to see if the full environment is available to the script you are trying to execute. In other words, you need to realize that a job executed via cron runs as a detached process meaning it is not associated with a login environment. Therefore whenever you try to debug a cron job that works when you execute manually, you need to be sure the same environment is available to the cronjob as is available to you when you execute it manually. This include any PATH settings, and other envvars that the script may depend on.
For me, the problem was a different shell interpreter in crontab.