On time laravel scheduler command I want to run in background if fail then run again on Ubuntu - linux

Currently i am using nohup php artisan schedule:run >> /dev/null 2>&1 & but some 3 or 4 days later it kills. I want a permanent solution. I tried to create supervisor but it runs again and again but i am looking only one time run in background. If i Autostart=false its not ruining on background. Some one can help i will be great full to you. I have not vast knowledge on Ubuntu server.

When using the scheduler, you only need to add the following Cron entry to your server. If you do not know how to add Cron entries to your server, consider using a service such as Laravel Forge which can manage the Cron entries for you:
* * * * * cd /path-to-your-project && php artisan schedule:run >> /dev/null 2>&1
This Cron will call the Laravel command scheduler every minute. When the schedule:run command is executed, Laravel will evaluate your scheduled tasks and runs the tasks that are due.
https://laravel.com/docs/7.x/scheduling

Related

run python telegram bot after reboot and if it's crashes on AWS EC2

I created a telegram bot using pyrogram and it crashes after few hours. Sometimes I stop the EC2 myself to reduce the cost. I created these cron jobs inside /etc/crontab but it seems they are not working as expected.
Cronjob 1 is to run the python file after EC2 reboot.
Cronjob 2 is to restart the bot if it got crashed.
Here is my crontab content.
#reboot sudo pgrep -f bot.py || sudo nohup /usr/bin/python3 /home/ubuntu/bot.py & > /home/ubuntu/startOnReboot.log
*/2 * * * * sudo pgrep -f bot.py || sudo nohup /usr/bin/python3 /home/ubuntu/bot.py & > /home/ubuntu/restartBotAfterCrash.log
I would like to know whether my cronjob is not correct or any solution better than this approach.
You shouldn't use sudo in the cronjob, use sudo crontab -e instead to have it run as root.
Futhermore, & > is different from &> - did you mean to redirect all output to the specified file or run the cronjob in the background and redirect stdout? If it's the latter, you don't need to tell cron to run it as a background job and the redirection should come before the ampersand (which you should drop anyway).
Last, you probably want to use a systemd service for this instead.

Cronjob stuck without exiting

I have 50+ cronjobs like the one given below running in my Centos 7 server.
curl -s https://url.com/file.php
This runs every 10 minutes. When running manually from the shell it only takes 1-2 minutes. It also is working fine using cronjob. The problem is that it does not exit after the execution. When i check my processes using ps command, it shows many cronjobs of previous dates(even 10 days before) which accumulates the total proccesses in my server.
Line in crontab :-
*/10 * * * * user curl -s https://url.com/file.php > /dev/null 2>&1
Is there any reasion for this? If i rememmber correctly this happened after latest patch update.
Please help.
Modify your command to store the logs in log files instead of dumping it to /dev/null.
Options
--max-time
--connect-timeout
--retry
--retry-max-time
can be used to control the curl command behaviour.

Crontab on shared hosting, not being started

Alright so I'm having an issue.
I've setup this correctly but something is out of order
I added this to my crontab
* * * * * /home/website/public_html/ && php artisan schedule:run >> /dev/null 2>&1
Now if I start this command from my terminal it will work, but on the crontab log I get this
/bin/sh: /home/website/public_html/ : is a directory
After the frequency of the execution (in your case five stars to run every minute) crontab syntax expects to find the command to run.
I guess you want change the working directory before running php, so you need to add cd to change directory:
cd /home/website/public_html/ && php artisan schedule:run
Anyway, there is plenty of examples and explanations about crontab in internet.

Cron job creating files each time it runs

I have a cron job that runs the script every 30 minutes. The problem is each time it runs the cron it creates a file in the root directory. It'll create files like this:
wp-cron.php?doing_wp_cron.1
wp-cron.php?doing_wp_cron.2
wp-cron.php?doing_wp_cron.3
This is my cron:
*/30 * * * * wget http://yourdomain.com/wp-cron.php?doing_wp_cron 2>&1 > /dev/null
How can I make it auto delete after it finishes running the cron job or make it not create the file?
wget primarily is for downloading files, it might be better to use curl
curl http://yourdomain.com/wp-cron.php?doing_wp_cron

cron job not running in goDaddy shared host for laravel 5.2

I am trying to run cron job in my laravel 5.2 application hosted in goDaddy shared host.
I have cPanel access and there I added a cron job, something like this:
* * * * * php /home/path/to/artisan schedule:run 1>> /dev/null 2>&1
But the issue is that the server is not calling schedule action in Kernel.php. The same works fine in my local system.
Can anyone point out the mistake please or suggest some way to accomplish this so that server runs the cron job so as to execute defined commands.
Add path of php binary to cron command.
* * * * * path/php /home/path/to/artisan schedule:run 1>> /dev/null 2>&1
Example : /usr/bin/php
Cpanel Cron job command in laravel godaddy and others are same
/usr/local/bin/php /home/ivlssadmin/public_html/inventoryproject/artisan OpeningStocks:openingstocks >> /dev/null 2>&1
Here what you change follow bellow :
/usr/local/bin/php /home/Your Host User Name /public_html/Sub Domain Name/artisan Command Name:Command Name >> /dev/null 2>&1
if you don't have sub domain you write only public html
Make sure you write the correct command in the schedule action. e.g. $schedule->command('send:followup')
Also check the timezone of the crontab if possible are you using utc timezone in your commands this is the default for most servers.

Resources