I create a cronjob for periodically run the executatble compile from python.
20 23 * * * .../TestInno/inno_main >>.../TestInno/Log/Inno_cronlog_`date +\%Y\%m\%d\%H\%M\%S`.log*
the executable consisted of 2 task
update database
sent email notification
When I run the executable in command line, it does perfectly.
however, the cronjob only does task 1 only (from log file)
Since there's few minute pause while connecting to smtp server for sending email. I suspect that the cronjob think the job's done, so kills the task.
Is it right?
If so, how to prevent it?
Related
I have crontab scheduled as something as
MAILTO=myemail#domain.com,anotheremail#domain.com
00 09 * * sun /opt/add.sh
Every sunday the job runs and triggers an email.
When I run the add.sh script manually anytime, even then email is triggered why?
Expectation is manually running the script, it should not trigger email
I have a nodeJs program that runs flawlessly when I run there from the console.
But when I start it from crontab it runs about 20 min and then disconnects. (It reads sensors every 10 seconds)
Of course I do not get any error messages on the console, as background programs have no console !!.
I realize they need to be sent to a log file.
But how?
Sincerely Poul Christoffersen
I have scheduled a cron job that is executed every minute.
This cron job generates a pdf file using a distant web service. This operation alone takes a few seconds (something like 3 seconds), that means the cron job will be able to generate 20 pdf files per minute approximately.
If the visitor requests 60 documents, that means it will take 3 minutes for the server to generate all the pdf files.
Executing parallel cron jobs to do this task is not possible as all the files request must be handled individually for database relationships and integrity reasons. Basically, each file can only be handle one by one.
Therefore, is there any logic I could apply in order to :
execute multiple occurrences of the same cron job to speed up the process and decrease the user waiting time
and make the file creation process handled by one cron job only so that a specific creation process is not handled by another cron job doing the same task.
Thank you
I have an admin system where I can send emails with my lead info in it.
What I'm trying to achieve is an auto command where 10 minutes after I sent the email another email will be sent. The page with the action of sending the second email is ready, but how do I "activate" the action without logging into the system and do it manually?
I'm using IIS w. Classic ASP.
You'll need to use a batch file to schedule this task.
Run batch file from asp http://bytes.com/topic/asp-classic/answers/442265-asp-run-command-line
The batch command for scheduled tasks Make server automatic run asp-script every day
Here is how to add 5 minutes (or 10 minutes) to current time Adding to %TIME% variable in windows cmd script
I let you read and combine all that.
I have a script which updates a web application. The web application is spread across 2 servers. Here is a rundown of the script
The shell script updates the git repository.
The shell script stops the application server.
The shell script stops the web server.
The shell script instructs the application server to checkout the latest git update.
The shell script instructs the web server to checkout the latest git update.
The shell script starts the application server.
The shell script starts the web server.
Each of the 7 steps are done one after the other synchronously. The total run time is approximately 9 seconds. To reduce downtime however, many of these steps could be done asynchronously.
For example, step 4 and 5 could be done at the same time. I want to start step 4 and 5 asynchronously (e.g. running in the background), but I cannot find how to wait until they are both completed before going further.
You might want to use command grouping to maintain which steps need to be synchronous:
step1
( step2 && step4 && step6 ) &
( step3 && step5 && step7 ) &
wait && echo "all done"
launch step 4 and 5 in background in your script (ending &), then simply call wait bash builtin before running step 6
You are looking for the wait command.
wait: wait [id]
Wait for job completion and return exit status.
Waits for the process identified by ID, which may be a process ID or a
job specification, and reports its termination status. If ID is not
given, waits for all currently active child processes, and the return
status is zero. If ID is a a job specification, waits for all processes
in the job's pipeline.
Exit Status:
Returns the status of ID; fails if ID is invalid or an invalid option is
given.