Azure WebJob fails to run on its schedule but it ran when I uploaded the bat file and never ran again - cron

I created a WebJob as a Triggered job to run on a Schedule. When I uploaded the file it was accepted by the form and I went ahead and clicked RUN because I figured you have to click RUN right after uploading it so that it knows it can go ahead and start running. (I am not sure if I actually have to click RUN, or if I should have just uploaded it and let it be so it should just run on its own according to the CRON Expression provided.)
Well, the job ran as soon as I clicked start and it succeeded which was good news. The issue is, it was supposed to run on its schedule every 4 hours, but never did. It only ran once, which was the time I clicked start.
The CRON Expression I created for it is **0 50 23/4 * * *** which translates to:
At 50 minutes past the hour, every 4 hours, starting at 11:00 PM.
Basically I need the job to run every 4 hours but most importantly at 11:50pm which is why I set that as the schedule. So it should run at 11:50pm, 3:50am, 7:50am, 11:50am, 3:50pm, 7:50pm 11:50pm everyday.
I uploaded the job at about 10pm and it ran at that time because I clicked on RUN but was still expecting it to do its REAL SCHEDULED RUN at 11:50 pm but it never did. The logs show success for that initial run as you can see below.
When I look at the WebJob area in Azure the next day, it shows completed 17 hours ago and only ran once at the time of writing this.
What could be my error here? Is it something wrong with the CRON Expression that I have provided for the job? Before this one I made one that would run every 2 minutes and that one worked perfectly fine, but this one with a more complex CRON Expression seems to give me issue.
What could be my problem here?

I was able to fix this issue scrapping everything and starting over. I was using the incorrect CRON Expression for the trigger times I needed. Also found out that I could just upload the file and not have to click the RUN button since it will just run on its own following the given expression.

Related

linux wait command only proceed if previous job successful putty

Im an end user that has access to Putty in order to run selective scripts on our server as they would run during our overnight batch process.
Some of these I run in sequence using
run_process.ksh kj_job1 & wait; run_process.ksh kj_job2
However kj_job1 can fail and kj_job2 would still run. Id like a way for kj_job2 to only proceed if kj_job1 was completed succesfully but i cant find a guide online to walk me through what i need to do.
My knowledge in this area is limited, i simply navigate to the directory where we have the file run_process.ksh and then add the job name i want to run. I recently found out about the & wait command in order to run strings and the () s i can run things in parallel.
Any help is apprecaited.

Autorun script for any activity on linux server

so i have around 500 servers for which i wanted to create a script that automatically login and do some trivial activity to create some logs (basically want to send logs through siem tool to check if log sending is working or not) and then automatically logoff from the server.
I am planning that the script can be auto-run on the server every 15 days.
trivial activity can be anything(just want to create logs).
Any help how to achieve that??
EDIT
i was thinking now that stopping and starting a service in server will accomplish my need. Any help on that script. i am actually new in working in linux server. so any help is greatly appriciated.
This can be done by cronjobs, you can simply setup a cronjob to do the task,
0 0 15 1-12 * /path/to/your/script
This Cronjob will run at 00:00 on day-of-month 15 in every month from January through December.
You could use cron jobs.
In the crontab file of every server you need to make an entry of the time when you wanna run the script in your case every 15 days and the location of your script file.
0 0 */15 * * /path/to/script.sh
Mind you the job would run a bit differently, as it would start at the 1st of every month, then on 16th and then finally on 31st if the month has 31 days.

Run section of puppet manifest once a day but with hourly poll

I have nodes checking into a puppet server every hour.
We have some tasks which we want to run on check-in but only once a day.
Would it be possible to make a function inside a puppet manifest that saves last run time and only runs if the last time was over 24 hours?
Update:
I did try one thing which semi-works. That is move the chunk of puppet code into a separate file, and have my main puppet ensure a cron job exists for it.
The complaint I go back from another department with this is that they can no longer see install errors on puppet board. This image shows 2 nodes on the old puppet branch and 1 on the new branch:
With having cron run puppet apply myFile.pp we no longer got the feedback from failures on Puppetboard, as the main script simply ensures that the cron job exists:
You have at least two options.
Assuming your unspecified task is handled by an exec resource, you could design this in such a way that Puppet only ever regards the exec as out of sync once per day. That could be achieved by having your exec write the calendar day into a file. Then you could add an unless attribute:
unless => "test $(</var/tmp/last_run) == $(date +%d)"
Obviously your exec would need to also keep track of updating that file.
A second option would be to use the schedule metaparameter:
schedule { 'everyday':
period => daily,
range => '1:00 - 1:59',
}
exec { 'do your thing':
schedule => 'everyday',
}
That assumes that Puppet really will run only once per hour. The risk of course is that Puppet runs more than once in that hour, e.g. a sysadmin might manually run it.

Coded Ui-Data driven tests

I am automating a test scenario which runs on different test inputs.The inputs are passed from a CSV file or MTM. During the test run,the first iteration went through successfully but the 2nd iteration fails for the same flow for which the first has gone through successfully.
Could anyone say the cause for this problem,why is it happening? I thought it would be due to the objects which are set to some value(during the first run) and not initialized to null in the second run.So when the next run happens it fails on some controls saying "Unable to find control" on some objects.But the tool recognized it successfully in the first run.If this is the problem kindly help us on the solution asap.Thanks in advance!!
regards
Amsaveni
You will get this error if your test is running but the controls or app has not yet been loaded. Say Round 1 is finished and round 2 has begun. If you are not starting the app from a same starting point or not waiting for the control, then you will get this exception.
Verify that after each test has completed you app starts in the same state
Verify that you are waiting for your controls
Hand code and debug your test

How can I make my PHP-based cron job execute more often?

I have a file "update.php" which does some MySQL operations. A cron job executes this file every 5 minutes. Unfortunately, I cannot execute the cron job more often.
So I had the idea that I could add ...
<html>
<head>
<meta http-equiv="refresh" content="2; URL=<?php echo $_SERVER['SCRIPT_NAME']; ?>" />
</head>
<body></body>
</html>
... to the page "update.php". Will cron execute the file in a way that the page will refresh automatically? Or will that not happen because there is no client with a browser?
If it the meta refresh has no effect, is there any other possibility to achieve the refreshing of the page?
Thanks in advance!
I'm afraid that won't work, because it's a browser feature to refresh the page.
Question: Why can't you set the cron job to run more frequently that every 5 minutes?
If there is no other option then you could create you're own daemon to do the job more frequently.
e.g.
Your php script could:
Run
Wait 60 seconds
Run
( Wait; Run; two more times)
exit
For example: (By variation of sshow's code)
<?php
$secs = 60;
ignore_user_abort(true);
set_time_limit(0);
dostuff();
sleep($secs);
dostuff();
sleep($secs);
dostuff();
sleep($secs);
dostuff();
sleep($secs);
dostuff();
?>
This version of the script will remain resident for four minutes, and execute the code 4 times which would be equivalent to running every minute, if this script is run by cron every 5 minutes.
There seems some confusion about what a cronjob is, and how it is run.
cron is a daemon, which sits in the background, and run tasks through the shell at a schedule specified in crontabs.
Each user has a crontab, and there is a system crontab.
Each user's crontab can specify jobs which are run as that user.
For example:
# run five minutes after midnight, every day
5 0 * * * $HOME/bin/daily.job >> $HOME/tmp/out 2>&1
# run at 2:15pm on the first of every month -- output mailed to paul
15 14 1 * * $HOME/bin/monthly
# run at 10 pm on weekdays, annoy Joe
0 22 * * 1-5 mail -s "It's 10pm" joe%Joe,%%Where are your kids?%
23 0-23/2 * * * echo "run 23 minutes after midn, 2am, 4am ..., everyday"
5 4 * * sun echo "run at 5 after 4 every sunday"
So to run every five minutes:
*/5 * * * * echo "This will be run every five minutes"
Or to run every minute:
* * * * * echo "This will be run every minute"
The output from the commands are emailed to the owner of the crontab (or as specified by MAILTO).
This means if you run something every minute it will email you every minute, unless you ensure all normal output is suppressed or redirected.
The commands are run as the user who owns the crontab, which contrasts with the scripts run by the web-server, which are run as the 'nobody' user (or similar - whatever the web-server is configured to run as).
This can make life more complicated if the cronjob is writing to files which are supposed to be accessed by the scripts run by the web-server. Basically you have to ensure that the permissions remain correct.
Now, I'm not sure that this is the system you are refering to. If you mean something else by cronjob then the above might not apply.
If you want to do something that your current host is not letting you do, then rather than hacking around the restriction, you might what to look at switching hosting provider?
An alternative is to put the script in you're normal scripts location, and have some external scheduler run wget against it at whatever frequency you like.
Another alternative is on-demand updating of the form of vartec's suggestion. However that may not solve your problems.
I'm pretty sure you can achieve it by doing this:
<?php
$secs = 120;
ignore_user_abort(true);
set_time_limit(0);
while (true)
{
// do something
// Sleep for some time
sleep($secs);
}
?>
Edit
You will have to execute it once after every server restart unless you do it like Douglas describes.
Update
Keep Douglas Leeder's answer in mind, and then take a look at this:
http://www.php.net/manual/en/function.ignore-user-abort.php
I'd say don't try to do this with php, change your crontab. If you need your application to do a cronjob every minute and your hosting doesn't provide this option, you have most likely outgrown your hosting. Get yourself a VPS hosting for 20$ a month (Slicehost, Servergrove).
Update: Editted based on new information.
Meta refresh won't work because cronjob.de will be using an automated system that doesn't actually read the contents of the page. No browser, so nothing to see the meta refresh.
You have a couple options. They vary in greater or lesser horribleness.
The best option is to change webhosts. A good webhost will have full support for cron. But if you need to touch cron, honestly, you should probably be on a VPS host anyways. A lot of hosts will object to a cron task running every minute unless the task is just updating something really quickly and exits. But VPS hosts won't usually care. Slicehost offers VPS servers for as little as $20/month. Not recommended for people who've never had root access before.
The only option you've got that will work with cronjob.de's 5 minute limitation is to build a loop that will run an iteration, sleep, run another iteration, and repeat however many times you need before the end of the 5 minutes. However, there are two major problems with this approach. First, if you have a request that lasts 4 minutes, there's a distinct possibility that your webhost might kill the request before it finishes. Second, if the webserver isn't configured just right, such a request might block other requests, preventing legitimate users from accessing your site — they would queue up, and be waiting for the cronjob.de request to finish before their requests could be completed. And since that request will take 4-5 minutes to finish, before being repeated a minute later, they might only be able to access your site once every 5 minutes. I'm guessing this is undesirable. Unfortunately, the only way to know if you'll run afoul of either of these problems is to ask your webhost. I don't recommend trying it before asking, because they may not appreciate it if it goes unexpectedly bad and starts affecting their other customers on the server.
If you're lucky, they may even be willing to set up a cron job for you.
You can use PHP to call the script...
<?php
$script='/path/to/my/php/script.php';
ignore_user_abort(1);
set_time_limit(0);
$php=exec('which php');
while (1) {
if (file_exists($script)) { exec($php.' '.$script); }
else { file_get_contents($script); }
sleep(2);
}
?>
The file you give $script needs to exist, otherwise it thinks it's a URL.
It should do the trick.
What you are trying to do it implies user interaction, so what if no client enters into your "page"?
For example Servlets and EJB containers can do it programmatically at container startup what you need, so I suppose that for PHP the only way to accomplish "automatically" cronlike jobs is to make some changes in your Apache source code, obviously only if you are using an housing server.
A more praticable option not contempling code changes could be some cron script calling directly your page (wget,netcat,etc...) launched during your web server startup

Resources