Quartz scheduler - Cron Not Running - cron

I am trying to run a scheduler which will update dates in my database table. The cron is starting but it is not starting the job!
My cron file -->
package Crons.Schedulers;
import org.quartz.CronTrigger;
import org.quartz.Scheduler;
import org.quartz.SchedulerFactory;
import org.quartz.impl.StdSchedulerFactory;
import org.quartz.JobDetail;
public class WeeklySchedule{
public WeeklySchedule()throws Exception {
SchedulerFactory sf=new StdSchedulerFactory();
Scheduler sched=sf.getScheduler();
sched.start();
JobDetail jd=new JobDetail("WeeklyTask","Weekly",WeeklyJob.class);
CronTrigger ct=new CronTrigger("cronTrigger","group2","0 57 16 * * ?");
sched.scheduleJob(jd,ct);
}
public static void main(String args[]){
try{
new WeeklySchedule();
}catch(Exception e){}
}
}
The output shown is -->
INFO SimpleThreadPool(initialize:247) - Job execution threads will use class loader of thread: main
INFO QuartzScheduler(<init>:195) - Quartz Scheduler v.1.5.2 created.
INFO RAMJobStore(initialize:138) - RAMJobStore initialized.
INFO StdSchedulerFactory(instantiate:1014) - Quartz scheduler 'DefaultQuartzScheduler' initialized from default resource file in Quartz package: 'quartz.properties'
INFO StdSchedulerFactory(instantiate:1018) - Quartz scheduler version: 1.5.2
INFO QuartzScheduler(start:400) - Scheduler DefaultQuartzScheduler_$_NON_CLUSTERED started.
I have no idea what is going wrong as I am using quartz for the first time.
What can be the possible reason?
Thanks in advance.

this line
CronTrigger ct=new CronTrigger("cronTrigger","group2","0 57 16 * * ?");
says your job'll start at 16:57. see this Quartz CronTrigger Tutorial

Related

Maximo Event Filter Java class not being picked up by Publish Channel

I have written a Java class for event filtering on one of the Publish channels, and rebuilt and deployed it. I have referenced it on the Publish channel too. However, Maximo behaves as if the class was never there.
package com.sof.iface.eventfilter;
import java.rmi.RemoteException;
import psdi.iface.mic.MaximoEventFilter;
import psdi.iface.mic.PublishInfo;
import psdi.mbo.MboRemote;
import psdi.util.MXException;
import psdi.util.logging.MXLogger;
import psdi.util.logging.MXLoggerFactory;
public class VSPPWOCOMPEventFilter extends MaximoEventFilter {
private static final String SILMX_ATTRIBUTE_STATUS = "STATUS";
private MXLogger log = MXLoggerFactory.getLogger("maximo.application.EVENTFILTER");
/**
* Constructor
*
* #param pubInfo Publish Channel Information
* #throws MXException Maximo Exception
*/
public VSPPWOCOMPEventFilter(PublishInfo pubInfo) throws MXException {
super(pubInfo);
} // end constructor.
/**
* Decide whether to filter out the event before it triggers the
* Publish Channel or not.
*/
public boolean filterEvent(MboRemote mbo) throws MXException, RemoteException {
log.debug("######## com.sof.iface.eventfilter.VSPPWOCOMPEventFilter::filterEvent() - Start of Method");
boolean filter = false;
// try {
String status = mbo.getString(SILMX_ATTRIBUTE_STATUS);
log.debug("######## com.sof.iface.eventfilter.VSPPWOCOMPEventFilter::filterEvent() - WO Status " + status);
if(mbo.isModified("STATUS") && status == "COMP") {
log.debug("######## com.sof.iface.eventfilter.VSPPWOCOMPEventFilter::filterEvent() - Skipping MBO");
filter = true;
} else {
filter = super.filterEvent(mbo);
}
log.debug("######## com.sof.iface.eventfilter.VSPPWOCOMPEventFilter::filterEvent() - End of Method");
return filter;
// }
} // end filterEvent.
} // end class.
Please ignore the below text :)
A good logging (tracing) is always a lifesaver when you have problems in a production environment. I will never stop telling to my fellow programmers how much is important to fill code with meaningful log calls.Maximo has a good and flexible logging subsystem. This IBM TechNote describes in detail how logging works in Maximo. Let’s now see hot to use Maximo logging in your custom Java code.
It looks like you need to skip the outbound message when the Work Order is completed. When the event doesn't seem to occur, make sure to check for these flags:
External System is active
Publish Channel is active
Publish Channel listener is enabled
I think you could easily achieve the same result with a SKIP action processing rule. See details here:
https://www.ibm.com/support/knowledgecenter/en/SSLKT6_7.6.0/com.ibm.mt.doc/gp_intfrmwk/c_message_proc_actions.html
Also worth mentioning: IBM added automation script support for Event Filtering in version 7.6 so no more build/redeploy required.

Pass parameters to cron job on hybris

public class UserRetrievePerformable extends AbstractJobPerformable<UserRetrieveJobModel>
{
private static final Logger LOG = Logger.getLogger(UserRetrievePerformable.class);
#Autowired
TotalCustomerFacade totalCustomerFacade;
#Override
public PerformResult perform(UserRetrieveJobModel userRetrieveJobModel) {
LOG.info("**********************************");
LOG.info("Greeting from MyJobPerformable!!!");
LOG.info("**********************************");
return new PerformResult(CronJobResult.SUCCESS, CronJobStatus.FINISHED);
}
i want to make this job to get parameter as string.
how can i do this? according to spring docs, it cant be done. But there must be another way?
when i give string from backoffice or somewhere, job will give output that string?
I think you might confuse Job and CronJob. A Job is the task that should be done. A CronJob is a single execution of that task. So if you want to execute a Job, you have to create an instance of a CronJob. You define the task a CronJob should execute by selecting the right Job instance. When this CronJob is started, the perform method of the corresponding AbstractJobPerformable class is called with the CronJob as parameter.
So how do you create parameters for an execution? Create a subtype of the type CronJob and add all your needed parameters as attributes. When you then create the CronJob, set all attributes accordingly. In the perform method you can then access the attributes of that CronJob instance.
You create the CronJob like this:
HelloWorldCronJobModel cronJob = modelService.create(HelloWorldCronJobModel.class);
JobModel myJob = cronJobService.getJob("myJob");
cronJob.setJob(myJob);
// Add own attribute to Job
cronJob.setGreetedPerson("John Doe");
modelService.save(cronJob);
cronJobService.performCronJob(cronJob);
When your perform method is called, access the attribute:
#Override
public PerformResult perform(MyCronJobModel myCronJob) {
LOG.info("Hello " + myCronJob.getGreetedPerson());
return new PerformResult(CronJobResult.SUCCESS, CronJobStatus.FINISHED);
}

Custom update listener to set subtask's fix-version

I'm developing custom listener which will update subtask's fix version to same value as it's parent issue.
Currently we are using post-function in workflow in order to set subtask's fix version according to parent on subtask creation. This however doesn't cover cases when subtask already exists and parent's fix version gets updated. New value from parent task is not propagated to subtask.
I'm using script runner and I'm creating 'Custom lisatener', for my specific project and specified Event: 'Issue Updated'. I added script as following:
import com.atlassian.jira.component.ComponentAccessor
import com.atlassian.jira.config.SubTaskManager
import com.atlassian.jira.event.issue.AbstractIssueEventListener
import com.atlassian.jira.event.issue.IssueEvent
import com.atlassian.jira.event.type.EventDispatchOption
import com.atlassian.jira.issue.Issue
import com.atlassian.jira.issue.IssueManager
import com.atlassian.jira.issue.MutableIssue
import com.atlassian.jira.project.version.Version
import org.apache.log4j.Logger
class CopyFixVersionFromParentToChild extends AbstractIssueEventListener {
Logger log = Logger.getLogger(CopyFixVersionFromParentToChild.class);
SubTaskManager subTaskManager = ComponentAccessor.getComponent(SubTaskManager.class)
IssueManager issueManager = ComponentAccessor.getComponent(IssueManager.class)
#Override
void issueUpdated(IssueEvent event) {
log.warn("\nIssue updated!!!\n")
try {
Issue updatedIssue = event.getIssue()
if (updatedIssue.issueTypeObject.name == "Parent issue type") {
Collection<Version> fixVersions = new ArrayList<Version>()
fixVersions = updatedIssue.getFixVersions()
Collection<Issue> subTasks = updatedIssue.getSubTaskObjects()
if (subTaskManager.subTasksEnabled && !subTasks.empty) {
subTasks.each {
if (it instanceof MutableIssue) {
((MutableIssue) it).setFixVersions(fixVersions)
issueManager.updateIssue(event.getUser(), it, EventDispatchOption.ISSUE_UPDATED, false)
}
}
}
}
} catch (ex) {
log.debug "Event: ${event.getEventTypeId()} fired for ${event.issue} and caught by script 'CopyVersionFromParentToChild'"
log.debug(ex.getMessage())
}
}
}
Problem is, that it doesn't work. I'm not sure whethe rit's problem that my script logic is encapsulated inside class. Do I have to register this in some specific way? Or am I using script runner completely wrong and I'm pasting this script to wrong section? I checked code against JIRA API and it looks like it should work, my IDE doesnt show any warnings/errors.
Also, could anyone give me hints on where to find logging output from custom scripts like this? Whatever message I put into logger, I seem to be unable to find anywhere in JIRA logs (although I'm aware that script might not work for now).
Any response is much appreciated guys, Thanks.
Martin
Well, I figure it out.
Method I posted, which implements listener as groovy class is used in different way than I expected. These kind of script files were used to be located in to specific path in JIRA installation and ScriptRunner would register them into JIRA as listeners.
In in order to create 'simple' listener script which reacts to issue updated event, I had to strip it down to this code
import com.atlassian.jira.component.ComponentAccessor
import com.atlassian.jira.event.type.EventDispatchOption
import com.atlassian.jira.issue.Issue
import com.atlassian.jira.issue.IssueManager
import com.atlassian.jira.issue.MutableIssue
import com.atlassian.jira.project.version.Version
IssueManager issueManager = ComponentAccessor.getComponent(IssueManager.class)
Issue updatedIssue = event.getIssue()
Collection<Version> fixVersions = new ArrayList<Version>()
fixVersions = updatedIssue.getFixVersions()
Collection<Issue> subTasks = updatedIssue.getSubTaskObjects()
subTasks.each {
if (it instanceof MutableIssue) {
((MutableIssue) it).setFixVersions(fixVersions)
issueManager.updateIssue(event.getUser(), it, EventDispatchOption.ISSUE_UPDATED, false)
}
}
You past this to script runner interface and it works :-). Hope this helps anyone who's learning ScriptRunner. Cheers.
Matthew

cloudbees, groovy, jobs, folders: How to determine the job result, if the job is within a cloudbees folder?

Problem: I'm using a script to determine if a certain amount of jobs are in SUCCESS state.
It worked fine as long as I was not using cloudbees folder plugin. I could easily get the list of projects and get the project result. But after I moved the jobs to the cloudbee folder, the jobs and therefore the job results are no longer available!
Q: Does anybody now how to get the job results with groovy from jobs which are located in a Cloudbees folder?
def job = Jenkins.instance.getItemByFullName('foldername/jobname');
Folder plugin provides the getItems() method which can be used to get all immediate items (jobs/folders) under a folder.
folder.getItems()
Check this link to traverse across all the folders in Jenkins.
Displaying the code snippet below,
import jenkins.*
import jenkins.model.*
import hudson.*
import hudson.model.*
import hudson.scm.*
import hudson.tasks.*
import com.cloudbees.hudson.plugins.folder.*
jen = Jenkins.instance
jen.getItems().each{
if(it instanceof Folder){
processFolder(it)
}else{
processJob(it)
}
}
void processJob(Item job){
}
void processFolder(Item folder){
folder.getItems().each{
if(it instanceof Folder){
processFolder(it)
}else{
processJob(it)
}
}
}

grails sessionFactory.currentSession.flushMode not work with thread?

In grails we have the following config:
DataSource.groovy:
hibernate {
flush.mode="commit"
}
which prints "COMMIT" when we log it in a transactional context:
println "session=${sessionFactory.currentSession.flushMode}"
but when we create a new thread
this prints "AUTO".
New thread does seem to get the other hibernate settings, ie database, username and factory, but the currentSession doesn't take the flush.mode setting.
Can anyone advise?
Are you using the Quartz plugin?
Quartz changes the flush mode:
https://fisheye.codehaus.org/browse/~raw,r=41198/grails-plugins/grails-quartz/tags/LATEST_RELEASE/src/java/org/codehaus/groovy/grails/plugins/quartz/listeners/SessionBinderJobListener.java
public void jobToBeExecuted(JobExecutionContext context) {
Session session = SessionFactoryUtils.getSession(sessionFactory, true);
session.setFlushMode(FlushMode.AUTO);
TransactionSynchronizationManager.bindResource(sessionFactory, new SessionHolder(session));
if( LOG.isDebugEnabled()) LOG.debug("Hibernate Session is bounded to Job thread");
}
The workaround is to change the flush mode in the Job:
def sessionFactory
.
.
.
def session=SessionFactoryUtils.getSession(sessionFactory, false)
session?.setFlushMode(FlushMode.COMMIT)

Resources