How to override/turnoff scheduled tasks in test of Spring context [duplicate] - spring-test

This question already has an answer here:
Disable a Spring Scheduler Task via Property
(1 answer)
Closed 5 years ago.
I have a small Spring webapp. Besides my plain unit tests, I'm writing a unit test that just verifies required bean wiring. I'm using the default applicationContext.xml file, not a "test" version. I do have some fake test resources that normally defined in my Tomcat JNDI context.
The test basically works, but one annoyance is that some scheduled tasks that are defined in the default context start up and emit some error messages that don't effect the test result.
The scheduled tasks are defined in the context like this:
<task:scheduled-tasks>
<task:scheduled ref="ratesQueryProcessor" method="run" fixed-rate="30000"/> <!-- Every 120 seconds -->
</task:scheduled-tasks>
Is there something that I can do in the spring context resulting from the default applicationContext.xml and my "test resources" XML file, and perhaps a JavaConfig class, which would "override" these scheduled tasks to turn them off?
If it matters, here's a small excerpt from my unit test class:
#RunWith(SpringRunner.class)
#ContextConfiguration(value = {"file:src/main/webapp/WEB-INF/applicationContext.xml", "/testResources.xml"})
//#ContextHierarchy({
// #ContextConfiguration("file:src/main/webapp/WEB-INF/applicationContext.xml"),
// #ContextConfiguration(classes = SpringWiringTest.Config.class)
//})
#TestPropertySource(properties = { "env = tomcat", "doNotifications = false" })
public class SpringWiringTest {
The commented out section is because I'm attempting to define my test resources in a JavaConfig class, but at this point I'm unable to use BOTH an XML file and a JavaConfig class (I have another SO posting asking about this).

This is a duplicate of Disable a Spring Scheduler Task via Property.
The solution is to use bean definition profiles (a.k.a., environment profiles).

Related

Cucumber JVM : avoid dependency injection by Picocontainer for features not tagged for execution

Assuming that I have a Cucumber feature tagged #api
#api
Feature: BankID authentication
Scenario Outline: Successful authentication of user using BankID
Given the initial flow start URLĀ 
When I enter that "<url>" into my browser
...
and steps for execution as below:
public class ApiSteps implements En {
public ApiSteps (ABCinjected abcInjected) {
Given("^the initial flow start URLĀ $", () -> {});
When("^I enter that \"([^\"]*)\" into my browser$", abcInjected::navigateToAuthenticationPage);
...
}
Even if I define this feature not to be executed by specifying different Cucumber tags or explicitly specifying tags = {"not #api"} , although steps themselves are not executed per se, Picocontainer still creates and injects instance of ABCinjected class, which is undesirable. Is it possible to control this behavior? I assume if feature is tagged as not to be executed and related scenario/steps are ignored, consecutively DI should not happen.
I got response from Cucumber contributor on Github:
When using lamda steddefs the class has to be instantiated to register
the steps. And we would need to know the steps defined by the class to
determine if we should instantiate it. This is a dead lock of
requirements.
Another recommendation is to set different packages for steps (api, units etc) and set different glue on runtime.
https://github.com/cucumber/cucumber-jvm/issues/1672

How to perform teardown for releasing resources after each Scenario In serenity BDD with Cucumber

I am using Serenity with BDD and need to perform a teardown step that must get executed after completion of each Scenario. Also, this teardown step should not be visible to report as it is technical thing and nothing to do with behavior to be exposed as part of cucumber such as releasing few expensive resource that got
I used cucumber's #After annotation which is working as expected, but the problem is now this step is also shown in my Report which I don't want to be visible.
Could someone please suggest me a solution that allows me to perform teardown step that gets executed per scenario but should not be added as step in my Serenity Report.
Current Solution I have is which does not satisfy my need:
Step Definition Class has following method:
#After
public void tearDown() {
systemAction.deleteCostlyResource(id);
}
but #After annotation makes it a candidate for Reporting Step.
If you are using Dependency Injection, you could have your DI framework teardown the resources at the end of the scenarios?
For instance, if you are using Spring:
If the "costly resource" is a class that you yourself have created, mark it with:
#Component
#Scope("cucumber-glue")
If the "costly resource" is not a class you created, but provided by a framework or whatever, you can register it as a bean in your spring (test)configuration and mark it with a "destroy method".
For example, to register Selenium WebDriver using annotation based configuration and making sure to quit after each Scenario, mark it with:
#Bean(destroyMethod = "quit")
In this example, quit() is WebDriver's method to quit(). In your situation, call "costly resource's" quit method, or equivalent thereof.

Mule Java Component and Thread Safe

An excerpt from this http://www.mulesoft.org/documentation/display/current/Configuring+Java+Components is:
When you specify the class directly on the component or pooled-component element, the PrototypeObjectFactory is used by default, and a new instance is created for each invocation, or a new pooled component is created in the case of the PooledJavaComponent
And, I have configured a Java class as Mule Java component like below:
<component class="com.mycompany.SalesOrderProductsHandler" doc:name="Java" />. The class SalesOrderProductsHandler has implemented org.mule.api.lifecycle.Callable and has one state variable named targetProductsIndex.
My question follows:
Will a new instance of com.mycompany.SalesOrderProductsHandler get created every time a new request comes?
The documentation is absolutely correct. With:
<component class="com.mycompany.SalesOrderProductsHandler" />
you will get a new instance of com.mycompany.SalesOrderProductsHandler for each invocation.

How do I know at runtime which implementing classes are set for my Spring beans?

In hybris, is there an easy way to know which implementing class is being used for a certain Spring bean?
I mean, I can override a Bean by doing something like this:
<alias name="myCheckoutFacade" alias="checkoutFacade"/>
<bean id="myCheckoutFacade" class="com.pedra.facades.checkout.impl.MyCheckoutFacadeImpl" scope="tenant" parent="defaultCheckoutFacade">
<property name="commerceCheckoutService" ref="myCommerceCheckoutService"/>
</bean>
... so now when Spring needs to create a bean with the alias checkoutFacade the implementing class will be MyCheckoutFacadeImpl as opposed to the overridden defaultCheckoutFacade which was defined in some other xml configuration file.
So is there a way to know at runtime which implementing class is being used for a certain Spring bean definition? Without having to debug the code, I mean.
Beanshell or Groovy :-)
Checking the implementing class of a bean is just one of the many cool things you can do at runtime with Beanshell or Groovy.
Disclaimer: Be careful running Beanshell or Groovy code on a production machine!
Log in to the HAC and go to Console > Beanshell or Groovy
Execute the following code in either Beanshell or Groovy to get your implementing class:
de.hybris.platform.core.Registry.getApplicationContext().getBean("checkoutFacade");
Both consoles will show the result of the last expression in the Result tab.
In the Groovy console for Hybris 5.x, simple execute the following:
checkoutFacade
As you can see, each bean is automatically def-ed into each Groovy script.
As for Beanshell, you could create a bean function in Beanshell:
import de.hybris.platform.core.Registry;
import de.hybris.platform.commercefacades.order.CheckoutFacade;
Object bean(String beanName)
{
return Registry.getApplicationContext().getBean(beanName);
}
CheckoutFacade checkoutFacade = (CheckoutFacade) bean("checkoutFacade");
print(checkoutFacade);
I ended up using Beanshell so much that I created my own wrapper application that allows me to develop Beanshell in Eclipse, and use Eclipse as the Beanshell console. But that's a whole other post!
Resources:
Beanshell User Manual
Beanshell Commands Documentation (Built-in functions like print())

JSF 2 + Quartz scheduling library

I have a web application thats using JSF 2. In this application I am using a charting library which is getting data from an xml file, the application updates the xml file, when someone accesses the site, because of jsf 2 Action. Now I want to implement the Quartz library the open source scheduling library, to update the xml file and not rely on the users action, but I have no idea how to call an Action from Quartz using JSF 2.
Thanks in advance guys.
Generally speaking , you should implement your scheduled logic , define when it will run , and initialize your scheduled jobs when the application server starts.
Implement scheduled logic
Your scheduled class should implement org.quartz.Job interface and override its execute() which contains the logic of your scheduled job. In your case , it is the method to update the XML file. You should make this method does not have any dependencies on JSF such that it can be called outside the JSF .
public class MyScheduledJob implements Job {
public void execute(JobExecutionContext context) throws JobExecutionException {
updateXML();
}
}
Initialize and start Quartz
Quartz provides a ServletContextListener called QuartzInitializerListener that allows you to initialize and start Quartz when the application server starts .
Add this listener to your web.xml
<listener>
<listener-class> org.quartz.ee.servlet.QuartzInitializerListener</listener-class>
</listener>
By default , it will look for a file called quartz.properties in the classpath to initialize Quartz . You can refer this for the more info about configurable options available in quartz.properties
Define which Job will run at which time
You can define it in a XML file (Its schema definition can be found here ) and configure XMLSchedulingDataProcessorPlugin in quartz.properties to load this XML when Quartz is initialized.
For example , in the quartz.properties
org.quartz.plugin.jobInitializer.class
=org.quartz.plugins.xml.XMLSchedulingDataProcessorPlugin org.quartz.plugin.jobInitializer.fileNames = quartz-config.xml
org.quartz.plugin.jobInitializer.failOnFileNotFound = true
Then in the quartz-config.xml
<?xml version="1.0" encoding="UTF-8"?>
<job-scheduling-data
xmlns="http://www.quartz-scheduler.org/xml/JobSchedulingData"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://www.quartz-scheduler.org/xml/JobSchedulingData http://www.quartz-scheduler.org/xml/job_scheduling_data_1_8.xsd"
version="1.8">
<schedule>
<job>
<name>MyScheduledJob</name>
<group>MyScheduledGroup</group>
<description>Job to update XML </description>
<job-class>com.xxxx.xxxx.xxxx.MyScheduledJob </job-class>
</job>
<trigger>
<cron>
<name>midNightTrigger</name>
<job-name>MyScheduledJob</job-name>
<job-group>MyScheduledGroup</job-group>
<!-- It will run every night at 3:30 am -->
<cron-expression>0 30 3 * * ?</cron-expression>
</cron>
</trigger>
</schedule>
</job-scheduling-data>
All the above is for Quartz 'a latest version 2.1 . You can check out the sample codes and tutorials from Quartz for more info.
If you actually want to invoke a JSF action from a scheduled job, the job's execute() method will need to include code that makes an HTTP request to the JSF action. You will likely want to use a code library such as Apache HttpClient or HTTP Unit, if java's URLConnection class does not easily fit your needs.

Resources