Micronaut: Unable to inject a bean inside Quartz job - groovy

I am trying to inject bean (Either data access GORM service or any other bean) in the Quartz Job implemented class, but it always shows null. Same beans (GORM or other beans) are able to inject without any issues in other classes.
can you please help me to retrieve any bean in this Job class.
My Quartz job
#Singleton
#Slf4j
class MyQuartzJob implements Job {
#Inject
MyHttpBean myHttpBean // unable to inject
#Inject
ApplicationContext appContext // unable to inject
#Inject
MyGORMService myGormService // unable to inject
}
#Singleton
#Slf4j
class MyHttpBean {
// business logic
}
code to invoke QuartzJob
#Singleton
#Context
#Slf4j
#CompileStatic
class MasterScheduler{
#PostConstruct
void init(){
// Quartz Job initialization code written here. This works fine.
}
}
my build.gradle
dependencyManagement {
imports {
mavenBom 'io.micronaut:micronaut-bom:1.3.2'
}
}
dependencies {
annotationProcessor("io.micronaut:micronaut-inject-java:1.3.2")
annotationProcessor("io.micronaut:micronaut-inject-groovy:1.3.2")
implementation("io.micronaut:micronaut-inject:1.3.2")
// https://mvnrepository.com/artifact/org.quartz-scheduler/quartz
compile group: 'org.quartz-scheduler', name: 'quartz', version: '2.3.0'
... other dependencies
}
Java version: 1.8
Note: I am using Micronaut scheduling capabilities, but I need distributed execution support & hence moving to Quartz ...

you need to define a JobFactory.
package io.micronaut.quartz;
import io.micronaut.context.BeanContext;
import org.quartz.Job;
import org.quartz.JobDetail;
import org.quartz.Scheduler;
import org.quartz.SchedulerException;
import org.quartz.spi.JobFactory;
import org.quartz.spi.TriggerFiredBundle;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import javax.inject.Singleton;
#Singleton
public class MicronautJobFactory implements JobFactory {
private final Logger log = LoggerFactory.getLogger(getClass());
private final BeanContext beanContext;
public MicronautJobFactory(BeanContext beanContext) {
this.beanContext = beanContext;
}
#Override
public Job newJob(TriggerFiredBundle bundle, Scheduler scheduler) throws SchedulerException {
JobDetail jobDetail = bundle.getJobDetail();
Class<? extends Job> jobClass = jobDetail.getJobClass();
try {
if (log.isDebugEnabled()) {
log.debug(
"Producing instance of Job '" + jobDetail.getKey() +
"', class=" + jobClass.getName());
}
return beanContext.getBean(jobClass);
} catch (Exception e) {
SchedulerException se = new SchedulerException(
"Problem instantiating class '"
+ jobDetail.getJobClass().getName() + "'", e);
throw se;
}
}
}
afterward when you define your scheduler you can will then have to add your factory builder to the quartz scheduler when you define it. This replaces the default builder that comes with Quartz.
All your jobs going forward will have to be defined using #Prototype or #Singleton annotation. By Default Quartz uses newInstance() which circumvents object defined by Micronaut, but this will defer the object building to Micronaut when the job is declared.
scheduler.setJobFactory(jobFactory);

Related

How to run the SpringBootTest with only a single bean and with included resilience4j annotations

I would like to run an integration test of a single bean with resilience4j annotated method in a spring boot app. My intent is to test resiliency of bean method calls while not loading the full spring context.
The setup is as follows:
Dependencies include the following:
io.github.resilience4j:resilience4j-spring-boot2
io.github.resilience4j:resilience4j-reactor
org.springframework.boot:spring-boot-starter-aop
The resilience4j time limited spring bean with method to test:
#Service
public class FooService {
#TimeLimiter(name = "fooTimeLimiter")
public FooResponse foo() {
//entertain operation that might timeout
}
}
Configuration:
resilience4j.timelimiter.instances.fooTimeLimiter.timeoutDuration=1s
And the test:
#SpringBootTest
#ContextConfiguration(classes = FooService.class)
public class FooServiceIT {
#Autowired
private FooService service;
#MockBean
private Bar bar;
#Test
void foo_timeout() {
//setup mocks so the operation delays the output and shall end up with timeout
Assertions.assertThrows(TimeoutException.class, () -> service.foo());
}
}
However, the TimeLimiterAdvice.proceed() is not entertained, no timeout exception is thrown and the test fails.
Same question has been asked here: Testing SpringBoot with annotation-style Resilience4j but there is no solution.
I tried both approaches - implement FooService interface and program directly using the concrete class. With the same result.
How can I achieve the time limiter annotation is taken into account in my test?
Edit: I even tried plain spring test (no spring boot) with the following setup:
#ExtendWith(SpringExtension.class)
#ContextConfiguration(loader = AnnotationConfigContextLoader.class)
public class FooServiceIT {
#Configuration
#Import({TimeLimiterConfiguration.class, FallbackConfiguration.class, SpelResolverConfiguration.class})
static class ContextConfiguration {
#Bean
public FooService fooService() {
//prepare bean;
}
#Bean
public TimeLimiterConfigurationProperties timeLimiterConfigurationProperties() {
return new TimeLimiterConfigurationProperties();
}
}
#Autowired
private FooService service;
//tests...
}
Same result (i.e. no timeout exception).
When dealing with SpringBootTest and #CircuitBreaker, it was sufficient to add #EnableAspectJAutoProxy annotation to the test. After this change, the CircuitBreakerAspect was entertained and the test behaves as expected.
In order to make #TimeLimiter working as expected, one need to add #Bulkhead annotation to the method as well.
The updated method looks as follows:
#Bulkhead(name = "fooBulkhead", type = Type.THREADPOOL)
#CircuitBreaker(
name = "fooCircuitBreaker",
fallbackMethod = "fooFallback"
)
#TimeLimiter(
name = "fooTimeLimiter"
)
public CompletableFuture<FooResponse> foo() {
//...
}
and the test:
#SpringBootTest(classes = FooService.class)
#EnableAspectJAutoProxy
#Import(value = {CircuitBreakerAutoConfiguration.class, TimeLimiterAutoConfiguration.class, BulkheadAutoConfiguration.class})
public class FooServiceIT {
//...
}

Injecting a different bean during local development with Quarkus

With Spring and Micronaut, there are very concise ways to inject a different bean depending on what environment/profile an application is running in. I'm trying to do the same with Quarkus.
I've read this post: https://quarkus.io/blog/quarkus-dependency-injection/. And the process is alluded to in this StackOverflow post: How can I override a CDI bean in Quarkus for testing?. That last post says, "create bean in test directory".
My problem is slightly different. I'd like to inject a bean when in "development". In production, I'd like the default bean injected. From the docs, I can't see a way to have the app make this distinction.
If I have a default class like this:
#DefaultBean
#ApplicationScoped
class ProdProvider : SomeProvider {}
And I want to override it like this:
#Alternative
#Priority(1)
class DevProvider : SomeProvider {}
How can I make this happen only in dev mode?
In one case, I have a credential provider class that sets up Google's PubSub emulator while in local development. In production, I use a class that implements the same interface, but a real credential provider. The particular case that led me to asking this question, though is a a class that implements one method:
#ApplicationScoped
class VaultLoginJwtProvider : LoginJwtProvider {
#ConfigProperty(name = "vault.tokenPath")
private val jwtPath: String? = null
companion object {
val logger: Logger = LoggerFactory.getLogger("VaultTokenProvider")
}
override fun getLoginJwt(): Optional<String> {
logger.info("Using Vault Login JWT")
return try {
Optional.of(String(Files.readAllBytes(Paths.get(jwtPath))).trim { it <= ' ' })
} catch (e: Exception) {
logger.error("Could not read vault token at $jwtPath")
logger.error(e.printStackTrace().toString())
Optional.empty()
}
}
}
That class is injected into another class via constructor injection:
#Singleton
class JwtServiceImpl(
#RestClient val vaultClient: VaultClient,
#Inject val loginJwtProvider: LoginJwtProvider
) {
private var serviceJwt: String? = null
companion object {
val logger: Logger = LoggerFactory.getLogger("JwtServiceImpl")
}
private fun getLoginToken(): String? {
val vaultLogin = VaultLogin(
role = "user-service",
jwt = loginJwtProvider.getLoginJwt().get()
)
val loginResponse = vaultClient.login(vaultLogin)
return loginResponse.auth.clientToken
}
}
I'd like to inject more of a "mock" class while in development that just returns a static string. I could use ProfileManager.getActiveProfile(), but that has me mixing development concerns into my logic. And I don't feel that that has any place in my compiled production code.
This is possible in Micronaut by using the annotation #Requires(env = ["dev", "test"]). I did briefly look at using #Produces but the Oracle EE docs seemed a little bit difficult for me to grasp. If that's the solution, I'll dig in.
In case anybody else comes across this, this is how to do it: https://quarkus.io/guides/cdi-reference#enabling-beans-for-quarkus-build-profile
For example:
import javax.enterprise.inject.Produces;
import com.oi1p.common.EmailSender;
import com.oi1p.common.ErrorEmailSender;
import com.oi1p.common.LogOnlyEmailSender;
import io.quarkus.arc.DefaultBean;
import io.quarkus.arc.profile.IfBuildProfile;
#ApplicationScoped
public class Producers {
#Produces
#IfBuildProfile("dev")
public EmailSender logOnlyEmailSender() {
return new LogOnlyEmailSender();
}
#Produces
#DefaultBean
public EmailSender errorEmailSender() {
// TODO: implement a real email sender. This one explodes when poked.
return new ErrorEmailSender();
}
}
My solution is to create the final bean on my own inside a #javax.ws.rs.ext.Provider. Not as elegant as Micronaut #Requires, but well, it works.
Note that instance of SomeProvider is not a "bean", you have to care for the lifecycle on your own (dependency injection, PostConstruct, no PreDestroy, ...).
org.acme.SomeProvider.java
package org.acme;
import javax.enterprise.context.ApplicationScoped;
public interface SomeProvider {
void providerMethod();
#ApplicationScoped
class ProdProviderRequirement {
void foo() {}
}
class ProdProvider implements SomeProvider {
private final ProdProviderRequirement prodProviderRequirement;
ProdProvider(final ProdProviderRequirement prodProviderRequirement) {
this.prodProviderRequirement = prodProviderRequirement;
}
#Override
public void providerMethod() {
prodProviderRequirement.foo();
}
}
class DevProvider implements SomeProvider {
#Override
public void providerMethod() {}
}
}
org.acme.SomeProviderFactory.java
package org.acme;
import javax.enterprise.context.ApplicationScoped;
import javax.enterprise.inject.Produces;
import javax.inject.Inject;
import javax.ws.rs.ext.Provider;
import org.acme.SomeProvider.DevProvider;
import org.acme.SomeProvider.ProdProvider;
import org.acme.SomeProvider.ProdProviderRequirement;
#Provider
class SomeProviderFactory {
SomeProvider someProvider;
#Inject
SomeProviderFactory(final ProdProviderRequirement prodProviderRequirement) {
final var someCondition = true;
someProvider = someCondition ? new DevProvider() : new ProdProvider(prodProviderRequirement);
}
#Produces
#ApplicationScoped
SomeProvider someProvider() {
return someProvider;
}
}

spring-integration amqp outbound adapter race condition?

We've got a rather complicated spring-integration-amqp use case in one of our production applications and we've been seeing some "org.springframework.integration.MessageDispatchingException: Dispatcher has no subscribers" exceptions on startup. After the initial errors on startup, we don't see those exceptions anymore from the same components. This is seeming like some kind of startup race condition on components that depend on AMQP outbound adapters and that end up using them early in the lifecycle.
I can reproduce this by calling a gateway that sends to a channel wired to an outbound adapter in a PostConstruct method.
config:
package gadams;
import org.springframework.amqp.core.Queue;
import org.springframework.amqp.rabbit.core.RabbitTemplate;
import org.springframework.boot.SpringApplication;
import org.springframework.boot.autoconfigure.SpringBootApplication;
import org.springframework.context.annotation.Bean;
import org.springframework.integration.annotation.IntegrationComponentScan;
import org.springframework.integration.dsl.IntegrationFlow;
import org.springframework.integration.dsl.IntegrationFlows;
import org.springframework.integration.dsl.amqp.Amqp;
import org.springframework.integration.dsl.channel.MessageChannels;
import org.springframework.messaging.MessageChannel;
#SpringBootApplication
#IntegrationComponentScan
public class RabbitRace {
public static void main(String[] args) {
SpringApplication.run(RabbitRace.class, args);
}
#Bean(name = "HelloOut")
public MessageChannel channelHelloOut() {
return MessageChannels.direct().get();
}
#Bean
public Queue queueHello() {
return new Queue("hello.q");
}
#Bean(name = "helloOutFlow")
public IntegrationFlow flowHelloOutToRabbit(RabbitTemplate rabbitTemplate) {
return IntegrationFlows.from("HelloOut").handle(Amqp.outboundAdapter(rabbitTemplate).routingKey("hello.q"))
.get();
}
}
gateway:
package gadams;
import org.springframework.integration.annotation.Gateway;
import org.springframework.integration.annotation.MessagingGateway;
#MessagingGateway
public interface HelloGateway {
#Gateway(requestChannel = "HelloOut")
void sendMessage(String message);
}
component:
package gadams;
import javax.annotation.PostConstruct;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.context.annotation.DependsOn;
import org.springframework.stereotype.Component;
#Component
#DependsOn("helloOutFlow")
public class HelloPublisher {
#Autowired
private HelloGateway helloGateway;
#PostConstruct
public void postConstruct() {
helloGateway.sendMessage("hello");
}
}
In my production use case, we have a component with a PostConstruct method where we're using a TaskScheduler to schedule a bunch of components with some that depend on AMQP outbound adapters, and some of those end up executing immediately. I've tried putting bean names on the IntegrationFlows that involve an outbound adapter and using #DependsOn on the beans that use the gateways and/or the gateway itself, but that doesn't get rid of the errors on startup.
That everything called Lifecycle. Any Spring Integration endpoints start listen for or produce messages only when their start() is performed.
Typically for standard default autoStartup = true it is done in the ApplicationContext.finishRefresh(); as a
// Propagate refresh to lifecycle processor first.
getLifecycleProcessor().onRefresh();
To start producing messages to the channel from the #PostConstruct (afterPropertiesSet()) is really very early, because it is does far away from the finishRefresh().
You really should reconsider your producing logic and that implementation into SmartLifecycle.start() phase.
See more info in the Reference Manual.

Spring Batch thread-safe Map job repository

the Spring Batch docs say of the Map-backed job repository:
Note that the in-memory repository is volatile and so does not allow restart between JVM instances. It also cannot guarantee that two job instances with the same parameters are launched simultaneously, and is not suitable for use in a multi-threaded Job, or a locally partitioned Step. So use the database version of the repository wherever you need those features.
I would like to use a Map job repository, and I do not care about restarting, prevention of concurrent job executions, etc. but I do care about being able to use multi-threading and local partitioning.
My batch application has some partitioned steps, and at first glance it seems to run just fine with a Map-backed job repository.
What is the reason it said to be not possible with MapJobRepositoryFactoryBean? Looking at the implementation of Map DAOs, they are using ConcurrentHashMap. Is this not thread-safe ?
I would advise you to follow the documentation, rather than relying on implementation details. Even if the maps are individually thread-safe, there might be race conditions in changes than involve more than one of these maps.
You can use an in-memory database very easily. Example
#Grapes([
#Grab('org.springframework:spring-jdbc:4.0.5.RELEASE'),
#Grab('com.h2database:h2:1.3.175'),
#Grab('org.springframework.batch:spring-batch-core:3.0.6.RELEASE'),
// must be passed with -cp, for whatever reason the GroovyClassLoader
// is not used for com.thoughtworks.xstream.io.json.JettisonMappedXmlDriver
//#Grab('org.codehaus.jettison:jettison:1.2'),
])
import org.h2.jdbcx.JdbcDataSource
import org.springframework.batch.core.Job
import org.springframework.batch.core.JobParameters
import org.springframework.batch.core.Step
import org.springframework.batch.core.StepContribution
import org.springframework.batch.core.configuration.annotation.EnableBatchProcessing
import org.springframework.batch.core.configuration.annotation.JobBuilderFactory
import org.springframework.batch.core.configuration.annotation.StepBuilderFactory
import org.springframework.batch.core.launch.JobLauncher
import org.springframework.batch.core.scope.context.ChunkContext
import org.springframework.batch.core.step.tasklet.Tasklet
import org.springframework.batch.repeat.RepeatStatus
import org.springframework.beans.factory.annotation.Autowired
import org.springframework.context.annotation.AnnotationConfigApplicationContext
import org.springframework.context.annotation.Bean
import org.springframework.context.annotation.Configuration
import org.springframework.core.io.ResourceLoader
import org.springframework.jdbc.datasource.init.DatabasePopulatorUtils
import org.springframework.jdbc.datasource.init.ResourceDatabasePopulator
import javax.annotation.PostConstruct
import javax.sql.DataSource
#Configuration
#EnableBatchProcessing
class AppConfig {
#Autowired
private JobBuilderFactory jobs
#Autowired
private StepBuilderFactory steps
#Bean
public Job job() {
return jobs.get("myJob").start(step1()).build()
}
#Bean
Step step1() {
this.steps.get('step1')
.tasklet(new MyTasklet())
.build()
}
#Bean
DataSource dataSource() {
new JdbcDataSource().with {
url = 'jdbc:h2:mem:temp_db;DB_CLOSE_DELAY=-1'
user = 'sa'
password = 'sa'
it
}
}
#Bean
BatchSchemaPopulator batchSchemaPopulator() {
new BatchSchemaPopulator()
}
}
class BatchSchemaPopulator {
#Autowired
ResourceLoader resourceLoader
#Autowired
DataSource dataSource
#PostConstruct
void init() {
def populator = new ResourceDatabasePopulator()
populator.addScript(
resourceLoader.getResource(
'classpath:/org/springframework/batch/core/schema-h2.sql'))
DatabasePopulatorUtils.execute populator, dataSource
}
}
class MyTasklet implements Tasklet {
#Override
RepeatStatus execute(StepContribution contribution, ChunkContext chunkContext) throws Exception {
println 'TEST!'
}
}
def ctx = new AnnotationConfigApplicationContext(AppConfig)
def launcher = ctx.getBean(JobLauncher)
def jobExecution = launcher.run(ctx.getBean(Job), new JobParameters([:]))
println "Status is: ${jobExecution.status}"

How to configure json format when using jaxb annotations with jersey

I am using jersey to expose a service which uses jaxb annotated classes to configure the look of the json.
I am trying to include the type directive in each json element. I do this by providing a Provider as such:
import org.codehaus.jackson.JsonParser.Feature;
import org.codehaus.jackson.map.ObjectMapper;
import org.codehaus.jackson.map.ObjectMapper.DefaultTyping;
#Provider
#Produces(MediaType.APPLICATION_JSON)
public class CmsContextResolver implements ContextResolver<ObjectMapper> {
ObjectMapper mapper;
public CmsContextResolver() {
mapper = new ObjectMapper();
// #JsonTypeInfo(use = JsonTypeInfo.Id.NAME, include =
// JsonTypeInfo.As.WRAPPER_OBJECT, property = "#type")
mapper.configure(Feature.INTERN_FIELD_NAMES, true);
mapper.enableDefaultTypingAsProperty(DefaultTyping.NON_FINAL, "#type");
}
#Override
public ObjectMapper getContext(Class<?> arg0) {
return mapper;
}
}
And this provider is definitely being picked up.
10 May 2011 3:53:18 PM com.sun.jersey.api.core.ScanningResourceConfig logClasses
INFO: Provider classes found:
class com.afrozaar.cms.service.CmsContextResolver
But it is making no difference. The format of the json is unaffected.
As far as I can tell the problem stems from the fact that jersey is not using jackson to serialize? or that jersey is ignoring my jackson configuration overrides...
I don't know why your code isn't working, but this is what I use:
import javax.ws.rs.Produces;
import javax.ws.rs.core.MediaType;
import javax.ws.rs.ext.Provider;
import org.codehaus.jackson.jaxrs.JacksonJaxbJsonProvider;
#Provider
#Produces(MediaType.APPLICATION_JSON)
public class JsonProvider extends JacksonJaxbJsonProvider {
public JsonProvider() {
super();
setMapper( myConfiguredObjectMapper );
}

Resources