Why is MockBean not working in an integrationtest with RabbitMQ? - mockito

I have a Spring Boot application with RabbitMQ and I try to test my code in an integration test. My integration test does work in some cases, but not in all cases.
Application code
#Component
#AllArgsConstructor
public class TestQueueListener {
private final TestService testService;
#RabbitListener(queues = "q_test")
public void listen(String value) {
testService.doSomething(value);
}
}
#Service
#Slf4j
public class TestService {
public void doSomething(String value) {
log.info(value);
}
}
Test code
#TestClassOrder(ClassOrderer.OrderAnnotation.class)
class Tests {
#Nested
#Order(1)
#SpringBootTest(webEnvironment = WebEnvironment.NONE)
#ExtendWith(OutputCaptureExtension.class)
class ServiceTest {
#Autowired TestService testService;
#Test
void testDoSomething(CapturedOutput output) {
testService.doSomething("ServiceTestValue");
assertTrue(output.toString().contains("ServiceTestValue"));
}
}
#Nested
#Order(2)
#SpringBootTest(webEnvironment = WebEnvironment.NONE)
class ListenerTest {
#Autowired private RabbitTemplate rabbitTemplate;
#MockBean private TestService testService;
#Test
void testListen() {
rabbitTemplate.convertAndSend("q_test", "testValue");
verify(testService, timeout(30000)).doSomething("testValue");
}
}
}
Logs
2022-05-19 10:59:21.527 INFO 102 --- [ main] .b.t.c.SpringBootTestContextBootstrapper : Neither #ContextConfiguration nor #ContextHierarchy found for test class [test.Tests$ListenerTest], using SpringBootContextLoader
2022-05-19 10:59:21.528 INFO 102 --- [ main] o.s.t.c.support.AbstractContextLoader : Could not detect default resource locations for test class [test.Tests$ListenerTest]: no resource found for suffixes {-context.xml, Context.groovy}.
2022-05-19 10:59:21.529 INFO 102 --- [ main] t.c.s.AnnotationConfigContextLoaderUtils : Could not detect default configuration classes for test class [test.Tests$ListenerTest]: ListenerTest does not declare any static, non-private, non-final, nested classes annotated with #Configuration.
2022-05-19 10:59:21.534 INFO 102 --- [ main] .b.t.c.SpringBootTestContextBootstrapper : Found #SpringBootConfiguration test.TestApplication for test class test.Tests$ListenerTest
2022-05-19 10:59:21.536 INFO 102 --- [ main] .b.t.c.SpringBootTestContextBootstrapper : Loaded default TestExecutionListener class names from location [META-INF/spring.factories]: [org.springframework.boot.test.mock.mockito.MockitoTestExecutionListener, org.springframework.boot.test.mock.mockito.ResetMocksTestExecutionListener, org.springframework.boot.test.autoconfigure.restdocs.RestDocsTestExecutionListener, org.springframework.boot.test.autoconfigure.web.client.MockRestServiceServerResetTestExecutionListener, org.springframework.boot.test.autoconfigure.web.servlet.MockMvcPrintOnlyOnFailureTestExecutionListener, org.springframework.boot.test.autoconfigure.web.servlet.WebDriverTestExecutionListener, org.springframework.boot.test.autoconfigure.webservices.client.MockWebServiceServerTestExecutionListener, org.springframework.test.context.web.ServletTestExecutionListener, org.springframework.test.context.support.DirtiesContextBeforeModesTestExecutionListener, org.springframework.test.context.event.ApplicationEventsTestExecutionListener, org.springframework.test.context.support.DependencyInjectionTestExecutionListener, org.springframework.test.context.support.DirtiesContextTestExecutionListener, org.springframework.test.context.transaction.TransactionalTestExecutionListener, org.springframework.test.context.jdbc.SqlScriptsTestExecutionListener, org.springframework.test.context.event.EventPublishingTestExecutionListener]
2022-05-19 10:59:21.536 INFO 102 --- [ main] .b.t.c.SpringBootTestContextBootstrapper : Using TestExecutionListeners: [org.springframework.test.context.web.ServletTestExecutionListener#7130e7a8, org.springframework.test.context.support.DirtiesContextBeforeModesTestExecutionListener#3fe79b11, org.springframework.test.context.event.ApplicationEventsTestExecutionListener#15f209fc, org.springframework.boot.test.mock.mockito.MockitoTestExecutionListener#4f2eec7c, org.springframework.boot.test.autoconfigure.SpringBootDependencyInjectionTestExecutionListener#39d55f96, org.springframework.test.context.support.DirtiesContextTestExecutionListener#3ae974ae, org.springframework.test.context.transaction.TransactionalTestExecutionListener#3fd127e6, org.springframework.test.context.jdbc.SqlScriptsTestExecutionListener#724716c, org.springframework.test.context.event.EventPublishingTestExecutionListener#149eb7f1, org.springframework.boot.test.mock.mockito.ResetMocksTestExecutionListener#65a3edc8, org.springframework.boot.test.autoconfigure.restdocs.RestDocsTestExecutionListener#537b088d, org.springframework.boot.test.autoconfigure.web.client.MockRestServiceServerResetTestExecutionListener#77f0ac9f, org.springframework.boot.test.autoconfigure.web.servlet.MockMvcPrintOnlyOnFailureTestExecutionListener#16b0acaf, org.springframework.boot.test.autoconfigure.web.servlet.WebDriverTestExecutionListener#6c87de1c, org.springframework.boot.test.autoconfigure.webservices.client.MockWebServiceServerTestExecutionListener#67b38c61]
. ____ _ __ _ _
/\\ / ___'_ __ _ _(_)_ __ __ _ \ \ \ \
( ( )\___ | '_ | '_| | '_ \/ _` | \ \ \ \
\\/ ___)| |_)| | | | | || (_| | ) ) ) )
' |____| .__|_| |_|_| |_\__, | / / / /
=========|_|==============|___/=/_/_/_/
:: Spring Boot :: (v2.6.6)
2022-05-19 10:59:21.634 INFO 102 --- [ main] test.Tests$ListenerTest : Starting Tests.ListenerTest using Java 11.0.10 on runner-grawsgkm-project-663-concurrent-0 with PID 102 (started by root in /builds/test/test-app)
2022-05-19 10:59:21.635 INFO 102 --- [ main] test.Tests$ListenerTest : No active profile set, falling back to 1 default profile: "default"
2022-05-19 10:59:22.307 INFO 102 --- [ main] o.s.a.r.c.CachingConnectionFactory : Attempting to connect to: [tmprabbit:5672]
2022-05-19 10:59:22.317 INFO 102 --- [ main] o.s.a.r.c.CachingConnectionFactory : Created new connection: rabbitConnectionFactory#4813cd48:0/SimpleConnection#662f5d78 [delegate=amqp://guest#172.17.0.2:5672/, localPort= 49876]
2022-05-19 10:59:22.336 INFO 102 --- [ main] test.Tests$ListenerTest : Started Tests.ListenerTest in 0.796 seconds (JVM running for 7.761)
2022-05-19 10:59:22.383 INFO 102 --- [ntContainer#0-1] test.TestService : testValue
[ERROR] Tests run: 2, Failures: 1, Errors: 0, Skipped: 0, Time elapsed: 36.706 s <<< FAILURE! - in test.Tests
[ERROR] testListen Time elapsed: 30.053 s <<< FAILURE!
org.mockito.exceptions.verification.WantedButNotInvoked:
Wanted but not invoked:
testService bean.doSomething("testValue");
-> at test.Tests$ListenerTest.testListen(Tests.java:46)
Actually, there were zero interactions with this mock.
at test.Tests$ListenerTest.testListen(Tests.java:46)
Research
The logs show that the real TestService is called, not the mocked one. I thought it has something to do with Spring's context cache, so I enabled DEBUG logs. But apparently there are the right number of contexts in the cache. In extended logs I can also see that the right context is retrieved.
2022-05-19 11:29:42.707 DEBUG 103 --- [ main] c.DefaultCacheAwareContextLoaderDelegate : Storing ApplicationContext [1261461477] in cache under key [[MergedContextConfiguration#3f7e3d48 testClass = Tests.ListenerTest, locations = '{}', classes = '{class test.TestApplication}', contextInitializerClasses = '[]', activeProfiles = '{}', propertySourceLocations = '{}', propertySourceProperties = '{org.springframework.boot.test.context.SpringBootTestContextBootstrapper=true}', contextCustomizers = set[org.springframework.boot.test.context.filter.ExcludeFilterContextCustomizer#31b82e0f, org.springframework.boot.test.json.DuplicateJsonObjectContextCustomizerFactory$DuplicateJsonObjectContextCustomizer#10272bbb, org.springframework.boot.test.mock.mockito.MockitoContextCustomizer#769c78c2, org.springframework.boot.test.web.client.TestRestTemplateContextCustomizer#da28d03, org.springframework.boot.test.autoconfigure.actuate.metrics.MetricsExportContextCustomizerFactory$DisableMetricExportContextCustomizer#65afeb6d, org.springframework.boot.test.autoconfigure.properties.PropertyMappingContextCustomizer#0, org.springframework.boot.test.autoconfigure.web.servlet.WebDriverContextCustomizerFactory$Customizer#b27b210, org.springframework.boot.test.context.SpringBootTestArgs#1, org.springframework.boot.test.context.SpringBootTestWebEnvironment#7c2b6087], contextLoader = 'org.springframework.boot.test.context.SpringBootContextLoader', parent = [null]]]
2022-05-19 11:29:42.708 DEBUG 103 --- [ main] org.springframework.test.context.cache : Spring test ApplicationContext cache statistics: [DefaultContextCache#2c36de3b size = 2, maxSize = 32, parentContextCount = 0, hitCount = 11, missCount = 2]
2022-05-19 11:29:42.712 DEBUG 103 --- [ main] c.DefaultCacheAwareContextLoaderDelegate : Retrieved ApplicationContext [1261461477] from cache with key [[MergedContextConfiguration#3f7e3d48 testClass = Tests.ListenerTest, locations = '{}', classes = '{class test.TestApplication}', contextInitializerClasses = '[]', activeProfiles = '{}', propertySourceLocations = '{}', propertySourceProperties = '{org.springframework.boot.test.context.SpringBootTestContextBootstrapper=true}', contextCustomizers = set[org.springframework.boot.test.context.filter.ExcludeFilterContextCustomizer#31b82e0f, org.springframework.boot.test.json.DuplicateJsonObjectContextCustomizerFactory$DuplicateJsonObjectContextCustomizer#10272bbb, org.springframework.boot.test.mock.mockito.MockitoContextCustomizer#769c78c2, org.springframework.boot.test.web.client.TestRestTemplateContextCustomizer#da28d03, org.springframework.boot.test.autoconfigure.actuate.metrics.MetricsExportContextCustomizerFactory$DisableMetricExportContextCustomizer#65afeb6d, org.springframework.boot.test.autoconfigure.properties.PropertyMappingContextCustomizer#0, org.springframework.boot.test.autoconfigure.web.servlet.WebDriverContextCustomizerFactory$Customizer#b27b210, org.springframework.boot.test.context.SpringBootTestArgs#1, org.springframework.boot.test.context.SpringBootTestWebEnvironment#7c2b6087], contextLoader = 'org.springframework.boot.test.context.SpringBootContextLoader', parent = [null]]]
I also added following log statements to see if the used TestService is really a mock. The result was that both variables contain a mock.
log.info("Direct: " + Mockito.mockingDetails(testService).isMock());
log.info("Indirect: " + Mockito.mockingDetails(testQueueListener.testService).isMock());
If I change the order of the tests, all tests pass. However, I don't want to order all my tests as nested classes within one class. That is not readable.
If I call TestQueueListener direcly without RabbitMQ, all tests pass.
#Nested
#Order(2)
#SpringBootTest(webEnvironment = WebEnvironment.NONE)
class ListenerTest {
#Autowired private TestQueueListener testQueueListener;
#MockBean private TestService testService;
#Test
void testListen() {
testQueueListener.listen("testValue");
verify(testService, timeout(30000)).doSomething("testValue");
}
}
But I also want to test messaging with RabbitMQ.
Question
Why is my integration test failing?

Related

Getting error while deploying in weblogic 14 for log4j library

Source Code :
#Controller
public class HomeController {
private static final Logger log = LogManager.getLogger();
#GetMapping("/hello")
public #ResponseBody String getHello()
{
DemoClass cls = new DemoClass();
cls.helloworld();
log.info("INFO =====================");
return "Hello2";
}
}
The culprit in above code is "LogManager.getLogger()".
The above code is working perfectly fine on apache tomcat.
The stacktrace is as follows :
Java 11 is used.
Weblogic 14 is used.
log4j version : 2.19.0
Caused By: java.lang.UnsupportedOperationException: No class provided, and an appropriate one cannot be found.
at org.apache.logging.log4j.LogManager.callerClass(LogManager.java:573)
at org.apache.logging.log4j.LogManager.getLogger(LogManager.java:598)
at org.apache.logging.log4j.LogManager.getLogger(LogManager.java:585)
at com.example.demo.controller.HomeController.<clinit>(HomeController.java:12)

Why are my Library Logs Missing in a Groovy Script?

I've been using HttpBuilder-NG in a large compiled project and relying on its request and response logging when debugging, but using it in a standalone Groovy script results in only output from my class, not the library's logs.
Can anyone suggest how to get those logs I'm expecting to see from the HttpBuilder-NG library being used?
Here's a quick test scenario I put together:
// logging-test.groovy
import ch.qos.logback.classic.Logger
import ch.qos.logback.classic.Level
import static groovyx.net.http.HttpBuilder.configure
import static groovyx.net.http.util.SslUtils.ignoreSslIssues
import groovy.util.logging.Slf4j
import groovyx.net.http.OkHttpBuilder
#GrabConfig(systemClassLoader = true) // Encounters class loading issues without this
#GrabResolver(name = 'mcArtifacts', root = 'https://artifactory.mycompany.com/artifactory/maven-all/')
#Grab(group = 'io.github.http-builder-ng', module = 'http-builder-ng-core', version = '1.0.3')
#Grab(group = 'io.github.http-builder-ng', module = 'http-builder-ng-okhttp', version = '1.0.3')
#Grab('ch.qos.logback:logback-classic:1.2.3')
#Grab('ch.qos.logback:logback-core:1.2.3')
#Slf4j
class LoggingTest {
private static final Logger LOGGER = org.slf4j.LoggerFactory.getLogger(LoggingTest.class)
static void main(String[] args) {
new LoggingTest().run(args)
}
def run(String[] args) {
def builder = OkHttpBuilder.configure({
ignoreSslIssues execution
request.uri = "https://dummy.restapiexample.com"
})
def currentContents = builder.get {
request.uri.path = "/api/v1/employees"
}
LOGGER.info "Testing output - HttpBuilder-NG gives back a ${currentContents.getClass()}"
LOGGER.debug "Validating debug works."
}
}
And for logback configuration:
// logback.groovy
import ch.qos.logback.classic.encoder.PatternLayoutEncoder
import ch.qos.logback.core.ConsoleAppender
appender('CONSOLE', ConsoleAppender) {
encoder(PatternLayoutEncoder) {
pattern = '%d{HH:mm:ss.SSS} [%thread] %-5level %logger{36} - %msg%n'
}
}
logger 'groovyx.net.http.JavaHttpBuilder', DEBUG, [ 'CONSOLE' ]
logger 'groovy.net.http.JavaHttpBuilder', DEBUG, [ 'CONSOLE' ]
logger 'groovy.net.http.JavaHttpBuilder.content', TRACE, [ 'CONSOLE' ]
logger 'groovy.net.http.JavaHttpBuilder.headers', DEBUG, [ 'CONSOLE' ]
root DEBUG, ['CONSOLE']
Console output on execution:
$ groovy logging-test.groovy
08:57:46.053 [main] INFO LoggingTest - Testing output - HttpBuilder-NG gives back a class org.apache.groovy.json.internal.LazyMap
08:57:46.056 [main] DEBUG LoggingTest - Validating debug works.

Hazelcast tracing via sleuth

I wonder if there is some integration of sleuth in hazelcast. In my application I have hazelcast queue with event listeners configured for addEntity events and problem is that span seems to be broken once this listener triggeres. I know that there is integration of sleuth for ExecutorService, but is there something similar for com.hazelcast.core.ItemListener? Thanks in advance.
UPD: Giving more details.
I have some sample service that uses both spring-cloud-sleth and hazelcast queue
package com.myapp;
import com.hazelcast.core.Hazelcast;
import com.hazelcast.core.HazelcastInstance;
import com.hazelcast.core.IQueue;
import com.hazelcast.core.ItemEvent;
import com.hazelcast.core.ItemListener;
import java.util.concurrent.Executors;
import lombok.extern.slf4j.Slf4j;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.cloud.sleuth.DefaultSpanNamer;
import org.springframework.cloud.sleuth.TraceRunnable;
import org.springframework.cloud.sleuth.Tracer;
import org.springframework.scheduling.annotation.Async;
import org.springframework.stereotype.Service;
#Service
#Slf4j
public class SomeService {
private HazelcastInstance hazelcastInstance =
Hazelcast.newHazelcastInstance();
private IQueue<String> queue = hazelcastInstance.getQueue("someQueue");
private Tracer tracing;
#Autowired(required = false)
public void setTracer(Tracer tracer) {
this.tracing = tracer;
}
{
queue.addItemListener(new ItemListener<String>() {
#Override
public void itemAdded(ItemEvent<String> item) {
log.info("This is span");
log.info("This is item " + item);
}
#Override
public void itemRemoved(ItemEvent<String> item) {
}
}, true);
}
#Async
public void processRequestAsync() {
log.info("Processing async");
log.info("This is span");
Executors.newSingleThreadExecutor().execute(
new TraceRunnable(tracing, new DefaultSpanNamer(), () -> log.info("Some Weird stuff")));
queue.add("some stuff");
}
}
and once I call processRequestAsync I receive following output in console:
INFO [-,792a6c3ad3e91280,792a6c3ad3e91280,false] 9996 --- [nio-8080-exec-2] com.myapp.SomeController : Incoming request!
INFO [-,792a6c3ad3e91280,792a6c3ad3e91280,false] 9996 --- [nio-8080-exec-2] com.myapp.SomeController : This is current span [Trace: 792a6c3ad3e91280, Span: 792a6c3ad3e91280, Parent: null, exportable:false]
INFO [-,792a6c3ad3e91280,7d0c06d3e24a7ba1,false] 9996 --- [cTaskExecutor-1] com.myapp.SomeService : Processing async
INFO [-,792a6c3ad3e91280,7d0c06d3e24a7ba1,false] 9996 --- [cTaskExecutor-1] com.myapp.SomeService : This is span
INFO [-,792a6c3ad3e91280,8a2f0a9028f44979,false] 9996 --- [pool-1-thread-1] com.myapp.SomeService : Some Weird stuff
INFO [-,792a6c3ad3e91280,7d0c06d3e24a7ba1,false] 9996 --- [cTaskExecutor-1] c.h.i.p.impl.PartitionStateManager : [10.236.31.22]:5701 [dev] [3.8.3] Initializing cluster partition table arrangement...
INFO [-,,,] 9996 --- [e_1_dev.event-4] com.myapp.SomeService : This is span
INFO [-,,,] 9996 --- [e_1_dev.event-4] com.myapp.SomeService : This is item ItemEvent{event=ADDED, item=some stuff, member=Member [10.236.31.22]:5701 - b830dbf0-0977-42a3-a15d-800872221c84 this}
So looks like span was broked once we go to eventListener code and I wonder how can I propagate or create new span inside hazelcast queue
Sleuth (at the time of writing) does not support Hazelcast.
The solution is more general than just Hazelcast - you need to pass Zipkin's brave.Span between the client and server, but brave.Span is not serializable.
Zipkin provides a means by which to work around this.
Given a brave.Span on the client, you can convert it to a java.util.Map:
Span span = ...
Map<String, String> map = new HashMap<>();
tracing.propagation().injector(Map<String, String>::put).inject(span.context(), map);
On the server you can convert the java.util.Map back to a brave.Span:
Span span = tracer.toSpan(tracing.propagation().extractor(Map<String, String>::get).extract(map).context())
The use of java.util.Map can obviously be replaced as need be, but the principle is the same.
I can't get it to work for ItemListeners. I think we'd need to be able to wrap Hazelcast's StripedExecutor in something like a LazyTraceThreadPoolTaskExecutor (but one that accepts a plain Executor delegate instead of a ThreadPoolTaskExecutor).
For EntryProcessors, I've hacked this together. A factory to create EntryProcessors, passing in the current span from the thread that creates the processor. When the processor runs, it uses that span as the parent span in the executor thread.
#Component
public class SleuthedEntryProcessorFactory {
private final Tracer tracer;
public SleuthedEntryProcessorFactory(Tracer tracer) {
this.tracer = tracer;
}
/**
* Create an entry processor that will continue the Sleuth span of the thread
* that invokes this method.
* Mutate the given value as required. It will then be set on the entry.
*
* #param name name of the span
* #param task task to perform on the map entry
*/
public <K, V, R> SleuthedEntryProcessor<K, V, R> create(String name, Function<V, R> task) {
return new SleuthedEntryProcessor<>(name, tracer.getCurrentSpan(), task);
}
}
/**
* Copies the MDC context (which contains Sleuth's trace ID, etc.) and the current span
* from the thread that constructs this into the thread that runs this.
* #param <K> key type
* #param <V> value type
* #param <R> return type
*/
#SpringAware
public class SleuthedEntryProcessor<K, V, R> extends AbstractEntryProcessor<K, V> {
private final Map<String, String> copyOfContextMap;
private final String name;
private final Span parentSpan;
private final Function<V, R> task;
private transient Tracer tracer;
public SleuthedEntryProcessor(String name, Span parentSpan, Function<V, R> task) {
this(name, parentSpan, task, true);
}
public SleuthedEntryProcessor(
String name, Span parentSpan, Function<V, R> task, boolean applyOnBackup) {
super(applyOnBackup);
this.name = name + "Hz";
this.parentSpan = parentSpan;
this.task = task;
copyOfContextMap = MDC.getCopyOfContextMap();
}
#Override
public final R process(Map.Entry<K, V> entry) {
if (nonNull(copyOfContextMap)) {
MDC.setContextMap(copyOfContextMap);
}
Span span = tracer.createSpan(toLowerHyphen(name), parentSpan);
try {
V value = entry.getValue();
// The task mutates the value.
R result = task.apply(value);
// Set the mutated value back onto the entry.
entry.setValue(value);
return result;
} finally {
MDC.clear();
tracer.close(span);
}
}
#Autowired
public void setTracer(Tracer tracer) {
this.tracer = tracer;
}
}
Then pass the EntryProcessor to your IMap like this:
Function<V, R> process = ...;
SleuthedEntryProcessor<K, V, R> entryProcessor = sleuthedEntryProcessorFactory.create(label, process);
Map<K, R> results = iMap.executeOnEntries(entryProcessor);

Spark - Register model objects with Kyro - Caused by: java.lang.IllegalArgumentException: Class is not registered:

I am registering the classes which has business logic and model classes with Kyro in spark . I get the below exception
> Job aborted due to stage failure: Task 14 in stage 1.0 failed 4 times,
> most recent failure: Lost task 14.3 in stage 1.0 (TID 90, **):
> java.lang.IllegalArgumentException: Class is not registered: Object[]
> Note: To register this class use: kryo.register(Object[].class); at
> com.esotericsoftware.kryo.Kryo.getRegistration(Kryo.java:442) at
> com.esotericsoftware.kryo.util.DefaultClassResolver.writeClass(DefaultClassResolver.java:79)
> at com.esotericsoftware.kryo.Kryo.writeClass(Kryo.java:472) at
> com.esotericsoftware.kryo.Kryo.writeClassAndObject(Kryo.java:565) at
> org.apache.spark.serializer.KryoSerializerInstance.serialize(KryoSerializer.scala:296)
> at
> org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:239)
> at
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
> at
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
> at java.lang.Thread.run(Thread.java:745)
>
> Driver stacktrace:
Kyro Registrtor :
public class KyroSerializer implements KryoRegistrator {
#Override
public void registerClasses(Kryo kryo) {
kryo.register(People.class);
kryo.register(Lookup.class);
}
}
Model class:
class people implements Serializable{
private static final long serialVersionUID = 1L; ...... }
public class Lookup implements Serializable{
private static final long serialVersionUID = 1L;
private String code1;
private String code2;
}
Finally my Spark context :
sc.set("spark.kryo.registrator", KyroSerializer.class.getName())
From the exception, it seems that kryo has not registered a class object for Object[] (array whose entries are of type Object). Please try to change your code as follows:
public class KyroSerializer implements KryoRegistrator {
#Override
public void registerClasses(Kryo kryo) {
kryo.register(Object[].class); // add this line to your class
kryo.register(People.class);
kryo.register(Lookup.class);
}
}
In addition, I would register your custom registrator class with spark like this
sc.set("spark.kryo.registrator", KyroSerializer.class.getCanonicalName());
If it works for you, though, then just ignore my last remark.

How to resolve groovy.lang.MissingMethodException?

How to resolve groovy.lang.MissingMethodException: No signature of method: methodMissing() is applicable for argument types: () values: []?
In my project I have two plugins and I am getting this exception at start-up for one of the plugins (All the functionality of this plugin is fine)
I've got the exception on this line for 'findAllByStatus'
def newItemList = Item.findAllByStatus(ItemStatus.NEW)
I have imported Item.groovy in current service class, also the service class is being created at start-up when quartz is starting. I'm not sure if it is related to quartz or not.
Item is a domain class.
class Item implements Serializable {
ItemStatus status
Date dateCreated
Date lastUpdated
def updateLastUpdated(){
lastUpdated = new Date()
}
static hasMany = [itemProperties : ItemProperty]
static mapping = {
table 'xcomms_item'
datasource 'xcomms'
}
static constraints = {
batch nullable:true
}
#Override
public int hashCode() {
return 13 * id.hashCode();
}
#Override
public boolean equals(Object obj) {
if ((obj != null) && (obj instanceof Item) && (((Item)obj).id.equals(this.id))) {
return true;
}
return false;
}
}
The stack trace:
groovy.lang.MissingMethodException: No signature of method: xcomms.Item.methodMissing() is applicable for argument types: () values: []
at xcomms.CommunicationBatchProcessService.communicationProcesss(CommunicationBatchProcessService.groovy:53)
at xcomms.AutomatedCommunicationJob.execute(AutomatedCommunicationJob.groovy:16)
at grails.plugin.quartz2.GrailsArtefactJob.execute(GrailsArtefactJob.java:59)
at org.quartz.core.JobRunShell.run(JobRunShell.java:213)
at org.quartz.simpl.SimpleThreadPool$WorkerThread.run(SimpleThreadPool.java:557)
2013-11-14 14:20:00,112 [QuartzJobCluster_Worker-2] ERROR quartz2.JobErrorLoggerListener - Exception thrown in job:xcomms.AutomatedCommunicationJob
org.quartz.JobExecutionException: xcomms.communication.exception.CommunicationProcessException: Error in processing communication batch [See nested exception: xcomms.communication.exception.CommunicationProcessException: Error in processing communication batch]
at grails.plugin.quartz2.GrailsArtefactJob.execute(GrailsArtefactJob.java:66)
at org.quartz.core.JobRunShell.run(JobRunShell.java:213)
at org.quartz.simpl.SimpleThreadPool$WorkerThread.run(SimpleThreadPool.java:557)
Caused by: xcomms.communication.exception.CommunicationProcessException: Error in processing communication batch
at xcomms.AutomatedCommunicationJob.execute(AutomatedCommunicationJob.groovy:19)
at grails.plugin.quartz2.GrailsArtefactJob.execute(GrailsArtefactJob.java:59)
... 2 more
Caused by: groovy.lang.MissingMethodException: No signature of method: xcomms.Item.methodMissing() is applicable for argument types: () values: []
at xcomms.CommunicationBatchProcessService.communicationProcesss(CommunicationBatchProcessService.groovy:53)
at xcomms.AutomatedCommunicationJob.execute(AutomatedCommunicationJob.groovy:16)
... 3 more
ItemStatus is:
public enum ItemStatus {
NEW(0,"New"),BATCHED(1,"Batched"),SENT(2,"Sent")
final int id
final String name
private ItemStatus(int id, String name) { this.id = id; this.name = name;}
static ItemStatus getById(int i){
for( entry in ItemStatus.values() ){
if(entry.id == i)
return entry
}
}
}
What I've done to solve this problem is to postpone the start of the Quartz Scheduler. As Ima said, there is a kind of race condition that makes GORM not available until the Grails app has completely started. If you try to use it before, you get a MissingMethodException.
My solution involved disabling the Quartz Scheduler autoStartup (see https://github.com/9ci/grails-quartz2/blob/07ecde5baa59e20f99c05302c61137617c08fc81/src/groovy/grails/plugin/quartz2/QuartzFactoryBean.groovy#L61) and to start() it in the Bootstrap.groovy after all the initialization was done.
This is the config to prevent autoStartup:
grails {
plugin {
quartz2 {
autoStartup = false
}
}
}
Using this way you don't have to give up using GORM, as Ima suggested.
Finally, I found the solution.
At the start-up, quartz loads service class but it couldn't execute GORM commands at this point. Then I changed it to the native sql select * from xcomms_item where status = 0 . Now it works fine.

Resources