This is regarding the issue I am facing to test the service activator. Technologies used are - Spring Integration with Redis - using RedisQueueOutboundChannelAdapter and RedisQueueMessageDrivenEndpoint.
The application code can be found at link https://github.com/SRekha-LV/SISamples/tree/master/SpringRedisTestEg
Flow - HomeController sends asynchronous messages to the Service Activator method - processQueue1Details () in class ProcessQueue1Messages.
The intention is to test the no. of times the method - processQueue1Details () is being called. The sample testcase for the same is in the test class - TestHitCount.java - testProcessQueue1 ().
Executing this is giving an exception "Wanted but not invoked:
processQueue1Messages.processQueue1Details(
GenericMessage [payload=TEst, headers={timestamp=1457436349427, id=d1b622d4-43b1-dfc4-e114-2ae700cbdb6c}]
);
-> at com.spring.mvc.redis.TestHitCount.testProcessQueue1(TestHitCount.java:42)
Actually, there were zero interactions with this mock."
Need some help.
Thanks in advance for all the support.
Your application is very complex and your question is enough dirty. That's why someone has de-voted your question already.
I really can't catch your your configs and test, but I have two advises:
In Spring Integration you should use #ServiceActivator for MessageHandler #Beans. To be able to consume from the channel and handle message, e.g. send it to Redis List like all those your RedisQueueOutboundChannelAdapters.
If you test the async scenario you have to wait for the result on the other hand before invoke the .verify(). E.g. in your case you send to the Redis and exit. There is really no any interaction with your mock.
Let's take a look to your code closer!
#Test
public void testProcessQueue1 () {
ProcessQueue1Messages pQueue1Msgs = Mockito.mock(ProcessQueue1Messages.class);
Object strObj = "TEst";
Message<Object> message = MessageBuilder.withPayload(strObj).build();
rulesRQOutboundChannelAdapter.handleMessage(message);
Mockito.verify(pQueue1Msgs).processQueue1Details(message);
}
You create mock in the test and nowhere provide it. So how your Spring Application should interact with the mock, if it is just local variable?
Related
I am trying to test a Spring Integration flow that starts off from a message-driven-channel-adapter configured as:
<int-jms:message-driven-channel-adapter id="myAdapter" ... />
My test goes like:
#SpringJUnitConfig(locations = {"my-app-context.xml"})
#SpringIntegrationTest(noAutoStartup = {"myAdapter"})
public class MyIntegrationFlowTest {
#Autowired
private MockIntegrationContext mockIntegrationContext;
#Test
public void myTest() {
...
MessageSource<String> messageSource = () -> new GenericMessage<>("foo");
mockIntegrationContext.substituteMessageSourceFor("myAdapter", messageSource);
...
}
}
I am however getting the following error:
org.springframework.beans.factory.BeanNotOfRequiredTypeException: Bean named 'myAdapter' is expected to be of type 'org.springframework.integration.endpoint.SourcePollingChannelAdapter' but was actually of type 'org.springframework.integration.jms.JmsMessageDrivenEndpoint'
How should one specify an alternate source for the channel adapter for testing using the MockIntegrationContext, or by some other method?
The Message Driver Channel Adapter is really not a Source Polling Channel Adapter. So, the substituteMessageSourceFor() is indeed cannot be used for that type of components, which, essentially is a MessageProducerSupport implementation, not a SourcePollingChannelAdapter for a MessageSource.
The difference exists because not all protocols provides a listener-like hooks to spawn some self-managed task to subscribe to. The good example is JDBC, which is only passive system expecting requests. Therefore a polling channel adapter with a JdbcPollingChannelAdapter (which is a MessageSource) implementation must be used to interact with DB in event-driven manner.
Other systems (like JMS in your case) provides some listener (or consumer) API for what we can spawn a while task (see MessageListenerContainer in spring-jms) and let its MessageProducerSupport to emit messages to the channel.
Therefore you need to distinguish for yourself with what type of component you interact before choosing a testing strategy.
Since there is no extra layer in case of message-driver channel adapter, but rather some specific, self-managed MessageProducerSupport impl, we not only provide a particular mocking API, but even don't require to know anything else, but just standard unit testing feature and a message channel this endpoint is producing in the configuration.
So, the solution for you is something like:
#SpringIntegrationTest(noAutoStartup = {"myAdapter"}) - that's fully correct in your code: we really have to stop the real channel adapter to not pollute our testing environment.
You just need to inject into your test class a MessageChannel that id="myAdapter" is producing to. In your test code you just build a Message and send it into this channel. No need to worry about a MockIntegrationContext at all.
I have a process where I need to upload file to Sftp server asynchronously. So after exploring more about Async in gateways I found that I need to have error channel defined in #MessagingGateway parameter then handler to handle the Exception propagated to error channel but I felt handling this way is complicated for me, as I will have to update Pojo field and persist into DB depending on the file upload, success or failure.
So I thought of having a custom method annotated with #Async and call the gateway method. Also surround gateway method with try block and catch any exception occurred in the downstream
Code Sample:
#Async
void upload(Resource file, FileStatus fileStatus){
try{
uploadGateway.upload(file,fileStatus.getFilePath(),fileStatus.getFileName());
}catch(RuntimeException e){
fileStatus.setUploadStatus("Failed");
//save into db
}
}
Upload Gateway without error channel so that error can be sent back to caller
#MessagingGateway
public interface UploadGateway {
#Gateway(requestChannel = "input.channel")
void upload(#Payload Resource file, #Header("path") String path, #Header("name") String fileName);
}
Handler:
#Bean
public IntegrationFlow uploadDocument() {
return IntegrationFlows.from("input.channel")
.log(LoggingHandler.Level.WARN)
.handle(Sftp.outboundAdapter(sftpSessionFactory(), FileExistsMode.FAIL)
.autoCreateDirectory(true)
.remoteDirectoryExpression("headers['path']")
.fileNameExpression("headers['name']"))
.get();
}
Question:
What will be the consequences if I'm handling error this way? Is this the right way to handle any error occurred in downstream flow?
Since #MessagingGateway is like an RPC in messaging, it is fully OK to catch an exception on its method call like that. Since you make your flow fully sync, it works like typical Java exceptions sub-system.
Your concern about async error handling with errorChannel really makes sense since it is similar in complexity with standard Java async method handling and its errors processing.
On the other hand it is really commended to handle errors downstream via errorChannel if that is going to be some complex logic in some other flow. Plus you are going to return back some compensation message.
However in the end of day the choice is yours: there is no drawbacks to handle errors yourself.
See Error Handling chapter for more food to think on.
I have a quarkus application with an async endpoint that creates an entity with default properties, starts a new thread within the request method and executes a long running job and then returns the entity as a response for the client to track.
#POST
#Transactional
public Response startJob(#NonNull JsonObject request) {
// create my entity
JobsRecord job = new JobsRecord();
// set default properties
job.setName(request.getString("name"));
// make persistent
jobsRepository.persist(job);
// start the long running job on a different thread
Executor.execute(() -> longRunning(job));
return Response.accepted().entity(job).build();
}
Additionally, the long running job will make updates to the entity as it runs and so it must also be transactional. However, the database entity just doesn't get updated.
These are the issues I am facing:
I get the following warnings:
ARJUNA012094: Commit of action id 0:ffffc0a80065:f2db:5ef4e1c7:0 invoked while multiple threads active within it.
ARJUNA012107: CheckedAction::check - atomic action 0:ffffc0a80065:f2db:5ef4e1c7:0 commiting with 2 threads active!
Seems like something that should be avoided.
I tried using #Transaction(value = TxType.REQUIRES_NEW) to no avail.
I tried using the API Approach instead of the #Transactional approach on longRunning as mentioned in the guide as follows:
#Inject UserTransaction transaction;
.
.
.
try {
transaction.begin();
jobsRecord.setStatus("Complete");
jobsRecord.setCompletedOn(new Timestamp(System.currentTimeMillis()));
transaction.commit();
} catch (Exception e) {
e.printStackTrace();
transaction.rollback();
}
but then I get the errors: ARJUNA016051: thread is already associated with a transaction! and ARJUNA016079: Transaction rollback status is:ActionStatus.COMMITTED
I tried both the declarative and API based methods again this time with context propagation enabled. But still no luck.
Finally, based on the third approach, I thought keeping the #Transactional on the Http request handler and leaving longRunning as is without declarative or API based transaction approaches would work. However the database still does not get updated.
Clearly I am misunderstanding how JTA and context propagation works (among other things).
Is there a way (or even a design pattern) that allows me to update database entities asynchronously in a quarkus web application? Also why wouldn't any of the approaches I took have any effect?
Using quarkus 1.4.1.Final with ext: [agroal, cdi, flyway, hibernate-orm, hibernate-orm-panache, hibernate-validator, kubernetes-client, mutiny, narayana-jta, rest-client, resteasy, resteasy-jackson, resteasy-mutiny, smallrye-context-propagation, smallrye-health, smallrye-openapi, swagger-ui]
You should return an async type from your JAX-RS resource method, the transaction context will then be available when the async stage executes. There is some relevant documentation in the quarkus guide on context propagation.
I would start by looking at the one of the reactive examples such as the getting started quickstart. Try annotating each resource endpoint with #Transactional and the async code will run with a transaction context.
I have set up a simple Spring Integration flow which is composed of such steps:
poll a rest api periodically then
do some processing on the payload
and land it on a Kafka topic.
Please observe the code below:
#Component
public class MyIntegrationFlow extends IntegrationFlowAdapter {
#Override
protected IntegrationFlowDefinition<?> buildFlow() {
return from(() -> List.of("pathVariable1", "pathVariable2"), c -> c.poller(Pollers.fixedDelay(5, TimeUnit.SECONDS)))
.split()
.handle(httpRequest(), c -> c.advice(new RequestHandlerRetryAdvice()))
.transform(Tranformers.fromJson(Foo.class))
.filter(payload -> payload.isValid())
.log()
.transform(Tranformers.toJson())
.channel(Source.OUTPUT); // output channel for kafka topic
}
private HttpMessageHandlerSpec httpRequest() {
return Http.outboundGateway("http://somehost:8080/{pathVariable}")
.httpMethod(GET)
.uriVariable("pathVariable", Message::getPayload)
.expectedResponseType(String.class);
}
}
This works brilliantly, however, I am struggling to come up with some good tests.
How am I supposed to mock the external REST API?
How am I supposed to test that the retry policy does kick in and the desired number of http requests are made?
How do I change the MessageSource of the flow (list of path vars) that is polled periodically?
How do I check if the payload has successfully made it to the Kafka topic?
Too much questions and some of them requires too broad explanation. Anyway I think you can start from Spring Integration Testing Framework and its documentation.
How am I supposed to mock the external REST API?
I think you can just consider to use a Mock MVC from Spring Framework and its MockMvcClientHttpRequestFactory to inject into the HttpRequestExecutingMessageHandler based on the HttpMessageHandlerSpec.
retry policy does kick
Well, I guess the same mocked MVC endpoint can verify how many times it has been called and fail for first several times to initiate that retry.
How do I change the MessageSource
This is exactly a part of Spring Integration Testing Framework with its MockIntegration.mockMessageSource() and MockIntegrationContext: https://docs.spring.io/spring-integration/docs/5.1.6.RELEASE/reference/html/#mockintegration
made it to the Kafka topic?
Or you the mentioned MockIntegration.mockMessageHandler() to verify that endpoint for Kafka is called. Or use an Embedded Kafka from Spring Kafka project: https://docs.spring.io/spring-kafka/docs/2.2.7.RELEASE/reference/html/#embedded-kafka-annotation
I did one simple DSL which retrieves the data from database and doing simple conversion in the service activator.
#Bean
public IntegrationFlow mainFlow() {
return IntegrationFlows.from("userChannel")
.channel("queryChannel")
.handle("sampleConvertor","convertUser")
.get();
queryChannel is a jdbc outbound gateway and sampleConverter is the service Activator.
<int-jdbc:outbound-gateway query="select * from employee where employee_id=:payload"
request-channel="queryChannel" data-source="dataSource"/>
The issue is after retrieving the data from database, the flow is not going to serviceActivator and it simply returns back the database response.
In xml configuration, I used to invoke gateway inside the chain like below.
<int:gateway id="query.gateway" request-channel="queryChannel"/>
Please suggest what I am doing wrong here. Thanks in advance.
That's a bit unusual to combine Java DSL and XML configuration, but they still work together.
Your problem I think that you are missing the fact that your queryChannel has two subscriber at runtime, not a chain of call.
The first one is <int-jdbc:outbound-gateway> and the second is that .handle("sampleConvertor","convertUser"). Right, when you declare a channel in the IntegrationFlow, the next EIP-method produces a subscriber for this channel. At the same time when you use a channel like request-channel or input-channel in the XML configuration that brings a subscriber as well.
So, you have two subscriber on the DirectChannel with the RoundRobinLoadBalancingStrategy and therefore only one of them will handle a message and if it is a request-replly component, like that <int-jdbc:outbound-gateway> it will produce a message into the output-channel or to the replyChannel in the headers. In your case the story is exactly about a replyChannel and therefore you don't go to the .handle("sampleConvertor","convertUser") because it's not the next in the chain, but just a parallel universe by the round-robin algorithm.
If you really would like to reach that .handle("sampleConvertor","convertUser") after calling the <int-jdbc:outbound-gateway>, you should consider to use .gateway("queryChannel") instead of that .channel().