Spring Integration - MongoDB inbound channel adapter not working - spring-integration

I have configured a MongoDB inbound channel adapter. However, the inbound channel adapter is not working as expected. I tried outbound channel adapter and was successful to write some content to DB. It seems the connection is successful but the data retrieval is not successful. I don't see anything is log as well. Can anyone point out what I am missing?
<?xml version="1.0" encoding="UTF-8"?>
<beans xmlns="http://www.springframework.org/schema/beans"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:int="http://www.springframework.org/schema/integration"
xmlns:int-jms="http://www.springframework.org/schema/integration/jms"
xmlns:int-mongodb="http://www.springframework.org/schema/integration/mongodb"
xmlns:int-http="http://www.springframework.org/schema/integration/http"
xsi:schemaLocation="http://www.springframework.org/schema/integration http://www.springframework.org/schema/integration/spring-integration.xsd
http://www.springframework.org/schema/integration/jms http://www.springframework.org/schema/integration/jms/spring-integration-jms-2.0.xsd
http://www.springframework.org/schema/integration/mongodb http://www.springframework.org/schema/integration/mongodb/spring-integration-mongodb.xsd
http://www.springframework.org/schema/beans http://www.springframework.org/schema/beans/spring-beans.xsd
http://www.springframework.org/schema/integration/http http://www.springframework.org/schema/integration/http/spring-integration-http.xsd">
<bean id="mongoDBFactory"
class="org.springframework.data.mongodb.core.SimpleMongoDbFactory">
<constructor-arg name="mongo">
<bean class="com.mongodb.Mongo">
<constructor-arg name="host" value="localhost" />
<constructor-arg name="port" value="27017" />
</bean>
</constructor-arg>
<constructor-arg name="databaseName" value="test" />
</bean>
<int-mongodb:inbound-channel-adapter
id="mongoInboundAdapter" channel="mongoChannel" expect-single-result="true"
query="{'_id' : '10'}" entity-class="com.test.si.prototype.model.Order"
collection-name="orders" auto-startup="false"
mongodb-factory="mongoDBFactory">
<int:poller fixed-rate="1000" />
</int-mongodb:inbound-channel-adapter>
<int:channel id="mongoChannel" />
<int:logging-channel-adapter id="logger"
auto-startup="true" log-full-message="true" level="INFO" channel="mongoChannel" />
<int:service-activator input-channel="mongoChannel"
ref="messageListenerImpl" method="processMessage" />
<bean id="messageListenerImpl" class="com.test.si.prototype.service.MessageListenerImpl"></bean>
</beans>

First of all your <int-mongodb:inbound-channel-adapter> is marked with auto-startup="false". So, it isn't going to poll data from DB until you start() mongoInboundAdapter manually.
From other side you should understand that query="{'_id' : '10'}" isn't good. Because you will retrive from the DB only the single document and only if it has an id == 10.
Do you really sure that you have such a document in the orders collection?

Related

Spring Integration Kafka Configuration - Errors in Eclipse

I am using Eclipse as the IDE. I have a very basic config XML file that does not validate and hence prevents Eclipse from running anything. What am I missing?
Here's the validation errors (I see the in problems view):
Here's my config xml:
<?xml version="1.0" encoding="UTF-8"?>
<beans xmlns="http://www.springframework.org/schema/beans"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:int="http://www.springframework.org/schema/integration"
xmlns:int-kafka="http://www.springframework.org/schema/integration/kafka"
xmlns:task="http://www.springframework.org/schema/task"
xsi:schemaLocation="http://www.springframework.org/schema/integration/kafka http://www.springframework.org/schema/integration/kafka/spring-integration-kafka.xsd
http://www.springframework.org/schema/integration http://www.springframework.org/schema/integration/spring-integration.xsd
http://www.springframework.org/schema/beans http://www.springframework.org/schema/beans/spring-beans.xsd
http://www.springframework.org/schema/task http://www.springframework.org/schema/task/spring-task.xsd">
<int:channel id="inputToKafka" />
<int-kafka:outbound-channel-adapter
id="kafkaOutboundChannelAdapter" kafka-template="template"
auto-startup="false" channel="inputToKafka" topic="replicated-topic-1"
message-key-expression="'bar'" partition-id-expression="2">
</int-kafka:outbound-channel-adapter>
<bean id="template" class="org.springframework.kafka.core.KafkaTemplate">
<constructor-arg>
<bean class="org.springframework.kafka.core.DefaultKafkaProducerFactory">
<constructor-arg>
<map>
<entry key="bootstrap.servers" value="192.168.33.21:9092,192.168.33.22:9092,192.168.33.23:9092" />
</map>
</constructor-arg>
</bean>
</constructor-arg>
</bean>
<int-kafka:message-driven-channel-adapter
id="kafkaListener"
listener-container="listenerContainer"
auto-startup="false"
phase="100"
send-timeout="5000"
channel="nullChannel"
error-channel="errorChannel" />
<bean id="listenerContainer" class="org.springframework.kafka.listener.KafkaMessageListenerContainer">
<constructor-arg>
<bean class="org.springframework.kafka.core.DefaultKafkaConsumerFactory">
<constructor-arg>
<map>
<entry key="bootstrap.servers" value="192.168.33.21:9092,192.168.33.22:9092,192.168.33.23:9092" />
</map>
</constructor-arg>
</bean>
</constructor-arg>
<constructor-arg name="topics" value="replicated-topic-1" />
</bean>
</beans>
If these are just bogus errors and the app runs ok, it simply means you are resolving to the online version of the spring-integration-core schema here. See the IMPORTANT note at the top of that schema as to why it is not the current version.
You can resolve that by using a spring-aware eclipse (e.g. STS or the Spring IDE plugin) and set spring nature on the project so the schema is resolved properly, from the class path, instead of the internet.
Or you can go to the XML Catalog in eclipse preferences and configure the schema mapping to properly point to the 4.3 version of the schema.
If it's truly a runtime problem (app won't run), then it means you have an incorrect version of spring-integration-core on the classpath - you should use maven or gradle to pull in the correct version transitively. If you are manually building the project class path, you need spring-integration-core version 4.3.2 or later (current version is 4.3.4).

Enabling logging in spring integration utility

Below I have the program to send a message and consume a message from queue, right now I have commented out the sending part and only want to consume the messages from queue
Now I want to enable logging in the below program such that a log file is generated in my c: drive and inside that log file it should indicate that what message it is consuming at what time stamp please advise how to configure logging in the below configuration file
<?xml version="1.0" encoding="UTF-8"?>
<beans xmlns="http://www.springframework.org/schema/beans"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xmlns:int="http://www.springframework.org/schema/integration"
xmlns:jms="http://www.springframework.org/schema/integration/jms"
xmlns:file="http://www.springframework.org/schema/integration/file"
xmlns:context="http://www.springframework.org/schema/context"
xsi:schemaLocation="http://www.springframework.org/schema/beans
http://www.springframework.org/schema/beans/spring-beans.xsd
http://www.springframework.org/schema/integration http://www.springframework.org/schema/integration/spring-integration.xsd
http://www.springframework.org/schema/integration/jms
http://www.springframework.org/schema/integration/jms/spring-integration-jms.xsd
http://www.springframework.org/schema/integration/spring-integration.xsd
http://www.springframework.org/schema/context
http://www.springframework.org/schema/integration/file
http://www.springframework.org/schema/integration/file/spring-integration-file.xsd
http://www.springframework.org/schema/context/spring-context.xsd">
<int:poller id="poller" default="true">
<int:interval-trigger interval="200" />
</int:poller>
<int:channel id="input">
<int:queue capacity="10" />
</int:channel>
<bean id="tibcoEMSJndiTemplate" class="org.springframework.jndi.JndiTemplate">
<property name="environment">
<props>
<prop key="java.naming.factory.initial">com.tibco.tibjms.naming.TibjmsInitialContextFactory
</prop>
<prop key="java.naming.provider.url">tcp://lsdrtems2.fm.crdgrp.net:7333</prop>
<prop key="java.naming.security.principal">acfgtir</prop>
<prop key="java.naming.security.credentials">acfgtir</prop>
</props>
</property>
</bean>
<bean id="tibcoEMSConnFactory" class="org.springframework.jndi.JndiObjectFactoryBean">
<property name="jndiTemplate">
<ref bean="tibcoEMSJndiTemplate" />
</property>
<property name="jndiName">
<value>GenericConnectionFactory</value>
</property>
</bean>
<bean id="tibcosendJMSTemplate" class="org.springframework.jms.core.JmsTemplate">
<property name="connectionFactory">
<ref bean="tibcoEMSConnFactory" />
</property>
<property name="defaultDestinationName">
<value>acfgtirrtyation.ioa.swretift_publish_poc1</value>
</property>
<property name="pubSubDomain">
<value>false</value>
</property>
<property name="receiveTimeout">
<value>120000</value>
</property>
</bean>
<!-- <jms:outbound-channel-adapter channel="input"
destination-name="acfgtirrtyation.ioa.swretift_publish_poc1" connection-factory="tibcoEMSConnFactory" /> -->
<int:channel id="objetChannel"></int:channel>
<int:channel id="StringChannel"></int:channel>
<int:channel id="jmsInChannel" />
<jms:message-driven-channel-adapter id="jmsIn" concurrent-consumers="10"
destination-name="acfgtirrtyation.ioa.swretift_publish_poc1" connection-factory="tibcoEMSConnFactory" extract-payload="false"
channel="jmsInChannel" />
<int:payload-type-router input-channel="jmsInChannel">
<int:mapping type="javax.jms.ObjectMessage" channel="objetChannel" />
<int:mapping type="javax.jms.TextMessage" channel="StringChannel" />
</int:payload-type-router>
<file:outbound-channel-adapter id="filesoutOject" channel="objetChannel" directory="C:\\abcsaral"
filename-generator="generatorr" />
<file:outbound-channel-adapter id="filesoutString" channel="StringChannel" directory="C:\\abcsaral"
filename-generator="generatorr" />
<bean id="generatorr" class="com.abs.tibco.TimestampTextGenerator">
</bean>
</beans>
Add log4j (or logback, or any java logging system supported by commons-logging) to your classpath and configure it to log a DEBUG level for category org.springframework.integration.
Or you can add a wire tap to the channel and route it to a Logging channel adapter
<int:channel id="in">
<int:interceptors>
<int:wire-tap channel="logger"/>
</int:interceptors>
</int:channel>
<int:logging-channel-adapter id="logger" level="DEBUG"/>

File getting deleted before processing

I have spring xd module with rabbit transport which pulls files from s3 and split line by line and delete it after processing(ExpressionAdvice) .I have around 1 million messages(lines) in my file which is in s3.The file gets downloaded to xd container box and i checked md5sum and its same and has same lines . I see only 260k odd message are coming to output channel which is processor.I am loosing around 740 messages. sometimes it random once i see all messages like 1 million in my output channel and sometimes only 250k .I am measuring this using counter for my stream.File is downloaded but i feel its getting deleted before processing all records/lines in 10 seconds, my file size is around 700Mb.Please let me know if expression advice is deleting before processing.
module.aws-s3-source.count=1 and module.aws-s3-source.concurrency=70
stream1 as-s3-source |processor|sink
<?xml version="1.0" encoding="UTF-8"?>
<beans xmlns="http://www.springframework.org/schema/beans"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xmlns:int="http://www.springframework.org/schema/integration"
xmlns:context="http://www.springframework.org/schema/context"
xmlns:int-aws="http://www.springframework.org/schema/integration/aws"
xmlns:int-file="http://www.springframework.org/schema/integration/file"
xsi:schemaLocation="http://www.springframework.org/schema/beans http://www.springframework.org/schema/beans/spring-beans.xsd
http://www.springframework.org/schema/integration/file
http://www.springframework.org/schema/integration/file/spring-integration-file.xsd
http://www.springframework.org/schema/context
http://www.springframework.org/schema/context/spring-context.xsd
http://www.springframework.org/schema/integration http://www.springframework.org/schema/integration/spring-integration.xsd
http://www.springframework.org/schema/integration/aws http://www.springframework.org/schema/integration/aws/spring-integration-aws-1.0.xsd">
<context:property-placeholder location="classpath*:test-${region}.properties" />
<int:poller fixed-delay="${fixedDelay}" default="true">
<int:advice-chain>
<ref bean="pollAdvise"/>
</int:advice-chain>
</int:poller>
<bean id="pollAdvise" class="org.springframework.integration.scheduling.PollSkipAdvice">
<constructor-arg ref="healthCheckStrategy"/>
</bean>
<bean id="healthCheckStrategy" class="test.ServiceHealthCheckPollSkipStrategy">
<property name="url" value="${url}"/>
<property name="doHealthCheck" value="${doHealthCheck}"/>
<property name="restTemplate" ref="restTemplate"/>
</bean>
<bean id="restTemplate"
class="org.springframework.web.client.RestTemplate">
<constructor-arg ref="requestFactory"/>
</bean>
<bean id="requestFactory"
class="test.BatchClientHttpRequestFactory">
<constructor-arg ref="verifier"/>
</bean>
<bean id="verifier"
class="test.NullHostnameVerifier">
</bean>
<bean id="encryptedDatum" class="test.EncryptedSecuredDatum"/>
<bean id="clientConfiguration" class="com.amazonaws.ClientConfiguration">
<property name="proxyHost" value="${proxyHost}"/>
<property name="proxyPort" value="${proxyPort}"/>
<property name="preemptiveBasicProxyAuth" value="false"/>
</bean>
<bean id="s3Operations" class="test.CustomC1AmazonS3Operations">
<constructor-arg index="0" ref="clientConfiguration"/>
<property name="awsEndpoint" value="s3.amazonaws.com"/>
<property name="temporaryDirectory" value="${temporaryDirectory}"/>
<property name="awsSecurityKey" value=""/>
</bean>
<bean id="credentials" class="org.springframework.integration.aws.core.BasicAWSCredentials">
</bean>
<int-aws:s3-inbound-channel-adapter aws-endpoint="s3.amazonaws.com"
bucket="${bucket}"
s3-operations="s3Operations"
credentials-ref="credentials"
file-name-wildcard="${fileNameWildcard}"
remote-directory="${prefix}"
channel="splitChannel"
local-directory="${localDirectory}"
accept-sub-folders="false"
delete-source-files="true"
archive-bucket="${archiveBucket}"
archive-directory="${archiveDirectory}">
</int-aws:s3-inbound-channel-adapter>
<int-file:splitter input-channel="splitChannel" output-channel="output" markers="false" charset="UTF-8">
<int-file:request-handler-advice-chain>
<bean class="org.springframework.integration.handler.advice.ExpressionEvaluatingRequestHandlerAdvice">
<property name="onSuccessExpression" value="payload.delete()"/>
</bean>
</int-file:request-handler-advice-chain>
</int-file:splitter>
<int:channel-interceptor pattern="*" order="3">
<bean class="org.springframework.integration.channel.interceptor.WireTap">
<constructor-arg ref="loggingChannel" />
</bean>
</int:channel-interceptor>
<int:logging-channel-adapter id="loggingChannel" log-full-message="true" level="INFO"/>
<int:channel id="output"/>
</beans>
Update 2 :
My stream is like below
aws-s3-source|processor|http-client| processor> queue:testQueue
1)Now I split the stream like below:
aws-s3-source> queue:s3Queue
I was able to read all my 1 million messages very fast.
2)Now i added one more stream like below i see issue again was s3 stops pulling file and message are lost everytime
queue:s3Queue>processor|http-client| processor> queue:testQueue
3)Observation is when i add http-client this issue happens again ,i.e. some message from input source is missing.
4)Now I split the file into 125 Mb 5 files instead of 660mb one file .ie 200 k records 5 files.I don't see the issue i get all my messages
I also see lot of messages clogging in queue before http-client .
I am thinking is it something to do with memory or threading inside xd?
Please let me know if expression advice is deleting before processing.
No; the advice is an around advice around the message handler; it can't execute (evaluate the expression), until the splitter has emitted all the lines.
Is it possible the file is pulled from s3 before it's completely written?
To debug this problem, I would suggest changing the advice to send the file to another subflow and do some analysis/logging there before deleting.

consuming all the object type messages from a queue

I have the below configuration which will connect to a particular queue on jms and will consume message from that queue and will write to a file
now the issue is that on the queue the message is of type object type or of string type also and i want to consume object type message only
so for example below is the body of the message which is of object type so the message headers value is
ObjectMessage={ Header={ JMSMessageID={ID:LON_TEST_GAWE_4533.351656B16070206DEBAE:1936} JMSDestination={Queue[erty.retry.object]} JMSReplyTo={null} JMSDeliveryMode={PERSISTENT} JMSRedelivered={false} JMSCorrelationID={null} JMSType={null} JMSTimestamp={Fri Feb 26 11:52:53 IST 2016} JMSExpiration={0} JMSPriority={4} } Properties={ } Object={?} }
as you have notice above that for object message the initials text in the message headers begains with ObjectMessage={ Header={ JMSMessageID={ID:LON
so please advise how can i consume all object type message is there any way by which i can them and store them in a file
bwlow is my configuration rite now..
<?xml version="1.0" encoding="UTF-8"?>
<beans xmlns="http://www.springframework.org/schema/beans"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xmlns:int="http://www.springframework.org/schema/integration"
xmlns:jms="http://www.springframework.org/schema/integration/jms"
xmlns:file="http://www.springframework.org/schema/integration/file"
xmlns:context="http://www.springframework.org/schema/context"
xsi:schemaLocation="http://www.springframework.org/schema/beans
http://www.springframework.org/schema/beans/spring-beans.xsd
http://www.springframework.org/schema/integration http://www.springframework.org/schema/integration/spring-integration.xsd
http://www.springframework.org/schema/integration/jms
http://www.springframework.org/schema/integration/jms/spring-integration-jms.xsd
http://www.springframework.org/schema/integration/spring-integration.xsd
http://www.springframework.org/schema/context
http://www.springframework.org/schema/integration/file
http://www.springframework.org/schema/integration/file/spring-integration-file.xsd
http://www.springframework.org/schema/context/spring-context.xsd">
<int:poller id="poller" default="true">
<int:interval-trigger interval="200" />
</int:poller>
<int:channel id="input">
<int:queue capacity="10" />
</int:channel>
<bean id="tibcoEMSJndiTemplate" class="org.springframework.jndi.JndiTemplate">
<property name="environment">
<props>
<prop key="java.naming.factory.initial">com.tibco.tibjms.naming.TibjmsInitialContextFactory
</prop>
<prop key="java.naming.provider.url">tcp://wert2.fm.absgrp.net:3453</prop>
<prop key="java.naming.security.principal">aert</prop>
<prop key="java.naming.security.credentials">aert</prop>
</props>
</property>
</bean>
<bean id="tibcoEMSConnFactory" class="org.springframework.jndi.JndiObjectFactoryBean">
<property name="jndiTemplate">
<ref bean="tibcoEMSJndiTemplate" />
</property>
<property name="jndiName">
<value>GenericConnectionFactory</value>
</property>
</bean>
<bean id="tibcosendJMSTemplate" class="org.springframework.jms.core.JmsTemplate">
<property name="connectionFactory">
<ref bean="tibcoEMSConnFactory" />
</property>
<property name="defaultDestinationName">
<value>erty.retry.object</value>
</property>
<property name="pubSubDomain">
<value>false</value>
</property>
<property name="receiveTimeout">
<value>120000</value>
</property>
</bean>
<!-- <jms:outbound-channel-adapter channel="input"
destination-name="erty.retry.object" connection-factory="tibcoEMSConnFactory" /> -->
<jms:message-driven-channel-adapter id="jmsIn" concurrent-consumers="10"
destination-name="erty.retry.object" connection-factory="tibcoEMSConnFactory" extract-payload="false"
channel="jmsInChannel" />
<int:channel id="jmsInChannel" />
<file:outbound-channel-adapter id="filesout" channel="jmsInChannel" directory="C:\\dfgal"
filename-generator="generatorr" />
<bean id="generatorr" class="com.rbs.tibco.TimestampTextGenerator">
</bean>
<int:payload-type-router input-channel="jmsInChannel"></int:payload-type-router>
<bean id="generatorr" class="com.rbs.tibco.TimestampTextGenerator">
</bean>
</beans>
Seems for me I have seen similar question here on SO. I don't want to find it to be sure that was from you and there was some answer.
Please, be sure that you use the search before asking unclear questions.
First of all your solution looks weird from the architecture perspective.
Even if we can do something like that, the JMS isn't so flexible to be partitioned like it is possible with Kafka.
I mean that it isn't so convenient for the consumer to read different message types from the same queue. The main problem that consumers reads ALL messages from the queue. I'm not sure that just filter those text messages and drop them is a good solution for your system.
Anyway you can use extract-payload = "false" on the <jms:message-driven-channel-adapter> meaning that the whole JMS Message will be as Spring Integration Message payload. After that you can use <payload-type-router> and distinguish an ObjectMessage from TextMessage and send them to the different channel: the first one to store in the file, another to something else.
Hope I am clear.

health check before processing file stream in xd

I am pulling files from s3 and processing them using spring xd. I have one processor http client component where i do some RESTful request .Now the problem with this approach is if my webservice is down the files get accumulated in rabbit mq transport .Hence before pulling a individual file from s3 I want to do a health check on my rest service.How can I tackle this my configuration file looks something like this.
<?xml version="1.0" encoding="UTF-8"?>
<beans xmlns="http://www.springframework.org/schema/beans"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xmlns:int="http://www.springframework.org/schema/integration"
xmlns:int-aws="http://www.springframework.org/schema/integration/aws"
xsi:schemaLocation="http://www.springframework.org/schema/beans http://www.springframework.org/schema/beans/spring-beans.xsd
http://www.springframework.org/schema/integration http://www.springframework.org/schema/integration/spring-integration.xsd
http://www.springframework.org/schema/integration/aws http://www.springframework.org/schema/integration/aws/spring-integration-aws-1.0.xsd">
<int:poller fixed-delay="${fixed-delay}" default="true"/>
<bean id="credentials" class="org.springframework.integration.aws.core.BasicAWSCredentials">
<property name="accessKey" value="${accessKey}"/>
<property name="secretKey" value="${secretKey}"/>
</bean>
<bean
class="org.springframework.beans.factory.config.PropertyPlaceholderConfigurer">
<property name="location">
<value>dms-aws-s3-nonprod.properties</value>
</property>
</bean>
<bean id="clientConfiguration" class="com.amazonaws.ClientConfiguration">
<property name="proxyHost" value="${proxyHost}"/>
<property name="proxyPort" value="${proxyPort}"/>
<property name="preemptiveBasicProxyAuth" value="false"/>
</bean>
<bean id="s3Operations" class="org.springframework.integration.aws.s3.core.CustomC1AmazonS3Operations">
<constructor-arg index="0" ref="credentials"/>
<constructor-arg index="1" ref="clientConfiguration"/>
<property name="awsEndpoint" value="s3.amazonaws.com"/>
<property name="temporaryDirectory" value="${temporaryDirectory}"/>
<property name="awsSecurityKey" value="${awsSecurityKey}"/>
</bean>
<!-- aws-endpoint="https://s3.amazonaws.com" -->
<int-aws:s3-inbound-channel-adapter aws-endpoint="s3.amazonaws.com"
bucket="${bucket}"
s3-operations="s3Operations"
credentials-ref="credentials"
file-name-wildcard="${file-name-wildcard}"
remote-directory="${remote-directory}"
channel="splitChannel"
local-directory="${local-directory}"
accept-sub-folders="false"
delete-source-files="true"
archive-bucket="${archive-bucket}"
archive-directory="${archive-directory}">
</int-aws:s3-inbound-channel-adapter>
int-file:splitter input-channel="splitChannel" output-channel="output" markers="true"/>
<int:channel id="output"/>
My stream defination
xd-shell>stream create feedTest16 --definition "aws-s3-source |processor-http-client| log" --deploy
Starting with Spring Integration 4.1, the PollSkipAdvice has been introduced.
Implement your own ServiceHealthCheckPollSkipStrategy and inject it into the <advice-chain> of the <poller> for your <int-aws:s3-inbound-channel-adapter> and you're good with the requirement!
Only one issue is there that your s3-source is tied with the target service for the http-client...

Resources