In Spring LDAP, what happens when a pooled connection fails to successfully connect? - spring-ldap

I am using Spring LDAP (2.0.2.RELEASE) to interact with our AD environment. I have integrated pooling within my applicationContext.xml.
In the Java LDAP docs (section 3.4), it states
If the LDAP provider cannot establish a connection within that period, it aborts the connection attempt
My question is: does spring handle a retry for this connection, or is there an error that occurs/thrown? I know Spring utilizes many of the underlying JVM LDAP features, but I have yet to find anything specific in this area.
Pertinent pieces of my applicationContext:
<bean id="dirContextValidator" class="org.springframework.ldap.pool.validation.DefaultDirContextValidator" />
<bean id="exampleConnectionDetails" class="org.springframework.ldap.core.support.LdapContextSource" scope="singleton">
<property name="url" value="ldaps://ldap.example.com:636" />
<property name="userDn" value="CN=LDAP_User,DC=example,DC=com" />
<property name="password" value="superSecretPwd" />
<property name="pooled" value="false"/>
<property name="referral" value="follow"/>
</bean>
<bean id="exampleContextSource" class="org.springframework.ldap.pool.factory.PoolingContextSource">
<property name="contextSource" ref="exampleConnectionDetails" />
<property name="dirContextValidator" ref="dirContextValidator" />
<property name="testOnBorrow" value="true" />
<property name="testWhileIdle" value="true" />
<property name="maxWait" value="10000" />
<property name="whenExhaustedAction" value="0" />
<property name="minIdle" value="5" />
<property name="maxIdle" value="10" />
<property name="timeBetweenEvictionRunsMillis" value="15000" />
<property name="minEvictableIdleTimeMillis" value="30000" />
<property name="numTestsPerEvictionRun" value="7" />
</bean>

The documentation for testOnBorrow states this,
testOnBorrow: The indication of whether objects will be validated before being borrowed from the pool. If the object fails to validate, it will be dropped from the pool, and an attempt to borrow another will be made.
I understand, Spring will attempt to make another connection in the event of an failure.

Related

Azure Redis connection failure when using SSL

I am using Spring Session with Redis using Azure Redis.
Things are working fine with the non-SSL port 6379. However with the SSL port 6380, I get this error:
ERROR (org.springframework.data.redis.listener.RedisMessageListenerContainer:651) || - Connection failure occurred. Restarting subscription task after 5000 ms
That’s it. No further information.
Here is my Redis configuration:
<bean id="redisPassword" class="org.springframework.data.redis.connection.RedisPassword">
<constructor-arg index="0" value="${spring.redis.password}"/>
</bean>
<bean id="redisStandaloneConfiguration"
class="org.springframework.data.redis.connection.RedisStandaloneConfiguration">
<property name="hostName" value="${spring.redis.host}"/>
<property name="port" value="${spring.redis.port}"/>
<property name="password" ref="redisPassword"/>
</bean>
<util:constant id="configureRedisAction"
static-field="org.springframework.session.data.redis.config.ConfigureRedisAction.NO_OP"/>
<bean id="lettuceClientConfiguration"
class="org.springframework.data.redis.connection.lettuce.DefaultLettuceClientConfiguration"
factory-method="defaultConfiguration">
</bean>
<context:annotation-config/>
<bean class="org.springframework.session.data.redis.config.annotation.web.http.RedisHttpSessionConfiguration"
p:configureRedisAction-ref="configureRedisAction"/>
<bean class="org.springframework.data.redis.connection.lettuce.LettuceConnectionFactory">
<constructor-arg index="0" ref="redisStandaloneConfiguration"/>
<constructor-arg index="1" ref="lettuceClientConfiguration"/>
</bean>
What is causing the connection failure?
<util:constant id="configureRedisAction"
static-field="org.springframework.session.data.redis.config.ConfigureRedisAction.NO_OP"/>
<context:annotation-config/>
<bean class="org.springframework.session.data.redis.config.annotation.web.http.RedisHttpSessionConfiguration"
p:configureRedisAction-ref="configureRedisAction"/>
<bean id="jedisPoolConfig" class="redis.clients.jedis.JedisPoolConfig">
<property name="maxTotal" value="200" />
<property name="maxIdle" value="50" />
<property name="maxWaitMillis" value="30000" />
<property name="minIdle" value="10"/>
</bean>
<bean class="org.springframework.data.redis.connection.jedis.JedisConnectionFactory">
<property name="hostName" value="${spring.redis.host}" />
<property name="port" value="${spring.redis.port}" />
<property name="poolConfig" ref="jedisPoolConfig" />
<property name="usePool" value="true" />
<property name="useSsl" value="${spring.redis.ssl}"/>
<property name="password" value="${spring.redis.password}"/>
</bean>

How to use #RefreshScrop in spring-integration project configured as xml config

I am using spring integration to ftp files to a remote server and I am using xml based config. I would like to use spring cloud config , so I can move all the properties files to git and use #RefreshScope to refresh the properties. What's the best way to achieve this in spring integration which has only xmls.
I have the below code :
<bean id="inDefaultSftpSessionFactory"
class="org.springframework.integration.sftp.session.DefaultSftpSessionFactory">
<property name="host" value="${sftp.host}" />
<property name="port" value="${sftp.port}" />
<property name="user"
value="${sftp.username}" />
<property name="password"
value="${sftp.password}" />
<property name="allowUnknownKeys" value="true" />
</bean>
Try scope="refresh" and <aop:scoped-proxy/> for that bean definition.
Something like this:
<bean id="inDefaultSftpSessionFactory"
class="org.springframework.integration.sftp.session.DefaultSftpSessionFactory"
scope="refresh">
<property name="host" value="${sftp.host}" />
<property name="port" value="${sftp.port}" />
<property name="user" value="${sftp.username}" />
<property name="password" value="${sftp.password}" />
<property name="allowUnknownKeys" value="true" />
<aop:scoped-proxy/>
</bean>

Spring Integration auto reconnect to IBM MQ thru JBOSS Resource Adapter

We are using Spring Integration in our project and we have a requirement where If IBM MQ goes down then we will have to auto connect to IBM MQ when it is up. We have done this implementation using recoveryInterval option of org.springframework.jms.listener.DefaultMessageListenerContainer class. We have given recovery interval value to try to recover the MQ connection. But it is not recovering the connection after MQ restart. Below was my existing configuration:
<jms:message-driven-channel-adapter id="adapterId" channel="raw-channel" container="messageListenerContainer" />
<bean id="messageListenerContainer" class="org.springframework.jms.listener.DefaultMessageListenerContainer">
<property name="connectionFactory" ref="customQueueCachingConnectionFactory" />
<property name="destination" ref="requestQueue" />
<property name="recoveryInterval" value="60000" />
</bean>
Below is the Current Connection Factory :
<bean id="queueCachingConnectionFactory"
class="org.springframework.jms.connection.CachingConnectionFactory">
<property name="targetConnectionFactory" ref="queueConnectionFactory" />
<property name="sessionCacheSize" value="10" />
<property name="cacheProducers" value="false" />
<!-- <property name="reconnectOnException" value="true" /> -->
<!-- <property name="exceptionListener" ref="MQExceptionListener"></property> -->
</bean>
<jee:jndi-lookup id="queueConnectionFactory" jndi-name="MQConnectionFactory"
expected-type="javax.jms.ConnectionFactory" lookup-on-startup="true"></jee:jndi-lookup>
<jee:jndi-lookup id="queue" jndi-name="Queue"
expected-type="javax.jms.Queue" lookup-on-startup="true"/>
ERROR [task-scheduler-4] LoggingHandler:145 -org.springframework.jms.IllegalStateException: MQJCA1019: The connection is closed.; nested exception is com.ibm.msg.client.jms.DetailedIllegalStateException: MQJCA1019: The connection is closed.
The application attempted to use a JMS connection after it had closed the connection.
Modify the application so that it closes the JMS connection only after it has finished using the connection.
Thanks in Advance!!
The default message listening container should reference the caching connection factory:
<!-- caching connection factory fascade, also implements exception listener -->
<bean id="cachingConnectionFactory" class="org.springframework.jms.connection.CachingConnectionFactory">
<property name="targetConnectionFactory" ref="connectionFactory"/>
<property name="sessionCacheSize" value="10"/>
<property name="reconnectOnException" value="true"/>
</bean>
<!-- this is the Message Driven POJO (MDP) -->
<bean id="messageDrivenPOJO" class="com.redhat.gss.spring.MessageDrivenPOJO" />
<!-- The message listener container -->
<bean id="messageListener" class="org.springframework.jms.listener.DefaultMessageListenerContainer">
<property name="sessionTransacted" value="true"/>
<property name="concurrentConsumers" value="1"/>
<property name="cacheLevelName" value="CACHE_CONSUMER"/>
<property name="receiveTimeout" value="10000"/>
<property name="sessionAcknowledgeMode" value="2"/>
<property name="messageListener" ref="messageDrivenPOJO"/>
<property name="connectionFactory" ref="cachingConnectionFactory"/>
<property name="exceptionListener" ref="cachingConnectionFactory"/>
<property name="destination" ref="jbossQueue"/>
</bean>

Datasource is not bound properly to war appllication

In my environment I have JSF 2.2 + CDI + Spring 4 + Wildfly 9 + Spring Data
In wildfy server, I have two datasources configured:
ExampleDS (this comes from factory)
OracleDS (This one I created)
in persitence.xml, I have:
<persistence-unit name="persistenceUnit">
<class>co.EntityClass</class>
<properties>
<property name="hibernate.dialect" value="org.hibernate.dialect.Oracle10gDialect" />
<property name="hibernate.show_sql" value="true" />
<property name="hibernate.ejb.naming_strategy" value="org.hibernate.cfg.ImprovedNamingStrategy"/>
<property name="hibernate.cache.provider_class" value="org.hibernate.cache.EhCacheProvider" />
</properties>
</persistence-unit>
My applicationContext.xml:
http://www.springframework.org/schema/beans/spring-beans-4.1.xsd
http://www.springframework.org/schema/data/jpa
http://www.springframework.org/schema/data/jpa/spring-jpa.xsd
http://www.springframework.org/schema/context
http://www.springframework.org/schema/context/spring-context-4.1.xsd">
<context:component-scan base-package="co.com.dao, co.com.service.impl" />
<bean id="persistenceContext" class="org.springframework.jndi.JndiObjectFactoryBean">
<property name="jndiName" value="java:/XXXXXDS"/>
</bean>
<bean id="entityManagerFactory" class="org.springframework.orm.jpa.LocalContainerEntityManagerFactoryBean">
<property name="dataSource" ref="persistenceContext" />
<property name="persistenceUnitName" value="persistenceUnit" />
<property name="jpaVendorAdapter">
<bean class="org.springframework.orm.jpa.vendor.HibernateJpaVendorAdapter" />
</property>
</bean>
<jpa:repositories base-package="co.com.psl.connectnetwork.dao" entity-manager-factory-ref="entityManagerFactory" />
whey I deploy my application, and I try to do a query in database, I get:
Caused by: org.h2.jdbc.JdbcSQLException: Schema "XXXXXX" not found; SQL statement:
it looks like my application is taking the default datasource configured in wildfly ExampleDS, and that´s why it does not find the object I query, but why ?
I am not sure if cdi + spring + jsf is a good match
I am not sure why, but I added into the datasource definition in applicationContext.xml these properties:
<property name="lookupOnStartup" value="true"/>
<property name="proxyInterface" value="javax.sql.DataSource"/>

Spring Batch -- Multi-File-Resource -- Takes same time as single Thread?

I am using Spring Batch for data migration from XML to Oracle Database.
With Single Thread execution, process takes 80-90 Mins to insert 20K users approx.
I want to reduce it to more than half but even using Multi File Resource, I am not able to achieve that.
I have a single XML to be processed so I started simply by adding
task executor and making Reader synchronized but not able to achieve gain.
So what I am doing, I split XML into multiple XMLS and want to try with Multi File Resource. Here is the configuration.
<batch:job id="importJob">
<batch:step id="step1Master">
<batch:partition handler="handler" partitioner="partitioner" />
</batch:step>
</batch:job>
<bean id="handler"
class="org.springframework.batch.core.partition.support.TaskExecutorPartitionHandler">
<property name="taskExecutor" ref="taskExecutor" />
<property name="step" ref="slaveStep" />
<property name="gridSize" value="20" />
</bean>
<batch:step id="slaveStep">
<batch:tasklet transaction-manager="transactionManager"
allow-start-if-complete="true">
<batch:chunk reader="reader" writer="writer"
processor="processor" commit-interval="1000" skip-limit="1500000">
<batch:skippable-exception-classes>
<batch:include class="java.lang.Exception" />
</batch:skippable-exception-classes>
</batch:chunk>
</batch:tasklet>
</batch:step>
<bean id="taskExecutor"
class="org.springframework.scheduling.concurrent.ThreadPoolTaskExecutor">
<property name="corePoolSize" value="100" />
<property name="maxPoolSize" value="300" />
<property name="allowCoreThreadTimeOut" value="true" />
</bean>
<bean id="partitioner"
class="org.springframework.batch.core.partition.support.MultiResourcePartitioner"
scope="step">
<property name="keyName" value="inputFile" />
<property name="resources"
value="file:/.../*.xml" />
</bean>
<bean id="processor"
class="...Processor"
scope="step" />
<bean id="reader" class="org.springframework.batch.item.xml.StaxEventItemReader"
scope="step">
<property name="fragmentRootElementName" value="user" />
<property name="unmarshaller" ref="userDetailUnmarshaller" />
<property name="resource" value="#{stepExecutionContext[inputFile]}" />
</bean>
My Single XML file contains users around 1000 and I am trying by having 20 files.
I kept commit-interval=1000 as each file has 1000 records to be insert in DB.
Do commit-interval needs to adjusted accordingly?
I am using ORACLE DB, Do I need to do any pool management there.
Current Pool of ORACLE DB configured in JBOSS
Min Pool = 100
Max Pool = 300
I see logging like
17:01:50,553 DEBUG [Writer] (taskExecutor-11) [UserDetailWriter] | user added
17:01:50,683 DEBUG [Writer] (taskExecutor-15) [UserDetailWriter] | user added
17:01:51,093 DEBUG [Writer] (taskExecutor-11) [UserDetailWriter] | user added
17:01:59,795 DEBUG [Writer] (taskExecutor-12) [UserDetailWriter] | user added
17:02:00,385 DEBUG [Writer] (taskExecutor-12) [UserDetailWriter] | user added
17:02:00,385 DEBUG [Writer] (taskExecutor-12) [UserDetailWriter] | user added
It seems multiple threads are being created but still I am not seeing any performance improvement here?
Please suggest what I am doing wrong?
go through this documentation for parallel processing
http://docs.spring.io/spring-batch/trunk/reference/html/scalability.html#scalabilityParallelSteps

Resources