Large data processing using Spring Batch Multi-threaded Step and RepositoryItemWriter/ RepositoryItemReader - multithreading

I am trying to write a batch processing application using spring batch with multi-thread step.this is simple application reading data from a table and writing to another table but data is large around 2 million record .
I am using RepositoryItemReader & RepositoryItemWriter for reading and writing data. But after processing some data it failing due to Unable to acquire JDBC Connection.
//Config.Java
#Bean
public TaskExecutor taskExecutor() {
SimpleAsyncTaskExecutor taskExecutor = new SimpleAsyncTaskExecutor();
taskExecutor.setConcurrencyLimit(10);
return taskExecutor;
}
#Bean(name = "personJob")
public Job personKeeperJob() {
Step step = stepBuilderFactory.get("step-1")
.<User, Person> chunk(1000)
.reader(userReader)
.processor(jpaProcessor)
.writer(personWriter)
.taskExecutor(taskExecutor())
.throttleLimit(10)
.build();
Job job = jobBuilderFactory.get("person-job")
.incrementer(new RunIdIncrementer())
.listener(this)
.start(step)
.build();
return job;
}
//Processor.Java
#Override
public Person process(User user) throws Exception {
Optional<User> userFromDb = userRepo.findById(user.getUserId());
Person person = new Person();
if(userFromDb.isPresent()) {
person.setName(userFromDb.get().getName());
person.setUserId(userFromDb.get().getUserId());
person.setDept(userFromDb.get().getDept());
}
return person;
}
//Reader.Java
#Autowired
public UserItemReader(final UserRepository repository) {
super();
this.repository = repository;
}
#PostConstruct
protected void init() {
final Map<String, Sort.Direction> sorts = new HashMap<>();
sorts.put("userId", Direction.ASC);
this.setRepository(this.repository);
this.setSort(sorts);
this.setMethodName("findAll");
}
//Writer.Java
#PostConstruct
protected void init() {
this.setRepository(repository);
}
#Transactional
public void write(List<? extends Person> persons) throws Exception {
repository.saveAll(persons);
}
application.properties
# Datasource
spring.datasource.platform=h2
spring.datasource.url=jdbc:h2:mem:batchdb
spring.main.allow-bean-definition-overriding=true
spring.datasource.hikari.maximum-pool-size=500
Error :
org.springframework.transaction.CannotCreateTransactionException: Could not open JPA EntityManager for transaction; nested exception is org.hibernate.exception.JDBCConnectionException: Unable to acquire JDBC Connection
at org.springframework.orm.jpa.JpaTransactionManager.doBegin(JpaTransactionManager.java:447)
......................
Caused by: org.hibernate.exception.JDBCConnectionException: Unable to acquire JDBC Connection
at org.hibernate.exception.internal.SQLExceptionTypeDelegate.convert(SQLExceptionTypeDelegate.java:48)
............................
Caused by: java.sql.SQLTransientConnectionException: HikariPool-1 - Connection is not available, request timed out after 30927ms.

You run out of connections.
Try to set the Hikari Connection Pool to a bigger number:
spring.datasource.hikari.maximum-pool-size=20

Related

spring batch api request on wait

I have written a simple spring batch project where
API to execute job: returns job ID on job launch
reads/processes/writes from/to DB in multithread parallel processing
(Launching the job asynchronously to get the job ID in advance so I can poll the status of the job from another API request.)
API to poll the status of the job with respect to the job ID passed.
Polling api works smoothly if job step's throttle limit is 7 or less.
However, if throttle limit is more than 7, job execution continues but polling api will be on wait till read/process releases.
Have also tried a simple api which simply returns String instead of polling but that goes on wait too.
Sample of the code as shown below:
#Configuration
#EnableBatchProcessing
public class SpringBatchConfig {
private int core = 200;
#Bean
public Job job() throws Exception {
return jobBuilderFactory.get(SC_Constants.JOB)
.incrementer(new RunIdIncrementer())
.listener(new Listener(transDAO))
.start(step_processRecords()
.build();
}
#Bean
public ThreadPoolTaskExecutor taskExecutor(){
ThreadPoolTaskExecutor threadPoolTaskExecutor = new ThreadPoolTaskExecutor();
threadPoolTaskExecutor.setCorePoolSize(this.core);
threadPoolTaskExecutor.setMaxPoolSize(this.core);
threadPoolTaskExecutor.setQueueCapacity(this.core);
threadPoolTaskExecutor.setThreadNamePrefix("threadExecutor");
return threadPoolTaskExecutor;
}
#Bean
#StepScope
public JdbcPagingItemReader<Transaction> itemReader(...) {
JdbcPagingItemReader<Transaction> itemReader = new JdbcPagingItemReader<Transaction>();
...
return itemReader;
}
#Bean
#StepScope
public ItemProcessor<Transaction,Transaction> processor() {
return new Processor();
}
#Bean
#StepScope
public ItemWriter<Transaction> writer(...) {
return new Writer();
}
#Bean
public Step step3_processRecords() throws Exception {
return stepBuilderFactory.get(SC_Constants.STEP_3_PROCESS_RECORDS)
.<Transaction,Transaction>chunk(this.chunk)
.reader(itemReader(null,null,null))
.processor(processor())
.writer(writer(null,null,null))
.taskExecutor(taskExecutor())
.throttleLimit(20)
.build();
}
}
file that extends DefaultBatchConfigurer has below:
#Override
public JobLauncher getJobLauncher() {
SimpleJobLauncher jobLauncher = new SimpleJobLauncher();
jobLauncher.setJobRepository(jobRepository);
SimpleAsyncTaskExecutor exec = new SimpleAsyncTaskExecutor();
exec.setConcurrencyLimit(concurrency_limit);
jobLauncher.setTaskExecutor(exec);
return jobLauncher;
}
Edit:
polling api code snippet
#POST
#Consumes(MediaType.APPLICATION_JSON)
#Path("/getJobStatus")
public Response getJobStatus(#RequestBody String body){
JSONObject jsonObject = new JSONObject(body);
Long jobId = jsonObject.get("jobId");
jobExecution = jobExplorer.getJobExecution(jobId);
batchStatus = jobExecution.getStatus().getBatchStatus();
write_count = jobExecution.getStepExecutions().iterator().next().getWriteCount();
responseDto.setJob_id(jobId);
responseDto.setWrite_count(write_count);
responseDto.setStatus(batchStatus.name());
return responseDto;
}
Second edit:
sharing a snippet of the jobrepository setting: using postgres jdbc job repository.
#Component
public class SpringBatchConfigurer extends DefaultBatchConfigurer{
...
#PostConstruct
public void initialize() {
try {
BasicDataSource dataSource = new BasicDataSource();
dataSource.setDriverClassName(driverClassName);
dataSource.setUsername(username);
dataSource.setPassword(password);
dataSource.setUrl(dsUrl + "?currentSchema=public");
dataSource.setInitialSize(3);
dataSource.setMinIdle(1);
dataSource.setMaxIdle(3);
dataSource.addConnectionProperty("maxConnLifetimeMillis", "30000");
this.transactionManager = new DataSourceTransactionManager(dataSource);
JobRepositoryFactoryBean factory = new JobRepositoryFactoryBean();
factory.setDataSource(dataSource);
factory.setTransactionManager(transactionManager);
factory.afterPropertiesSet();
this.jobRepository = factory.getObject();
SimpleJobLauncher jobLauncher = new SimpleJobLauncher();
jobLauncher.setJobRepository(jobRepository);
jobLauncher.afterPropertiesSet();
this.jobLauncher = jobLauncher;
} catch (Exception e) {
throw new BatchConfigurationException(e);
}
}
Third Edit: Tried passing it as a local variable under this step. polling works but now, job execution is not happening. No threads generated. No processing is happening.
#Bean
public Step step3_processRecords() throws Exception {
ThreadPoolTaskExecutor threadPoolTaskExecutor = new ThreadPoolTaskExecutor();
threadPoolTaskExecutor.setCorePoolSize(this.core_size);
threadPoolTaskExecutor.setMaxPoolSize(this.max_pool_size);
threadPoolTaskExecutor.setQueueCapacity(this.queue_capacity);
threadPoolTaskExecutor.setThreadNamePrefix("threadExecutor");
return stepBuilderFactory.get("step3")
.<Transaction,Transaction>chunk(this.chunk)
.reader(itemReader(null,null,null))
.processor(processor())
.writer(writer(null,null,null))
.taskExecutor(threadPoolTaskExecutor)
.throttleLimit(20)
.build();
}

spring batch getting stuck in parallel processing where works fine in serial processing

I am quite new to Spring Batch and tried to run Spring batch with single thread. Now I need to add multithreading in step and have below configuration, but parallel processing is getting hang after some time and no trace on console after it processes some records. Earlier for single thread I used JdbcCursorItemReader and then switch to JdbcPagingItemReader for thread safe reader.
Reader is reading entries from postgres DB and then processor (which calls other rest webservice and return response to writer) and writer (which creates new file and update status data in DB) can execute parallelly.
#Bean
public Job job(JobBuilderFactory jobBuilderFactory,
StepBuilderFactory stepBuilderFactory,
ItemReader<OrderRequest> itemReader,
ItemProcessor<OrderRequest, OrderResponse> dataProcessor,
ItemWriter<OrderResponse> fileWriter, JobExecutionListener jobListener,
ItemReadListener<OrderRequest> stepItemReadListener,
SkipListener<OrderRequest, OrderResponse> stepSkipListener, TaskExecutor taskExecutor) {
Step step1 = stepBuilderFactory.get("Process-Data")
.<OrderRequest, OrderResponse>chunk(10)
.listener(stepItemReadListener)
.reader(itemReader)
.processor(dataProcessor)
.writer(fileWriter)
.faultTolerant()
.processorNonTransactional()
.skipLimit(5)
.skip(CustomException.class)
.listener(stepSkipListener)
.taskExecutor(taskExecutor)
.throttleLimit(5)
.build();
return jobBuilderFactory.get("Batch-Job")
.incrementer(new RunIdIncrementer())
.listener(jobListener)
.start(step1)
.build();
}
#StepScope
#Bean
public JdbcPagingItemReader<OrderRequest> jdbcPagingItemReader(#Qualifier("postgresDataSource") DataSource dataSource,
#Value("#{jobParameters[customerId]}") String customerId, OrderRequestRowMapper rowMapper) {
// reading database records using JDBC in a paging fashion
JdbcPagingItemReader<OrderRequest> reader = new JdbcPagingItemReader<>();
reader.setDataSource(dataSource);
reader.setFetchSize(1000);
reader.setRowMapper(rowMapper);
// Sort Keys
Map<String, Order> sortKeys = new HashMap<>();
sortKeys.put("OrderRequestID", Order.ASCENDING);
// Postgres implementation of a PagingQueryProvider using database specific features.
PostgresPagingQueryProvider queryProvider = new PostgresPagingQueryProvider();
queryProvider.setSelectClause("*");
queryProvider.setFromClause("FROM OrderRequest");
queryProvider.setWhereClause("CUSTOMER = '" + customerId + "'");
queryProvider.setSortKeys(sortKeys);
reader.setQueryProvider(queryProvider);
return reader;
}
#StepScope
#Bean
public SynchronizedItemStreamReader<OrderRequest> itemReader(JdbcPagingItemReader<OrderRequest> jdbcPagingItemReader) {
return new SynchronizedItemStreamReaderBuilder<OrderRequest>().delegate(jdbcPagingItemReader).build();
}
#Bean
public TaskExecutor taskExecutor() {
ThreadPoolTaskExecutor taskExecutor = new ThreadPoolTaskExecutor();
taskExecutor.setCorePoolSize(5);
taskExecutor.setMaxPoolSize(5);
taskExecutor.setQueueCapacity(0);
return taskExecutor;
}
#StepScope
#Bean
ItemProcessor<OrderRequest, OrderResponse> dataProcessor() {
return new BatchDataFileProcessor();
}
#StepScope
#Bean
ItemWriter<OrderResponse> fileWriter() {
return new BatchOrderFileWriter();
}
#StepScope
#Bean
public ItemReadListener<OrderRequest> stepItemReadListener() {
return new StepItemReadListener();
}
#Bean
public JobExecutionListener jobListener() {
return new JobListener();
}
#StepScope
#Bean
public SkipListener<OrderRequest, OrderResponse> stepSkipListener() {
return new StepSkipListener();
}
What is problem with multithreading configuration here?
Batch works fine with single record at a time when used JdbcCursorItemReader and no TaskExecutor bean:
#StepScope
#Bean
public JdbcCursorItemReader<OrderRequest> jdbcCursorItemReader(#Qualifier("postgresDataSource") DataSource dataSource,
#Value("#{jobParameters[customerId]}") String customerId, OrderRequestRowMapper rowMapper) {
return new JdbcCursorItemReaderBuilder<OrderRequest>()
.name("jdbcCursorItemReader")
.dataSource(dataSource)
.queryArguments(customerId)
.sql(CommonConstant.FETCH_QUERY)
.rowMapper(rowMapper)
.saveState(true)
.build();
}
After changing TaskExecutor as follows its working now:
#Bean
public TaskExecutor taskExecutor() {
SimpleAsyncTaskExecutor taskExecutor = new SimpleAsyncTaskExecutor();
taskExecutor.setConcurrencyLimit(concurrencyLimit);
return taskExecutor;
}
Didn't get what was the problem with earlier.

Scope 'job' is not active for the current thread, No context holder available for job scope Spring-Batch

In my Spring batch job, I'm trying to share data between steps using JobExecutionContext, which works only if i keep the steps single threaded as follows:
#EnableTask
#EnableBatchProcessing
#Configuration
#PropertySource(value = {"classpath:application.properties"})
public class Config{
private static final HashMap<String,Object> OVERRIDDEN_BY_EXPRESSION = null;
private static final String QUERY = "SELECT * FROM \"Config\"";
#Autowired
public JobBuilderFactory jobBuilderFactory;
#Autowired
public StepBuilderFactory stepBuilderFactory;
#Autowired
private MongoTemplate mongoTemplate;
#Autowired
EntityManager em;
#Autowired
DataSource dataSource;
/*Config Step*/
#Bean
public JdbcCursorItemReader<BatchConfig> configReader(DataSource dataSource) {
JdbcCursorItemReader<BatchConfig> config = new JdbcCursorItemReader<>();
config.setDataSource(dataSource);
config.setSql(QUERY);
config.setRowMapper(new BatchRowMapper());
return config;
}
#Bean
public ItemWriter<BatchConfig> itemWriter() {
return new ItemWriter<BatchConfig>() {
private StepExecution stepExecution;
#Override
public void write(List<? extends BatchConfig> items) {
ExecutionContext stepContext = this.stepExecution.getExecutionContext();
for (BatchConfig item : items) {
HashMap<String, Object> table = new HashMap<>();
table.put("date", item.getDate_time());
table.put("size", item.getSize());
System.out.println(table);
stepContext.put(item.getName(), table);
}
}
#BeforeStep
public void saveStepExecution(StepExecution stepExecution) {
this.stepExecution = stepExecution;
}
};
}
#Bean
public Step stepConfig(JdbcCursorItemReader<BatchConfig> configReader) throws Exception {
return stepBuilderFactory.get("stepConfig")
.<BatchConfig, BatchConfig>chunk(10)
.reader(configReader)
.writer(itemWriter())
.listener(promotionListener())
.build();
}
#Bean
public ExecutionContextPromotionListener promotionListener() {
ExecutionContextPromotionListener listener = new ExecutionContextPromotionListener();
listener.setKeys(new String[] {"COUNTRY", "CATEGORY", "USER"});
return listener;
}
/*Country Step*/
#JobScope
#Bean
public MongoItemReader<COUNTRY> CountryItemReader(#Value("#{jobExecutionContext['COUNTRY']}") HashMap<String, Object> table) {
int date = (int) table.get("date");
MongoItemReader<COUNTRY> reader = new MongoItemReader<COUNTRY>();
reader.setTemplate(mongoTemplate);
reader.setTargetType(COUNTRY.class);
reader.setCollection("COUNTRY");
reader.setFields("{\"COUNTRY_NAME\": 1,\"SHORT_NAME\": 1,\"DEPT_CODE\": 1}");
reader.setSort(new HashMap<String, Sort.Direction>() {{
put("_id", Sort.Direction.DESC);
}});
reader.setQuery("{DATE_TIME: {$gt:"+date+"}}");
reader.setPageSize(250);
return reader;
}
#Bean
public CountryItemProcessor CountryProcessor(){
return new CountryItemProcessor();
}
#Bean
public JpaItemWriter<COUNTRY> country_writer(){
JpaItemWriter<COUNTRY> jpa = new JpaItemWriter<COUNTRY>();
jpa.setEntityManagerFactory(em.getEntityManagerFactory());
return jpa;
}
#JobScope
#Bean
public Step step1(#Value("#{jobExecutionContext['COUNTRY']}") HashMap<String, Object> tab) {
int size = (int) tab.get("size");
//System.out.println(size);
return stepBuilderFactory.get("step1")
.<COUNTRY, COUNTRY>chunk(20)
.reader(CountryItemReader(OVERRIDDEN_BY_EXPRESSION))
.writer(country_writer())
.build();
}
#Bean
public Job TestJob(Step stepConfig) throws Exception {
return this.jobBuilderFactory.get("TestJob")
.incrementer(new RunIdIncrementer())// because a spring config bug, this incrementer is not really useful
.start(stepConfig)
.next(step1(OVERRIDDEN_BY_EXPRESSION))
.build();
}
}
However when adding SimpleAsyncTaskExecutor an error occured:
org.springframework.beans.factory.support.ScopeNotActiveException: Error creating bean with name 'scopedTarget.CountryItemReader': Scope 'job' is not active for the current thread; consider defining a scoped proxy for this bean if you intend to refer to it from a singleton; nested exception is java.lang.IllegalStateException: No context holder available for job scope
at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:383) ~[spring-beans-5.3.6.jar:5.3.6]
at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:208) ~[spring-beans-5.3.6.jar:5.3.6]
at org.springframework.aop.target.SimpleBeanTargetSource.getTarget(SimpleBeanTargetSource.java:35) ~[spring-aop-5.3.6.jar:5.3.6]
at org.springframework.aop.framework.CglibAopProxy$DynamicAdvisedInterceptor.intercept(CglibAopProxy.java:676) ~[spring-aop-5.3.6.jar:5.3.6]
at org.springframework.batch.item.data.MongoItemReader$$EnhancerBySpringCGLIB$$67443e4.read(<generated>) ~[spring-batch-infrastructure-4.3.2.jar:4.3.2]
at org.springframework.batch.core.step.item.SimpleChunkProvider.doRead(SimpleChunkProvider.java:99) ~[spring-batch-core-4.3.2.jar:4.3.2]
at org.springframework.batch.core.step.item.SimpleChunkProvider.read(SimpleChunkProvider.java:180) ~[spring-batch-core-4.3.2.jar:4.3.2]
at org.springframework.batch.core.step.item.SimpleChunkProvider$1.doInIteration(SimpleChunkProvider.java:126) ~[spring-batch-core-4.3.2.jar:4.3.2]
at org.springframework.batch.repeat.support.RepeatTemplate.getNextResult(RepeatTemplate.java:375) ~[spring-batch-infrastructure-4.3.2.jar:4.3.2]
at org.springframework.batch.repeat.support.RepeatTemplate.executeInternal(RepeatTemplate.java:215) ~[spring-batch-infrastructure-4.3.2.jar:4.3.2]
at org.springframework.batch.repeat.support.RepeatTemplate.iterate(RepeatTemplate.java:145) ~[spring-batch-infrastructure-4.3.2.jar:4.3.2]
at org.springframework.batch.core.step.item.SimpleChunkProvider.provide(SimpleChunkProvider.java:118) ~[spring-batch-core-4.3.2.jar:4.3.2]
at org.springframework.batch.core.step.item.ChunkOrientedTasklet.execute(ChunkOrientedTasklet.java:71) ~[spring-batch-core-4.3.2.jar:4.3.2]
at org.springframework.batch.core.step.tasklet.TaskletStep$ChunkTransactionCallback.doInTransaction(TaskletStep.java:407) ~[spring-batch-core-4.3.2.jar:4.3.2]
at org.springframework.batch.core.step.tasklet.TaskletStep$ChunkTransactionCallback.doInTransaction(TaskletStep.java:331) ~[spring-batch-core-4.3.2.jar:4.3.2]
at org.springframework.transaction.support.TransactionTemplate.execute(TransactionTemplate.java:140) ~[spring-tx-5.3.6.jar:5.3.6]
at org.springframework.batch.core.step.tasklet.TaskletStep$2.doInChunkContext(TaskletStep.java:273) ~[spring-batch-core-4.3.2.jar:4.3.2]
at org.springframework.batch.core.scope.context.StepContextRepeatCallback.doInIteration(StepContextRepeatCallback.java:82) ~[spring-batch-core-4.3.2.jar:4.3.2]
at org.springframework.batch.repeat.support.TaskExecutorRepeatTemplate$ExecutingRunnable.run(TaskExecutorRepeatTemplate.java:262) ~[spring-batch-infrastructure-4.3.2.jar:4.3.2]
at java.base/java.lang.Thread.run(Thread.java:829) ~[na:na]
Caused by: java.lang.IllegalStateException: No context holder available for job scope
at org.springframework.batch.core.scope.JobScope.getContext(JobScope.java:159) ~[spring-batch-core-4.3.2.jar:4.3.2]
at org.springframework.batch.core.scope.JobScope.get(JobScope.java:92) ~[spring-batch-core-4.3.2.jar:4.3.2]
at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:371) ~[spring-beans-5.3.6.jar:5.3.6]
I tried solving this issue like in:
https://github.com/spring-projects/spring-batch/issues/1335, but it seems like it is using just one thread in addition to the main.
Is there any way to resolve this issue without adding tweaked code ?
I'm planning to scale the job using remote partitionning on Kubernetes, would this issue persist because of job scope?
Any thoughts or advice are more than welcome.
I'm trying to share data between steps using JobExecutionContext, which works only if i keep the steps single threaded
Relying on the execution context to share data between multi-threaded steps is incorrect, because the keys will be overridden by concurrent threads. The reference documentation explicitly mentions to turn off state management in multi-threaded environment:
Javadoc: remember to use saveState=false if used in a multi-threaded client
Reference doc: it is not recommended to use job-scoped beans in multi-threaded or partitioned steps
That said, I don't see what key could be shared from a multi-threaded step to the next step (as threads are executed in parallel), but if you really need to do that, you should use another method like defining a shared bean that is thread safe.

How could I transfer message dynamically in Sftp Inbound adapter through Spring DSL in java 1.7

I have a Sftp inbound flow and I got the session information from DefaultSftpSessionFactory. But I need to implement mulitple session information dynamically which I will get from database table. That means I have multiple number of Sftp server details that I need to implement in my integration flow. Now I have done with file transfer from single source to single destination but I need to implement multiple source to multiple destination. So can any one provide some pointer on this.
This is my session Factory...Here I have a single Sftp server information but how to configure multiple server details.
#Autowired
private DefaultSftpSessionFactory sftpSessionFactory;
#Bean
public DefaultSftpSessionFactory sftpSessionFactory() {
DefaultSftpSessionFactory factory = new DefaultSftpSessionFactory(
true);
factory.setHost("111.11.12.143");
factory.setPort(22);
factory.setUser("sftp");
factory.setPassword("*******");
return factory;
}
and this is my Sftp Inbound flow..
#Bean
public IntegrationFlow sftpInboundFlow() {
System.out.println("enter sftpInboundFlow....."
+ sftpSessionFactory.getSession());
return IntegrationFlows
.from(Sftp.inboundAdapter(this.sftpSessionFactory)
.preserveTimestamp(true).remoteDirectory(remDir)
.regexFilter(".*\\.txt$")
.localFilenameExpression("#this.toUpperCase()")
.localDirectory(new File(localDir))
.remoteFileSeparator("/"),
new Consumer<SourcePollingChannelAdapterSpec>() {
#Override
public void accept(SourcePollingChannelAdapterSpec e) {
e.id("sftpInboundAdapter")
.autoStartup(true)
.poller(Pollers.fixedRate(1000)
.maxMessagesPerPoll(1));
}
})
//.channel(MessageChannels.queue("sftpInboundResultChannel"))
.channel(sftpInboundResultChannel())
.get();
}
As suggested by Gary I am editing my post....
Hi Gary,
I am taking the reference from Github dynamic FTP example.
Through the ChannelResolver class I need to call my above DSL class. and set the dynamic value in context property without using XML.
In my ChannelResolver class I want some thing like this
StandardEnvironment env = new StandardEnvironment();
Properties props = new Properties();
props.setProperty("inbound.host", host); //I am getting the value of 'host' from a DB table.
PropertiesPropertySource pps = new PropertiesPropertySource("sftpprop", props);
env.getPropertySources().addLast(pps);
context.setEnvironment(env);
And my DSL class I need to use like this.
#Value("${inbound.host}")
private String host;
So in this way can I set dynamic value for String 'host' ?
I am editing my original post...........
In my Outbound dynamic resolver class I am doing like this
StandardEnvironment env = new StandardEnvironment();
Properties props = new Properties();
props.setProperty("outbound.host", host);
props.setProperty("outbound.port", String.valueOf(port));
props.setProperty("outbound.user", user);
props.setProperty("outbound.password", password);
props.setProperty("outbound.remote.directory", remoteDir);
props.setProperty("outbound.local.directory", localDir);
PropertiesPropertySource pps = new PropertiesPropertySource("ftpprops", props);
env.getPropertySources().addLast(pps);
ctx.setEnvironment(env);
And this is my dsl class....
#Autowired
private DefaultSftpSessionFactory sftpSessionFactory;
#Bean
public DefaultSftpSessionFactory sftpSessionFactory(#Value("${outbound.host}") String host, #Value("${outbound.port}") int port,
#Value("${outbound.user}") String user, #Value("${outbound.password}") String password
) {
DefaultSftpSessionFactory factory = new DefaultSftpSessionFactory(true);
factory.setHost(host);
factory.setPort(port);
factory.setUser(user);
factory.setPassword(password);
return factory;
}
#Bean
public IntegrationFlow fileInboundFlow(#Value("${outbound.local.directory}") String localDir)
{
return IntegrationFlows
.from(Files.inboundAdapter(new File(localDir)),
new Consumer<SourcePollingChannelAdapterSpec>() {
#Override
public void accept(SourcePollingChannelAdapterSpec e) {
e.autoStartup(true).poller(
Pollers.fixedDelay(5000)
.maxMessagesPerPoll(1));
}
})
.channel(sftpSendChannel())
.get();
}
#Bean
public IntegrationFlow sftpOutboundFlow(#Value("${outbound.remote.directory}") String remDir) {
return IntegrationFlows
.from(sftpSendChannel())
.handle(Sftp.outboundAdapter(this.sftpSessionFactory)
.useTemporaryFileName(false)
.remoteDirectory(remDir))
.get();
}
#Bean
public MessageChannel sftpSendChannel() {
return new DirectChannel();
}
#Bean
public static PropertySourcesPlaceholderConfigurer configurer1() {
return new PropertySourcesPlaceholderConfigurer();
}
And this the error log from console...
Aug 03, 2015 7:50:25 PM org.apache.catalina.core.StandardContext listenerStart
SEVERE: Exception sending context initialized event to listener instance of class org.springframework.web.context.ContextLoaderListener
org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'sftpOutBoundDsl': Injection of autowired dependencies failed; nested exception is org.springframework.beans.factory.BeanCreationException: Could not autowire field: private org.springframework.integration.sftp.session.DefaultSftpSessionFactory com.tcs.iux.ieg.sftp.dynamic.SftpOutBoundDsl.sftpSessionFactory; nested exception is java.lang.IllegalArgumentException: Could not resolve placeholder 'outbound.host' in string value "${outbound.host}"
at org.springframework.beans.factory.annotation.AutowiredAnnotationBeanPostProcessor.postProcessPropertyValues(AutowiredAnnotationBeanPostProcessor.java:334)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.populateBean(AbstractAutowireCapableBeanFactory.java:1204)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:538)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:476)
at org.springframework.beans.factory.support.AbstractBeanFactory$1.getObject(AbstractBeanFactory.java:302)
at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:229)
at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:298)
at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:193)
at org.springframework.beans.factory.support.DefaultListableBeanFactory.preInstantiateSingletons(DefaultListableBeanFactory.java:725)
at org.springframework.context.support.AbstractApplicationContext.finishBeanFactoryInitialization(AbstractApplicationContext.java:757)
at org.springframework.context.support.AbstractApplicationContext.refresh(AbstractApplicationContext.java:480)
at org.springframework.web.context.ContextLoader.configureAndRefreshWebApplicationContext(ContextLoader.java:403)
at org.springframework.web.context.ContextLoader.initWebApplicationContext(ContextLoader.java:306)
at org.springframework.web.context.ContextLoaderListener.contextInitialized(ContextLoaderListener.java:106)
at org.apache.catalina.core.StandardContext.listenerStart(StandardContext.java:4973)
at org.apache.catalina.core.StandardContext.startInternal(StandardContext.java:5467)
at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:150)
at org.apache.catalina.core.ContainerBase$StartChild.call(ContainerBase.java:1559)
at org.apache.catalina.core.ContainerBase$StartChild.call(ContainerBase.java:1549)
at java.util.concurrent.FutureTask.run(FutureTask.java:262)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:744)
Caused by: org.springframework.beans.factory.BeanCreationException: Could not autowire field: private org.springframework.integration.sftp.session.DefaultSftpSessionFactory com.tcs.iux.ieg.sftp.dynamic.SftpOutBoundDsl.sftpSessionFactory; nested exception is java.lang.IllegalArgumentException: Could not resolve placeholder 'outbound.host' in string value "${outbound.host}"
at org.springframework.beans.factory.annotation.AutowiredAnnotationBeanPostProcessor$AutowiredFieldElement.inject(AutowiredAnnotationBeanPostProcessor.java:555)
at org.springframework.beans.factory.annotation.InjectionMetadata.inject(InjectionMetadata.java:87)
at org.springframework.beans.factory.annotation.AutowiredAnnotationBeanPostProcessor.postProcessPropertyValues(AutowiredAnnotationBeanPostProcessor.java:331)
... 22 more
Caused by: java.lang.IllegalArgumentException: Could not resolve placeholder 'outbound.host' in string value "${outbound.host}"
at org.springframework.util.PropertyPlaceholderHelper.parseStringValue(PropertyPlaceholderHelper.java:174)
at org.springframework.util.PropertyPlaceholderHelper.replacePlaceholders(PropertyPlaceholderHelper.java:126)
at org.springframework.core.env.AbstractPropertyResolver.doResolvePlaceholders(AbstractPropertyResolver.java:204)
at org.springframework.core.env.AbstractPropertyResolver.resolveRequiredPlaceholders(AbstractPropertyResolver.java:178)
at org.springframework.context.support.PropertySourcesPlaceholderConfigurer$2.resolveStringValue(PropertySourcesPlaceholderConfigurer.java:175)
at org.springframework.beans.factory.support.AbstractBeanFactory.resolveEmbeddedValue(AbstractBeanFactory.java:800)
at org.springframework.beans.factory.support.DefaultListableBeanFactory.doResolveDependency(DefaultListableBeanFactory.java:917)
at org.springframework.beans.factory.support.DefaultListableBeanFactory.resolveDependency(DefaultListableBeanFactory.java:904)
at org.springframework.beans.factory.support.ConstructorResolver.resolveAutowiredArgument(ConstructorResolver.java:815)
at org.springframework.beans.factory.support.ConstructorResolver.createArgumentArray(ConstructorResolver.java:743)
at org.springframework.beans.factory.support.ConstructorResolver.instantiateUsingFactoryMethod(ConstructorResolver.java:466)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.instantiateUsingFactoryMethod(AbstractAutowireCapableBeanFactory.java:1113)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBeanInstance(AbstractAutowireCapableBeanFactory.java:1008)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:505)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:476)
at org.springframework.beans.factory.support.AbstractBeanFactory$1.getObject(AbstractBeanFactory.java:302)
at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:229)
at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:298)
at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:193)
at org.springframework.beans.factory.support.DefaultListableBeanFactory.findAutowireCandidates(DefaultListableBeanFactory.java:1088)
at org.springframework.beans.factory.support.DefaultListableBeanFactory.doResolveDependency(DefaultListableBeanFactory.java:1006)
at org.springframework.beans.factory.support.DefaultListableBeanFactory.resolveDependency(DefaultListableBeanFactory.java:904)
at org.springframework.beans.factory.annotation.AutowiredAnnotationBeanPostProcessor$AutowiredFieldElement.inject(AutowiredAnnotationBeanPostProcessor.java:527)
... 24 more
It is currently not supported.
We have an open JIRA to add support for dynamic server selection but it's unlikely to be done in time for the upcoming 4.2 release.
You could work around it by writing your own custom delegating session factory that uses some criteria (e.g. a ThreadLocal) to determine which delegate factory to use.
EDIT:
As with the XML, you need a PropertySourcesPlaceholderConfigurer bean.
You should also use factory-method injection because the #Configuration class is created too early to have the #Value injected...
#Configuration
public class FooConfig {
#Bean
public DefaultSftpSessionFactory factory(
#Value("${inbound.host}") String host,
#Value("${inbound.port}") int port) {
DefaultSftpSessionFactory sf = new DefaultSftpSessionFactory();
sf.setHost(host);
sf.setPort(port);
return sf;
}
#Bean
public PropertySourcesPlaceholderConfigurer configurer() {
return new PropertySourcesPlaceholderConfigurer();
}
}
.
public class Testing {
#Test
public void test() {
AnnotationConfigApplicationContext context = new AnnotationConfigApplicationContext();
context.register(FooConfig.class);
StandardEnvironment env = new StandardEnvironment();
Properties props = new Properties();
props.setProperty("inbound.host", "bar");
props.setProperty("inbound.port", "23");
PropertiesPropertySource pps = new PropertiesPropertySource("sftpprop", props);
env.getPropertySources().addLast(pps);
context.setEnvironment(env);
context.refresh();
DefaultSftpSessionFactory sessionFactory = context.getBean(DefaultSftpSessionFactory.class);
assertEquals("bar", TestUtils.getPropertyValue(sessionFactory, "host"));
context.close();
}
}
By the way, the delegating session factory will be in 4.2 after all.
EDIT2:
You can avoid the early instantiation of the config class and use global #Value injection, as long as you make the PSPC bean static...
#Configuration
public class FooConfig {
#Value("${foo}")
public String foo;
#Bean
public String earlyFoo() {
return this.foo;
}
#Bean
public String foo(#Value("${foo}") String foo) {
return foo;
}
#Bean
public static PropertySourcesPlaceholderConfigurer configurer() {
return new PropertySourcesPlaceholderConfigurer();
}
}
In this case, earlyFoo is populated as expected.

Are spring-data-redis connections not properly released when transaction support is enabled?

In our Spring 4 project we would like to have database transactions that involve Redis and Hibernate. Whenever Hibernate fails, for example due to optimistic locking, the Redis transaction should be aborted as well.
This seems to work for
Single-threaded transaction execution.
Multi-threaded transaction execution, as long as the transaction only includes a single Redis call.
Multi-threaded transaction execution with multiple Redis calls, if Hibernate is excluded from our configuration.
As soon as a transaction includes multiple Redis calls, and Hibernate is configured to take part in the transactions, there seems to be a problem with connection binding and multithreading. Threads are stuck at RedisConnectionUtils.bindConnection(), probably since the JedisPool runs out of connections.
This can be reproduced as follows.
#Service
public class TransactionalService {
#Autowired
#Qualifier("redisTemplate")
private RedisTemplate<String, Object> redisTemplate;
#Transactional
public void processTask(int i){
redisTemplate.convertAndSend("testChannel", new Message());
redisTemplate.convertAndSend("testChannel", new Message());
}
}
We use a ThreadPoolTaskExecutor having a core pool size of 50 to simulate multithreaded transactions.
#Service
public class TaskRunnerService {
#Autowired
private TaskExecutor taskExecutor;
#Autowired
private TransactionalService transactionalService;
public void runTasks() {
for (int i = 0; i < 100; i++) {
final int j = i;
taskExecutor.execute(new Runnable() {
#Override
public void run() {
transactionalService.processTask(j);
}
});
}
}
}
Running this results in all taskExecutor threads hanging in JedisPool.getResource():
"taskExecutor-1" - Thread t#18
java.lang.Thread.State: WAITING
at sun.misc.Unsafe.park(Native Method)
- parking to wait for <1b83c92c> (a java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject)
at java.util.concurrent.locks.LockSupport.park(LockSupport.java:175)
at java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.await(AbstractQueuedSynchronizer.java:2039)
at org.apache.commons.pool2.impl.LinkedBlockingDeque.takeFirst(LinkedBlockingDeque.java:524)
at org.apache.commons.pool2.impl.GenericObjectPool.borrowObject(GenericObjectPool.java:438)
at org.apache.commons.pool2.impl.GenericObjectPool.borrowObject(GenericObjectPool.java:361)
at redis.clients.util.Pool.getResource(Pool.java:40)
at redis.clients.jedis.JedisPool.getResource(JedisPool.java:84)
at redis.clients.jedis.JedisPool.getResource(JedisPool.java:10)
at org.springframework.data.redis.connection.jedis.JedisConnectionFactory.fetchJedisConnector(JedisConnectionFactory.java:90)
at org.springframework.data.redis.connection.jedis.JedisConnectionFactory.getConnection(JedisConnectionFactory.java:143)
at org.springframework.data.redis.connection.jedis.JedisConnectionFactory.getConnection(JedisConnectionFactory.java:41)
at org.springframework.data.redis.core.RedisConnectionUtils.doGetConnection(RedisConnectionUtils.java:128)
at org.springframework.data.redis.core.RedisConnectionUtils.bindConnection(RedisConnectionUtils.java:66)
at org.springframework.data.redis.core.RedisTemplate.execute(RedisTemplate.java:175)
at org.springframework.data.redis.core.RedisTemplate.execute(RedisTemplate.java:152)
at org.springframework.data.redis.core.RedisTemplate.convertAndSend(RedisTemplate.java:675)
at test.TransactionalService.processTask(TransactionalService.java:23)
at test.TransactionalService$$FastClassBySpringCGLIB$$9b3de279.invoke(<generated>)
at org.springframework.cglib.proxy.MethodProxy.invoke(MethodProxy.java:204)
at org.springframework.aop.framework.CglibAopProxy$CglibMethodInvocation.invokeJoinpoint(CglibAopProxy.java:708)
at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:157)
at org.springframework.transaction.interceptor.TransactionInterceptor$1.proceedWithInvocation(TransactionInterceptor.java:98)
at org.springframework.transaction.interceptor.TransactionAspectSupport.invokeWithinTransaction(TransactionAspectSupport.java:262)
at org.springframework.transaction.interceptor.TransactionInterceptor.invoke(TransactionInterceptor.java:95)
at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179)
at org.springframework.aop.framework.CglibAopProxy$DynamicAdvisedInterceptor.intercept(CglibAopProxy.java:644)
at test.TransactionalService$$EnhancerBySpringCGLIB$$a1b3ba03.processTask(<generated>)
at test.TaskRunnerService$1.run(TaskRunnerService.java:28)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
Locked ownable synchronizers:
- locked <7d528cf7> (a java.util.concurrent.ThreadPoolExecutor$Worker)
Redis Config
#Configuration
public class RedisConfig {
#Bean
public JedisConnectionFactory jedisConnectionFactory() {
JedisConnectionFactory jedisConnectionFactory = new JedisConnectionFactory();
jedisConnectionFactory.setPoolConfig(new JedisPoolConfig());
return jedisConnectionFactory;
}
#Bean
public Jackson2JsonRedisSerializer<Object> jackson2JsonRedisSerializer() {
Jackson2JsonRedisSerializer jackson2JsonRedisSerializer = new Jackson2JsonRedisSerializer(Object.class);
jackson2JsonRedisSerializer.setObjectMapper(objectMapper());
return jackson2JsonRedisSerializer;
}
#Bean
public StringRedisSerializer stringRedisSerializer() {
return new StringRedisSerializer();
}
#Bean
public RedisTemplate<String, Object> redisTemplate() {
RedisTemplate<String, Object> redisTemplate = new RedisTemplate();
redisTemplate.setConnectionFactory(jedisConnectionFactory());
redisTemplate.setKeySerializer(stringRedisSerializer());
redisTemplate.setValueSerializer(jackson2JsonRedisSerializer());
redisTemplate.setEnableTransactionSupport(true);
return redisTemplate;
}
#Bean
public ObjectMapper objectMapper() {
ObjectMapper objectMapper = new ObjectMapper();
objectMapper.setVisibility(PropertyAccessor.ALL, JsonAutoDetect.Visibility.ANY);
objectMapper.enableDefaultTyping(ObjectMapper.DefaultTyping.NON_FINAL);
return objectMapper;
}
}
Hibernate Config
#EnableTransactionManagement
#Configuration
public class HibernateConfig {
#Bean
public LocalContainerEntityManagerFactoryBean admin() {
LocalContainerEntityManagerFactoryBean entityManagerFactoryBean = new LocalContainerEntityManagerFactoryBean();
entityManagerFactoryBean.setJpaVendorAdapter(new HibernateJpaVendorAdapter());
entityManagerFactoryBean.setPersistenceUnitName("test");
return entityManagerFactoryBean;
}
#Bean
public JpaTransactionManager transactionManager(
#Qualifier("admin") LocalContainerEntityManagerFactoryBean entityManagerFactoryBean) {
JpaTransactionManager transactionManager = new JpaTransactionManager();
transactionManager.setEntityManagerFactory(entityManagerFactoryBean.getObject());
transactionManager.setDataSource(entityManagerFactoryBean.getDataSource());
return transactionManager;
}
}
Is this a bug in spring-data-redis or is something wrong in our configuration?
I found your question (coincidentally) right before I hit the exact same issue using opsForHAsh and putting many keys. A thread dump confirmed it.
What I found helped to get me going was to increase the thread pool in my JedisPoolConfig. I set it as follows, to 128, and that got me on my way again.
#Bean
JedisPoolConfig jedisPoolConfig() {
JedisPoolConfig jedisPoolConfig = new JedisPoolConfig();
jedisPoolConfig.setMaxTotal(128);
return jedisPoolConfig;
}
I assume the pool was too small in my case, and all the threads were in use for my transaction, so were waiting indefinitely. Setting to total to 128 allowed me to continue. Try setting your config to a maxTotal that makes sense for your application.
I had a very similar problem but bumping the maxTotal threads bothered me if the threads really weren't being released. Instead I had some code that rapidly did a get and then a set. I put this in a SessionCallback and it behaved much better. Hope that helps.

Resources