I have a project with Spring Roo 1.5 (mysql w/ hibernate), I made a Thread class (extends from Thread) because I need call to async operations. But when I tried to get this, for example a property from persitence class occurs a exception. This only occurs when I call from Thread class...
My entity class:
#RooJavaBean
#RooToString
#RooEntity
public class Consulta {
private String nombre;
#OneToMany(cascade=CascadeType.ALL)
private List<DetalleConsulta> detalleConsulta;
}
My thread:
public class ThreadIngresarConsulta extends Thread {
private Long idConsulta;
public ThreadIngresarConsultaCRM(Long idConsulta) {
super("ThreadIngresarConsultaCRM");
this.idConsulta = idConsulta;
}
public void run(){
try {
Consulta consulta = Consulta.findConsulta(idConsulta);
List<DetalleConsulta> lista = consulta.getDetalleConsulta();
}catch(Exception e) {
System.err.println(e.getMessage());
}
}
}
ApplicactionContext (generated by roo)
...
<bean class="org.apache.commons.dbcp.BasicDataSource" destroy-method="close" id="dataSource">
<property name="driverClassName" value="${database.driverClassName}"/>
<property name="url" value="${database.url}"/>
<property name="username" value="${database.username}"/>
<property name="password" value="${database.password}"/>
<property name="validationQuery" value="SELECT 1"/>
</bean>
<bean class="org.springframework.orm.jpa.JpaTransactionManager" id="transactionManager">
<property name="entityManagerFactory" ref="entityManagerFactory"/>
</bean>
<tx:annotation-driven mode="aspectj" transaction-manager="transactionManager"/>
<bean class="org.springframework.orm.jpa.LocalContainerEntityManagerFactoryBean" id="entityManagerFactory">
<property name="persistenceUnitName" value="persistenceUnit"/>
<property name="dataSource" ref="dataSource"/>
</bean>
And this is the exception:
011-12-05 18:49:10,015 [ThreadIngresarConsulta] ERROR org.hibernate.LazyInitializationException - failed to lazily initialize a collection of role: com.core.Consulta.detalleConsulta, no session or session was closed
org.hibernate.LazyInitializationException: failed to lazily initialize a collection of role: com.core.Consulta.detalleConsulta, no session or session was closed
Has anyone tried to call an entity JPA within a thread?
Try putting an #Transactional annotation on the run method of your thread. If that doesn't work move the two lines into a separate method and add #Transactional on that method.
public class ThreadIngresarConsulta extends Thread {
public void run(){
doProcess();
}
#Transactional
public void doProcess() {
try {
Consulta consulta = Consulta.findConsulta(idConsulta);
List<DetalleConsulta> lista = consulta.getDetalleConsulta();
}catch(Exception e) {
System.err.println(e.getMessage());
}
}
}
Related
is it possible to create a date condition for promotion in hybris?
I am trying to do this implementation. my first problem is how to map the parameter value
<bean id="dateRuleParameterValueMapperDefinition" class="de.hybris.platform.ruleengineservices.rule.strategies.impl.RuleParameterValueMapperDefinition">
<property name="mapper" ref="dateRuleParameterValueMapper" />
<property name="type" value="java.util.Date" />
</bean>
in this mapping, I have an exception that the type is not supported (Caused by: de.hybris.platform.ruleengineservices.rule.strategies.RuleParameterValueMapperException:)
if so, can I resolve this error .. is it possible to create a date condition in the RuleConditionTranslator?
hybris version: 6.5
If you want to add any new RuleParameterValueMapperDefinition, you have to implement 'RuleParameterValueMapper' and override 'toString' and 'fromString' methods in your mapper implementation.
public class MediaTypeRuleParameterValueMapper implements RuleParameterValueMapper<MediaModel>
{
#Resource
MediaService mediaService;
#Override
public String toString(final MediaModel mediaModel)
{
Preconditions.checkArgument(Objects.nonNull(mediaModel), "mediaModel must not be null!");
return mediaModel.getCode();
}
#Override
public MediaModel fromString(final String mediaCode)
{
try
{
return mediaService.getMedia(mediaCode);
}
catch (UnknownIdentifierException e)
{
e.message()
}
return null;
}
}
Now create a custom mapper definition (RuleParameterValueMapperDefinition)
<bean id="mediaTypeRuleParameterValueMapper" class="com.hybris.MediaTypeRuleParameterValueMapper"/>
<bean id="mediaTypeRuleParameterValueMapperDefinition" class="de.hybris.platform.ruleengineservices.rule.strategies.impl.RuleParameterValueMapperDefinition">
<property name="mapper" ref="mediaTypeRuleParameterValueMapper"/>
<property name="type" value="ItemType(Media)"/>
</bean>
Now you are ready to use
RuleConditionDefinitionParameter.type=ItemType(Media)
I don't believe that you need to map a date in that way, it should be treated the same as a String or Integer. So for reference look at the way any of the String or Integer parameters are handled.
My basis for saying this is the following from ruleengineservices-spring-rule.xml:
<alias name="defaultRuleParameterSupportedTypes" alias="ruleParameterSupportedTypes" />
<util:set id="defaultRuleParameterSupportedTypes" value-type="java.lang.String">
<value>java.lang.Boolean</value>
<value>java.lang.Character</value>
<value>java.lang.String</value>
<value>java.lang.Byte</value>
<value>java.lang.Short</value>
<value>java.lang.Integer</value>
<value>java.lang.Long</value>
<value>java.lang.Float</value>
<value>java.lang.Double</value>
<value>java.math.BigInteger</value>
<value>java.math.BigDecimal</value>
<value>java.util.Date</value>
<value>java.lang.Enum</value>
<value>java.util.List</value>
<value>java.util.Map</value>
</util:set>
I am building Docker Containers which have a war file running on Jetty, and I have been alternating a few settings to see if performance improves but nothing so far. Per container it has been achieving 7 tps.
The settings are
<bean id="cachingConnectionFactory" class="org.springframework.jms.connection.CachingConnectionFactory">
<property name="targetConnectionFactory" ref="MQConnectionFactory" />
<property name="sessionCacheSize" value="10"/>
</bean>
<bean id="requestQueue" class="com.ibm.mq.jms.MQQueue">
<constructor-arg index="0" value="${queuemanager}"/>
<constructor-arg index="1" value="${incoming.queue}"/>
</bean>
<integration:poller id="poller" default="true" fixed-delay="1000" error-channel="errorChannel"/>
How can I improve the number of threads processing over here?
Also, my connection factory details are as shown below
#Bean(name="DefaultJmsListenerContainerFactory")
public DefaultJmsListenerContainerFactory provideJmsListenerContainerFactory(PlatformTransactionManager transactionManager) {
DefaultJmsListenerContainerFactory factory = new DefaultJmsListenerContainerFactory();
factory.setConnectionFactory(connectionFactory());
factory.setTransactionManager(transactionManager);
factory.setConcurrency(jmsConcurrency);
factory.setCacheLevel(jmsCacheLevel);
factory.setSessionAcknowledgeMode(Session.CLIENT_ACKNOWLEDGE);
factory.setSessionTransacted(true);
return factory;
}
#Bean(name = "txManager")
public PlatformTransactionManager provideTransactionManager() {
return new JmsTransactionManager(connectionFactory());
}
#Bean(name = "JmsTemplate")
public JmsTemplate provideJmsTemplate() {
JmsTemplate jmsTemplate = new JmsTemplate(connectionFactory());
jmsTemplate.setReceiveTimeout(Long.parseLong(env.getRequiredProperty(RECEIVE_TIMEOUT)));
return jmsTemplate;
}
#Bean(name="MQConnectionFactory")
public ConnectionFactory connectionFactory() {
if (factory == null) {
factory = new MQXAConnectionFactory();
try {
factory.setHostName(env.getRequiredProperty(HOST));
factory.setPort(Integer.parseInt(env.getRequiredProperty(PORT)));
factory.setQueueManager(env.getRequiredProperty(QUEUE_MANAGER));
factory.setChannel(env.getRequiredProperty(CHANNEL));
factory.setTransportType(WMQConstants.WMQ_CM_CLIENT);
} catch (JMSException e) {
throw new RuntimeException(e);
}
}
return factory;
}
The initial setting for the concurrency was '1-2' and I changed that to '10-15'. Did not affect performance.
The jmsCache was set to 3 (Consumer cache), but no change there yet either.
Any help is much appreciated.
Cheers
Kris
Answering my own post here. What we found out was that the problem was actually with our Database pooling not setup correctly in the first place.
But in order to increase the Listener count, I had to change my Spring integration adapter settings
<jms:message-driven-channel-adapter id="jmsIn"
destination="requestQueue"
channel="inputJsonConversionChannel"
connection-factory="cachingConnectionFactory"
error-channel="errorChannel"
concurrent-consumers="${jms_adapter_concurrent_consumers}" />
Only when the concurrent-consumers is varied, does the number of listeners on the queue increase.
I am using SftpSimplePatternFileListFilter and SftpPersistentAcceptOnceFileListFilter along with metadata store. But I noticed that it is not flushing the entries to file. I never show flush() method being called from PropertiesPersistingMetadataStore which ultimately invokes saveMetaData() method.
Here is my config looks like
<bean id="compositeFilter" class="org.springframework.integration.file.filters.CompositeFileListFilter">
<constructor-arg>
<list>
<bean class="org.springframework.integration.sftp.filters.SftpSimplePatternFileListFilter">
<constructor-arg value="*.txt" />
</bean>
<bean class="org.springframework.integration.sftp.filters.SftpPersistentAcceptOnceFileListFilter">
<constructor-arg name="store" ref="metadataStore"/>
<constructor-arg value="myapp"/>
</bean>
</list>
</constructor-arg>
</bean>
<bean name="metadataStore" class="org.springframework.integration.metadata.PropertiesPersistingMetadataStore">
<property name="baseDirectory" value="/tmp/"/>
</bean>
By default PropertiesPersistingMetadataStore flushes to the file on applicationContext destroy:
#Override
public void close() throws IOException {
flush();
}
#Override
public void flush() {
saveMetadata();
}
#Override
public void destroy() throws Exception {
flush();
}
Starting with 4.1.2 you can invoke flush() manually at runtime.
E.g. periodically with <task:sheduled-tasks> or with some <int:outbound-channel-adapter>.
Feel free to ask for more information!
How to notify with a listener when the Future method has finished using #Async annotation in spring based service layer and with a thread pool in application context called from a ManagedBean . I have tried with p:poll listener with a listener method future.isDone() but is not very effective because it makes many requests to the server.
Edit.
here is the example
#ManagedBean
#ViewScoped
public class Controller
{
Future<SomeModel> someFuture;
#ManagedProperty(value = "#{someService}")
private SomeService someService;
private String things;
private SomeModel someModel;
private boolean renderSomeModel;
public void callAsyncMethod()
{
someFuture = someService.thingsToBeInsered(things);
//here make the poll in jsf start
}
public void methodAsyncFinished{
if(someFuture!=null)
{
if(this.someFuture.isDone())
{
renderSomeModel = true;
//here make the poll to stop
}
}
}
#Service
public class SomeService
{
#Async
Future<SomeModel> thingsToBeInsered(things)
{
//Calling the dao here
return new AsyncResult<SomeModel>(things);
}
}
//spring context
<task:annotation-driven executor="taskExecutor" proxy-target-class="true"/>
<bean id="taskExecutor"
class="org.springframework.scheduling.concurrent.ThreadPoolTaskExecutor">
<property name="corePoolSize" value="${thread_pool.corePoolSize}" />
<property name="maxPoolSize" value="${thread_pool.maxPoolSize}" />
<property name="WaitForTasksToCompleteOnShutdown" value="${thread_pool.waitForTasksToCompleteOnShutdown}" />
</bean>
jsf
<p:poll autoStart="false" widgetVar="asyncGenPoll" interval="6" listener="#{controller.methodAsyncFinished}" update="resultPanel"/>
I am getting the following exception when I run a JUnit test in which I test a Spring-MVC controller that calls a Spring-Batch job. The Job includes two tasklets: first read from a file & write to DB, then update the DB. Both tasklets use the same DB. As far as I can see, the exception tells me that the datasource is closed, but in the DB I see that the first tasklet has been executed, while the second one has not.
Could you come up with any suggestion for why the datasource is closed (???) during the second tasklet (the update to the DB)?
The job executes both tasklets when I call the controller from the browser.
ERROR [taskExecutor-1] (AbstractJob.java:306) - Encountered fatal error executing job
org.springframework.batch.core.JobExecutionException: Flow execution ended unexpectedly
at org.springframework.batch.core.job.flow.FlowJob.doExecute(FlowJob.java:141)
at org.springframework.batch.core.job.AbstractJob.execute(AbstractJob.java:281)
at org.springframework.batch.core.launch.support.SimpleJobLauncher$1.run(SimpleJobLauncher.java:120)
at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
at java.lang.Thread.run(Thread.java:662)
Caused by: org.springframework.batch.core.job.flow.FlowExecutionException: Ended flow=writeProductsJob at state=writeProductsJob.readWrite with exception
at org.springframework.batch.core.job.flow.support.SimpleFlow.resume(SimpleFlow.java:152)
at org.springframework.batch.core.job.flow.support.SimpleFlow.start(SimpleFlow.java:124)
at org.springframework.batch.core.job.flow.FlowJob.doExecute(FlowJob.java:135)
... 5 more
Caused by: org.springframework.transaction.CannotCreateTransactionException: Could not open JDBC Connection for transaction; nested exception is java.sql.SQLException: Data source is closed
at org.springframework.jdbc.datasource.DataSourceTransactionManager.doBegin(DataSourceTransactionManager.java:240)
at org.springframework.transaction.support.AbstractPlatformTransactionManager.getTransaction(AbstractPlatformTransactionManager.java:371)
at org.springframework.transaction.interceptor.TransactionAspectSupport.createTransactionIfNecessary(TransactionAspectSupport.java:335)
at org.springframework.transaction.interceptor.TransactionInterceptor.invoke(TransactionInterceptor.java:105)
at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:172)
at org.springframework.aop.framework.JdkDynamicAopProxy.invoke(JdkDynamicAopProxy.java:202)
at $Proxy27.getStepExecutionCount(Unknown Source)
at org.springframework.batch.core.job.SimpleStepHandler.shouldStart(SimpleStepHandler.java:210)
at org.springframework.batch.core.job.SimpleStepHandler.handleStep(SimpleStepHandler.java:117)
at org.springframework.batch.core.job.flow.JobFlowExecutor.executeStep(JobFlowExecutor.java:61)
at org.springframework.batch.core.job.flow.support.state.StepState.handle(StepState.java:60)
at org.springframework.batch.core.job.flow.support.SimpleFlow.resume(SimpleFlow.java:144)
... 7 more
Caused by: java.sql.SQLException: Data source is closed
at org.apache.commons.dbcp.BasicDataSource.createDataSource(BasicDataSource.java:1362)
at org.apache.commons.dbcp.BasicDataSource.getConnection(BasicDataSource.java:1044)
at org.springframework.jdbc.datasource.DataSourceTransactionManager.doBegin(DataSourceTransactionManager.java:202)
... 18 more
Exception in thread "taskExecutor-1" org.springframework.transaction.CannotCreateTransactionException: Could not open JDBC Connection for transaction; nested exception is java.sql.SQLException: Data source is closed
at org.springframework.jdbc.datasource.DataSourceTransactionManager.doBegin(DataSourceTransactionManager.java:240)
at org.springframework.transaction.support.AbstractPlatformTransactionManager.getTransaction(AbstractPlatformTransactionManager.java:371)
at org.springframework.transaction.interceptor.TransactionAspectSupport.createTransactionIfNecessary(TransactionAspectSupport.java:335)
at org.springframework.transaction.interceptor.TransactionInterceptor.invoke(TransactionInterceptor.java:105)
at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:172)
at org.springframework.aop.framework.JdkDynamicAopProxy.invoke(JdkDynamicAopProxy.java:202)
at $Proxy27.update(Unknown Source)
at org.springframework.batch.core.job.AbstractJob.execute(AbstractJob.java:329)
at org.springframework.batch.core.launch.support.SimpleJobLauncher$1.run(SimpleJobLauncher.java:120)
at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
at java.lang.Thread.run(Thread.java:662)
Caused by: java.sql.SQLException: Data source is closed
at org.apache.commons.dbcp.BasicDataSource.createDataSource(BasicDataSource.java:1362)
at org.apache.commons.dbcp.BasicDataSource.getConnection(BasicDataSource.java:1044)
at org.springframework.jdbc.datasource.DataSourceTransactionManager.doBegin(DataSourceTransactionManager.java:202)
... 11 more
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.628 sec
UPDATE:
Test Class:
#RunWith(SpringJUnit4ClassRunner.class)
#ContextConfiguration(locations = {"classpath:applicationContext-test.xml"})
public class UploadFileValidatorTest {
#Autowired
private ApplicationContext applicationContext;
private MockMultipartHttpServletRequest request;
private MockHttpServletResponse response;
#Autowired
private MyController controller;
private MessageSource messageSource;
#Before
public void setUp() {
request = new MockMultipartHttpServletRequest();
response = new MockHttpServletResponse();
}
#Test
#DirtiesContext
public void testDoSomething() throws Exception {
DiskFileItem fileItem = null;
final File TEST_FILE = applicationContext.getResource("classpath:csv_example.txt").getFile();
try
{
fileItem = (DiskFileItem) new DiskFileItemFactory().createItem("fileData", "text/plain", true, TEST_FILE.getName());
InputStream input = new FileInputStream(TEST_FILE);
OutputStream os = fileItem.getOutputStream();
int ret = input.read();
while ( ret != -1 )
{
os.write(ret);
ret = input.read();
}
os.flush();
System.out.println("diskFileItem.getString() = " + fileItem.getString());
}
catch (Exception e)
{
e.printStackTrace();
}
/////////////////////////////////////////////////////////////////////
request.addFile(multipartFile);
request.addParameter("email", "email#email.com");
request.setRequestURI("/book/upload.html");
final ModelAndView mav = new AnnotationMethodHandlerAdapter().handle(request, response, controller);
BindingResult bindException = (BindingResult) mav.getModel().get(BindingResult.MODEL_KEY_PREFIX + "uploadFile");
for (Object object : bindException.getAllErrors()) {
if(object instanceof FieldError) {
FieldError fieldError = (FieldError) object;
assertEquals(fieldError.getField(), "fileData");
System.out.println(messageSource.getMessage((FieldError) object, null));
}
}
}
}
Batch config:
<bean id="transactionManager" class="org.springframework.jdbc.datasource.DataSourceTransactionManager">
<property name="dataSource" ref="dataSource" />
</bean>
<bean id="jobRepository" class="org.springframework.batch.core.repository.support.MapJobRepositoryFactoryBean">
<property name="transactionManager" ref="transactionManager" />
</bean>
<bean id="jobLauncher" class="org.springframework.batch.core.launch.support.SimpleJobLauncher">
<property name="jobRepository" ref="jobRepository" />
<property name="taskExecutor" ref="taskExecutor"/>
</bean>
<bean id="taskExecutor"
class="org.springframework.scheduling.concurrent.ThreadPoolTaskExecutor">
<property name="corePoolSize" value="5" />
<property name="maxPoolSize" value="5" />
</bean>
UPDATE 2:
I have tried to add the following code at the end of the testDoSomething() test method:
ThreadGroup threadGroup = Thread.currentThread().getThreadGroup();
int activeCount = threadGroup.activeCount();
Thread[] list = new Thread[activeCount];
threadGroup.enumerate(list);
for (Thread thread : list) {
if (thread.getName().startsWith("taskExecutor")) {
System.out.println("WAITING FOR THREAD: " + thread.getName());
try {
thread.join();
}
catch (InterruptedException ignore) {}
}
}
Now the whole job is executed, but testDoSomething() never ends, it seems like it waits forever for the taskExecutor thread to die. Any idea why the taskExecutor never dies?
I have similar problem in my test.
The thing is that your batch job launching asynchronously and test finished but job is running(or about to run). And context already closed(and all beans closed too, including DataSource). So you need to use synchronous JobLauhcher in your test.
Example to define two launchers:
<beans:bean id="jobLauncher" class="org.springframework.batch.core.launch.support.SimpleJobLauncher">
<beans:property name="jobRepository" ref="jobRepository"/>
</beans:bean>
<beans:bean id="jobLauncher_asynch"
class="org.springframework.batch.core.launch.support.SimpleJobLauncher">
<beans:property name="jobRepository" ref="jobRepository" />
<beans:property name="taskExecutor" ref="taskExecutor"/>
</beans:bean>