Trying to transfer messages form RabbitMQ <int-amqp:inbound-channel-adapter to MQSeries <int-jms:outbound-channel-adapter. This works fine.
Actually some producers on MQSeries use IBM JMS classes like that :
MQMessage messageMQ = new MQMessage();
messageMQ.format = " ";
messageMQ.persistence = 1;
messageMQ.correlationId = MQ_MESSAGE_CORRELATION_ID;
messageMQ.write(message.getMessageData());
MQPutMessageOptions putMessageOption = new MQPutMessageOptions();
putMessageOption.options = 8194;
MQQueue queue = openQueue(destinataire, 8208);
queue.put(messageMQ, putMessageOption);
I tried using transformer between amqp and jms like this :
<int:transformer id="testTransformer" ref="testTransformerBean" input-channel="fromRabbit"
method="transform" output-channel="toJms"/>
public MQMessage transform(Message<?> msg) throws Exception {
MQMessage result = new MQMessage();
result.format = " ";
result.persistence = 1;
result.correlationId = MQC.MQCI_NONE;
String test = "message to send ";
result.write(test.getBytes());
return result;
}
What is the type of Object storing in msg.getPayload()? How can i convert it into a String object?
Implementing this method, i have an exception because the outbound need a JMS message and not a com.ibm.mq.MQMessage!
Cannot convert object of type [com.ibm.mq.MQMessage] to JMS message
Is this way correct?
Or should i remove the outbound-channel and use a service activator instead with specific code for IBM?
Thanks for your help
Regards
Edit following Artem's answer
Following the jms outbound configuration:
<bean id="jmsConnectionFactory" class="com.ibm.mq.jms.MQConnectionFactory">
<property name="queueManager" value="${queueManager}" />
<property name="hostName" value="${hostName}" />
<property name="port" value="${port}" />
<property name="channel" value="${channelName}" />
<property name="transportType" value="1" />
</bean>
<bean id="jmsQueue" class="com.ibm.mq.jms.MQQueue" depends-on="jmsConnectionFactory">
<property name="baseQueueManagerName" value="${queueManager}" />
<property name="baseQueueName" value="${queueName}" />
<property name="targetClient" value="1" />
</bean>
<bean id="jmsConnectionFactory_cred"
class="org.springframework.jms.connection.UserCredentialsConnectionFactoryAdapter">
<property name="targetConnectionFactory" ref="jmsConnectionFactory" />
<property name="username" value="${user}"/>
<property name="password" value="${password}"/>
</bean>
<bean id="connectionFactoryCaching"
class="org.springframework.jms.connection.CachingConnectionFactory">
<property name="targetConnectionFactory" ref="jmsConnectionFactory_cred" />
<property name="sessionCacheSize" value="${BRIDGE_MQ_OUTBOUND_SESSION_CACHE}" />
</bean>
<bean class="org.springframework.integration.handler.advice.ExpressionEvaluatingRequestHandlerAdvice" id="requestHandler">
<property name="trapException" value="false"/>
<property name="onFailureExpressionString" value="#this"/>
<property name="failureChannel" ref="processChannel1"/>
</bean>
<int-jms:outbound-channel-adapter channel="channelRmqMQ"
id="jmsOut" destination="jmsQueue" connection-factory="connectionFactoryCaching" delivery-persistent="true"
explicit-qos-enabled="true" session-transacted="true" >
<int-jms:request-handler-advice-chain>
<ref bean="requestHandler" />
</int-jms:request-handler-advice-chain>
</int-jms:outbound-channel-adapter>
If your AMQP message comes with the text/* contentType, then its body is converted to string automatically by the out-of-the-box SimpleMessageConverter in the AmqpInboundChannelAdapter:
if (contentType != null && contentType.startsWith("text")) {
String encoding = properties.getContentEncoding();
if (encoding == null) {
encoding = this.defaultCharset;
}
try {
content = new String(message.getBody(), encoding);
} catch (UnsupportedEncodingException var8) {
throw new MessageConversionException("failed to convert text-based Message content", var8);
}
Otherwise you need to place a simple transformer in between to convert a byte[] to string:
<object-to-string-transformer>
The <int-jms:outbound-channel-adapter> is exactly for JMS protocol interaction, so, your MQMessage is not going to be accepted there. That's why you get that Cannot convert object of type [com.ibm.mq.MQMessage] to JMS message exception.
Yes, you can use IBM MP API directly in some custom service-activator, however I would suggest to take a look into JMS to MQ bridge on IBM WebSphere. Then you need only to configure an appropriate connection factory and use it from the <int-jms:outbound-channel-adapter>:
<jee:jndi-lookup id="jndiMqConnectionFactory" jndi-name="${mqConnectionFactory}"/>
<bean id="jmsQueueConnectionFactory"
class="org.springframework.jms.connection.UserCredentialsConnectionFactoryAdapter">
<property name="targetConnectionFactory" ref="jndiMqConnectionFactory"/>
<property name="username" value="${mqLogin}"/>
<property name="password" value="${mqPassword}"/>
</bean>
<jee:jndi-lookup id="myMqQueue" jndi-name="queue/myMqQueue"/>
<bean id="mqQueueJmsTemplate" class="org.springframework.jms.core.JmsTemplate">
<property name="connectionFactory" ref="jmsQueueConnectionFactory"/>
<property name="defaultDestination" ref="myMqQueue"/>
</bean>
<jms:outbound-channel-adapter channel="myMqChannel" jms-template="mqQueueJmsTemplate"/>
I am trying to integrate my application with JMS Queues (using ActiveMQ).
My requirement is, that once the message has been read from the input queue, message should not be removed from the queue. Rather I want to send an acknowledgement that application has received that message.
I am not able to understand the exact usage of property "acknowledge".
My code is as below:
<?xml version="1.0" encoding="UTF-8"?>
<beans xmlns="http://www.springframework.org/schema/beans"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xmlns:aop="http://www.springframework.org/schema/aop"
xmlns:context="http://www.springframework.org/schema/context"
xmlns:int-jms="http://www.springframework.org/schema/integration/jms"
xmlns:jms="http://www.springframework.org/schema/jms"
xmlns:int="http://www.springframework.org/schema/integration"
xsi:schemaLocation="http://www.springframework.org/schema/aop http://www.springframework.org/schema/aop/spring-aop.xsd
http://www.springframework.org/schema/integration http://www.springframework.org/schema/integration/spring-integration.xsd
http://www.springframework.org/schema/jms http://www.springframework.org/schema/jms/spring-jms.xsd
http://www.springframework.org/schema/integration/jms http://www.springframework.org/schema/integration/jms/spring-integration-jms.xsd
http://www.springframework.org/schema/beans http://www.springframework.org/schema/beans/spring-beans.xsd
http://www.springframework.org/schema/context http://www.springframework.org/schema/context/spring-context.xsd">
<!-- Component scan to find all Spring components -->
<context:component-scan base-package="com.poc.springinteg._7" />
<!-- -->
<bean id="remoteJndiTemplate" class="org.springframework.jndi.JndiTemplate" lazy-init="false">
<property name="environment">
<props>
<prop key="java.naming.provider.url">tcp://localhost:61616</prop>
<prop key="java.naming.factory.url.pkgs">org.apache.activemq.jndi</prop>
<prop key="java.naming.factory.initial">org.apache.activemq.jndi.ActiveMQInitialContextFactory</prop>
<prop key="connectionFactoryNames">DefaultActiveMQConnectionFactory,QueueConnectionFactory</prop>
<prop key="queue.SendReceiveQueue">org.apache.geronimo.configs/activemq-ra/JCAAdminObject/SendReceiveQueue</prop>
<prop key="queue.SendQueue">org.apache.geronimo.configs/activemq-ra/JCAAdminObject/MDBTransferBeanOutQueue</prop>
</props>
</property>
</bean>
<bean id="remoteConnectionFactory" class="org.springframework.jndi.JndiObjectFactoryBean" lazy-init="false">
<property name="jndiTemplate" ref="remoteJndiTemplate"/>
<property name="jndiName" value="QueueConnectionFactory"/>
<property name="lookupOnStartup" value="true" />
<property name="proxyInterface" value="javax.jms.ConnectionFactory" />
</bean>
<!-- writing queue -->
<bean id="destinationqueue"
class="org.apache.activemq.command.ActiveMQQueue">
<constructor-arg index="0">
<value>OutputQueue_7</value>
</constructor-arg>
</bean>
<int:channel id="outbound"/>
<int-jms:outbound-channel-adapter id="jmsOut"
channel="outbound"
connection-factory="remoteConnectionFactory"
destination="destinationqueue" />
<!-- reading queue -->
<bean id="sourceQueue" class="org.apache.activemq.command.ActiveMQQueue">
<constructor-arg index="0">
<value>OutputQueue_7</value>
</constructor-arg>
</bean>
<bean id="messageListenerContainer"
class="org.springframework.jms.listener.DefaultMessageListenerContainer">
<property name="connectionFactory" ref="remoteConnectionFactory"/>
<property name="destination" ref="sourceQueue"/>
<property name="maxConcurrentConsumers" value="10"/>
<property name="concurrentConsumers" value="1"/>
<property name="autoStartup" value="true"/>
</bean>
<int:channel id="inbound"/>
<int-jms:message-driven-channel-adapter id="jmsIn"
channel="inbound"
extract-payload="false"
container="messageListenerContainer" />
<int:service-activator input-channel="inbound"
output-channel="outbound"
ref="messageReader"
method="onMessage" />
</beans>
-- Message Reader Class
import javax.jms.JMSException;
import org.springframework.integration.annotation.ServiceActivator;
import org.springframework.messaging.Message;
import org.springframework.stereotype.Component;
#Component("messageReader")
public class MessageReader
{
#ServiceActivator
public void onMessage(Message inboundMessage) {
System.out.println(" -------Message Read Start--------");
System.out.println(inboundMessage.getHeaders());
System.out.println(" -------Message Headers Reading completed--------");
System.out.println("payload-->" + inboundMessage.getPayload().getClass());
String payload = inboundMessage.getPayload().toString();
System.out.println("payload value-->" + payload);
org.apache.activemq.command.ActiveMQTextMessage obj = (org.apache.activemq.command.ActiveMQTextMessage)inboundMessage.getPayload();
System.out.println("Object-->" + obj);
String var = null;
try {
var = obj.getText();
System.out.println("Datastructure-->" + obj.getText());
} catch (JMSException e) {
e.printStackTrace();
}
}
}
---- Message Writer Class
#Component("sendMessage")
public class SendMessage {
#Autowired
private MessageChannel outbound;
public void send(String name)
{
Entity entity = new Entity(1,"anuj");
Message<Entity> message = MessageBuilder.withPayload(entity)
.setHeader("Message_Header1", "Message_Header1_Value")
.setHeader("Message_Header2", "Message_Header2_Value")
.build();
outbound.send(message);
}
}
-- Application main class
public class App {
public static void main( String[] args )
{
ClassPathXmlApplicationContext applicationContext = new ClassPathXmlApplicationContext("classpath:7_applicationContext.xml" );
SendMessage sendMessage = (SendMessage)applicationContext.getBean( "sendMessage", SendMessage.class);
for(int i=0;i<10;i++){
sendMessage.send("This is Message Content");
}
applicationContext.registerShutdownHook();
}
}
I have a module s3-puller which pulls file from was s3 .In the production i am facing some issue when i try to create a stream.But local single node it works fine and i tried to set up 3 node cluster and 1 admin node in local it works fine.
Below is my application context
<?xml version="1.0" encoding="UTF-8"?>
<beans xmlns="http://www.springframework.org/schema/beans"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xmlns:int="http://www.springframework.org/schema/integration"
xmlns:int-aws="http://www.springframework.org/schema/integration/aws"
xsi:schemaLocation="http://www.springframework.org/schema/beans http://www.springframework.org/schema/beans/spring-beans.xsd
http://www.springframework.org/schema/integration http://www.springframework.org/schema/integration/spring-integration.xsd
http://www.springframework.org/schema/integration/aws http://www.springframework.org/schema/integration/aws/spring-integration-aws-1.0.xsd">
<int:poller fixed-delay="${fixed-delay}" default="true"/>
<bean id="credentials" class="org.springframework.integration.aws.core.BasicAWSCredentials">
<property name="accessKey" value="${accessKey}"/>
<property name="secretKey" value="${secretKey}"/>
</bean>
<bean
class="org.springframework.beans.factory.config.PropertyPlaceholderConfigurer">
<property name="location">
<value>dms-aws-s3-nonprod.properties</value>
</property>
</bean>
<bean id="clientConfiguration" class="com.amazonaws.ClientConfiguration">
<property name="proxyHost" value="${proxyHost}"/>
<property name="proxyPort" value="${proxyPort}"/>
<property name="preemptiveBasicProxyAuth" value="false"/>
</bean>
<bean id="s3Operations" class="org.springframework.integration.aws.s3.core.CustomC1AmazonS3Operations">
<constructor-arg index="0" ref="credentials"/>
<constructor-arg index="1" ref="clientConfiguration"/>
<property name="awsEndpoint" value="s3.amazonaws.com"/>
<property name="temporaryDirectory" value="${temporaryDirectory}"/>
<property name="awsSecurityKey" value="${awsSecurityKey}"/>
</bean>
<!-- aws-endpoint="https://s3.amazonaws.com" -->
<int-aws:s3-inbound-channel-adapter aws-endpoint="s3.amazonaws.com"
bucket="${bucket}"
s3-operations="s3Operations"
credentials-ref="credentials"
file-name-wildcard="${file-name-wildcard}"
remote-directory="${remote-directory}"
channel="splitChannel"
local-directory="${local-directory}"
accept-sub-folders="false"
delete-source-files="true"
archive-bucket="${archive-bucket}"
archive-directory="${archive-directory}">
</int-aws:s3-inbound-channel-adapter>
<int:splitter input-channel="splitChannel" output-channel="output"
expression="T(org.apache.commons.io.FileUtils).lineIterator(payload)"/>
<int:channel id="output"/>
my Application.java
package com.capitalone.api.dms.main;
import org.springframework.boot.autoconfigure.SpringBootApplication;
import org.springframework.boot.builder.SpringApplicationBuilder;
import org.springframework.context.annotation.ImportResource;
#SpringBootApplication
#ImportResource("classpath:config/applicationContext.xml")
public class Application {
public static void main(String[] args) throws Exception {
new SpringApplicationBuilder(Application.class)
.web(false)
.showBanner(false)
.properties("security.basic.enabled=false")
.run(args);
}
}
I am getting below exception when i try to create a basic stream
module upload --file aws.jar --name aws-s3-options --type source
stream create feedTest91 --definition "aws-s3-options | log" --deploy
I get below exception
DeploymentStatus{state=failed,error(s)=org.springframework.beans.factory.BeanDefinitionStoreException: Invalid bean definition with name 'objectNameProperties' defined in null: Could not resolve placeholder 'xd.module.sequence' in string value "${xd.module.sequence}"; nested exception is java.lang.IllegalArgumentException: Could not resolve placeholder 'xd.module.sequence' in string value "${xd.module.sequence}" at org.springframework.beans.factory.config.PlaceholderConfigurerSupport.doProcessProperties(PlaceholderConfigurerSupport.java:211) at org.springframework.beans.factory.config.PropertyPlaceholderConfigurer.processProperties(PropertyPlaceholderConfigurer.java:222) at org.springframework.beans.factory.config.PropertyResourceConfigurer.postProcessBeanFactory(PropertyResourceConfigurer.java:86) at org.springframework.context.support.PostProcessorRegistrationDelegate.invokeBeanFactoryPostProcessors(PostProcessorRegistrationDelegate.java:265)
from the source code i see that its loaded by jmx mbean file of xd and loaded by below java file
https://github.com/spring-projects/spring-xd/blob/6923ee8705bd9c2c58ad73120724b8b87c5ba37d/spring-xd-dirt/src/main/resources/META-INF/spring-xd/plugins/jmx/mbean-exporters.xml
https://github.com/spring-projects/spring-xd/blob/e9ce8e897774722c1e61038817ebd55c5cf0befc/spring-xd-dirt/src/main/java/org/springframework/xd/dirt/plugins/MBeanExportingPlugin.java
Solution :
I am planning to inject them from my s3 module .Is it right way to do please let me know what should be the values?
<context:mbean-export />
<int-jmx:mbean-export object-naming-strategy="moduleObjectNamingStrategy" />
<util:properties id="objectNameProperties">
<prop key="group">${xd.group.name}</prop>
<prop key="label">${xd.module.label}</prop>
<prop key="type">${xd.module.type}</prop>
<prop key="sequence">${xd.module.sequence}</prop>
</util:properties>
<bean id="moduleObjectNamingStrategy"
class="org.springframework.xd.dirt.module.jmx.ModuleObjectNamingStrategy">
<constructor-arg value="xd.${xd.stream.name:${xd.job.name:}}" />
<constructor-arg ref="objectNameProperties" />
</bean>
That property should be automatically set up by the ModuleInfoPlugin.
This is the second time someone has said that property is missing somehow.
I have opened a JIRA Issue.
In a Spring Batch I am trying to read a CSV file and want to assign each row to a separate thread and process it. I have tried to achieve it by using TaskExecutor, but what is happening all the thread is picking the same row at a time. I also tried to implement the concept using Partioner, there also same thing happening. Please see below my Configuration Xml.
Step Description
<step id="Step2">
<tasklet task-executor="taskExecutor">
<chunk reader="reader" processor="processor" writer="writer" commit-interval="1" skip-limit="1">
</chunk>
</tasklet>
</step>
<bean id="reader" class="org.springframework.batch.item.file.FlatFileItemReader">
<property name="resource" value="file:cvs/user.csv" />
<property name="lineMapper">
<bean class="org.springframework.batch.item.file.mapping.DefaultLineMapper">
<!-- split it -->
<property name="lineTokenizer">
<bean
class="org.springframework.batch.item.file.transform.DelimitedLineTokenizer">
<property name="names" value="userid,customerId,ssoId,flag1,flag2" />
</bean>
</property>
<property name="fieldSetMapper">
<!-- map to an object -->
<bean
class="org.springframework.batch.item.file.mapping.BeanWrapperFieldSetMapper">
<property name="prototypeBeanName" value="user" />
</bean>
</property>
</bean>
</property>
</bean>
<bean id="taskExecutor" class="org.springframework.core.task.SimpleAsyncTaskExecutor">
<property name="concurrencyLimit" value="4"/>
I have tried with different types of task executor, but all of them are behaving in same way. How can I assign each row to a separate thread?
FlatFileItemReader is not thread-safe. In your example you can try to split the CSV file to smaller CSV files and then use a MultiResourcePartitioner to process each one of them. This can be done in 2 steps, one for splitting the original file(like 10 smaller files) and the other for processing splitted files.This way you won't have any issues since each file will be processed by one thread.
Example:
<batch:job id="csvsplitandprocess">
<batch:step id="step1" next="step2master">
<batch:tasklet>
<batch:chunk reader="largecsvreader" writer="csvwriter" commit-interval="500">
</batch:chunk>
</batch:tasklet>
</batch:step>
<batch:step id="step2master">
<partition step="step2" partitioner="partitioner">
<handler grid-size="10" task-executor="taskExecutor"/>
</partition>
</batch:step>
</batch:job>
<batch:step id="step2">
<batch:tasklet>
<batch:chunk reader="smallcsvreader" writer="writer" commit-interval="100">
</batch:chunk>
</batch:tasklet>
</batch:step>
<bean id="taskExecutor" class="org.springframework.scheduling.concurrent.ThreadPoolTaskExecutor">
<property name="corePoolSize" value="10" />
<property name="maxPoolSize" value="10" />
</bean>
<bean id="partitioner"
class="org.springframework.batch.core.partition.support.MultiResourcePartitioner">
<property name="resources" value="file:cvs/extracted/*.csv" />
</bean>
The alternative instead of partitioning might be a Custom Thread-safe Reader who will create a thread for each line, but probably partitioning is your best choice
You're problem is that you reader is not in scope step .
That's means : all your threads share the same input Stream (Resource file).
To have for each thread one row to process you need to :
Be sure that all threads read the file from the start to the
end of file (Each thread should open the stream and close it for
each execution context )
The partitioner must inject the start and end position for each
execution context.
You're reader must read the file with this positions.
I write some code and this is the output :
Code of com.test.partitioner.RangePartitioner class :
public Map<String, ExecutionContext> partition() {
Map < String, ExecutionContext > result = new HashMap < String, ExecutionContext >();
int range = 1;
int fromId = 1;
int toId = range;
for (int i = 1; i <= gridSize; i++) {
ExecutionContext value = new ExecutionContext();
log.debug("\nStarting : Thread" + i);
log.debug("fromId : " + fromId);
log.debug("toId : " + toId);
value.putInt("fromId", fromId);
value.putInt("toId", toId);
// give each thread a name, thread 1,2,3
value.putString("name", "Thread" + i);
result.put("partition" + i, value);
fromId = toId + 1;
toId += range;
}
return result;
}
--> Look at the outPut console
Starting : Thread1
fromId : 1
toId : 1
Starting : Thread2
fromId : 2
toId : 2
Starting : Thread3
fromId : 3
toId : 3
Starting : Thread4
fromId : 4
toId : 4
Starting : Thread5
fromId : 5
toId : 5
Starting : Thread6
fromId : 6
toId : 6
Starting : Thread7
fromId : 7
toId : 7
Starting : Thread8
fromId : 8
toId : 8
Starting : Thread9
fromId : 9
toId : 9
Starting : Thread10
fromId : 10
toId : 10
Look at the configuration bellow :
http://www.springframework.org/schema/batch/spring-batch-2.2.xsd
http://www.springframework.org/schema/beans
http://www.springframework.org/schema/beans/spring-beans-3.2.xsd">
<import resource="../config/context.xml" />
<import resource="../config/database.xml" />
<bean id="mouvement" class="com.test.model.Mouvement" scope="prototype" />
<bean id="itemProcessor" class="com.test.processor.CustomItemProcessor" scope="step">
<property name="threadName" value="#{stepExecutionContext[name]}" />
</bean>
<bean id="xmlItemWriter" class="com.test.writer.ItemWriter" />
<batch:job id="mouvementImport" xmlns:batch="http://www.springframework.org/schema/batch">
<batch:listeners>
<batch:listener ref="myAppJobExecutionListener" />
</batch:listeners>
<batch:step id="masterStep">
<batch:partition step="slave" partitioner="rangePartitioner">
<batch:handler grid-size="10" task-executor="taskExecutor" />
</batch:partition>
</batch:step>
</batch:job>
<bean id="rangePartitioner" class="com.test.partitioner.RangePartitioner" />
<bean id="taskExecutor" class="org.springframework.core.task.SimpleAsyncTaskExecutor" />
<batch:step id="slave">
<batch:tasklet>
<batch:listeners>
<batch:listener ref="stepExecutionListener" />
</batch:listeners>
<batch:chunk reader="mouvementReader" writer="xmlItemWriter" processor="itemProcessor" commit-interval="1">
</batch:chunk>
</batch:tasklet>
</batch:step>
<bean id="stepExecutionListener" class="com.test.listener.step.StepExecutionListenerCtxInjecter" scope="step" />
<bean id="myAppJobExecutionListener" class="com.test.listener.job.MyAppJobExecutionListener" />
<bean id="mouvementReaderParent" class="org.springframework.batch.item.file.FlatFileItemReader" scope="step">
<property name="resource" value="classpath:XXXXX/XXXXXXXX.csv" />
<property name="lineMapper">
<bean class="org.springframework.batch.item.file.mapping.DefaultLineMapper">
<property name="lineTokenizer">
<bean class="org.springframework.batch.item.file.transform.DelimitedLineTokenizer">
<property name="delimiter" value="|" />
<property name="names"
value="id,numen,prenom,grade,anneeScolaire,academieOrigin,academieArrivee,codeUsi,specialiteEmploiType,natureSupport,dateEffet,modaliteAffectation" />
</bean>
</property>
<property name="fieldSetMapper">
<bean class="com.test.mapper.MouvementFieldSetMapper" />
</property>
</bean>
</property>
</bean>
<!-- <bean id="itemReader" scope="step" autowire-candidate="false" parent="mouvementReaderParent">-->
<!-- <property name="resource" value="#{stepExecutionContext[fileName]}" />-->
<!-- </bean>-->
<bean id="mouvementReader" class="com.test.reader.MouvementItemReader" scope="step">
<property name="delegate" ref="mouvementReaderParent" />
<property name="parameterValues">
<map>
<entry key="fromId" value="#{stepExecutionContext[fromId]}" />
<entry key="toId" value="#{stepExecutionContext[toId]}" />
</map>
</property>
</bean>
<!-- <bean id="xmlItemWriter" class="org.springframework.batch.item.xml.StaxEventItemWriter">-->
<!-- <property name="resource" value="file:xml/outputs/Mouvements.xml" />-->
<!-- <property name="marshaller" ref="reportMarshaller" />-->
<!-- <property name="rootTagName" value="Mouvement" />-->
<!-- </bean>-->
<bean id="reportMarshaller" class="org.springframework.oxm.jaxb.Jaxb2Marshaller">
<property name="classesToBeBound">
<list>
<value>com.test.model.Mouvement</value>
</list>
</property>
</bean>
TODO : Change my reader on other that read with position (start and end position) like with Scanner Class in java.
Hope this help.
You can split your input file to many file , the use Partitionner and load small files with threads, but on error , you must restart all job after DB cleaned.
<batch:job id="transformJob">
<batch:step id="deleteDir" next="cleanDB">
<batch:tasklet ref="fileDeletingTasklet" />
</batch:step>
<batch:step id="cleanDB" next="split">
<batch:tasklet ref="countThreadTasklet" />
</batch:step>
<batch:step id="split" next="partitionerMasterImporter">
<batch:tasklet>
<batch:chunk reader="largeCSVReader" writer="smallCSVWriter" commit-interval="#{jobExecutionContext['chunk.count']}" />
</batch:tasklet>
</batch:step>
<batch:step id="partitionerMasterImporter" next="partitionerMasterExporter">
<partition step="importChunked" partitioner="filePartitioner">
<handler grid-size="10" task-executor="taskExecutor" />
</partition>
</batch:step>
Full example code working (on Github)
Hope this help.
I created a scheduler as in the example on the Alfresco wiki page but it does not work.
In final form it should delete documents older than 30 days and this cron and lucene query is only for test (it remove in every second all documents from folder test).
I create it in simple amp module and I install it as in tutorial.
My beans:
<bean id="templateActionModelFactory" class="org.alfresco.repo.action.scheduled.FreeMarkerWithLuceneExtensionsModelFactory">
<property name="serviceRegistry">
<ref bean="ServiceRegistry" />
</property>
</bean>
<!-- Action -->
<bean id="deleteNodesActionBean"
class="pl.consdata.eximee.spike.deletescheduler.DeleteNodeActionExecuter"
parent="action-executer">
<property name="nodeService">
<ref bean="nodeService" />
</property>
<property name="transactionService">
<ref bean="TransactionService" />
</property>
</bean>
<!-- Action Definition -->
<bean id="deletefilesActionDefinition"
class="org.alfresco.repo.action.scheduled.SimpleTemplateActionDefinition">
<property name="actionName">
<value>deleteNodesActionBean</value>
</property>
<!-- Required services and the FreeMarker template model -->
<property name="templateActionModelFactory">
<ref bean="templateActionModelFactory" />
</property>
<property name="dictionaryService">
<ref bean="DictionaryService" />
</property>
<property name="actionService">
<ref bean="ActionService" />
</property>
<property name="templateService">
<ref bean="TemplateService" />
</property>
</bean>
<!-- Scheduler -->
<bean id="addClassifiableAspectEveryTenMinutes"
class="org.alfresco.repo.action.scheduled.CronScheduledQueryBasedTemplateActionDefinition">
<property name="transactionMode">
<value>ISOLATED_TRANSACTIONS</value>
</property>
<property name="compensatingActionMode">
<value>IGNORE</value>
</property>
<property name="searchService">
<ref bean="SearchService" />
</property>
<property name="templateService">
<ref bean="TemplateService" />
</property>
<property name="queryLanguage">
<value>lucene</value>
</property>
<property name="stores">
<list>
<value>workspace://SpacesStore</value>
</list>
</property>
<!-- QUERY -->
<property name="queryTemplate">
<value>PATH:"/app:company_home/cm:test/*"</value>
</property>
<property name="cronExpression">
<value>0/1 * * * * ?</value>
</property>
<property name="jobName">
<value>jobA</value>
</property>
<property name="jobGroup">
<value>jobGroup</value>
</property>
<property name="triggerName">
<value>triggerA</value>
</property>
<property name="triggerGroup">
<value>triggerGroup</value>
</property>
<!-- Inject the scheduler - the trigger will be registered with this scheduler -->
<property name="scheduler">
<ref bean="schedulerFactory" />
</property>
<property name="actionService">
<ref bean="ActionService" />
</property>
<property name="templateActionModelFactory">
<ref bean="templateActionModelFactory" />
</property>
<property name="templateActionDefinition">
<ref bean="deletefilesActionDefinition" />
</property>
<property name="transactionService">
<ref bean="TransactionService" />
</property>
<property name="runAsUser">
<value>System</value>
</property>
</bean>
And my action code:
public class DeleteNodeActionExecuter extends ActionExecuterAbstractBase {
public static final String NAME = "deleteNodesActionBean";
private static final Logger LOGGER = LoggerFactory
.getLogger(DeleteNodeActionExecuter.class);
private NodeService nodeService;
private TransactionService transactionService;
public void setNodeService(final NodeService nodeService) {
this.nodeService = nodeService;
}
public void setTransactionService(TransactionService transactionService) {
this.transactionService = transactionService;
}
#Override
protected void executeImpl(Action action, final NodeRef actionedUponNodeRef) {
if (!nodeService.exists(actionedUponNodeRef)) {
LOGGER.warn("< node does not exist!", action, actionedUponNodeRef);
return;
}
transactionService.getRetryingTransactionHelper().doInTransaction(
new RetryingTransactionCallback<Void>() {
public Void execute() throws Throwable {
if (!nodeService.exists(actionedUponNodeRef)) {
// Node has gone away, skip
LOGGER.debug("Node has gone away, skip: "+ actionedUponNodeRef.getId());
return null;
}
LOGGER.debug("deleting node: "
+ actionedUponNodeRef.getId());
nodeService.deleteNode(actionedUponNodeRef);
LOGGER.debug("node deleted");
return null;
}
});
}
#Override
protected void addParameterDefinitions(List<ParameterDefinition> paramList) {
}
}