Spring integration no files picked - spring-integration

The following is my configuration. I was able to poll files until recently. Now the filter always gets an empty list of files. The only change i have made is install kaspersky antivirus. Hopefully that is not a problem. I can successfully access the ftp from command prompt as well as from the browser.
The conf:
<int:channel id="ftpChannel"/>
<int-ftp:inbound-channel-adapter id="ftpInbound1"
channel="ftpChannel"
session-factory="ftpClientFactory"
charset="UTF-8"
local-directory="file:${paths.root}"
delete-remote-files="false"
temporary-file-suffix=".writing"
remote-directory="${file.ftpfolder}"
preserve-timestamp="true"
auto-startup="true"
filter="compositeFilterLocal"
>
<int:poller max-messages-per-poll="10000" fixed-rate="1000" error-channel="errorChannel"/>
</int-ftp:inbound-channel-adapter>
<int-ftp:outbound-channel-adapter id="ftpOutbound"
channel="ftpChannel"
session-factory="ftpClientFactory"
charset="UTF-8"
remote-file-separator="/"
auto-create-directory="true"
remote-directory="DMS"
use-temporary-file-name="true"
temporary-file-suffix=".writing">
</int-ftp:outbound-channel-adapter>
<!-- <bean id="acceptAllFilter" class="org.springframework.integration.file.filters.AcceptAllFileListFilter" /> -->
<bean id="compositeFilterLocal" class="org.springframework.integration.file.filters.CompositeFileListFilter">
<constructor-arg>
<list>
<!-- Ensures that the file is whole before processing it -->
<bean class="org.springframework.integration.file.filters.AcceptAllFileListFilter" />
<bean class="com.polling.util.CustomFileFilterLocal"/>
<!-- Ensures files are picked up only once from the directory -->
</list>
</constructor-arg>
</bean>
Please tell me if anything should be changed in it...Thanks
Please let me know if anymore information is needed!
EDIT:: Update
If I use the Apache commons-net-3.3 to retrieve the same file from the same folder, it is working fine and allowing me to take the file and download it. So this has nothing to do with jvm access to the ftp site.
EDIT::Code for the filter is simple. Currently I am only using it for pattern matching.
#Override
public List<File> filterFiles(File[] files)
{
List<File> ret = new ArrayList<File>();
Pattern pattern = Pattern.compile(".*?~.*?");//(".*?#.*?#.*?");
DocumentFile documentFile;
Matcher matcher;
for (File file : files)
{
matcher = pattern.matcher(file.getName());
if(matcher.find())// matching the input file name pattern
{
//get key and documentfile
//create sha key to check file existance
String key = EncodeUtil.generateKey(file);
documentFile = documentDaoImpl.getDocumentFile(key,Constants.INPROGRESS);
if (documentFile != null)
{
ret.add(file);
}
}/*else
{
file.delete();
}*/
}
return ret;
}
I have been successfully working with this for atleast a couple of months and now suddenly I am getting no files!!
Currently I am in process of using a timer cron expression and will do the ftp using apache commons-net within the triggered class. Seems such a waste having to do the ftp inspite of having the spring ftp tag.

I have built a project with the configuration you have used and everything seems to work fine.
There are some pieces in your code (not posted here) that might lead to discard the files in the filter and that you will have to check (adding log messages will help):
if (matcher.find())// matching the input file name pattern
{
// get key and documentfile
// create sha key to check file existance
// TODO: does this call throw any exception? return null?
String key = EncodeUtil.generateKey(file);
documentFile = documentDaoImpl.getDocumentFile(key, Constants.INPROGRESS);
if (documentFile != null) {
ret.add(file);
}
else {
// TODO: Log here that your DAO implementation did not return anything for this specific file
}
}
else {
// TODO: Log here that the file does not meet the naming convention
}

Related

WARN JdbcChannelMessageStore Message with id was not deleted

I have a queue channel backed by a JdbcChannelMessageStore. I have two instances of this application and with high concurrency I have this warning:
2020-03-13 19:25:38,209 task-scheduler-5 WARN JdbcChannelMessageStore:652 - Message with id '06b73eab-727a-780f-d0fa-1b0e0dd1ea20' was not deleted.
Is there a way to remove them?
As far as my understanding, messages are being read twice, am I correct?
I am using SI 4.3.19.RELEASE. Here is my spring flow
<int:channel id="channel">
<int:queue message-store="messageStoreBean"/>
</int:channel>
<int:header-value-router input-channel="channel
header-name="name" >
<int:poller max-messages-per-poll="2" fixed-rate="500" >
<int:transactional />
</int:poller>
...
</int:header-value-router>
<bean id="storeQueryProviderBean" class="org.springframework.integration.jdbc.store.channel.PostgresChannelMessageStoreQueryProvider" />
<bean id="messageStoreBean" class="org.springframework.integration.jdbc.store.JdbcChannelMessageStore">
<property name="dataSource" ref="messageStoreDataSource" />
<property name="channelMessageStoreQueryProvider" ref="storeQueryProviderBean" />
<property name="region" value="region" />
</bean>
It looks like PostgreSQL doesn't guarantee exclusive reading with transactions and LIMIT 1 FOR UPDATE.
Anyway that WARN is just a note that some other process has removed the message. Nothing is duplicated if other process is similar to that poller:
public Message<?> pollMessageFromGroup(Object groupId) {
final String key = getKey(groupId);
final Message<?> polledMessage = this.doPollForMessage(key);
if (polledMessage != null) {
if (!this.doRemoveMessageFromGroup(groupId, polledMessage)) {
return null;
}
}
return polledMessage;
}
You see if message was not removed, we return null therefore nothing to poll at the moment.
You can turn off the warning level for the org.springframework.integration.jdbc.store.JdbcChannelMessageStore to avoid that message specifying a category level as ERROR in your logging config.

Spring Integration - Scheduling Job from configuration file

I'm using Spring Integration to parse XML file and i will need to create a thread (and each one have a different rate) for each tag.
Right now (with the help of many users here :)) i'm able to split XML by tag and then route it to the appropiate service-activator.
This works great but i'm not able to redirect to a channel that create "a thread" and then execute the operations. Right now i have the following configuration and in my mind (that i dont know if it is correct...)
Split tag -> Route to the appropiate channel -> Start a thread(from tag configuration) -> Execute the operation
This is my actual configuration that split tag and redirect to the channel.
The router should redirect not toward a channel directly, but schedule them.
In first instance will be enought to redirect it in a pool with fixed rate and later i will use XPATH to get the attribute and then replace this "fixed" rate with the correct value.
I've tried many solutions to create this flow but each one fails or do not compile :(
<context:component-scan base-package="it.mypkg" />
<si:channel id="rootChannel" />
<si-xml:xpath-splitter id="mySplitter" input-channel="rootChannel" output-channel="routerChannel" create-documents="true">
<si-xml:xpath-expression expression="//service" />
</si-xml:xpath-splitter>
<si-xml:xpath-router id="router" input-channel="routerChannel" evaluate-as-string="true">
<si-xml:xpath-expression expression="concat(name(./node()), 'Channel')" />
</si-xml:xpath-router>
<si:service-activator input-channel="serviceChannel" output-channel="endChannel">
<bean class="it.mypkg.Service" />
</si:service-activator>
UPDATE:
Using this configuration for the service this should run a task every 10 seconds (the id=service1) and every 5 seconds the other (the id=service2). In the same way i can have another tag that is handle by another class (because this will have another behaviour)
<root>
<service id="service1" interval="10000" />
<service id="service2" interval="5000" />
<activity id="activity1" interval="50000" />
<root>
I will have a classe (Service) that is general to handle Service tag and this complete some operation and then "return me" the value so i can redirect to another channel.
public class Service {
public int execute() {
// Execute the task and return the value to continue the "chain"
}
}
It's not at all clear what you mean; you split a tag; route it but want to "schedule" it at a rate in the XML. It's not clear what you mean by "schedule" here - normally each message is processed once not multiple times on a schedule.
As I said, I don't understand what you need to do, but a smart poller might be suitable.
Another possibility is the delayer where the amount of the delay can be derived from the message.
EDIT
Since your "services" don't seem to take any input data, it looks like you simply need to configure/start an <inbound-channel-adapter/> for each service, and then start it, based on the arguments in the XML.
<int:inbound-channel-adapter id="service1" channel="foo"
auto-startup="false"
ref="service1Bean" method="execute">
<poller fixed-delay="1000" />
</int:inbound-channel-adapter/>
Note auto-startup="false".
Now, in the code that receives the split
#Autowired
SourcePollingChannelAdapter service1;
...
public void startService1(Node node) {
...
service1.setTrigger(new PeridicTrigger(...));
service1.start();
...
}
I dont know if this is the right way to implement the flow, but i've write the follow code:
applicationContext.xml
<context:component-scan base-package="it.mypkg" />
<!-- Expression to extract interval from XML tag -->
<si-xml:xpath-expression id="selectIntervalXpath" expression="//*/#interval" />
<si:channel id="rootChannel" />
<!-- Split each tag to redirect on router -->
<si-xml:xpath-splitter id="mySplitter" input-channel="rootChannel" output-channel="routerChannel" create-documents="true">
<si-xml:xpath-expression expression="//service|//activity" />
</si-xml:xpath-splitter>
<!-- Route each tag to the appropiate channel -->
<si-xml:xpath-router id="router" input-channel="routerChannel" evaluate-as-string="true">
<si-xml:xpath-expression expression="concat(name(./node()), 'Channel')" />
</si-xml:xpath-router>
<!-- Activator for Service Tag -->
<si:service-activator input-channel="serviceChannel" method="schedule">
<bean class="it.mypkg.Service" />
</si:service-activator>
<!-- Activator for Activity Tag -->
<si:service-activator input-channel="activityChannel" method="schedule">
<bean class="it.mypkg.Activity" />
</si:service-activator>
<!-- Task scheduler -->
<task:scheduler id="taskScheduler" pool-size="10"/>
Each tag will extend an Operation class (to avoid code duplication on bean injection)
Operation.java
public abstract class Operation {
protected TaskScheduler taskScheduler;
protected XPathExpression selectIntervalXpath;
abstract public void schedule(Node document);
#Autowired
public void setTaskScheduler(TaskScheduler taskScheduler) {
this.taskScheduler= taskScheduler;
}
public TaskScheduler getTaskScheduler() {
return this.taskScheduler;
}
#Autowired
public void setSelectIntervalXpath(XPathExpression selectIntervalXpath) {
this.selectIntervalXpath = selectIntervalXpath;
}
public XPathExpression getSelectIntervalXPath() {
return this.selectIntervalXpath;
}
}
And an example of Service class (that handle all tags service provided on .xml)
public class Service extends Operation {
private static final Logger log = Logger.getLogger(Service.class);
#Override
public void schedule(Node document) {
log.debug("Scheduling Service");
long interval = Long.parseLong(this.selectIntervalXpath.evaluateAsString(document));
this.taskScheduler.scheduleAtFixedRate(new ServiceRunner(), interval);
}
private class ServiceRunner implements Runnable {
public void run() {
log.debug("Running...");
}
}
}
Now to continue my flow i will need to find a way to redirect the output of each job to Spring Integration (applicationContext.xml).

Spring Integration with ftp to perform operation using file

I want to read file from ftp and store it locally and after storing it locally i want to process that file in java code. After processing the file successfully i want to move it to another directory. How to do it efficiently with using spring integration with ftp
public class FtpFileHandler {
public File ftpFileUserHandler(#Header("timestamp") String timestamp, File file){
try {
String filename = file.getName();
String extension = FilenameUtils.getExtension(filename);
}
return file;
} catch ( IOException | org.json.simple.parser.ParseException e) {
Utility.exceptionLogger(e);
return file;
}
}
}
<bean id="ftpFileHandlerService" class="com.aaa.clear.integration.service.impl.FtpFileHandler" />
<!-- HR Integration Start -->
<int-ftp:inbound-channel-adapter id="ftpInbound"
session-factory="ftpSessionFactory"
charset="UTF-8"
auto-create-local-directory="true"
delete-remote-files="true"
filename-regex=".*\.(txt)$"
remote-directory="/FTP Test"
remote-file-separator="/"
preserve-timestamp="true"
temporary-file-suffix=".writing"
local-directory="#{systemProperties['aaa']}/clear/integration/download/user">
<int:poller fixed-rate="180000" />
</int-ftp:inbound-channel-adapter>
<int:service-activator id="" input-channel="ftpInbound" output-channel="ftpOutbound" ref="ftpFileHandlerService" method="ftpFileUserHandler" >
</int:service-activator>
<int-ftp:outbound-channel-adapter id="ftpOutbound" session-factory="ftpSessionFactory" auto-create-directory="true"
remote-directory="/FTP Test/processed/user"/>
Read the documentation
Check out the samples here and here
Come back if you have a specific question

Duplicate call in select-sql-parameter-source

I'm working with a dynamic query, using select-sql-parameter-source to search the information that I need.
This is my configuration:
<int-jdbc:inbound-channel-adapter query="SELECT * FROM CUSTOMER WHERE CUSTOMER.LASTUPDATE_ACTIVE < TO_DATE(:last_process_date,'YYYY-MM-DD HH24:Mi:SS') "
channel="headerEnricher.customerBR01"
update=""
row-mapper="customerRowMapper"
data-source="jdbcTemplate"
max-rows-per-poll="0"
select-sql-parameter-source="parameterSource.customerBR01">
<!-- Cron Time -->
<int:poller fixed-rate="50" time-unit="SECONDS">
</int:poller>
</int-jdbc:inbound-channel-adapter>
<!-- This is to get last process date -->
<bean id="parameterSource.customerBR01" factory-bean="parameterSourceFactory.customerBR01" factory-method="createParameterSourceNoCache">
<constructor-arg value="" />
</bean>
<bean id="parameterSourceFactory.customerBR01" class="org.springframework.integration.jdbc.ExpressionEvaluatingSqlParameterSourceFactory">
<property name="parameterExpressions">
<map>
<!-- Here we get the last process date -->
<entry key="last_process_date" value="#hsqlHistoricProcessServiceDateDAO.getLastProcessDate(3,1,'CUSTOMER')" />
</map>
</property>
</bean>
I was looking that loggin appeared twice, so I changed my code in this function :
hsqlHistoricProcessServiceDateDAO.getLastProcessDate
To return only an account variable.
Code of function hsqlHistoricProcessServiceDateDAO.getLastProcessDate is the following:
private int contador = 0;
public String getLastProcessDate(Integer country, Integer business, String tableName) {
contador++;
System.out.println("Contador "+ contador);
return Integer.toString(contador);
}
And its result is :
Contador 1
Contador 2
So, this method is called twice, and I need only one call, because in the "real code" I have all logging twice for that.
For your use case, you don't need to disable the cache; use this instead...
<bean id="parameterSource.customerBR01"
factory-bean="parameterSourceFactory.customerBR01"
factory-method="createParameterSource">
<constructor-arg value="" />
</bean>
The ...NoCache version is needed when the same key is used in multiple parameters and you want each one to be re-evaluated.
Disabling the cache has this additional side-effect because the getValue() method is called twice for each use of the key. One call is from NamedParameterUtils.substituteNamedParameters(); the second is from NamedParameterUtils.buildValueArray().

File inbound-channel-adapter spring integration for Multiple Files aggregation into one master File for Job processing

I have written a code to combined multiple files into one single Master file.
The issue is with int-transformer where I am getting one file at a time although I have aggregated List of File in composite Filter of File inbound-channel-adapter. The List of File size in composite filter is correct but in Transformer bean the List of File size is always one and not getting the correct list size aggregated file by the filter.
Here is my config:
<!-- Auto Wiring -->
<context:component-scan base-package="com.nt.na21.nam.integration.*" />
<!-- intercept and log every message -->
<int:logging-channel-adapter id="logger"
level="DEBUG" />
<int:wire-tap channel="logger" />
<!-- Aggregating the processed Output for OSS processing -->
<int:channel id="networkData" />
<int:channel id="requests" />
<int-file:inbound-channel-adapter id="pollProcessedNetworkData"
directory="file:${processing.files.directory}" filter="compositeProcessedFileFilter"
channel="networkData">
<int:poller default="true" cron="*/20 * * * * *" />
</int-file:inbound-channel-adapter>
<bean id="compositeProcessedFileFilter"
class="com.nt.na21.nam.integration.file.filter.CompositeFileListFilterForBaseLine" />
<int:transformer id="aggregateNetworkData"
input-channel="networkData" output-channel="requests">
<bean id="networkData" class="com.nt.na21.nam.integration.helper.CSVFileAggregator">
</bean>
</int:transformer>
CompositeFileListFilterForBaseLine:
public class CompositeFileListFilterForBaseLine implements FileListFilter<File> {
private final static Logger LOG = Logger
.getLogger(CompositeFileListFilterForBaseLine.class);
#Override
public List<File> filterFiles(File[] files) {
List<File> filteredFile = new ArrayList<File>();
int index;
String fetchedFileName = null;
String fileCreatedDate = null;
String todayDate = DateHelper.toddMM(new Date());
LOG.debug("Date - dd-MM: " + todayDate);
for (File f : files) {
fetchedFileName = StringUtils.removeEnd(f.getName(), ".csv");
index = fetchedFileName.indexOf("_");
// Add plus one to index to skip underscore
fileCreatedDate = fetchedFileName.substring(index + 1);
// Format the created file date
fileCreatedDate = DateHelper.formatFileNameDateForAggregation(fileCreatedDate);
LOG.debug("file created date: " + fileCreatedDate + " today Date: "
+ todayDate);
if (fileCreatedDate.equalsIgnoreCase(todayDate)) {
filteredFile.add(f);
LOG.debug("File added to List of File: " + f.getAbsolutePath());
}
}
LOG.debug("SIZE: " + filteredFile.size());
LOG.debug("filterFiles method end.");
return filteredFile;
}
}
The Class file for CSVFileAggregator
public class CSVFileAggregator {
private final static Logger LOG = Logger.getLogger(CSVFileAggregator.class);
private int snePostion;
protected String masterFileSourcePath=null;
public File handleAggregateFiles(List<File> files) throws IOException {
LOG.debug("materFileSourcePath: " + masterFileSourcePath);
LinkedHashSet<String> allAttributes = null;
Map<String, LinkedHashSet<String>> allAttrBase = null;
Map<String, LinkedHashSet<String>> allAttrDelta = null;
LOG.info("Aggregator releasing [" + files.size() + "] files");
}
}
Log Output:
INFO : com.nt.na21.nam.integration.aggregator.NetFileAggregatorClient - NetFileAggregator context initialized. Polling input folder...
INFO : com.nt.na21.nam.integration.aggregator.NetFileAggregatorClient - Input directory is: D:\Projects\csv\processing
DEBUG: com.nt.na21.nam.integration.file.filter.CompositeFileListFilterForBaseLine - Date - dd-MM: 0103
DEBUG: com.nt.na21.nam.integration.file.filter.CompositeFileListFilterForBaseLine - file created date: 0103 today Date: 0103
DEBUG: com.nt.na21.nam.integration.file.filter.CompositeFileListFilterForBaseLine - File added to List of File: D:\Projects\NA21\NAMworkspace\na21_nam_integration\csv\processing\file1_base_0103.csv
DEBUG: com.nt.na21.nam.integration.file.filter.CompositeFileListFilterForBaseLine - file created date: 0103 today Date: 0103
DEBUG: com.nt.na21.nam.integration.file.filter.CompositeFileListFilterForBaseLine - File added to List of File: D:\Projects\NA21\NAMworkspace\na21_nam_integration\csv\processing\file2_base_0103.csv
DEBUG: com.nt.na21.nam.integration.file.filter.CompositeFileListFilterForBaseLine - **SIZE: 2**
DEBUG: com.nt.na21.nam.integration.file.filter.CompositeFileListFilterForBaseLine - filterFiles method end.
DEBUG: org.springframework.integration.file.FileReadingMessageSource - Added to queue: [csv\processing\file1_base_0103.csv, csv\processing\file2_base_0103.csv]
INFO : org.springframework.integration.file.FileReadingMessageSource - Created message: [GenericMessage [payload=csv\processing\file2_base_0103.csv, headers={timestamp=1425158920029, id=cb3c8505-0ee5-7476-5b06-01d14380e24a}]]
DEBUG: org.springframework.integration.endpoint.SourcePollingChannelAdapter - Poll resulted in Message: GenericMessage [payload=csv\processing\file2_base_0103.csv, headers={timestamp=1425158920029, id=cb3c8505-0ee5-7476-5b06-01d14380e24a}]
DEBUG: org.springframework.integration.channel.DirectChannel - preSend on channel 'networkData', message: GenericMessage [payload=csv\processing\file2_base_0103.csv, headers={timestamp=1425158920029, id=cb3c8505-0ee5-7476-5b06-01d14380e24a}]
DEBUG: org.springframework.integration.handler.LoggingHandler - org.springframework.integration.handler.LoggingHandler#0 received message: GenericMessage [payload=csv\processing\file2_base_0103.csv, headers={timestamp=1425158920029, id=cb3c8505-0ee5-7476-5b06-01d14380e24a}]
DEBUG: org.springframework.integration.handler.LoggingHandler - csv\processing\file2_base_0103.csv
DEBUG: org.springframework.integration.channel.DirectChannel - postSend (sent=true) on channel 'logger', message: GenericMessage [payload=csv\processing\file2_base_0103.csv, headers={timestamp=1425158920029, id=cb3c8505-0ee5-7476-5b06-01d14380e24a}]
DEBUG: org.springframework.integration.transformer.MessageTransformingHandler - org.springframework.integration.transformer.MessageTransformingHandler#606f8b2b received message: GenericMessage [payload=csv\processing\file2_base_0103.csv, headers={timestamp=1425158920029, id=cb3c8505-0ee5-7476-5b06-01d14380e24a}]
DEBUG: com.nt.na21.nam.integration.helper.CSVFileAggregator - materFileSourcePath: null
INFO : com.nt.na21.nam.integration.helper.CSVFileAggregator - **Aggregator releasing [1] files**
Can some one help me here in identifying the issue with Filter and same is not collecting for transformation?
Thanks in advance.
The issue is with int:aggregator as I am not sure how to invoke. I have used this earlier in my design but it didn't get executed at all. Thanks for the quick response.
For this problem I have written a FileScaner utility which will scan all the files in Folder inside and aggregation is working perfectly.
Please find the config with Aggregator which didn't works, hence I splited the design by two poller first produced all the CSV file(s) and second collect it and aggregate it.
<!-- Auto Wiring -->
<context:component-scan base-package="com.bt.na21.nam.integration.*" />
<!-- intercept and log every message -->
<int:logging-channel-adapter id="logger" level="DEBUG" />
<int:wire-tap channel = "logger" />
<int:channel id="fileInputChannel" datatype="java.io.File" />
<int:channel id="error" />
<int:channel id="requestsCSVInput" />
<int-file:inbound-channel-adapter id="pollNetworkFile"
directory="file:${input.files.directory}" channel="fileInputChannel"
filter="compositeFileFilter" prevent-duplicates="true">
<int:poller default="true" cron="*/20 * * * * *"
error-channel="error" />
</int-file:inbound-channel-adapter>
<bean id="compositeFileFilter"
class="com.nt.na21.nam.integration.file.filter.CompositeFileListFilterForTodayFiles" />
<int:transformer id="transformInputZipCSVFileIntoCSV"
input-channel="fileInputChannel" output-channel="requestsCSVInput">
<bean id="transformZipFile"
class="com.nt.na21.nam.integration.file.net.NetRecordFileTransformation" />
</int:transformer>
<int:router ref="docTypeRouter" input-channel="requestsCSVInput"
method="resolveObjectTypeChannel">
</int:router>
<int:channel id="Vlan" />
<int:channel id="VlanShaper" />
<int:channel id="TdmPwe" />
<bean id="docTypeRouter"
class="com.nt.na21.nam.integration.file.net.DocumentTypeMessageRouter" />
<int:service-activator ref="vLanMessageHandler" output-channel="newContentItemNotification" input-channel="Vlan" method="handleFile" />
<bean id="vLanMessageHandler" class="com.nt.na21.nam.integration.file.handler.VLanRecordsHandler" />
<int:service-activator ref="VlanShaperMessageHandler" output-channel="newContentItemNotification" input-channel="VlanShaper" method="handleFile" />
<bean id="VlanShaperMessageHandler" class="com.nt.na21.nam.integration.file.handler.VlanShaperRecordsHandler" />
<int:service-activator ref="PweMessageHandler" output-channel="newContentItemNotification" input-channel="TdmPwe" method="handleFile" />
<bean id="PweMessageHandler" class="com.nt.na21.nam.integration.file.handler.PseudoWireRecordsHandler" />
<int:channel id="newContentItemNotification" />
<!-- Adding for aggregating the records in one place for OSS output -->
<int:aggregator input-channel="newContentItemNotification" method="aggregate"
ref="netRecordsResultAggregator" output-channel="net-records-aggregated-reply"
message-store="netRecordsResultMessageStore"
send-partial-result-on-expiry="true">
</int:aggregator>
<int:channel id="net-records-aggregated-reply" />
<bean id="netRecordsResultAggregator" class="com.nt.na21.nam.integration.aggregator.NetRecordsResultAggregator" />
<!-- Define a store for our network records results and set up a reaper that will
periodically expire those results. -->
<bean id="netRecordsResultMessageStore" class="org.springframework.integration.store.SimpleMessageStore" />
<int-file:outbound-channel-adapter id="filesOut"
directory="file:${output.files.directory}"
delete-source-files="true">
</int-file:outbound-channel-adapter>
The code is working fine till the routed to all the channel below:
<int:channel id="Vlan" />
<int:channel id="VlanShaper" />
<int:channel id="TdmPwe" />
I am trying to return LinkedHashSet from the Process of the above channel which contains CSV data and I need to aggregate all the merge
LinkedHashSet vAllAttributes to get the master output CSV file.
List<String> masterList = new ArrayList<String>(vAllAttributes);
Collections.sort(masterList);
Well, looks like you misunderstood a bit <int-file:inbound-channel-adapter> behaviour. Its nature is producing one file per message to the channel. It doesn't depend on the logic of the FileListFilter. The is like:
The FileReadingMessageSource uses DirectoryScanner to retrieve files from the provided directory to an internal toBeReceived Queue
Since we scan the directory for the files the design for the DirectoryScanner looks like List<File> listFiles(File directory). I guess this has led you astray.
After that the filter is applied to the original file list and returns only appropriate files.
They are stored to the toBeReceived Queue.
And only after that the FileReadingMessageSource polls an item from the queue to build message for the output channel.
To achieve your aggregation requirements you really should use an <aggregator> between <int-file:inbound-channel-adapter> and your <int:transformer>.
You can mark the <poller> of the <int-file:inbound-channel-adapter> with max-messages-per-poll="-1" to really poll all your files during the single scheduled task. But anyway there will as much messages as your filter returns files.
After that you must accept some tricks for the <aggregator>:
correlationKey - to allow your file messages to be combined to the single MessageGroup for release a single message for the further <transformer>. Since we don't have any context from <int-file:inbound-channel-adapter>, but we know that all messages are provided by the single polling task and withing scheduled Thread (you don't use task-executor on the <poller>), hence we can simply use correlationKey as:
correlation-strategy-expression="T(Thread).currentThread().id"
But the is not enough, because we should produce somehow the single message in the end anyway. Unfortunately we don't know the number of files (however you can do that via the ThreadLocal from your custom FileListFilter) to allow the ReleaseStrategy to return true for the aggregate phase. Hence we never have the normal group completion. But we can forceRelease uncompleted groups from the aggregator to use the MessageGroupStoreReaper or group-timeout on the <aggregator>.
In addition to the previous clause you should supply these options on the <aggegator>:
send-partial-result-on-expiry="true"
expire-groups-upon-completion="true"
And that's all. There is no reason to provide any custom aggregation function (ref/method or expression), because the default on just build a single message with the List of payloads from all messages in group. And that is appropriate for your CSVFileAggregator. Although you can avoid that <transformer> and this CSVFileAggregator for the aggregation function.
Hope I ma clear

Resources