Spring Integration: create dynamic directories using ftp:outbound-adapter - spring-integration

We would like to be able to change the FTP directory on a channel, after the channel has been created. In our particular use case, the subdirectory for an FTP put is determined at runtime.for ex: we have daily reports uploaded by users.it should be store in ftp server in day wise folders. ex: test/reports/27-11-2012/abc.pdf, test/reports/28-11-2012/abc.pdf etc..
some what Like this
<int-ftp:outbound-channel-adapter id="ftpOutbound" channel="ftpChannel" remote-directory="remoteDirectoryPath"
session-factory="ftpClientFactory" />
remoteDirectoryPath - it should append runtime
Please Anybody can give us solution?

Use remote-directory-expression
#beanName.method() is currently not available in this expression; you will need to use SpEL for the directory generation...
"'test' + T(java.io.File).separator + new java.text.SimpleDateFormat('yyyyMMDD').format(new java.util.Date())"

You can assign a directory/path at Runtime into ftp:outbound-channel-adapter.
I am coping the data over here. You can check this out.
This is working for me.
xml side:
<int-ftp:outbound-channel-adapter id="ftpOutboundAdapter" session-factory="ftpSessionFactory"
channel="sftpOutboundChannel"
remote-directory-expression="#targetDir.get()"
remote-filename-generator="fileNameGenerator"/>
<bean id="targetDir" class="java.util.concurrent.atomic.AtomicReference">
<constructor-arg value="D:\PATH\"/>
</bean>
In this block...remote-directory-expression="#targetDir.get()"
is used for setting the directory/path at runtime.
Java side:
AtomicReference<String> targetDir = (AtomicReference<String>)appContext.getBean("targetDir", AtomicReference.class);
targetDir.set("E:\PATH\");
By, this you can set your path.

Related

Nlog / event-properties: how to hint NLog to ignore/skip empty/null properties from the final log entry

Basically the title itself kinda explains what i'm trying to achieve but in greater detail:
Let's say the one has similar to the following XML setup for the layout:
layout="<log level='${level:lowerCase=True}' time='${longdate:universalTime=true}' myCustomProperty1='${event-properties:item=myCustomProperty1}' myCustomProperty2='${event-properties:item=myCustomProperty2}'>${newline}
...."
Now when myCustomProperty1 is set to let's say 'blah1' but myCustomProperty2 is not added to eventInfo.Properties collection the resulting entry looks like following:
<log level='blah' time='blah' myCustomProperty1='blah1' myCustomProperty2=''>
...
The question is - what can be done (preferably in the config file) to exclude the myCustomProperty2 attribute from finally rendered result so the output looks as following:
<log level='blah' time='blah' myCustomProperty1='blah1'>
...
Here is the gotcha - the same logger is used by multiple threads so i can't simply alter target's layout configuration at the runtime since it may negatively affect the rest of the threads
Thank you in advance for your suggestions.
-K
You could try using When :
<variables>
<variable name="var_myCustomProperty1" value="${when:when=length('${event-properties:item=myCustomProperty1}')>0:Inner= myCustomProperty1="${event-properties:item=myCustomProperty1}"}"/>
<variable name="var_myCustomProperty2" value="${when:when=length('${event-properties:item=myCustomProperty2}')>0:Inner= myCustomProperty2="${event-properties:item=myCustomProperty2}"}"/>
</variables>
<targets>
<target name="test" type="Console" layout="<log level='${level:lowerCase=True}' time='${longdate:universalTime=true}'${var_myCustomProperty1}${var_myCustomProperty2} />" />
</targets>
NLog 4.6 will include the XmlLayout, that might make things easier:
https://github.com/NLog/NLog/pull/2670
Alternative you can use the JsonLayout, if xml-output is not a requirement (renderEmptyObject="false")

Spring Integration File Outbound Channel Adapter and file last modified date

I'm trying to make a File Outbound Channel Adapter to write a file having the last modified date attribute set to a custom value instead of system current time.
according to the documentation (http://docs.spring.io/spring-integration/docs/4.3.11.RELEASE/reference/html/files.html#file-timestamps) I'm supposed to set the preserve-timestamp attribute to true on the outbound and set the header file_setModified to the desired timestamp in the messages.
Anyway I made several attempts without success.
This is a code snippet to show what I'm doing right now:
<int:inbound-channel-adapter
channel="msg.channel"
expression="'Hello'">
<int:poller fixed-delay="1000"/>
</int:inbound-channel-adapter>
<int:header-enricher
input-channel="msg.channel"
output-channel="msgEnriched.channel">
<int:header
name="file_setModified"
expression="new Long(1473897600)"/>
</int:header-enricher>
<int-file:outbound-channel-adapter
id="msgEnriched.channel"
preserve-timestamp="true"
directory="/tmp/foo"/>
what's wrong with that?
(using Spring Integration 4.3.11)
The timestamp value is overridden if your payload is a File:
Object timestamp = requestMessage.getHeaders().get(FileHeaders.SET_MODIFIED);
...
if (payload instanceof File) {
resultFile = handleFileMessage((File) payload, tempFile, resultFile);
timestamp = ((File) payload).lastModified();
}
...
if (this.preserveTimestamp) {
if (timestamp instanceof Number) {
resultFile.setLastModified(((Number) timestamp).longValue());
}
}
To avoid that override and really get a gain from the file_setModified, you should convert the File from the <int:inbound-channel-adapter> to its InputStream:
<transformer expression="new java.io.FileInputStream(payload)"/>
before <int-file:outbound-channel-adapter>.
The documentation warns about that though:
For File payloads, this will transfer the timestamp from the inbound file to the outbound (regardless of whether a copy was required)
UPDATE
I have just tested your use case and my /tmp/out directory looks like:
As you see all my files have the proper custom last modified.
What am I missing?
Maybe that 1473897600 (1970 year) is wrong for your operation system?
UPDATE
OK! The problem that preserve-timestamp isn't configured into the target component during parsing that XML: https://jira.spring.io/browse/INT-4324
The workaround for your use-case is like:
<int:outbound-channel-adapter id="msgEnriched.channel">
<bean class="org.springframework.integration.file.FileWritingMessageHandler">
<constructor-arg value="/tmp/foo"/>
<property name="preserveTimestamp" value="true"/>
<property name="expectReply" value="false"/>
</bean>
</int:outbound-channel-adapter>
instead of that <int-file:outbound-channel-adapter> definition.

inbound sftp channel adapter custom filter not accepting same file again

I have very simple custom filter for inbound sftp channel adapter where I just check if file extension is in list of accepted or not. If so it returns true and should allow to process that file.
What is happening is first time that file is processed it works fine. if same file is dropped in my sftp server it comes to filter and it is returning true that means file is accepted still it does not put that message on the downstream queue. Here is my sample config looks like
<int-sftp:inbound-channel-adapter id="sftpAdapter"
channel="ftpChannel"
session-factory="sftpSessionFactory"
local-directory="c:\\temp"
remote-directory="//test//inbound"
remote-file-separator="/"
auto-create-local-directory="true"
delete-remote-files="true"
filter="customfilter"
preserve-timestamp="true"
>
<int:poller cron="0/5 * * * * *" max-messages-per-poll="1"/>
</int-sftp:inbound-channel-adapter>
That's because there is one more FileListFilter in the AbstractInboundFileSynchronizingMessageSource:
private volatile FileListFilter<File> localFileListFilter = new AcceptOnceFileListFilter<File>();
Since you guarantee the duplicate logic with your filter="customfilter" you should configure local-filter:
<int-sftp:inbound-channel-adapter id="sftpAdapter"
channel="ftpChannel"
....
local-filter="acceptAllFileFilter"/>
<bean id="acceptAllFileFilter" class="org.springframework.integration.file.filters.AcceptAllFileListFilter"/>

aggregator release strategy depend on another service activator running

I understand how aggregating based on size works but I also want to make the release strategy depend on another step in the pipeline to be still running. The idea is that i move files to a certain dir "source", aggregate enough files and then move from "source" to "stage" and then process the staged files. While this process is running I dont want to put more files in stage but I do want to continue to add more files to source folder (that part is handled by using a dispatcher channel connected with file inbound adapter before the aggregator)
<int:aggregator id="filesBuffered"
input-channel="sourceFilesProcessed"
output-channel="stagedFiles"
release-strategy-expression="size() == 10"
correlation-strategy-expression="'mes-group'"
expire-groups-upon-completion="true"
/>
<int:channel id="stagedFiles" />
<int:service-activator input-channel="stagedFiles"
output-channel="readyForMes"
ref="moveToStage"
method="move" />
so as you can see I dont want to release the aggregated messages if an existing instance of moveToStage service activator is running.
I thought about making the stagedFiles channel a queue channel but that doesnt seems right because I do want the files to be passed to moveToStage as a Collection not a single file which I am assuming by making stagedFiles a queue channel it will send a single file. Instead I want to get to a threshold e.g. 10 files, pass those to stagedFiles which allows the moveToStage to process those files but then until this step is done I want the aggregator to continue to aggregate files and then release all aggregated files.
Thanks
I suggest you to have some flag as a AtomicBoolean bean and use it from your moveToStage#move and check it's state from:
release-strategy-expression="size() >= 10 and #stagingFlag.get()"

Spring Integration to Iterate through a list of files or records

I am using spring integration to download files and to process them.
<int-sftp:inbound-channel-adapter channel="FileDownloadChannel"
session-factory="SftpSessionFactory"
remote-directory="/home/sshaji/from_disney/files"
filter = "modifiedFileListFilter"
local-directory="/home/sshaji/to_disney/downloads"
auto-create-local-directory="true" >
<integration:poller cron="*/10 * * * * *" default="true"/>
</int-sftp:inbound-channel-adapter>
<integration:transformer input-channel="FileDownloadChannel"
ref="ErrorTransformer"
output-channel="EndChannel"/>
<integration:router input-channel="FileErrorProcessingChannel"
expression="payload.getErrorCode() > 0">
<integration:mapping value="true" channel="ReportErrorChannel"/>
<integration:mapping value="false" channel="FilesBackupChannel"/>
</integration:router>
The int-sftp:inbound-channel-adapter is used to download files from sftp server.
It downloads about 6 files. all xml files.
The transformer iterates all the 6 files and check whether they have an error tag.
If there is an error tag then it will set its errorcode as 1. else it will be set a 0.
When it comes out of the transformer and goes to the router,
i want to send the files whose errorcode is set to 1 to move to a specific folder (Error)
and those which has errorcode set to 0 to move to another folder (NoError).
Currently the transformer returns a " list fileNames " which contains the errorcode and fileNames of all the 6 files.
How can i check the error code for each file using the router? and then map that particular file to a router.
Common C Logic for my problem
for (int i =0; i<fileNames.lenght();i++) {
if(fileNames[i].getErrorCode == 1) {
moveToErrorFolder(fileNames[i].getName());
} else {
moveToNoErrors(fileNames[i].getName());
}
}
How can i achieve this using spring integration?.
If its not possible, is there any workaround for it?.
I hope now its clear. I am sorry for not providing enough details last time.
Also in the int-sftp:inbound-channel-adapter i have hard coded the "remote-directory" and "local-directory" fields to a specific folder in the system. can i refer these from a bean property or from a constant value?.
I need to configure these values based on config.xml file, is that possible?.
I am new to Spring Integration. Please help me.
Thanks in Advance.
It's still not clear to me what you mean by "The transformer iterates all the 6 files".
Each file will be passed to the transformer in a single message, so I don't see how it can emit a list of 6.
It sounds like you need an <aggregator/> with a correlation-strategy-expression="'foo'" and release-strategy-expression="size() == 6". This would aggregate each single File into a list of File and pass it to your transformer. It then transforms it to a list of your status objects containing the filename and error code.
Finally, you would add a <splitter/> which would split the list into separate FileName messages to send to the router.
You can use normal Spring property placeholders for the directory attributes ${some.property} or SpEL to use a property of another bean #{someBean.remoteDir}.

Resources