How to retrieve an InputStream from SmbRemoteFileTemplate? - spring-integration

I'm using spring integration for SMB to store and retrieve files from windows server.
In cases when I want to retrieve the file from the server I found the method "get" which receives a lamda function to handler the InputStream, but I need return this element and I wouldn't like to store in local and then return the InputStream.
Is there any alternative for this matter?
Thank you all.
My code is like this:
#Override
protected InputStream readMetadataFile(final String filename) throws FileNotFoundException {
final File inputFile = new File(filename);
if (this.smbRemoteFileTemplate.exists(filename)) {
this.smbRemoteFileTemplate.get(filename, in -> FileUtils.copyInputStreamToFile(in, inputFile));
return new FileInputStream(inputFile);
}
return null;
}
PS: does any mate with reputation greater than 1500 could create the tag "spring-integration-smb"? Thanks again.

The RemoteFileTemplate is based on the SessionFactory and there is an API like this:
/**
* Obtain a raw Session object. User must close the session when it is no longer
* needed.
* #return a session.
* #since 4.3
*/
Session<F> getSession();
That Session has this one for you:
/**
* Retrieve a remote file as a raw {#link InputStream}.
* #param source The path of the remote file.
* #return The raw inputStream.
* #throws IOException Any IOException.
* #since 3.0
*/
InputStream readRaw(String source) throws IOException;
Let's hope that this path is enough for your use-case!
Note: that you are responsible for closing this InputStream after using.

Related

using TransactionInterceptor throw java.lang.IllegalStateException: Cannot apply reactive transaction to non-reactive return type: void

my code is:
#Bean(R2DBC_TRANSACTION_INTERCEPTOR)
public TransactionInterceptor R2DBCTransactionInterceptor(ReactiveTransactionManager reactiveTransactionManager) {
return new TransactionInterceptorBuilder(true)
.transactionManager(reactiveTransactionManager)
.isolation(Isolation.READ_COMMITTED)
.propagation(Propagation.REQUIRES_NEW)
.build();
}
#ServiceActivator(inputChannel = "create invoice",
adviceChain = {IntegrationConfiguration.R2DBC_TRANSACTION_INTERCEPTOR})
public Mono<Invoice> createInvoice(Invoice invoice) {
return invoiceRepository.save(invoice);
}
I got an exception java.lang.IllegalStateException: Cannot apply reactive transaction to non-reactive return type: void
at org.springframework.transaction.interceptor.TransactionAspectSupport.lambda$invokeWithinTransaction$0(TransactionAspectSupport.java:348) ~[spring-tx-5.2.8.RELEASE.jar:5.2.8.RELEASE]
even there is no exception during the message passing process, the createInvoice method invoked with that exception.
any demo provided for this situation?
is my configuration wrong?
new TransactionInterceptorBuilder(true)
See its JavaDocs:
* When the {#code handleMessageAdvice} option is in use, this builder produces
* {#link TransactionHandleMessageAdvice} instance.
Then we go to the mentioned TransactionHandleMessageAdvice:
* When this {#link org.aopalliance.aop.Advice}
* is used from the {#code request-handler-advice-chain}, it is applied
* to the {#link org.springframework.messaging.MessageHandler#handleMessage}
* (not to the
* {#link org.springframework.integration.handler.AbstractReplyProducingMessageHandler.RequestHandler#handleRequestMessage}),
* therefore the entire downstream process is wrapped to the transaction.
Now if we take a look into that MessageHandler#handleMessage, we'll see:
void handleMessage(Message<?> message) throws MessagingException;
So, that's where that transaction interceptor is injected and that's where your solution fails because of that void return from the handleMessage().
You really need to think if you definitely need a transaction (especially reactive) for the whole sub-flow under that #ServiceActivator...
I mean probably just new TransactionInterceptorBuilder() would be OK for you to make your #ServiceActivator transactional and already with that reactive TX manager since you indeed return there a Mono.

StreamingMessageSource keeps firing when a filter is applied

I am trying to poll an FTP directory for a certain kind of file, the polling of a directory works, but whenever I try to apply a filter to filter the files by extension, the messagesource keeps spamming messages about the file with no regard to the polling delay. Without the filters everything works fine, once I enable them my application authenticates with the FTP, downloads the file and sends the message nonstop over and over again. I have the following beans:
/**
* Factory that creates the remote connection
*
* #return DefaultSftpSessionFactory
*/
#Bean
public DefaultSftpSessionFactory sftpSessionFactory(#Value("${ftp.host}") String ftpHost,
#Value("${ftp.port}") int ftpPort,
#Value("${ftp.user}") String ftpUser,
#Value("${ftp.pass}") String ftpPass) {
DefaultSftpSessionFactory factory = new DefaultSftpSessionFactory();
factory.setAllowUnknownKeys(true);
factory.setHost(ftpHost);
factory.setPort(ftpPort);
factory.setUser(ftpUser);
factory.setPassword(ftpPass);
return factory;
}
/**
* Template to handle remote files
*
* #param sessionFactory SessionFactory bean
* #return SftpRemoteFileTemplate
*/
#Bean
public SftpRemoteFileTemplate fileTemplate(DefaultSftpSessionFactory sessionFactory) {
SftpRemoteFileTemplate template = new SftpRemoteFileTemplate(sessionFactory);
template.setAutoCreateDirectory(true);
template.setUseTemporaryFileName(false);
return template;
}
/**
* To listen to multiple directories, declare multiples of this bean with the same inbound channel
*
* #param fileTemplate FileTemplate bean
* #return MessageSource
*/
#Bean
#InboundChannelAdapter(channel = "deeplinkAutomated", poller = #Poller(fixedDelay = "6000", maxMessagesPerPoll = "-1"))
public MessageSource inboundChannelAdapter(SftpRemoteFileTemplate fileTemplate) {
SftpStreamingMessageSource source = new SftpStreamingMessageSource(fileTemplate);
source.setRemoteDirectory("/upload");
source.setFilter(new CompositeFileListFilter<>(
Arrays.asList(new AcceptOnceFileListFilter<>(), new SftpSimplePatternFileListFilter("*.trg"))
));
return source;
}
/**
* Listener that activates on new messages on the specified input channel
*
* #return MessageHandler
*/
#Bean
#ServiceActivator(inputChannel = "deeplinkAutomated")
public MessageHandler handler(JobLauncher jobLauncher, Job deeplinkBatch) {
return message -> {
Gson gson = new Gson();
SFTPFileInfo info = gson.fromJson((String) message.getHeaders().get("file_remoteFileInfo"), SFTPFileInfo.class);
System.out.println("File to download: " + info.getFilename().replace(".trg", ".xml"));
};
}
I think AcceptOnceFileListFilter is not suitable for SFTP files. The returned LsEntry doesn't match previously stored in the HashSet: just their hashes are different!
Consider to use a SftpPersistentAcceptOnceFileListFilter instead.
Also it would be better to configure a DefaultSftpSessionFactory for the isSharedSession:
/**
* #param isSharedSession true if the session is to be shared.
*/
public DefaultSftpSessionFactory(boolean isSharedSession) {
To avoid session recreation on each polling task.
you don't have a 6 seconds delay between calls because you have a maxMessagesPerPoll = "-1". That means poll remote files until they are there in remote dir. In your case with the AcceptOnceFileListFilter you always end up with the same file by the hash reason.

Spring Integration: How to dynamically create subdir on sftp using IntegrationFlow

I have a use case for transfering files to sftp under certain subdirs that are created dynamically.
I got this working using custom SftpMessageHandler method and a Gateway. But the issue with this approach was, it was not deleting local temp files after successful upload.
To solve that, now I am using IntegrationFlow along with expression Advice (as below), this does remove local files, but I don't know how to create remote subDirs dynamically. I read about remote directory expression, but not sure how to use/implement it.
Any one resolved this issue? Any help is appreciated!
#Bean
public IntegrationFlow sftpOutboundFlow() {
return IntegrationFlows.from("toSftpChannel")
.handle(Sftp.outboundAdapter(this.sftpSessionFactory())
.remoteFileSeparator("/")
.useTemporaryFileName(false)
.remoteDirectory("/temp"), c -> c.advice(expressionAdvice(c)))
.get();
}
#Bean
public Advice expressionAdvice(GenericEndpointSpec<FileTransferringMessageHandler<ChannelSftp.LsEntry>> c) {
ExpressionEvaluatingRequestHandlerAdvice advice = new ExpressionEvaluatingRequestHandlerAdvice();
advice.setOnSuccessExpressionString("payload.delete()");
advice.setOnFailureExpressionString("payload + ' failed to upload'");
advice.setTrapException(true);
return advice;
}
#MessagingGateway
public interface UploadGateway {
#Gateway(requestChannel = "toSftpChannel")
void upload(File file);
}
The Sftp.outboundAdapter() has these options for the remote directory:
/**
* Specify a remote directory path.
* #param remoteDirectory the remote directory path.
* #return the current Spec
*/
public S remoteDirectory(String remoteDirectory) {
}
/**
* Specify a remote directory path SpEL expression.
* #param remoteDirectoryExpression the remote directory expression
* #return the current Spec
*/
public S remoteDirectoryExpression(String remoteDirectoryExpression) {
}
/**
* Specify a remote directory path {#link Function}.
* #param remoteDirectoryFunction the remote directory {#link Function}
* #param <P> the expected payload type.
* #return the current Spec
*/
public <P> S remoteDirectory(Function<Message<P>, String> remoteDirectoryFunction) {
}
So, if the story is about a dynamic sub-directory, you can choose a remoteDirectoryExpression or remoteDirectory(Function) and calculate a target path against message or some bean in the application context.
For example:
.remoteDirectoryExpression("'rootDir/' + headers.subDir")
Also bear in mind that for not existing directories you need to configure an .autoCreateDirectory(true), too.

how get JSON from URL

I need to read with the function SSJS fromJson() a URL.
For example the Data access API for a Notes View
http://{host}/{database}/api/data/collections/name/{name}
How can I do this ?
P.S I think (I don't know if is true) that if I use Java code
(for example the class URLReader from this blogger, I
lose authors/readers functionality because is my server and not the current user that execute the reading of the stream?
I'll explain why I'm trying to understand this...
I need to use this plugin JQuery Jquery Data Tables in my app.
I need a complete Server-side processing because I have over 10.000 documents for any view.
This jQueryPlugin send a parameters to a specif URL (I think my XAgent) so that I think to create a XAgent that read this parameter and parsing a JSON API Data for the output.
This because I need a fasted response.
The solution of Oliver Busse it very slow because load all entries of my view in a JSON (I have many entries) and I wait 30/40 seconds for this operation
I gather from the PS that you're specifically looking to fetch JSON on the server from itself, while retaining user authentication information. Sven's post there does a bit of that, but I think that the most reliable way would be to grab the Authorization and Cookie headers from the request and then pass them along in your URL request. This answer has a good basis for doing this with Java. You could expand that to do something like this (which, granted, I haven't tested, but it's a starting point):
HttpServletRequest req = (HttpServletRequest)FacesContext.getCurrentInstance().getExternalContext().getRequest();
String authorization = req.getHeader("Authorization");
String cookie = req.getHeader("Cookie");
URL myURL = new URL("http://foo.com");
HttpURLConnection myURLConnection = (HttpURLConnection)myURL.openConnection();
if(StringUtil.isNotEmpty(authorization)) {
myURLConnection.setRequestProperty("Authorization", authorization);
}
if(StringUtil.isNotEmpty(cookie)) {
myURLConnection.setRequestProperty("Cookie", cookie);
}
myURLConnection.setRequestMethod("GET");
myURLConnection.setDoInput(true);
myURLConnection.setDoOutput(true);
myURLConnection.connect();
InputStream is = null;
try {
is = myURLConnection.getInputStream();
String result = StreamUtil.readString(is);
} finally {
StreamUtil.close(is);
myURLConnection.disconnect();
}
Ideally, you would also fetch the server host name, protocol, and port from the request.
Eric's comment is also wise: if this is something you can do with the normal classes, that's going to be more flexible and less problem-prone, due to how fiddly server-self HTTP calls can be.
As I mentioned in my comment, this approach forces that your call go through a client-side call to the Domino Data Service and otherwise complicates a normal handle to establish on a View (and it's contents) via a cross-NSF call (e.g.- var vwEnt:ViewEntryCollection = session.getDatabase("serverName", "path/myDb.nsf").getView("viewName").getAllEntries();).
As a blog post of mine previously outlines, you can definitely achieve this as Jesse's answer (curse you fast typer Jesse!) outlines. Something I include in my "grab bag" of tools is a Java class that's a starting point for getting JSON formatted content. Here's the link (here's one with basic authorization in a request header) and the class I generally start from:
package com.eric.awesome;
import java.net.URL;
import java.net.URLConnection;
import java.io.BufferedReader;
import com.google.gson.*;
import java.io.InputStream;
import java.io.InputStreamReader;
import java.io.IOException;
import java.net.MalformedURLException;
import org.apache.commons.validator.routines.*;
/**
* Class with a single, public, static method to provide a REST consumer
* which returns data as a JsonObject.
*
* #author Eric McCormick, #edm00se
*
*/
public class CustJsonConsumer {
/**
* Method for receiving HTTP JSON GET request against a RESTful URL data source.
*
* #param myUrlStr the URL of the REST endpoint
* #return JsonObject containing the data from the REST response.
* #throws IOException
* #throws MalformedURLException
* #throws ParseException
*/
public static JsonObject GetMyRestData( String myUrlStr ) throws IOException, MalformedURLException {
JsonObject myRestData = new JsonObject();
try{
UrlValidator defaultValidator = new UrlValidator();
if(defaultValidator.isValid(myUrlStr)){
URL myUrl = new URL(myUrlStr);
URLConnection urlCon = myUrl.openConnection();
urlCon.setConnectTimeout(5000);
InputStream is = urlCon.getInputStream();
InputStreamReader isR = new InputStreamReader(is);
BufferedReader reader = new BufferedReader(isR);
StringBuffer buffer = new StringBuffer();
String line = "";
while( (line = reader.readLine()) != null ){
buffer.append(line);
}
reader.close();
JsonParser parser = new JsonParser();
myRestData = (JsonObject) parser.parse(buffer.toString());
return myRestData;
}else{
myRestData.addProperty("error", "URL failed validation by Apache Commmons URL Validator");
return myRestData;
}
}catch( MalformedURLException e ){
e.printStackTrace();
myRestData.addProperty("error", e.toString());
return myRestData;
}catch( IOException e ){
e.printStackTrace();
myRestData.addProperty("error", e.toString());
return myRestData;
}
}
}
To invoke from SSJS as a POJO, you would want do something like:
importPackage(com.eric.awesome);
var urlStr = "http://{host}/{database}/api/data/collections/name/{name}";
var myStuff:JsonObject = CustJsonConsumer.GetMyRestData(urlStr);

Apache POI: SXSSFWorkbook.dispose() does not exist

I'm using apache's POI API to write XLSX files. Since I need to write big files, I'm using the Streaming API (SXSSF).
To do this, I'm following this guide. Note that by the end of the example there's a call to
wb.dispose
This wb instance refers to a SXSSFWorkbook instance. I'm using the same in my code but it complains about the dispose method not existing. I downloaded the source code and the method isn't there. However, going to their SVN and checking that class' code we can see the method there:
https://svn.apache.org/repos/asf/poi/trunk/src/ooxml/java/org/apache/poi/xssf/streaming/SXSSFWorkbook.java
I already tried to recompile their code but I get a lot of errors...
The Apache POI 3.8 (latest stable at the time) creates a temporary XML file for each sheet (when using SXSSF) but does not gives the option to delete these files. This fact makes this API not good to use because if I'm exporting 600MB of data then I'll have 2 files with 600MB and one of them will be in the temporary folder until it's deleted.
Digging into the code, we see that the class SXSSFSheet has an instance of SheetDataWriter. This last class is responsible to write and maintain the temporary file that is represented by the File instance. Accessing this object would allow to delete the file.
All these instances are private so, theoretically, you cannot access them. However, through reflection, we can access the File instance to delete this useful but annoying files!
The following to methods allow to do this. By calling the deleteSXSSFTempFiles, all temporary files of that workbook are deleted.
/**
* Returns a private attribute of a class
* #param containingClass The class that contains the private attribute to retrieve
* #param fieldToGet The name of the attribute to get
* #return The private attribute
* #throws NoSuchFieldException
* #throws IllegalAccessException
*/
public static Object getPrivateAttribute(Object containingClass, String fieldToGet) throws NoSuchFieldException, IllegalAccessException {
//get the field of the containingClass instance
Field declaredField = containingClass.getClass().getDeclaredField(fieldToGet);
//set it as accessible
declaredField.setAccessible(true);
//access it
Object get = declaredField.get(containingClass);
//return it!
return get;
}
/**
* Deletes all temporary files of the SXSSFWorkbook instance
* #param workbook
* #throws NoSuchFieldException
* #throws IllegalAccessException
*/
public static void deleteSXSSFTempFiles(SXSSFWorkbook workbook) throws NoSuchFieldException, IllegalAccessException {
int numberOfSheets = workbook.getNumberOfSheets();
//iterate through all sheets (each sheet as a temp file)
for (int i = 0; i < numberOfSheets; i++) {
Sheet sheetAt = workbook.getSheetAt(i);
//delete only if the sheet is written by stream
if (sheetAt instanceof SXSSFSheet) {
SheetDataWriter sdw = (SheetDataWriter) getPrivateAttribute(sheetAt, "_writer");
File f = (File) getPrivateAttribute(sdw, "_fd");
try {
f.delete();
} catch (Exception ex) {
//could not delete the file
}
}
}
}
As of 2012-12-03, POI 3.9 is available as a stable release. The dispose() method is available in SXSSFWorkbook in this release.
(Of course, this was not the case when the question was asked.)

Resources