I have a use case for transfering files to sftp under certain subdirs that are created dynamically.
I got this working using custom SftpMessageHandler method and a Gateway. But the issue with this approach was, it was not deleting local temp files after successful upload.
To solve that, now I am using IntegrationFlow along with expression Advice (as below), this does remove local files, but I don't know how to create remote subDirs dynamically. I read about remote directory expression, but not sure how to use/implement it.
Any one resolved this issue? Any help is appreciated!
#Bean
public IntegrationFlow sftpOutboundFlow() {
return IntegrationFlows.from("toSftpChannel")
.handle(Sftp.outboundAdapter(this.sftpSessionFactory())
.remoteFileSeparator("/")
.useTemporaryFileName(false)
.remoteDirectory("/temp"), c -> c.advice(expressionAdvice(c)))
.get();
}
#Bean
public Advice expressionAdvice(GenericEndpointSpec<FileTransferringMessageHandler<ChannelSftp.LsEntry>> c) {
ExpressionEvaluatingRequestHandlerAdvice advice = new ExpressionEvaluatingRequestHandlerAdvice();
advice.setOnSuccessExpressionString("payload.delete()");
advice.setOnFailureExpressionString("payload + ' failed to upload'");
advice.setTrapException(true);
return advice;
}
#MessagingGateway
public interface UploadGateway {
#Gateway(requestChannel = "toSftpChannel")
void upload(File file);
}
The Sftp.outboundAdapter() has these options for the remote directory:
/**
* Specify a remote directory path.
* #param remoteDirectory the remote directory path.
* #return the current Spec
*/
public S remoteDirectory(String remoteDirectory) {
}
/**
* Specify a remote directory path SpEL expression.
* #param remoteDirectoryExpression the remote directory expression
* #return the current Spec
*/
public S remoteDirectoryExpression(String remoteDirectoryExpression) {
}
/**
* Specify a remote directory path {#link Function}.
* #param remoteDirectoryFunction the remote directory {#link Function}
* #param <P> the expected payload type.
* #return the current Spec
*/
public <P> S remoteDirectory(Function<Message<P>, String> remoteDirectoryFunction) {
}
So, if the story is about a dynamic sub-directory, you can choose a remoteDirectoryExpression or remoteDirectory(Function) and calculate a target path against message or some bean in the application context.
For example:
.remoteDirectoryExpression("'rootDir/' + headers.subDir")
Also bear in mind that for not existing directories you need to configure an .autoCreateDirectory(true), too.
Related
I was implementing Azure AD B2C in Multiuser mode and was reading the sample files. Why is there a configuration class which states:
"If you'd like to use your own app registration, you will also need to update B2CConfiguration.java to match with your configuration json file."
Doesn't that seem to defeat the purpose of having a configuration file? Shouldn't the values be accessible through the module somehow as long as the configuration file is
This code shows the calling of the json configuration file:
// Creates a PublicClientApplication object with res/raw/auth_config_single_account.json
PublicClientApplication.createMultipleAccountPublicClientApplication(getContext(),
R.raw.auth_config_b2c,
new IPublicClientApplication.IMultipleAccountApplicationCreatedListener() {
#Override
public void onCreated(IMultipleAccountPublicClientApplication application) {
b2cApp = application;
loadAccounts();
}
#Override
public void onError(MsalException exception) {
displayError(exception);
removeAccountButton.setEnabled(false);
runUserFlowButton.setEnabled(false);
acquireTokenSilentButton.setEnabled(false);
}
});
And the B2CConfiguraiton shows:
/**
* Name of your B2C tenant hostname.
*/
final static String azureAdB2CHostName = "fabrikamb2c.b2clogin.com";
/**
* Name of your B2C tenant.
*/
final static String tenantName = "fabrikamb2c.onmicrosoft.com";
/**
* Returns an authority for the given policy name.
*
* #param policyName name of a B2C policy.
*/
public static String getAuthorityFromPolicyName(final String policyName) {
return "https://" + azureAdB2CHostName + "/tfp/" + tenantName + "/" + policyName + "/";
}
/**
* Returns an array of scopes you wish to acquire as part of the returned token result.
* These scopes must be added in your B2C application page.
*/
public static List<String> getScopes() {
return Arrays.asList(
"https://fabrikamb2c.onmicrosoft.com/helloapi/demo.read");
}
All of these values are in the configuration file, except for scopes.
Is there another option here so I don't need to hard code configuration information?
The configuration details (like tenentid, policy name)can't be rendered dynamically.
In the B2CConfiguration.java file if you see the comments section it was mentioned as The value in this class has to map with the json configuration file (auth_config_b2c.json).
I'm using spring integration for SMB to store and retrieve files from windows server.
In cases when I want to retrieve the file from the server I found the method "get" which receives a lamda function to handler the InputStream, but I need return this element and I wouldn't like to store in local and then return the InputStream.
Is there any alternative for this matter?
Thank you all.
My code is like this:
#Override
protected InputStream readMetadataFile(final String filename) throws FileNotFoundException {
final File inputFile = new File(filename);
if (this.smbRemoteFileTemplate.exists(filename)) {
this.smbRemoteFileTemplate.get(filename, in -> FileUtils.copyInputStreamToFile(in, inputFile));
return new FileInputStream(inputFile);
}
return null;
}
PS: does any mate with reputation greater than 1500 could create the tag "spring-integration-smb"? Thanks again.
The RemoteFileTemplate is based on the SessionFactory and there is an API like this:
/**
* Obtain a raw Session object. User must close the session when it is no longer
* needed.
* #return a session.
* #since 4.3
*/
Session<F> getSession();
That Session has this one for you:
/**
* Retrieve a remote file as a raw {#link InputStream}.
* #param source The path of the remote file.
* #return The raw inputStream.
* #throws IOException Any IOException.
* #since 3.0
*/
InputStream readRaw(String source) throws IOException;
Let's hope that this path is enough for your use-case!
Note: that you are responsible for closing this InputStream after using.
I am using sftp outbound adaptor to transfer files generated in ItemWriter to sftp server successfully.
Following is the java dsl config for my sftp outbound gateway.
#Bean
public IntegrationFlow sftpOutboundFlow() {
return IntegrationFlows.from("toSftpChannel")
.handle(Sftp.outboundAdapter(delegatingSessionFactory(sessionFactoryLocator), FileExistsMode.REPLACE)
.useTemporaryFileName(false)
.fileNameExpression("headers['" + FileHeaders.FILENAME + "']")
.remoteDirectoryExpression("headers.path")
.autoCreateDirectory(true),
c -> c.advice(expressionAdvice(c))) .get();
}
/**
* Advice to remove local files after successful upload
*
* #param c
* #return
*/
#Bean
public Advice expressionAdvice(GenericEndpointSpec<FileTransferringMessageHandler<ChannelSftp.LsEntry>> c) {
ExpressionEvaluatingRequestHandlerAdvice advice = new ExpressionEvaluatingRequestHandlerAdvice();
advice.setOnSuccessExpressionString("payload.delete()");
advice.setOnFailureExpressionString("payload + ' failed to upload'");
advice.setTrapException(true);
return advice;
}
/**
* Channel for uploading files
*
*
*/
#MessagingGateway
public interface LettersUploadGateway {
#Gateway(requestChannel = "toSftpChannel")
void upload(#Payload File file, #Header("path") String path);
}
Update:
From ItemWriter I am calling gateway's upload method to transfer files as below:
lettersGateway.upload(fileName, batchConfiguration.getLettersDirectory());
The directory specified is remote directory where I want the files to be transfered.
In this process, I have noticed that the temporary local files are created in my project root folder (which are deleted later after successful sftp transfer), is there a way to change the local temporary file location to somewhere like "c:/temp/"?
Thanks for your help.
I am trying to poll an FTP directory for a certain kind of file, the polling of a directory works, but whenever I try to apply a filter to filter the files by extension, the messagesource keeps spamming messages about the file with no regard to the polling delay. Without the filters everything works fine, once I enable them my application authenticates with the FTP, downloads the file and sends the message nonstop over and over again. I have the following beans:
/**
* Factory that creates the remote connection
*
* #return DefaultSftpSessionFactory
*/
#Bean
public DefaultSftpSessionFactory sftpSessionFactory(#Value("${ftp.host}") String ftpHost,
#Value("${ftp.port}") int ftpPort,
#Value("${ftp.user}") String ftpUser,
#Value("${ftp.pass}") String ftpPass) {
DefaultSftpSessionFactory factory = new DefaultSftpSessionFactory();
factory.setAllowUnknownKeys(true);
factory.setHost(ftpHost);
factory.setPort(ftpPort);
factory.setUser(ftpUser);
factory.setPassword(ftpPass);
return factory;
}
/**
* Template to handle remote files
*
* #param sessionFactory SessionFactory bean
* #return SftpRemoteFileTemplate
*/
#Bean
public SftpRemoteFileTemplate fileTemplate(DefaultSftpSessionFactory sessionFactory) {
SftpRemoteFileTemplate template = new SftpRemoteFileTemplate(sessionFactory);
template.setAutoCreateDirectory(true);
template.setUseTemporaryFileName(false);
return template;
}
/**
* To listen to multiple directories, declare multiples of this bean with the same inbound channel
*
* #param fileTemplate FileTemplate bean
* #return MessageSource
*/
#Bean
#InboundChannelAdapter(channel = "deeplinkAutomated", poller = #Poller(fixedDelay = "6000", maxMessagesPerPoll = "-1"))
public MessageSource inboundChannelAdapter(SftpRemoteFileTemplate fileTemplate) {
SftpStreamingMessageSource source = new SftpStreamingMessageSource(fileTemplate);
source.setRemoteDirectory("/upload");
source.setFilter(new CompositeFileListFilter<>(
Arrays.asList(new AcceptOnceFileListFilter<>(), new SftpSimplePatternFileListFilter("*.trg"))
));
return source;
}
/**
* Listener that activates on new messages on the specified input channel
*
* #return MessageHandler
*/
#Bean
#ServiceActivator(inputChannel = "deeplinkAutomated")
public MessageHandler handler(JobLauncher jobLauncher, Job deeplinkBatch) {
return message -> {
Gson gson = new Gson();
SFTPFileInfo info = gson.fromJson((String) message.getHeaders().get("file_remoteFileInfo"), SFTPFileInfo.class);
System.out.println("File to download: " + info.getFilename().replace(".trg", ".xml"));
};
}
I think AcceptOnceFileListFilter is not suitable for SFTP files. The returned LsEntry doesn't match previously stored in the HashSet: just their hashes are different!
Consider to use a SftpPersistentAcceptOnceFileListFilter instead.
Also it would be better to configure a DefaultSftpSessionFactory for the isSharedSession:
/**
* #param isSharedSession true if the session is to be shared.
*/
public DefaultSftpSessionFactory(boolean isSharedSession) {
To avoid session recreation on each polling task.
you don't have a 6 seconds delay between calls because you have a maxMessagesPerPoll = "-1". That means poll remote files until they are there in remote dir. In your case with the AcceptOnceFileListFilter you always end up with the same file by the hash reason.
I have a case where I read a file, convert the content to a String. Then split the string into multiple payloads and send those payloads individually to a Queue. I want to use a JmsTransactionManager so that all messages are send or none at all.
When the TX is successful I want to move the file to an Archive folder, otherwise move it to a Failed folder. I have read that I can use transactionSynchronizationFactory to accomplish this. But in combination with a JmsTransactionManager the file is not moved. If I use a PseudoTransactionManager, then the file is moved, but I loose my JmsTransaction.
I have made a simplified version to reproduce the issue. (The content of the file in this case is a simple comma separated list of values.)
#Bean
public IntegrationFlow fileInboundAdaptor() {
return IntegrationFlows
.from(s -> s.file(new File(INBOUND_PATH))
.patternFilter("*.txt"),
e -> e.poller(Pollers.fixedDelay(5000)
.transactionSynchronizationFactory(transactionSynchronizationFactory())
.transactional(new JmsTransactionManager(connectionFactory))
)
)
.transform(Transformers.fileToString())
.split(s -> s.applySequence(false).get().getT2().setDelimiters(","))
.handle((GenericHandler<String>) (payload, headers) -> {
jmsTemplate.send("SOME_QUEUE", (Session session) -> session.createTextMessage(payload));
return payload;
})
.channel(MessageChannels.queue("fileReadingResultChannel"))
.get();
}
The transactionSynchronizationFactory looks like this:
#Bean
public TransactionSynchronizationFactory transactionSynchronizationFactory() {
ExpressionParser parser = new SpelExpressionParser();
ExpressionEvaluatingTransactionSynchronizationProcessor syncProcessor
= new ExpressionEvaluatingTransactionSynchronizationProcessor();
syncProcessor.setBeanFactory(applicationContext.getAutowireCapableBeanFactory());
syncProcessor.setAfterCommitExpression(parser.parseExpression(
"payload.renameTo(new java.io.File('test/archive' " +
" + T(java.io.File).separator + 'ARCHIVE-' + payload.name))"));
syncProcessor.setAfterRollbackExpression(parser.parseExpression(
"payload.renameTo(new java.io.File('test/fail' " +
" + T(java.io.File).separator + 'FAILED-' + payload.name))"));
return new DefaultTransactionSynchronizationFactory(syncProcessor);
}
So my question is: does TransactionSynchronizationFactory only work with PseudoTransactionManager or is supposed to work with JmsTransactionManager aswell?
Solution
I needed to set the transactionSynchronization on the JmsTransaction. Something like this:
public JmsTransactionManager transactionManager() {
JmsTransactionManager jmsTransactionManager = new JmsTransactionManager(connectionFactory);
jmsTransactionManager.setTransactionSynchronization(AbstractPlatformTransactionManager.SYNCHRONIZATION_ON_ACTUAL_TRANSACTION);
return jmsTransactionManager;
}
Well, I think your issue is here:
/**
* Create a new JmsTransactionManager for bean-style usage.
* <p>Note: The ConnectionFactory has to be set before using the instance.
* This constructor can be used to prepare a JmsTemplate via a BeanFactory,
* typically setting the ConnectionFactory via setConnectionFactory.
* <p>Turns off transaction synchronization by default, as this manager might
* be used alongside a datastore-based Spring transaction manager like
* DataSourceTransactionManager, which has stronger needs for synchronization.
* Only one manager is allowed to drive synchronization at any point of time.
* #see #setConnectionFactory
* #see #setTransactionSynchronization
*/
public JmsTransactionManager() {
setTransactionSynchronization(SYNCHRONIZATION_NEVER);
}
So, you have to manually switch on it into setTransactionSynchronization(AbstractPlatformTransactionManager.SYNCHRONIZATION_ALWAYS);