How to get response from uploading file to sftp? - spring-integration

How can i get response if file is uploaded succesfuly to sftp? This is the code which i have
#Bean
public SessionFactory<LsEntry> axisSftpSessionFactory() {
DefaultSftpSessionFactory factory = new DefaultSftpSessionFactory(true);
factory.setHost(axisSftpProperties.getSftpHost());
factory.setPort(axisSftpProperties.getSftpPort());
factory.setUser(axisSftpProperties.getSftpUser());
factory.setPassword(axisSftpProperties.getSftpPassword());
factory.setAllowUnknownKeys(true);
return new CachingSessionFactory<>(factory);
}
/**
* Handler message handler.
*
* #return the message handler
*/
#Bean
#ServiceActivator(inputChannel = TO_SFTP_CHANNEL)
public MessageHandler handler() {
SftpMessageHandler handler = new SftpMessageHandler(axisSftpSessionFactory());
handler.setRemoteDirectoryExpression(new LiteralExpression(axisSftpProperties.getSftpRemoteDirectory()));
handler.setFileNameGenerator(message -> (String) message.getHeaders().get(FILENAME));
return handler;
}
#Component
#MessagingGateway
public interface UploadGateway {
#Gateway(requestChannel = TO_SFTP_CHANNEL)
String upload(#Header(FILENAME) String filename, #Payload byte[] bytes);
}
And the idea here is to catch any error if the file is not successfully uploaded to sftp to be able to retry it.
If i use SftpOutboundGateway how can setup a remote directory path?
SftpOutboundGateway gateway = new SftpOutboundGateway(sessionFactory(), "put", "payload");

See the documentation:
The message payload resulting from a put operation is a String that contains the full path of the file on the server after transfer.
Since you have a void return, it is discarded.
So...
String upload(File file);
EDIT
When using the gateway, the third constructor argument is the expression for the file name; the remote directory is provided in the remote file template...
#Bean
public SftpRemoteFileTemplate template() {
SftpRemoteFileTemplate template = new SftpRemoteFileTemplate(sessionFactory());
template.setRemoteDirectoryExpression(new LiteralExpression("foo/test"));
return template;
}
and
new SftpOutboundGateway(template(). "put", "headers['file_name']")
and
System.out.println(gate.upload("foo.txt", "foo".getBytes()));
and
foo/test/foo.txt

Related

How to poll for multiple files at once with Spring Integration with WebFlux?

I have the following configuration below for file monitoring using Spring Integration and WebFlux.
It works well, but if I drop in 100 files it will pick up one file at a time with a 10 second gap between the "Received a notification of new file" log messages.
How do I poll for multiple files at once, so I don't have to wait 1000 seconds for all my files to finally register?
#Configuration
#EnableIntegration
public class FileMonitoringConfig {
private static final Logger logger =
LoggerFactory.getLogger(FileMonitoringConfig.class.getName());
#Value("${monitoring.folder}")
private String monitoringFolder;
#Value("${monitoring.polling-in-seconds:10}")
private int pollingInSeconds;
#Bean
Publisher<Message<Object>> myMessagePublisher() {
return IntegrationFlows.from(
Files.inboundAdapter(new File(monitoringFolder))
.useWatchService(false),
e -> e.poller(Pollers.fixedDelay(pollingInSeconds, TimeUnit.SECONDS)))
.channel(myChannel())
.toReactivePublisher();
}
#Bean
Function<Flux<Message<Object>>, Publisher<Message<Object>>> myReactiveSource() {
return flux -> myMessagePublisher();
}
#Bean
FluxMessageChannel myChannel() {
return new FluxMessageChannel();
}
#Bean
#ServiceActivator(
inputChannel = "myChannel",
async = "true",
reactive = #Reactive("myReactiveSource"))
ReactiveMessageHandler myMessageHandler() {
return new ReactiveMessageHandler() {
#Override
public Mono<Void> handleMessage(Message<?> message) throws MessagingException {
return Mono.fromFuture(doHandle(message));
}
private CompletableFuture<Void> doHandle(Message<?> message) {
return CompletableFuture.runAsync(
() -> {
logger.info("Received a notification of new file: {}", message.getPayload());
File file = (File) message.getPayload();
});
}
};
}
}
The Inbound Channel Adapter polls a single data record from the source per poll cycle.
Consider to add maxMessagesPerPoll(-1) to your poller() configuration.
See more in docs: https://docs.spring.io/spring-integration/docs/current/reference/html/core.html#channel-adapter-namespace-inbound

Spring Kafka outboundChannelAdapter's control does not return back in the integration flow

After a messsage is sent, it gets published to Kafka topic but the Message from KafkaSuccessTransformer does not return back to the REST controller. I am trying to return the message as-is if sent successfully but nothing after Kafka handler seems to be invoked.
#MessagingGateway
public interface MyGateway<String, Message<?>> {
#Gateway(requestChannel = "enrollChannel")
Message<?> sendMsg(#Payload String payload);
}
------------------------
#RestController
public class Controller {
MyGateway<String, Message<?>> myGateway;
#PostMapping
public Message<?> send(#RequestBody String request) throws Exception {
Message<?> resp = myGateway.sendMsg(request);
log.info("I am back"); // control doesn't come to this point
return resp;
}
}
--------------------------
#Component
public class MyIntegrationFlow {
KafkaSuccessTransformer stransformer;
#Bean
public MessageChannel enrollChannel() {
return new DirectChannel();
}
#Bean
public MessageChannel kafkaSuccessChannel() {
return new DirectChannel();
}
#Bean
public IntegrationFlow enrollIntegrationFlow() {
return IntegrationFlows.from("enrollChannel")
//another transformer which turns the string to Message<?>
.handle(Kafka.outboundChannelAdapter(kafkaTemplate) //kafkaTemplate has the necesssary config
.topic("topic1")
.messageKey(messageKeyFunction -> messageKeyFunction.getHeaders()
.get("key1")
.sendSuccessChannel("kafkaSuccessChannel"));
}
#Bean
public IntegrationFlow successfulKafkaSends() {
return f -> IntegrationFlows.from("kafkaSuccessChannel").transform(stransformer);
}
}
--------------
#Component
public class KafkaSuccessTransformer {
#Transformer
public Message<?> transform(Message<?> message) {
log.info("Message is sent to Kafka");
return message; //control comes here but does not return to REST controller
}
}
Channel adapters are for one-way traffic; there is no result.
Add a publishSubscribe channel with two subflows; the second one can be just a bridge to nowhere - .bridge() ends the flow. It will then return the outbound message to the gateway.
See https://docs.spring.io/spring-integration/docs/current/reference/html/dsl.html#java-dsl-subflows
Per Artem:
Something is off in the configuration or code. The logic is like this: processSendResult(message, producerRecord, sendFuture, getSendSuccessChannel());. Then: getMessageBuilderFactory().fromMessage(message). So, the replyChannel header is present in this "success" message. Therefore that transform(stransformer) should really produce its return to the replyChannel for a gateway in the beginning. Only the problem could be in the KafkaSuccessTransformer code where it does not copy request message headers for reply message. Please, share its whole code.

It is not possible for an entity that requires sessions to create a non-sessionful message receiver

Listener
#JmsListener(destination = "${servicebus.entities.acsTopicToListen.entityName}", containerFactory = "topicJmsListenerContainerFactory", subscription = "${servicebus.entities.acsTopicToListen.subscriptionName}")
public void run(byte[] message, Session session) throws Exception {
try {
acsDataHandler.messageProcessor(new String(message));
} catch (Exception ex) {
LOGGER.error("Exception thrown while listening to acsDataTopic...." + ex.getMessage());
exceptionHelper.handleTransformError(INTERNAL_SERVER_ERROR, "Error from AcsDataReceiver listen()",
ACS0001.name(), ex);
}
Configuration
#Bean
public ConnectionFactory schedulerConnectionFactory(ServicebusConnectionProperties serviceBusJMSProperties) {
final String connectionString = serviceBusJMSProperties.getConnectionString();
final String clientId = serviceBusJMSProperties.getTopiClientId();
final int idleTimeout = serviceBusJMSProperties.getIdleTimeout();
final ServiceBusKey serviceBusKey = ConnectionStringResolver.getServiceBusKey(connectionString);
final String host = serviceBusKey.getHost();
final String sasKeyName = serviceBusKey.getSharedAccessKeyName();
final String sasKey = serviceBusKey.getSharedAccessKey();
final String remoteUri = String.format(AMQP_URI_FORMAT, host, idleTimeout);
final JmsConnectionFactory jmsConnectionFactory = new JmsConnectionFactory();
jmsConnectionFactory.setRemoteURI(remoteUri);
jmsConnectionFactory.setClientID(clientId);
jmsConnectionFactory.setUsername(sasKeyName);
jmsConnectionFactory.setPassword(sasKey);
return new CachingConnectionFactory(jmsConnectionFactory);
}
#Bean
public Destination destination() {
return new JmsTopic(destination);
}
#Bean
public JmsTemplate jmsTemplate(ConnectionFactory jmsConnectionFactory, Destination destination) {
final JmsTemplate jmsTemplate = new JmsTemplate();
jmsTemplate.setConnectionFactory(jmsConnectionFactory);
jmsTemplate.setMessageIdEnabled(true);
jmsTemplate.setDefaultDestination(destination);
return jmsTemplate;
}
#Bean
public JmsListenerContainerFactory<?> topicJmsListenerContainerFactory(ConnectionFactory connectionFactory) {
final DefaultJmsListenerContainerFactory jmsListenerContainerFactory = new DefaultJmsListenerContainerFactory();
jmsListenerContainerFactory.setConnectionFactory(connectionFactory);
jmsListenerContainerFactory.setSubscriptionDurable(Boolean.TRUE);
jmsListenerContainerFactory.setSessionAcknowledgeMode(Session.CLIENT_ACKNOWLEDGE);
return jmsListenerContainerFactory;
}
I am using Azure Service Bus Spring Boot Starter to connect the Servicebus Topic/Subscription which is session enabled , But It's unable to connect with the message below :
It is not possible for an entity that requires sessions to create a non-sessionful message receiver.
In Java, the session support works with azure-servicebus library example
QueuesGettingStarted.java by changing queueClient.registerMessageHandler to queueClient.registerSessionHandler and relevant changes.
But in this case please check :
https://github.com/Azure/azure-service-bus/issues/326#issuecomment-573236250
https://github.com/MicrosoftDocs/azure-dev-docs/issues/285#issuecomment-699573311

Watch remote directory for added files and stream it for reading data over SFTP

I want to add a watch on the remote machine for newly added CSV files or unread. Once files are identified read it according to their timestamp which will be there in the file name. The file will be read using streaming rather coping to the local machine. While the file is getting read, append _reading to the filename and append _read once the file is read. The file will be read over SFTP protocol and I am planning to use spring integration sftp. In case of error while reading file or data in the file is not as per expectation I want to move that file in sub-directory.
I have tried to poll the remote directory and reading once CSV file. Once read I am removing the file from the directory.
<dependency>
<groupId>org.springframework.integration</groupId>
<artifactId>spring-integration-sftp</artifactId>
<version>5.1.0.RELEASE</version>
</dependency>
<dependency>
<groupId>org.springframework.integration</groupId>
<artifactId>spring-integration-core</artifactId>
<version>5.0.6.RELEASE</version>
</dependency>
Spring boot version 2.0.3.RELEASE
#Bean
public SessionFactory<ChannelSftp.LsEntry> sftpSessionFactory() {
DefaultSftpSessionFactory factory = new DefaultSftpSessionFactory(true);
factory.setHost(hostname);
factory.setPort(22);
factory.setUser(username);
factory.setPassword(password);
factory.setAllowUnknownKeys(true);
return new CachingSessionFactory<ChannelSftp.LsEntry>(factory);
}
#Bean
public MessageSource<InputStream> sftpMessageSource() {
SftpStreamingMessageSource messageSource = new SftpStreamingMessageSource(template());
messageSource.setRemoteDirectory(path);
messageSource.setFilter(compositeFilters());
return messageSource;
}
public CompositeFileListFilter compositeFilters() {
return new CompositeFileListFilter()
.addFilter(new SftpRegexPatternFileListFilter(".*csv"));
}
#Bean
public SftpRemoteFileTemplate template() {
return new SftpRemoteFileTemplate(sftpSessionFactory());
}
#Bean
public IntegrationFlow sftpOutboundListFlow() {
return IntegrationFlows.from(this.sftpMessageSource(), e -> e.poller(Pollers.fixedDelay(5, TimeUnit.SECONDS)))
.handle(Sftp.outboundGateway(template(), NLST, path).options(Option.RECURSIVE)))
.filter(compositeFilters())
.transform(sorter())
.split()
.handle(Sftp.outboundGateway(template(), GET, "headers['file_remoteDirectory'] + headers['file_remoteFile']").options(STREAM))
.transform(csvToPojoTransformer())
.handle(service())
.handle(Sftp.outboundGateway(template(), MV, "headers['file_remoteDirectory'] + headers['file_remoteFile'] + _read"))
.handle(after())
.get();
}
#Bean
public MessageHandler sorter() {
return new MessageHandler() {
#Override
public void handleMessage(Message<?> message) throws MessagingException {
List<String> fileNames = (List<String>) message.getPayload();
Collections.sort(fileNames);
}
};
}
#Bean
public MessageHandler csvToPojoTransformer() {
return new MessageHandler() {
#Override
public void handleMessage(Message<?> message) throws MessagingException {
InputStream streamData = (InputStream) message.getPayload();
convertStreamtoObject(streamData, Class.class);
}
};
}
public List<?> convertStreamtoObject(InputStream inputStream, Class clazz) {
HeaderColumnNameMappingStrategy ms = new HeaderColumnNameMappingStrategy();
ms.setType(clazz);
Reader reader = new InputStreamReader(inputStream);
CsvToBean cb = new CsvToBeanBuilder(reader)
.withType(clazz)
.withMappingStrategy(ms)
.withSkipLines(0)
.withSeparator('|')
.withThrowExceptions(true)
.build();
return cb.parse();
}
#Bean
public MessageHandler service() {
return new MessageHandler() {
#Override
public void handleMessage(Message<?> message) throws MessagingException {
List<Class> csvDataAsListOfPojo = List < Class > message.getPayload();
// use this
}
};
}
#Bean
public ExpressionEvaluatingRequestHandlerAdvice after() {
ExpressionEvaluatingRequestHandlerAdvice advice = new ExpressionEvaluatingRequestHandlerAdvice();
advice.setSuccessChannelName("success.input");
advice.setOnSuccessExpressionString("payload + ' was successful'");
advice.setFailureChannelName("failure.input");
advice.setOnFailureExpressionString("payload + ' was bad, with reason: ' + #exception.cause.message");
advice.setTrapException(true);
return advice;
}
#Bean
public IntegrationFlow success() {
return f -> f.handle(System.out::println);
}
#Bean
public IntegrationFlow failure() {
return f -> f.handle(System.out::println);
}
Updated Code
For complex scenarios (list, move, fetch, remove, etc), you should use SFTP remote file gateways instead.
The SFTP outbound gateway provides a limited set of commands that let you interact with a remote SFTP server:
ls (list files)
nlst (list file names)
get (retrieve a file)
mget (retrieve multiple files)
rm (remove file(s))
mv (move and rename file)
put (send a file)
mput (send multiple files)
Or use the SftpRemoteFileTemplate directly from your code.
EDIT
In response to your comments; you need something like this
Inbound Channel Adapter (with poller) - returns directory name
LS Gateway
Filter (remove any files already fetched)
Transformer (sort the list)
Splitter
GET Gateway(stream option)
Transformer (csv to POJO)
Service (process POJO)
If you add
RM Gateway
after your service (to remove the remote file), you don't need the filter step.
You might find the Java DSL simpler to assemble this flow...
#Bean
public IntegrationFlow flow() {
return IntegrationFlows.from(() -> "some/dir", e -> e.poller(...))
.handle(...) // LS Gateway
.filter(...)
.transform(sorter())
.split
.handle(...) // GET Gateway
.transform(csvToPojoTransformer())
.handle(myService())
.get()
}

Returning a generated file and then deleting it off the server

I have a ServiceStack Service, and the service generates a .zip file then returns it via:
result = new HttpResult(new FileInfo(zipFileName), asAttachment: false);
followed by (later)
Directory.Delete(dir); // Containing the zipfile
return result
The problem I have is I now want to delete the generated file, but I can't because it's still busy.
with an invalid access violation.
What's the best way to handle this? Is there a way to write the whole contents to the response stream which would free up the directory?
There are a number of different ways to return binary responses which can be seen in the ImageService: e.g. you can:
return byte[], Stream, IStreamWriter from your Service which get written directly to the response
wrap byte[], Stream responses in a HttpResult to also customize the HTTP Response headers
write directly to the base.Response in your service
return a custom a custom result
Here's a custom Result example that implements IStreamWriter which writes the file to the response stream and deletes the parent directory of the containing file in the Dispose() method:
public class ZipFileResult : IDisposable, IStreamWriter, IHasOptions
{
private readonly FileInfo fileInfo;
public ZipFileResult(FileInfo zipInfo, string contentType="application/zip")
{
fileInfo = zipInfo;
Options = new Dictionary<string, string> {
{ HttpHeaders.ContentType, contentType }
};
}
public void WriteTo(Stream responseStream)
{
using (var fs = fileInfo.OpenRead())
{
fs.WriteTo(responseStream);
return;
}
}
public void Dispose()
{
Directory.Delete(fileInfo.DirectoryName);
}
public IDictionary<string, string> Options { get; set; }
}

Resources