Transferring pdf file from an sFtp Server (Complete Ftp) with spring integration - spring-integration

I tryed to execute the example at https://github.com/spring-projects/spring-integration-samples/tree/master/basic/sftp
I tryed to Transfers files from remote to local directory with the example given in the release (sFtpInboundReceiveSample.java)
Therefore, I installed a real sFtp Server (Complete FTP) created an account, generated a private / public key for the user and then tryed to launch tge sample test class.
My goal was to retrieve pdf binary files from the remote sFtp Server and store them in a local directory.
Unfortunately the sample program remain stuck in when trying to receive the first .pdf file.
That is : a temporary test1.pdf.writing file is created but the file is never being completely downloaded.
hence after a while the test end with an error. Can somebody help me determine what to do please ?
below is the configuration I use in my spring .xml file :
<int-sftp:inbound-channel-adapter id="sftpInbondAdapter" auto-startup="false" channel="receiveChannel" session-factory="sftpSessionFactory" local-directory="file:local-mumuFolder" remote-directory="mumuFolder" auto-create-local-directory="true" delete-remote-files="false" filename-regex=".*\.pdf$"> <int:poller fixed-rate="1000" max-messages-per-poll="1"/> </int-sftp:inbound-channel-adapter>
the sample code in java stop here :
public class SftpInboundReceiveSample {
#Test
public void runDemo(){
ConfigurableApplicationContext context =
new ClassPathXmlApplicationContext("/META-INF/spring/integration/SftpInboundReceiveSample-context.xml", this.getClass());
RemoteFileTemplate<LsEntry> template = null;
/*String file1 = "a.txt";
String file2 = "b.txt";
String file3 = "c.bar";
new File("local-dir", file1).delete();
new File("local-dir", file2).delete();*/
try {
PollableChannel localFileChannel = context.getBean("receiveChannel", PollableChannel.class);
#SuppressWarnings("unchecked")
SessionFactory<LsEntry> sessionFactory = context.getBean(CachingSessionFactory.class);
template = new RemoteFileTemplate<LsEntry>(sessionFactory);
//SftpTestUtils.createTestFiles(template, file1, file2, file3);
SourcePollingChannelAdapter adapter = context.getBean(SourcePollingChannelAdapter.class);
adapter.start();
Message<?> received = localFileChannel.receive();
(The localFileChannel.receive() instruction block indefinitively.
Can somebody please help me ?
Thanks and regards

Related

How to populate unknown Cucumber or Specflow step argument that only be generated during execution?

Given I executed some steps, then on certain step I get a value from a database cell. Since this value is unknown prior to execution, I cannot use any binding or table value defined in feature file, is there any way to populate this value into Step Definition => then it is showed on other report?
For ex a feature file:
Given I drop the file to the server's UNC path
When the file is processed successfully
Then a new account is loaded as (.*) (this is the number generated at runtime)
The account can only be know at the last step through a connection to the database, is there any way to put it to the step definition so that later it shows as:
The a new account is loaded as 100051359
What you want to do is not possible with SpecFlow. However, you can still get a good test out of this, but you will likely need to share data between steps using the ScenarioContext.
The step that processes the file will need to know the newly loaded account Id. Then that step can put that account Id in the ScenarioContext:
[Binding]
public class FileSteps
{
private readonly ScenarioContext scenario;
public FileSteps(ScenarioContext scenario)
{
this.scenario = scenario;
}
[When(#"the file is processed successfully"]
public void WhenTheFileIsProcessedSuccessfully()
{
var account = // process the file
scenario.Set(account.Id, "AccountId");
}
}
Later when making the assertion, get the account Id from the scenario context before making your assertion:
[Binding]
public class AccountSteps
{
private readonly ScenarioContext scenario;
public AccountSteps(ScenarioContext scenario)
{
this.scenario = scenario;
}
[Then(#"a new account is loaded")]
public void ThenANewAccountIsLoaded()
{
var account = accountRepository.Find(scenario.Get<int>("AccountId"));
// Assert something about the account
}
}
And your test becomes:
Scenario: ...
Given I drop the file to the server's UNC path
When the file is processed successfully
Then a new account is loaded

SAP HYBRIS : how to use media to convert csv file to media hmc

I am a beginner in the Sap Hybris. I created a CronJob that works perfectly. which returns all the products with status approved and generated CSV file in local C://...
But I want to create or convert my CSV file to a media in HMC MEDIA? can someone help me?
I already have gone through Hybris wiki but I didn't understand.
Thank u for all !!
To achieve this, you only need to create your Media object, and attach your file to the created object, something like :
private MediaModel createMedia(final File file) throws MediaIOException, IllegalArgumentException, FileNotFoundException
{
final CatalogVersionModel catalogVersion = catalogVersionService.getCatalogVersion("MY_MEDIA_CATALOG", "VERSION");
MediaModel mediaModel;
try
{
mediaModel = mediaService.getMedia(catalogVersion, file.getName());
}
catch (final UnknownIdentifierException e)
{
mediaModel = modelService.create(MediaModel.class);
}
mediaModel.setCode(file.getName());
mediaModel.setCatalogVersion(catalogVersion);
mediaModel.setMime("text/csv");
mediaModel.setRealFileName(file.getName());
modelService.save(mediaModel);
mediaService.setStreamForMedia(mediaModel, new FileInputStream(file));
//Remove file
FileUtils.removeFile(file);
return mediaModel;
}

Spring Integration - FileSystemPersistentAcceptOnceFileListFilter filtering files with same name but different timestamp

I have a Spring integration application that does some processing on a file once it exists within a local directory. After it processes the file, it moves the file to a processed directory.
Some time later, a new file appears in that same local directory with the same file name but different content and time stamp. The application should once again process the file and then move it to a processed directory... but a message is never generated. Here's the config:
#Bean
#InboundChannelAdapter(value = "dailyFileInputChannel", poller = #Poller(maxMessagesPerPoll = "1", fixedDelay = "${load.manualPollingInterval}"))
public MessageSource<File> messageSource(ApplicationProperties applicationProperties) {
FileReadingMessageSource source = new FileReadingMessageSource();
source.setDirectory(applicationProperties.getLoadDirectory());
CompositeFileListFilter<File> compositeFileListFilter = new CompositeFileListFilter<File>();
compositeFileListFilter.addFilter(new LastModifiedFileListFilter());
FileSystemPersistentAcceptOnceFileListFilter persistent = new FileSystemPersistentAcceptOnceFileListFilter(store(), "dailyfilesystem");
persistent.setFlushOnUpdate(true);
compositeFileListFilter.addFilter(persistent);
compositeFileListFilter.addFilter(new SimplePatternFileListFilter(applicationProperties.getFileNamePattern()));
source.setFilter(compositeFileListFilter);
return source;
}
#Bean
public PropertiesPersistingMetadataStore store() {
PropertiesPersistingMetadataStore store = new PropertiesPersistingMetadataStore();
store.setBaseDirectory(applicationProperties.getProcessedStoreDirectory());
store.setFileName(applicationProperties.getProcessedStoreFile());
return store;
}
#Bean
#ServiceActivator(inputChannel = "dailyFileInputChannel")
public MessageHandler handler() {
// return a handler that processes and moves the file
}
I do not want the application process a file with the same name and same modified time stamp. How can I ensure the application though still processes files with the same name but different time stamps?
Use a ChainFileListFilter instead of a CompositeFileListFilter.
The latter presents all files to each filter so, if the LastModifiedFileListFilter filters a file on the first attempt (and the FileSystemPersistentAcceptOnceFileListFilter passes it), the composite filter filters the file; on the next attempt it will be filtered again, even if it passes the first filter.
The ChainFileListFilter won't pass a file filtered by the LastModifiedFileListFilter to the next filter.
This was a recent "fix" (in 4.3.7 JIRA here).
The current version is 4.3.8.

Same file gets picked up again and again in spring-ftp but with different names

I have a spring input channel defined like this
<file:inbound-channel-adapter prevent-duplicates="false" id="inpChannel" directory="file:/Users/abhisheksingh/req" auto-startup="true">
<int:poller id="poller" fixed-delay="1000" />
</file:inbound-channel-adapter>
<int:service-activator input-channel="inpChannel" ref="inpHandler" />
The file name example as TEST.SQQ. SQQ is the file format which the client uses to place the files in ftp. However, I see that the same file is picked up by the spring ftp adapter again and again with different file names. So the first time it is TEST.SQQ. Then the the next time it is TEST.SQQ-20170204.PQQ and then the next time it is TEST.SQQ-20170204.PQQ.20170304.PQQ. This keeps on continuing. I have a filter on my end which checks the name of the file already processed. But since the file name being polled is different each time, all of these files are picked up for processing.
This is my ftp adapter -
<int-ftp:inbound-channel-adapter id="sqqFtpInbound"
channel="ftpChannel"
session-factory="sqqFtpClientFactory"
auto-create-local-directory="true"
delete-remote-files="false"
local-filter="acceptAllFileListFilter"
local-directory="file:/Users/abhisheksingh/ddrive/everge_ws/sqqReq" auto-startup="true" >
<int:poller id="poller" fixed-delay="1000" />
</int-ftp:inbound-channel-adapter>
Here is my ftp server image -
Here is my local directory image -
I dont understand why the same file gets picked up again and again. I will appreciate some help !
This is my file list filter code.
public class TestFileListFilter<F> extends AbstractFileListFilter<F> {
private static final Logger log = LoggerFactory.getLogger(EvergeFileListFilter.class);
#Override
protected boolean accept(F file) {
File f = (File) file;
if(f.getAbsolutePath().contains(".PQQ")) {
String newDir = "/Users/abhisheksingh/ddrive/sample/pqqReq/";
String archiveLocation = "/Users/abhisheksingh/ddrive/sample/pqqArchive/";
String fullName = archiveLocation + f.getName();
log.info("Check if the file has already been processed " + fullName);
File fl = new File(fullName);
final File dir = new File(archiveLocation);
for (final File child : dir.listFiles()) {
String archiveName = FilenameUtils.getBaseName(child.getName());
String inputName = FilenameUtils.getBaseName(fl.getName());
log.info("Archive file name is " + archiveName);
log.info("Input file name is " + inputName);
if(inputName.contains(archiveName)) {
log.info("The file is already processed "+inputName);
}
}
if(fl.exists()) {
log.error("PQQ file has already been processed.");
removeFile(f);
return false;
}else{
log.info("PQQ File received " + f.getAbsolutePath());
}
moveFile(f, newDir);
return true;
}
}
I think your custom local-filter has some vulnerabilities to rely on an nonexistent fact to wait for unique files from remote store.
You should ensure that ability because it isn't switched by default.
For this purpose consider to add filter option to the <int-ftp:inbound-channel-adapter> as a reference to the AcceptOnceFileListFilter or FtpPersistentAcceptOnceFileListFilter.
We have a JIRA on the matter.
Please, confirm that it is exactly an issue for you and we might revise a priority for that ticket and will fix it soon.

SSIS script delete or replace Sharepoint library file

I'm trying to use an SSIS script task to copy an excel file into a Sharepoint library when a file with the same name already exists there. It doesn't matter if it deletes the old file first then copies the new or just copies and replaces the new. I can't figure out how to delete the old file and it won't copy the new until the old is gone. So far I have:
/*
Microsoft SQL Server Integration Services Script Task
Write scripts using Microsoft Visual C# 2008.
The ScriptMain is the entry point class of the script.
*/
using System;
using System.Data;
using Microsoft.SqlServer.Dts.Runtime;
using System.Windows.Forms;
using System.IO;
namespace ST_be5f0817a6b54483a96a8c9e79402175.csproj
{
[System.AddIn.AddIn("ScriptMain", Version = "1.0", Publisher = "", Description = "")]
public partial class ScriptMain : Microsoft.SqlServer.Dts.Tasks.ScriptTask.VSTARTScriptObjectModelBase
{
#region VSTA generated code
enum ScriptResults
{
Success = Microsoft.SqlServer.Dts.Runtime.DTSExecResult.Success,
Failure = Microsoft.SqlServer.Dts.Runtime.DTSExecResult.Failure
};
#endregion
/*
The execution engine calls this method when the task executes.
To access the object model, use the Dts property. Connections, variables, events,
and logging features are available as members of the Dts property as shown in the following examples.
To reference a variable, call Dts.Variables["MyCaseSensitiveVariableName"].Value;
To post a log entry, call Dts.Log("This is my log text", 999, null);
To fire an event, call Dts.Events.FireInformation(99, "test", "hit the help message", "", 0, true);
To use the connections collection use something like the following:
ConnectionManager cm = Dts.Connections.Add("OLEDB");
cm.ConnectionString = "Data Source=localhost;Initial Catalog=AdventureWorks;Provider=SQLNCLI10;Integrated Security=SSPI;Auto Translate=False;";
Before returning from this method, set the value of Dts.TaskResult to indicate success or failure.
To open Help, press F1.
*/
public void Main()
{
string fileDir = (string)Dts.Variables["fileDir"].Value;
string SPDir = (string)Dts.Variables["SPDir"].Value;
if (File.Exists(Path.Combine(SPDir, "filename.CSV")))
{
File.Delete(Path.Combine(SPDir, "filename.CSV"));
}
File.Copy(Path.Combine(fileDir, "filename.CSV"), Path.Combine(SPDir, "filename.CSV"));
Dts.TaskResult = (int)ScriptResults.Success;
}
}
}
I think I can delete the file by running a data flow component with the sharepoint list connector before the script and have the script just copy the file over, but I'm trying to avoid that many components and connections and that method just generally sounds more complicated.
Any help or advice would be welcome.
Fixed this issue by changing the relevant line to: File.Copy(Path.Combine(fileDir, "filename.CSV"), Path.Combine(SPDir, "filename.CSV"), true);
The boolean argument specifies whether to overwrite the existing file and is "false" if not specified.

Resources