Refresh / Reinitialize a Spring Bean - spring-bean

I need to refresh / reinitialize a Spring Bean. My code snippet is as per below:
#Bean
#Qualifier("service")
Service myService() {
return new Service(file1, file2);
}
#Scheduled(cron = "${cronExpression}")
public void downloadData() {
file1 = loadFile1();
file2 = loadFile2();
// here I want to refresh the bean with the updated values of file1, file2
}
The cron expression executes once a day say # 11:00 a.m. and it fetches the latest files (file1, file2) from GCP. After it fetches the latest files I want to pass the latest files as an argument and create the bean. Service class is in another module.
How do I achieve this?

Related

How to populate unknown Cucumber or Specflow step argument that only be generated during execution?

Given I executed some steps, then on certain step I get a value from a database cell. Since this value is unknown prior to execution, I cannot use any binding or table value defined in feature file, is there any way to populate this value into Step Definition => then it is showed on other report?
For ex a feature file:
Given I drop the file to the server's UNC path
When the file is processed successfully
Then a new account is loaded as (.*) (this is the number generated at runtime)
The account can only be know at the last step through a connection to the database, is there any way to put it to the step definition so that later it shows as:
The a new account is loaded as 100051359
What you want to do is not possible with SpecFlow. However, you can still get a good test out of this, but you will likely need to share data between steps using the ScenarioContext.
The step that processes the file will need to know the newly loaded account Id. Then that step can put that account Id in the ScenarioContext:
[Binding]
public class FileSteps
{
private readonly ScenarioContext scenario;
public FileSteps(ScenarioContext scenario)
{
this.scenario = scenario;
}
[When(#"the file is processed successfully"]
public void WhenTheFileIsProcessedSuccessfully()
{
var account = // process the file
scenario.Set(account.Id, "AccountId");
}
}
Later when making the assertion, get the account Id from the scenario context before making your assertion:
[Binding]
public class AccountSteps
{
private readonly ScenarioContext scenario;
public AccountSteps(ScenarioContext scenario)
{
this.scenario = scenario;
}
[Then(#"a new account is loaded")]
public void ThenANewAccountIsLoaded()
{
var account = accountRepository.Find(scenario.Get<int>("AccountId"));
// Assert something about the account
}
}
And your test becomes:
Scenario: ...
Given I drop the file to the server's UNC path
When the file is processed successfully
Then a new account is loaded

Spring Integration - FileSystemPersistentAcceptOnceFileListFilter filtering files with same name but different timestamp

I have a Spring integration application that does some processing on a file once it exists within a local directory. After it processes the file, it moves the file to a processed directory.
Some time later, a new file appears in that same local directory with the same file name but different content and time stamp. The application should once again process the file and then move it to a processed directory... but a message is never generated. Here's the config:
#Bean
#InboundChannelAdapter(value = "dailyFileInputChannel", poller = #Poller(maxMessagesPerPoll = "1", fixedDelay = "${load.manualPollingInterval}"))
public MessageSource<File> messageSource(ApplicationProperties applicationProperties) {
FileReadingMessageSource source = new FileReadingMessageSource();
source.setDirectory(applicationProperties.getLoadDirectory());
CompositeFileListFilter<File> compositeFileListFilter = new CompositeFileListFilter<File>();
compositeFileListFilter.addFilter(new LastModifiedFileListFilter());
FileSystemPersistentAcceptOnceFileListFilter persistent = new FileSystemPersistentAcceptOnceFileListFilter(store(), "dailyfilesystem");
persistent.setFlushOnUpdate(true);
compositeFileListFilter.addFilter(persistent);
compositeFileListFilter.addFilter(new SimplePatternFileListFilter(applicationProperties.getFileNamePattern()));
source.setFilter(compositeFileListFilter);
return source;
}
#Bean
public PropertiesPersistingMetadataStore store() {
PropertiesPersistingMetadataStore store = new PropertiesPersistingMetadataStore();
store.setBaseDirectory(applicationProperties.getProcessedStoreDirectory());
store.setFileName(applicationProperties.getProcessedStoreFile());
return store;
}
#Bean
#ServiceActivator(inputChannel = "dailyFileInputChannel")
public MessageHandler handler() {
// return a handler that processes and moves the file
}
I do not want the application process a file with the same name and same modified time stamp. How can I ensure the application though still processes files with the same name but different time stamps?
Use a ChainFileListFilter instead of a CompositeFileListFilter.
The latter presents all files to each filter so, if the LastModifiedFileListFilter filters a file on the first attempt (and the FileSystemPersistentAcceptOnceFileListFilter passes it), the composite filter filters the file; on the next attempt it will be filtered again, even if it passes the first filter.
The ChainFileListFilter won't pass a file filtered by the LastModifiedFileListFilter to the next filter.
This was a recent "fix" (in 4.3.7 JIRA here).
The current version is 4.3.8.

Spring Aggregation Group

I did create an aggregate service as below
#EnableBinding(Processor.class)
class Configuration {
#Autowired
Processor processor;
#ServiceActivator(inputChannel = Processor.INPUT)
#Bean
public MessageHandler aggregator() {
AggregatingMessageHandler aggregatingMessageHandler =
new AggregatingMessageHandler(new DefaultAggregatingMessageGroupProcessor(),
new SimpleMessageStore(10));
//AggregatorFactoryBean aggregatorFactoryBean = new AggregatorFactoryBean();
//aggregatorFactoryBean.setMessageStore();
aggregatingMessageHandler.setOutputChannel(processor.output());
//aggregatorFactoryBean.setDiscardChannel(processor.output());
aggregatingMessageHandler.setSendPartialResultOnExpiry(true);
aggregatingMessageHandler.setSendTimeout(1000L);
aggregatingMessageHandler.setCorrelationStrategy(new ExpressionEvaluatingCorrelationStrategy("requestType"));
aggregatingMessageHandler.setReleaseStrategy(new MessageCountReleaseStrategy(3)); //ExpressionEvaluatingReleaseStrategy("size() == 5")
aggregatingMessageHandler.setExpireGroupsUponCompletion(true);
aggregatingMessageHandler.setGroupTimeoutExpression(new ValueExpression<>(3000L)); //size() ge 2 ? 5000 : -1
aggregatingMessageHandler.setExpireGroupsUponTimeout(true);
return aggregatingMessageHandler;
}
}
Now i want to release the group as soon as a new group is created, so i only have one group at a time.
To be more specific i do receive two types of requests 'PUT' and 'DEL' . i want to keep aggregating per the above rules but as soon as i receive a request type other than what i am aggregating i want to release the current group and start aggregating the new Type.
The reason i want to do this is because these requests are sent to another party that don't support having PUT and DEL requests at the same time and i can't delay any DEL request as sequence between PUT and DEL is important.
I understand that i need to create a custom release Pojo but will i be able to check the current groups ?
For Example
If i receive 6 messages like below
PUT PUT PUT DEL DEL PUT
they should be aggregated as below
3PUT
2 DEL
1 PUT
OK. Thank you for sharing more info.
Yes, you custom ReleaseStrategy can check that message type and return true to lead to the group completion function.
As long as you have only static correlationKey, so only one group is there in the store. When your message is stepping to the ReleaseStrategy, there won't be much magic just to check the current group for completion signal. Since there are no any other groups in the store, there is no need any complex release logic.
You should add expireGroupsUponCompletion = true to let the group to be removed after completion and the next message will form a new group for the same correlationKey.
UPDATE
Thank you for further info!
So, yes, your original PoC is good. And even static correlationKey is fine, since you are just going to collect incoming messages to batches.
Your custom ReleaseStrategy should analyze MessageGroup for a message with different key and return true in that case.
The custom MessageGroupProcessor should filter a message with different key from the output List and send that message to the aggregator back to let to form a new group for a sequence for its key.
i ended up implementing the below ReleaseStrategy as i found it simpler than removing message and queuing it again.
class MessageCountAndOnlyOneGroupReleaseStrategy implements org.springframework.integration.aggregator.ReleaseStrategy {
private final int threshold;
private final MessageGroupProcessor messageGroupProcessor;
public MessageCountAndOnlyOneGroupReleaseStrategy(int threshold,MessageGroupProcessor messageGroupProcessor) {
super();
this.threshold = threshold;
this.messageGroupProcessor = messageGroupProcessor;
}
private MessageGroup currentGroup;
#Override
public boolean canRelease(MessageGroup group) {
if(currentGroup == null)
currentGroup = group;
if(!group.getGroupId().equals(currentGroup.getGroupId())) {
messageGroupProcessor.processMessageGroup(currentGroup);
currentGroup = group;
return false;
}
return group.size() >= this.threshold;
}
}
Note that i did used new HeaderAttributeCorrelationStrategy("request_type") instead of just FOO for CollorationStrategy

Transferring pdf file from an sFtp Server (Complete Ftp) with spring integration

I tryed to execute the example at https://github.com/spring-projects/spring-integration-samples/tree/master/basic/sftp
I tryed to Transfers files from remote to local directory with the example given in the release (sFtpInboundReceiveSample.java)
Therefore, I installed a real sFtp Server (Complete FTP) created an account, generated a private / public key for the user and then tryed to launch tge sample test class.
My goal was to retrieve pdf binary files from the remote sFtp Server and store them in a local directory.
Unfortunately the sample program remain stuck in when trying to receive the first .pdf file.
That is : a temporary test1.pdf.writing file is created but the file is never being completely downloaded.
hence after a while the test end with an error. Can somebody help me determine what to do please ?
below is the configuration I use in my spring .xml file :
<int-sftp:inbound-channel-adapter id="sftpInbondAdapter" auto-startup="false" channel="receiveChannel" session-factory="sftpSessionFactory" local-directory="file:local-mumuFolder" remote-directory="mumuFolder" auto-create-local-directory="true" delete-remote-files="false" filename-regex=".*\.pdf$"> <int:poller fixed-rate="1000" max-messages-per-poll="1"/> </int-sftp:inbound-channel-adapter>
the sample code in java stop here :
public class SftpInboundReceiveSample {
#Test
public void runDemo(){
ConfigurableApplicationContext context =
new ClassPathXmlApplicationContext("/META-INF/spring/integration/SftpInboundReceiveSample-context.xml", this.getClass());
RemoteFileTemplate<LsEntry> template = null;
/*String file1 = "a.txt";
String file2 = "b.txt";
String file3 = "c.bar";
new File("local-dir", file1).delete();
new File("local-dir", file2).delete();*/
try {
PollableChannel localFileChannel = context.getBean("receiveChannel", PollableChannel.class);
#SuppressWarnings("unchecked")
SessionFactory<LsEntry> sessionFactory = context.getBean(CachingSessionFactory.class);
template = new RemoteFileTemplate<LsEntry>(sessionFactory);
//SftpTestUtils.createTestFiles(template, file1, file2, file3);
SourcePollingChannelAdapter adapter = context.getBean(SourcePollingChannelAdapter.class);
adapter.start();
Message<?> received = localFileChannel.receive();
(The localFileChannel.receive() instruction block indefinitively.
Can somebody please help me ?
Thanks and regards

How to create a timer counter in JSF

I want to create an online test system, so i need a timer. When user start the test the timer will start counting down. But when user go to next question refresh the page the timer will still running. How can I do that any suggestion?
You can use PrimeFaces Poll option.
The counter will always be saved on the server.
You can also use PrimeFaces Extensions, they have timer in their API.
Example: you put start of the test into #SessionScoped bean, then calculate seconds remaining which will be used as an input to <pe:timer> element:
#SessionScoped
public class UserData {
private Date testStart;
/* getters and setters for testStart */
public int secondsRemaining() {
//calculate seconds remaining until end of the test from this.testStart
return secondsRemaining;
}
}
and then your test.xhtml would contain element
Time remaining: <pe:timer format="HH:mm:ss"
timeout="#{userData.secondsRemaining()}"/>

Resources