Springboot schedular run in multi thread - multithreading

I have more than five methods inside a scheduler. each will trigger a SOAP request and fetch data from the remote system and save to spring mongo DB.
I don't want these scheduler run sequentially. If one fails it will not affect other methods in the scheduler. How can I make each method trigger in separate thread ?
#Configuration
#EnableScheduling
public class Scheduler {
private final Logger LOGGER = LoggerFactory.getLogger(Scheduler.class);
#Autowired
private MaterialRepository materialRepository;
#Autowired
private CustomerRepository customerRepository;
//fire each 1 minute
//method1
#Scheduled(cron = "0 * * * * ?")
public void getMaterial() {
//get data and save to db
}
//Using SOAP Web Services , fetching the values from SAP and saving to MongoDB
//fire each 1 minute
//method 2
#Scheduled(cron = "0 * * * * ?")
public void getCustomers() {
//get data and save to db
}
#Bean
public TaskScheduler taskScheduler() {
return new ConcurrentTaskScheduler(); //single threaded by default
}
}

Related

How to notify other RestControllers that a #Scheduled method is running in Spring-boot

I am using a #Scheduled cron job on a method is
Controller A {
#Scheduled(cron = "0 0/2 * * * *", zone = "GMT")
public void methodA() {
// Do something
}`
Is there a way my ControllerB is notified while methodA() is busy?
Controller B {
#Scheduled
public void methodB() {
// I want to be informed that a scheduled job is running
}`

Issue using #Transactional annotation above integration test class for multithreading environment

When I run the integration test for code which calls JPA repository within a new thread, I'm getting data that was populated during starting PostgreSQLContainer and I can't receive data from the script above class test( #Sql(scripts ="data.sql").
But when I remove #Transactional annotation above the test I can get data both from SQL script from test and test container.
My question is it possible to get data in a multithreading environment from test script without removing #Transactional annotation?
Thank you for your answer!
Application stack: Spring boot 2.1v+ test containers PostgreSQL 1.10.3v+ JUnit 4.12v
DB testcontainers config
#TestConfiguration
public class DatabaseTestConfig {
private static JdbcDatabaseContainer PSQL;
static {
PSQL = (PostgreSQLContainer) new PostgreSQLContainer("mdillon/postgis:9.4").withUsername("test")
.withPassword("test")
.withDatabaseName("test");
PSQL.start();
Arrays.asList("main_data.sql")
.forEach(DatabaseTestConfig::restoreDump);
/*
set db properties
*/
}
public void restoreDump(String fileName){
/*
insert sql data
PSQL.copyFileToContainer(fileName)...
*/
}
}
Base Integration Test class
#RunWith(SpringRunner.class)
#SpringBootTest(classes = { DatabaseTestConfig.class, ProjectApplication.class })
#ActiveProfiles("test-int")
#AutoConfigureMockMvc
#Sql(scripts = "classpath:extra_data.sql") // insert some extra data for all integration tests
public abstract class AbstractIntTest {
#Autowired
protected MockMvc mockMvc;
Integration Test that calls service where everething happenes
#Transactional
public class SomeIntegrationTest extends AbstractIntTest {
#Before
public void setUp() throws IOException {
//...
}
#Test
public void callServiceTest() throws Exception {
//mockMvc.perform(post(ENDPOINT_URL)
}
Service with simplified logic
#Service
#AllArgsConstructor
public class SomeService {
private final SomeJpaReporistory repo;
private final ExecutorService executor;
#Override
#Transactional
public SomeData call(){
return CompletableFuture.supplyAsync(() -> {
return repo.findAll();
}, executor).exceptionally(e -> {
throw new BadRequestException(e.getMessage());
});
}
When you make the test transactional, the SQL queries in extra_data.sql are performed in a transaction. That transaction is bound to a particular thread and is begun before execution of the test method and rolled back after the test method has completed:
Begin transaction
Execute extra_data.sql
Invoke test method
Roll back transaction
In step 3 you are calling repo.findAll() on a separate thread due to your service's use of supplyAsync. As a transaction is bound to a particular thread, this findAll() call is not part of the transaction in which extra_data.sql was executed. To be able to read the data added by extra_data.sql, it would have to be able to read uncommitted changes and perform a dirty read. Postgres does not support the read uncommitted isolation level so this isn't possible.
You'll need to revisit how you're populating your database with test data or your use of transactions in your tests. Perhaps you could apply extra_data.sql to the database in the same manner as main_data.sql so that it's always in place before any tests are executed and before any transactions are begun.
This is how I've solved this problem:
#Test
#Transactional
#Sql(scripts = "/db/extra_data.sql",
config = #SqlConfig(transactionMode = SqlConfig.TransactionMode.ISOLATED))
void test() {
// extra_data.sql are executed before this test is run.
}

StreamingMessageSource keeps firing when a filter is applied

I am trying to poll an FTP directory for a certain kind of file, the polling of a directory works, but whenever I try to apply a filter to filter the files by extension, the messagesource keeps spamming messages about the file with no regard to the polling delay. Without the filters everything works fine, once I enable them my application authenticates with the FTP, downloads the file and sends the message nonstop over and over again. I have the following beans:
/**
* Factory that creates the remote connection
*
* #return DefaultSftpSessionFactory
*/
#Bean
public DefaultSftpSessionFactory sftpSessionFactory(#Value("${ftp.host}") String ftpHost,
#Value("${ftp.port}") int ftpPort,
#Value("${ftp.user}") String ftpUser,
#Value("${ftp.pass}") String ftpPass) {
DefaultSftpSessionFactory factory = new DefaultSftpSessionFactory();
factory.setAllowUnknownKeys(true);
factory.setHost(ftpHost);
factory.setPort(ftpPort);
factory.setUser(ftpUser);
factory.setPassword(ftpPass);
return factory;
}
/**
* Template to handle remote files
*
* #param sessionFactory SessionFactory bean
* #return SftpRemoteFileTemplate
*/
#Bean
public SftpRemoteFileTemplate fileTemplate(DefaultSftpSessionFactory sessionFactory) {
SftpRemoteFileTemplate template = new SftpRemoteFileTemplate(sessionFactory);
template.setAutoCreateDirectory(true);
template.setUseTemporaryFileName(false);
return template;
}
/**
* To listen to multiple directories, declare multiples of this bean with the same inbound channel
*
* #param fileTemplate FileTemplate bean
* #return MessageSource
*/
#Bean
#InboundChannelAdapter(channel = "deeplinkAutomated", poller = #Poller(fixedDelay = "6000", maxMessagesPerPoll = "-1"))
public MessageSource inboundChannelAdapter(SftpRemoteFileTemplate fileTemplate) {
SftpStreamingMessageSource source = new SftpStreamingMessageSource(fileTemplate);
source.setRemoteDirectory("/upload");
source.setFilter(new CompositeFileListFilter<>(
Arrays.asList(new AcceptOnceFileListFilter<>(), new SftpSimplePatternFileListFilter("*.trg"))
));
return source;
}
/**
* Listener that activates on new messages on the specified input channel
*
* #return MessageHandler
*/
#Bean
#ServiceActivator(inputChannel = "deeplinkAutomated")
public MessageHandler handler(JobLauncher jobLauncher, Job deeplinkBatch) {
return message -> {
Gson gson = new Gson();
SFTPFileInfo info = gson.fromJson((String) message.getHeaders().get("file_remoteFileInfo"), SFTPFileInfo.class);
System.out.println("File to download: " + info.getFilename().replace(".trg", ".xml"));
};
}
I think AcceptOnceFileListFilter is not suitable for SFTP files. The returned LsEntry doesn't match previously stored in the HashSet: just their hashes are different!
Consider to use a SftpPersistentAcceptOnceFileListFilter instead.
Also it would be better to configure a DefaultSftpSessionFactory for the isSharedSession:
/**
* #param isSharedSession true if the session is to be shared.
*/
public DefaultSftpSessionFactory(boolean isSharedSession) {
To avoid session recreation on each polling task.
you don't have a 6 seconds delay between calls because you have a maxMessagesPerPoll = "-1". That means poll remote files until they are there in remote dir. In your case with the AcceptOnceFileListFilter you always end up with the same file by the hash reason.

How to get a file daily via SFTP using Spring Integration with Java config?

I need to get a file daily via SFTP. I would like to use Spring Integration with Java config. The file is generally available at a specific time each day. The application should try to get the file near that time each day. If the file is not available, it should continue to retry for x attempts. After x attempts, it should send an email to let the admin know that the file is still not available on the SFTP site.
One option is to use SftpInboundFileSynchronizingMessageSource. In the MessageHandler, I can kick off a job to process the file. However, I really don't need synchronization with the remote file system. After all, it is a scheduled delivery of the file. Plus, I need to delay at most 15 minutes for the next retry and to poll every 15 minutes seems a bit overkill for a daily file. I guess that I could use this but would need some mechanism to send email after a certain time elapsed and no file was received.
The other option seems to be using get of the SFTP Outbound Gateway. But the only examples I can find seem to be XML config.
Update
Adding code after using help provided by Artem Bilan's answer below:
Configuration class:
#Bean
#InboundChannelAdapter(autoStartup="true", channel = "sftpChannel", poller = #Poller("pollerMetadata"))
public SftpInboundFileSynchronizingMessageSource sftpMessageSource(ApplicationProperties applicationProperties, PropertiesPersistingMetadataStore store) {
SftpInboundFileSynchronizingMessageSource source =
new SftpInboundFileSynchronizingMessageSource(sftpInboundFileSynchronizer(applicationProperties));
source.setLocalDirectory(new File("ftp-inbound"));
source.setAutoCreateLocalDirectory(true);
FileSystemPersistentAcceptOnceFileListFilter local = new FileSystemPersistentAcceptOnceFileListFilter(store,"test");
source.setLocalFilter(local);
source.setCountsEnabled(true);
return source;
}
#Bean
public PollerMetadata pollerMetadata() {
PollerMetadata pollerMetadata = new PollerMetadata();
List<Advice> adviceChain = new ArrayList<Advice>();
adviceChain.add(retryCompoundTriggerAdvice());
pollerMetadata.setAdviceChain(adviceChain);
pollerMetadata.setTrigger(compoundTrigger());
return pollerMetadata;
}
#Bean
public RetryCompoundTriggerAdvice retryCompoundTriggerAdvice() {
return new RetryCompoundTriggerAdvice(compoundTrigger(), secondaryTrigger());
}
#Bean
public CompoundTrigger compoundTrigger() {
CompoundTrigger compoundTrigger = new CompoundTrigger(primaryTrigger());
return compoundTrigger;
}
#Bean
public Trigger primaryTrigger() {
return new CronTrigger("*/60 * * * * *");
}
#Bean
public Trigger secondaryTrigger() {
return new PeriodicTrigger(10000);
}
#Bean
#ServiceActivator(inputChannel = "sftpChannel")
public MessageHandler handler(PropertiesPersistingMetadataStore store) {
return new MessageHandler() {
#Override
public void handleMessage(Message<?> message) throws MessagingException {
System.out.println(message.getPayload());
store.flush();
}
};
}
RetryCompoundTriggerAdvice class:
public class RetryCompoundTriggerAdvice extends AbstractMessageSourceAdvice {
private final CompoundTrigger compoundTrigger;
private final Trigger override;
private int count = 0;
public RetryCompoundTriggerAdvice(CompoundTrigger compoundTrigger, Trigger overrideTrigger) {
Assert.notNull(compoundTrigger, "'compoundTrigger' cannot be null");
this.compoundTrigger = compoundTrigger;
this.override = overrideTrigger;
}
#Override
public boolean beforeReceive(MessageSource<?> source) {
return true;
}
#Override
public Message<?> afterReceive(Message<?> result, MessageSource<?> source) {
if (result == null && count <= 5) {
count++;
this.compoundTrigger.setOverride(this.override);
}
else {
this.compoundTrigger.setOverride(null);
if (count > 5) {
//send email
}
count = 0;
}
return result;
}
}
Since Spring Integration 4.3 there is CompoundTrigger:
* A {#link Trigger} that delegates the {#link #nextExecutionTime(TriggerContext)}
* to one of two Triggers. If the {#link #setOverride(Trigger) override} trigger is
* {#code null}, the primary trigger is invoked; otherwise the override trigger is
* invoked.
With the combination of CompoundTriggerAdvice:
* An {#link AbstractMessageSourceAdvice} that uses a {#link CompoundTrigger} to adjust
* the poller - when a message is present, the compound trigger's primary trigger is
* used to determine the next poll. When no message is present, the override trigger is
* used.
it can be used to reach your task:
The primaryTrigger can be a CronTrigger to run the task only once a day.
The override could be a PeriodicTrigger with desired short period to retry.
The retry logic you can utilize with one more Advice for poller or just extend that CompoundTriggerAdvice to add count logic to send an email eventually.
Since there is no file, therefore no message to kick the flow. And we don't have choice unless dance around the poller infrastructure.

Use android Application class run a thread background to post some data to webservice

Could I write a thread in Android the Application class?
This thread is running every five minutes to post data to a webservice.
public class MyApplication extends Application {
#Override
public void onCreate() {
startUploadGPSTimer();
}
private void startUploadGPSTimer() {
gpsTimerHandler.postDelayed(runnable, 5* 60 * 1000); // start Timer
}
private Handler gpsTimerHandler = new Handler();
private Runnable runnable = new Runnable() {
public void run() {
Map<String, String> params = new HashMap<String, String>();
params.put("latitude", Global.CUR_LATITUDE);
params.put("longitude", Global.CUR_LONGITUDE);
WebServiceObj obj = new WebServiceObj("upload",
WebServiceMethod.METHOD_UPLOAD_GPS,
Utilly.getSoapParams(params));
SoapService service = null;
SoapObject result = null;
service = new SoapService(obj.tag);
result = service.LoadResult(obj);
Log.i("post webservrce ", result.toString());
gpsTimerHandler.postDelayed(this, 5 * 60 * 1000);
}
};
When my application enter background, this thread like do not running. Because of the data I post in thread Finally input to Database, and I can't find it in db.
Why?
I write some log when do post data to webservice. and found the log be generated randomly.
Very strange
Intent photosIntent = new Intent(this, MyService.class);
pendingIntent = PendingIntent.getBroadcast(getApplicationContext(), 0,photosIntent, PendingIntent.FLAG_UPDATE_CURRENT);
alarmManager = (AlarmManager)getSystemService(ALARM_SERVICE);
alarmManager.setRepeating(AlarmManager.RTC_WAKEUP, 5*60*1000,5*60*1000, pendingIntent);
In MyService is a Service Class write your stuff in onstart to send request to webservices. This service will call for every 5minutes.
Hope this helps.
Thanks

Resources