Spring Integration File Inbound Adapter Scan Directory Each Poll - spring-integration

I would like to enhance my current file inbound channel adapter that will scan the directory to refresh the file listing in the queue for each poll.
Below are the XML config for my current file inbound channel adapter :
<int-file:inbound-channel-adapter id="hostFilesOut" channel="hostFileOutChannel"
directory="${hostfile.dir.out}" prevent-duplicates="false"
filename-regex="${hostfile.out.filename-regex}" >
<int:poller id="poller" cron="${poller.cron:0,4,8,12,16,20,24,28,32,36,40,44,48,52,56 * * * * * }"
max-messages-per-poll="1" />
</int-file:inbound-channel-adapter>
I have try to create a custom scanner to read file. However, using the scanner to a file inbound channel adapter will cause the cron configuration not working.
Can someone give an advice on this or is there any other way can also achieve the same goal.
Thank you.

The FileReadingMessageSource has already such an option:
/**
* Optional. Set this flag if you want to make sure the internal queue is
* refreshed with the latest content of the input directory on each poll.
* <p>
* By default this implementation will empty its queue before looking at the
* directory again. In cases where order is relevant it is important to
* consider the effects of setting this flag. The internal
* {#link java.util.concurrent.BlockingQueue} that this class is keeping
* will more likely be out of sync with the file system if this flag is set
* to <code>false</code>, but it will change more often (causing expensive
* reordering) if it is set to <code>true</code>.
*
* #param scanEachPoll
* whether or not the component should re-scan (as opposed to not
* rescanning until the entire backlog has been delivered)
*/
public void setScanEachPoll(boolean scanEachPoll) {
However I'm surprised that we don't have that option exposed for the XML configuration although that option is there since day first https://jira.spring.io/browse/INT-583.
Here is a Doc on the matter.
As a workaround you can create FileReadingMessageSource bean and use it as a ref in the <int:inbound-channel-adapter>. Another way to proceed is Annotations or Java DSL configuration. You can find some sample in the Doc mentioned above.
For the XML support, please, raise a JIRA and we will add such a XSD definition. Also don't hesitate providing contribution on the matter!

Related

Is http-inbound gateway can be do something like preHandle and postHandle of interceptor?

I want to put a "UUID" into MDC when the service accept a http request.
Because it's convenience for log search.
I was inherited HttpRequestHandlingMessagingGateway and found the handleRequest() was final so that I can't overwrite it .
So is there a way to do something when accept a request(MDC.put()) and write response(MDC.remove()) ?
Well, this is not an HttpRequestHandlingMessagingGateway responsibility to manipulate a request that way.
I think you need to take a look into a Web Filter registration: https://www.mkyong.com/spring-mvc/how-to-register-a-servlet-filter-in-spring-mvc/
As a point of idea you can borrow an existing AbstractRequestLoggingFilter:
* Base class for {#code Filter}s that perform logging operations before and after a request
* is processed.
*
* <p>Subclasses should override the {#code beforeRequest(HttpServletRequest, String)} and
* {#code afterRequest(HttpServletRequest, String)} methods to perform the actual logging
* around the request.

How to setup cron dynamically via admin in Magento 2

How can I setup cron dynamically in config.xml (custom module) in Magento 2?
Magento2 has a different scheme to merge layout config so you have to create a new file that called crontab.xml under your_custom_module/etc folder. And then you can add your cron config like this one:
<?xml version="1.0"?>
<config xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:noNamespaceSchemaLocation="urn:magento:module:Magento_Cron:etc/crontab.xsd">
<group id="default">
<job name="custom_cronjob" instance="YourVenDoerName\CustomModule\Cron\Test" method="execute">
<schedule>* * * * *</schedule>
</job>
</group>
</config>
I will try to make a proposition, not sure if it completely answers your question tho.
So config.xml is setting a default value for your configuration field set in system.xml
So you can have another cron job that runs every minute (* * * * *) and dynamically change this value set in system.xml. Something like this:
public function __construct(
\Magento\Framework\App\Config\ConfigResource\ConfigInterface $resourceConfig)
{
$this->resourceConfig = $resourceConfig;
}
public function execute()
{
$newvalue = $dynamicvalue;
$this->resourceConfig->saveConfig(
'section/group/field',
$newvalue,
\Magento\Framework\App\Config\ScopeConfigInterface::SCOPE_TYPE_DEFAULT,
\Magento\Store\Model\Store::DEFAULT_STORE_ID
);
}
So basically two cron jobs.
One that actually does the job you want and one that tweaks it schedule.
Also you can tweak the schedule of it dynamically in an observer, plugin or some other class depending on your needs using the code above.

How to put 25k record to kinesis stream and Test tool to acknowledge it

I have developed a piece of software which writes record to Amazon kinesis Stream web service. i am trying to understand do we have any software tool which will allow me to measure what maximum throughput my code is generating to Kinesis Stream for 1 Shard in one second.
Yes i do agree it depends on hardware configurations too. But for start i want o know for general purpose machine then might be i will able to see horizontal scalability
With this i am trying to achieve 25k records per second to write to kinesis stream
Reference : Kinesis http://aws.amazon.com/kinesis/
I believe you can use Apache JMeter for this as
Download and install JMeter
Download Amazon Kinesis Java Client Library and drop jars to JMeter classpath (you can use /lib folder of your JMeter installation)
Using JSR223 Sampler, "groovy" as a language and AmazonKinesisRecordProducerSample as a reference implement the code which will write records to stream
See Beanshell vs JSR223 vs Java JMeter Scripting: The Performance-Off You've Been Waiting For! guide for instructions on installing "groovy" engine support and scripting best practices.
Thanks for the hints. I have figured a way out for working code in groovy to use AWS-Java-SDK to send records using Kinesis Stream:
and here is the sample code:
/*
* Copyright 2012-2016 Amazon.com, Inc. or its affiliates. All Rights Reserved.
*
* Licensed under the Apache License, Version 2.0 (the "License").
* You may not use this file except in compliance with the License.
* A copy of the License is located at
*
* http://aws.amazon.com/apache2.0
*
* or in the "license" file accompanying this file. This file is distributed
* on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either
* express or implied. See the License for the specific language governing
* permissions and limitations under the License.
*/
import java.nio.ByteBuffer
import java.util.List
import java.util.concurrent.TimeUnit
import com.amazonaws.AmazonClientException
import com.amazonaws.AmazonServiceException
import com.amazonaws.auth.AWSCredentials
import com.amazonaws.auth.profile.ProfileCredentialsProvider
import com.amazonaws.services.kinesis.AmazonKinesisClient
import com.amazonaws.services.kinesis.model.CreateStreamRequest
import com.amazonaws.services.kinesis.model.DescribeStreamRequest
import com.amazonaws.services.kinesis.model.DescribeStreamResult
import com.amazonaws.services.kinesis.model.ListStreamsRequest
import com.amazonaws.services.kinesis.model.ListStreamsResult
import com.amazonaws.services.kinesis.model.PutRecordRequest
import com.amazonaws.services.kinesis.model.PutRecordResult
import com.amazonaws.services.kinesis.model.ResourceNotFoundException
import com.amazonaws.services.kinesis.model.StreamDescription
class AmazonKinesisRecordProducerSample {
/*
* Before running the code:
* Fill in your AWS access credentials in the provided credentials
* file template, and be sure to move the file to the default location
* (~/.aws/credentials) where the sample code will load the
* credentials from.
* https://console.aws.amazon.com/iam/home?#security_credential
*
* WARNING:
* To avoid accidental leakage of your credentials, DO NOT keep
* the credentials file in your source directory.
*/
def kinesis
def init() {
/*
* The ProfileCredentialsProvider will return your [default]
* credential profile by reading from the credentials file located at
* (~/.aws/credentials).
*/
AWSCredentials credentials = null
credentials = new ProfileCredentialsProvider().getCredentials()
kinesis = new AmazonKinesisClient(credentials)
}
}
def amazonKinesisRecordProducerSample= new AmazonKinesisRecordProducerSample()
amazonKinesisRecordProducerSample.init()
def myStreamName="<KINESIS STREAM NAME>"
println("Press CTRL-C to stop.")
// Write records to the stream until this program is aborted.
while (true) {
def createTime = System.currentTimeMillis()
def data='<Data IN STRING FORMAT>'
def partitionkey="<PARTITION KEY>"
def putRecordRequest = new PutRecordRequest()
putRecordRequest.setStreamName(myStreamName)
putRecordRequest.setData(ByteBuffer.wrap(String.valueOf(data).getBytes()))
putRecordRequest.setPartitionKey(partitionkey)
def putRecordResult = new PutRecordResult()
putRecordResult = amazonKinesisRecordProducerSample.kinesis.putRecord(putRecordRequest)
printf("Successfully put record, partition key : %s, ShardID : %s, SequenceNumber : %s.\n",
putRecordRequest.getPartitionKey(),
putRecordResult.getShardId(),
putRecordResult.getSequenceNumber())
}
Note:This code will work only if you have Kinesis stream already created and is enabled.If you require to create the stream and then use it, please refer code example given in aws-java-sdk src folder.

Spring Integration to Iterate through a list of files or records

I am using spring integration to download files and to process them.
<int-sftp:inbound-channel-adapter channel="FileDownloadChannel"
session-factory="SftpSessionFactory"
remote-directory="/home/sshaji/from_disney/files"
filter = "modifiedFileListFilter"
local-directory="/home/sshaji/to_disney/downloads"
auto-create-local-directory="true" >
<integration:poller cron="*/10 * * * * *" default="true"/>
</int-sftp:inbound-channel-adapter>
<integration:transformer input-channel="FileDownloadChannel"
ref="ErrorTransformer"
output-channel="EndChannel"/>
<integration:router input-channel="FileErrorProcessingChannel"
expression="payload.getErrorCode() > 0">
<integration:mapping value="true" channel="ReportErrorChannel"/>
<integration:mapping value="false" channel="FilesBackupChannel"/>
</integration:router>
The int-sftp:inbound-channel-adapter is used to download files from sftp server.
It downloads about 6 files. all xml files.
The transformer iterates all the 6 files and check whether they have an error tag.
If there is an error tag then it will set its errorcode as 1. else it will be set a 0.
When it comes out of the transformer and goes to the router,
i want to send the files whose errorcode is set to 1 to move to a specific folder (Error)
and those which has errorcode set to 0 to move to another folder (NoError).
Currently the transformer returns a " list fileNames " which contains the errorcode and fileNames of all the 6 files.
How can i check the error code for each file using the router? and then map that particular file to a router.
Common C Logic for my problem
for (int i =0; i<fileNames.lenght();i++) {
if(fileNames[i].getErrorCode == 1) {
moveToErrorFolder(fileNames[i].getName());
} else {
moveToNoErrors(fileNames[i].getName());
}
}
How can i achieve this using spring integration?.
If its not possible, is there any workaround for it?.
I hope now its clear. I am sorry for not providing enough details last time.
Also in the int-sftp:inbound-channel-adapter i have hard coded the "remote-directory" and "local-directory" fields to a specific folder in the system. can i refer these from a bean property or from a constant value?.
I need to configure these values based on config.xml file, is that possible?.
I am new to Spring Integration. Please help me.
Thanks in Advance.
It's still not clear to me what you mean by "The transformer iterates all the 6 files".
Each file will be passed to the transformer in a single message, so I don't see how it can emit a list of 6.
It sounds like you need an <aggregator/> with a correlation-strategy-expression="'foo'" and release-strategy-expression="size() == 6". This would aggregate each single File into a list of File and pass it to your transformer. It then transforms it to a list of your status objects containing the filename and error code.
Finally, you would add a <splitter/> which would split the list into separate FileName messages to send to the router.
You can use normal Spring property placeholders for the directory attributes ${some.property} or SpEL to use a property of another bean #{someBean.remoteDir}.

Theming a field in Drupal-6

I have a custom content (using cck) namely thewittyshit and it has a field namely field_thewittyshit . I want to theme field_thewittyshit field for all the views. I wrote the following code and saved it in a new file namely views-view-field--default--field-thewittyshit-value.tpl.php . But still no change is reflected in any of my views or node display.
<?php
// $Id: views-view-field.tpl.php,v 1.1 2008/05/16 22:22:32 merlinofchaos Exp $
/**
* This template is used to print a single field in a view. It is not
* actually used in default Views, as this is registered as a theme
* function which has better performance. For single overrides, the
* template is perfectly okay.
*
* Variables available:
* - $view: The view object
* - $field: The field handler object that can process the input
* - $row: The raw SQL result that can be used
* - $output: The processed output that will normally be used.
*
* When fetching output from the $row, this construct should be used:
* $data = $row->{$field->field_alias}
*
* The above will guarantee that you'll always get the correct data,
* regardless of any changes in the aliasing that might happen if
* the view is modified.
*/
?>
<em>
<?php print $output; ?>
<em/>
I want to format the text. In this code i am just making it appear in italics.
The template you are overriding must reside in your theme directory. Find the views-view-field.tpl.php file in the views module folder and copy and paste it into your theme directory where your views-view-field--default--field-thewittyshit-value.tpl.php resides. This should let you then use this file. Also make sure to clear the cache after you do this.
For debugging purposes, if you are still having troubles, make sure views is using views-view-field--default--field-thewittyshit-value.tpl.php. Click on "information" next to the theme option in the views UI under the "Basic settings" section. The theme that you are using will be in bold.

Resources