In spring-batch, data can be passed between various steps via ExecutionContext. You can set the details in one step and retrieve in the next. Do we have anything of this sort in spring-integration ?
My use case is that I have to pick up a file from ftp location, then split it based on certain business logic and then process them. Depending on the file names client id would be derived. This client id would be used in splitter, service activator and aggregator components.
From my newbie level of expertise I have in spring, I could not find anything which help me share state for a particular run.I wanted to know if spring-integration provides this state sharing context in some way.
Please let me know if there is a way to do in spring-context.
In Spring Integration applications there is no single ExecutionContext for state sharing. Instead, as Gary Russel mentioned, each message carries all the information within its payload or its headers.
If you use Spring Integration Java DSL and want to transport the clientId by message header you can use enrichHeader transformer. Being supplied with a HeaderEnricherSpec, it can accept a function which returns dynamically determined value for the specified header. As of your use case this might look like:
return IntegrationFlows
.from(/*ftp source*/)
.enrichHeaders(e -> e.headerFunction("clientId", this::deriveClientId))
./*split, aggregate, etc the file according to clientId*/
, where deriveClientId method might be a sort of:
private String deriveClientId(Message<File> fileMessage) {
String fileName = fileMessage.getHeaders().get(FileHeaders.FILENAME, String.class);
String clientId = /*some other logic for deriving clientId from*/fileName;
return clientId;
}
(FILENAME header is provided by FTP message source)
When you need to access the clientId header somewhere in the downstream flow you can do it the same way as file name mentioned above:
String clientId = message.getHeaders().get("clientId", String.class);
But make sure that the message still contains such header as it could have been lost somewhere among intermediate flow items. This is likely to happen if at some point you construct a message manually and send it further. In order not to loose any headers from the preceding message you can copy them during the building:
Message<PayloadType> newMessage = MessageBuilder
.withPayload(payloadValue)
.copyHeaders(precedingMessage.getHeaders())
.build();
Please note that message headers are immutable in Spring Integration. It means you can't just add or change a header of the existing message. You should create a new message or use HeaderEnricher for that purpose. Examples of both approaches are presented above.
Typically you convey information between components in the message payload itself, or often via message headers - see Message Construction and Header Enricher
Related
This appears, to me, to be a simple problem that is probably replicated all over the place. A very basic application of the MessageHandlerChain, probably using nothing more than out of the box functionality.
Conceptually, what I need is this:
(1) Polled JDBC reader (sets parameters for integration pass)
|
V
(2) JDBC Reader (uses input from (1) to fetch data to feed through channel
|
V
(3) JDBC writer (writes data fetched by (2) to target)
|
V
(4) JDBC writer (writes additional data from the original parameters fetched in (1))
What I think I need is
Flow:
From: JdbcPollingChannelAdapter (setup adapter)
Handler: messageHandlerChain
Handlers (
JdbcPollingChannelAdapter (inbound adapter)
JdbcOutboundGateway (outbound adapter)
JdbcOutboundGateway (cleanup gateway)
)
The JdbcPollingChannelAdapter does not implement the MessageHandler API, so I am at a loss how to read the actual data based on the setup step.
Since the JdbcOutboundGateway does not implement the MessageProducer API, I am at a bit of a loss as to what I need to use for the outbound adapter.
Are there OOB classes I should be using? Or do I need to somehow wrap the two adapters in BridgeHandlers to make this work?
Thanks in advance
EDIT (2)
Additional configuration problem
The setup adapter is pulling a single row back with two timestamp columns. They are being processed correctly by the "enrich headers" piece.
However, when the inbound adapter is executing, the framework is passing in java.lang.Object as parameters. Not String, not Timestamp, but an actual java.lang.Object as in new Object ().
It is passing the correct number of objects, but the content and datatypes are lost. Am I correct that the ExpressionEvaluatingSqlParameterSourceFactory needs to be configured?
Message:
GenericMessage [payload=[{startTime=2020-11-18 18:01:34.90944, endTime=2020-11-18 18:01:34.90944}], headers={startTime=2020-11-18 18:01:34.90944, id=835edf42-6f69-226a-18f4-ade030c16618, timestamp=1605897225384}]
SQL in the JdbcOutboundGateway:
Select t.*, w.operation as "ops" from ADDRESS t
Inner join TT_ADDRESS w
on (t.ADDRESSID = w.ADDRESSID)
And (w.LASTUPDATESTAMP >= :payload.from[0].get("startTime") and w.LASTUPDATESTAMP <= :payload.from[0].get("endTime") )
Edit: added solution java DSL configuration
private JdbcPollingChannelAdapter setupAdapter; // select only
private JdbcOutboundGateway inboundAdapter; // select only
private JdbcOutboundGateway insertUpdateAdapter; // update only
private JdbcOutboundGateway deleteAdapter; // update only
private JdbcMessageHandler cleanupAdapter; // update only
setFlow(IntegrationFlows
.from(setupAdapter, c -> c.poller(Pollers.fixedRate(1000L, TimeUnit.MILLISECONDS).maxMessagesPerPoll(1)))
.enrichHeaders(h -> h.headerExpression("ALC_startTime", "payload.from[0].get(\"ALC_startTime\")")
.headerExpression("ALC_endTime", "payload.from[0].get(\"ALC_endTime\")"))
.handle(inboundAdapter)
.enrichHeaders(h -> h.headerExpression("ALC_operation", "payload.from[0].get(\"ALC_operation\")"))
.handle(insertUpdateAdapter)
.handle(deleteAdapter)
.handle(cleanupAdapter)
.get());
flowContext.registration(flow).id(this.getId().toString()).register();
If you would like to carry the original arguments down to the last gateway in your flow, you need to store those arguments in the headers since after every step the payload of reply message is going to be different and you won't have original setup data over there any more. That's first.
Second: if you deal with IntegrationFlow and Java DSL, you don't need to worry about messageHandlerChain since conceptually the IntegrationFlow is a chain by itself but much more advance.
I'm not sure why you need to use a JdbcPollingChannelAdapter to request data on demand according incoming message from the source in the beginning of your flow.
You definitely still need to use a JdbcOutboundGateway for just SELECT mode. The updateQuery is optional, so that gateway is just going to perform SELECT and return a data for you in a payload of the reply message.
If you two next steps are just "write" and you don't care about the result, you probably can just take a look into a PublishSubscribeChannel and two JdbcMessageHandler as subscribers to it. Without a provided Executor for the PublishSubscribeChannel they are going to be executed one-by-one.
I'm configuring some requests programmatically in my test cases, I can set headers, custom properties, teardown scripts, etc. however I can't find how to set a standard json body for my put requests.
Is there any possibility from the restMethod class ?
So far I end up getting the method used :
restService = testRunner.testCase.testSuite.project.getInterfaceAt(0)
resource = restService.getOperationByName(resource_name)
request = resource.getRequestAt(0)
httpMethod = request.getMethod()
if (httpMethod.toString().equals("PUT"))
but then I'm stuck trying to find how to set a standard body for my PUT requests.
I try with the getRequestParts() method but it didn't give me what I expected ...
can anyone help, please
thank you
Alexandre
I’ve managed this. I had a tests of tests where I wanted to squirt the content of interest into the “bare bones” request. Idea being that I can wrap this in a data driven test. Then, for each row in my data spreadsheet I pull in the request body for my test. At first I simply pulled the request from a data source value in my spreadsheet, but this became unmanageable in my spreadsheet.
So, another tactic. In my test data sheet (data source) I stored the file name that contains the payload I want to squirt in.
In the test itself, I put in a groovy step immediately before the the step I want to push the payload into.
The groovy script uses the data source to firstly get the file name containing the payload, I then read the contents of the file.
In the step I want to push the data into, I just use a get from data, e.g. {groovyStep#result}.
If this doesn’t completely make sense, let me know and I’ll update with screenshot when I have access to SoapUi.
My Spring Integration (with a HTTP inbound gateway) application has the following flow:
The end-client POSTs an XML input message with a UUID for unique identification (let us call this XML, "whole") and other business elements
A Splitter component breaks it down into smaller XMLs with each "part" XML having the same UUID as the "whole" XML
The Aggregator aggregates the parts with some business logic along with a #CorrelationStrategy that uses the UUID of the incoming "part" messages
I am trying to explore a way to clear/flush the Aggregator's inputChannel or reset the Correlation Strategy after the correlation operation to enable different POST requests to reuse the UUIDs in the incoming "whole" XML input messages.
At this point, what is happening is that once a "whole" XML input message is processed successfully, if another HTTP POST sends a "whole" XML input message with the same UUID, the flow is hanging, probably because it is unable to understand what to do with the #CorrelationStrategy or is in a stale state in terms of UUIDs.
Frankly, the very definition of UUID means it has to be uniquely generated for every HTTP request but just wanted to accommodate a case where the end-client ends up sending the same UUID for every POST.
This is how my Aggregator class and its methods look:
#MessageEndpoint
public class ProductAggregator {
...
...
#Aggregator(inputChannel="compositeMessagesForAggregation",
outputChannel="upstreamResponses")
public Message<BusinessComposite>
generateAggregatedResponse(List<Message<BusinessComposite>>
listOfCompositeMessagesForAggregation) {
...
...
}
#CorrelationStrategy
public UUID correlateByUUID(Message<BusinessComposite>
compositeMessageForAggregation) {
...
...
}
Thanks in advance!
Sincerely,
Bharath
Set expireGroupsUponCompletion and expireGroupsUponExpiry to true on the aggregator to reuse the same correlation id; otherwise "late arriving" messages are discarded.
I am using Spring Integration and the Feed Inbound Channel Adapter to process news RSS feeds (which I think is fantastic :). I would also like to consume some additional news feeds which are available by an API into the same channel. The API is just a HTTP endpoint which returns a list of news articles in JSON. The fields are very similar to RSS i.e. there is a title, description, published date which could be mapped to the SyndEntry object.
Ideally I want to use the same functionality available in feed inbound channel adapter which deals with duplicate entries etc. Is it possible to customise Feed Inbound Channel Adaptor to process and map the JSON?
Any sample code or pointers would be really helpful.
Well, no. FeedEntryMessageSource is fully based on the Rome Tools which deals only with the XML model.
I'm afraid you have to create your own component which will produce SyndEntry instances for those JSON records. That can really be something like HttpRequestExecutingMessageHandler, based on the RestTemplate with the MappingJackson2HttpMessageConverter which is present by default anyway.
You can try to configure HttpRequestExecutingMessageHandler to the setExpectedResponseType(SyndFeedImpl.class) and expect to have application/json content-type in the response.
To achieve the "deals with duplicate entries" you can consider to use Idempotent Receiver pattern afterwards. Where the MessageSelector should be based on the MetadaStore and really preform logic similar to the one in the FeedEntryMessageSource:
if (lastModifiedDate != null) {
this.lastTime = lastModifiedDate.getTime();
}
else {
this.lastTime += 1; //NOSONAR - single poller thread
}
this.metadataStore.put(this.metadataKey, this.lastTime + "");
...
if ((entryDate != null && entryDate.getTime() > this.lastTime)
where entry is a payload (FeedEntry) of splitted result from mentioned HttpRequestExecutingMessageHandler.
Let's consider a REST endpoint which receives a JSON object. One of the JSON fields is a String, so I want to validate that no malicious text is received.
#ValidateRequest
public interface RestService {
#POST
#Consumes(APPLICATION_JSON)
#Path("endpoint")
void postData (#Valid #NotNull Data data);
}
public class Data {
#ValidString
private String s;
// get,set methods
}
I'm using the bean validation framework via #ValidString to delegate the validation to the ESAPI library.
#Override
public boolean isValid (String value, ConstraintValidatorContext context) {
return ESAPI.validator().isValidInput(
"String validation",
value,
this.constraint.type(),
this.constraint.maxLength(),
this.constraint.nullable(),
true);
}
This method canonicalizes the value (i.e. removes encryption) and then validates against a regular expression provided in the ESAPI config. The regex is not that important to the question, but it mostly whitelists 'safe' characters.
All good so far. However, in a few occasions, I need to accept 'less' safe characters like %, ", <, >, etc. because the incoming text is from an end user's free text input field.
Is there a known pattern for this kind of String sanitization? What kind of text can cause server-side problems if SQL queries are considered safe (e.g. using bind variables)? What if the user wants to store <script>alert("Hello")</script> as his description which at some point will be send back to the client? Do I store that in the DB? Is that a client-side concern?
When dealing with text coming from the user, best practice is to white list only known character sets as you stated. But that is not the whole solution, since there are times when that will not work, again as you pointed out sometimes "dangerous" characters are part of the valid character set.
When this happens you need to be very vigilant in how you handle the data. I, as well as the commenters, recommended is to keep the original data from the user in its original state as long as possible. Dealing with the data securely will be to use proper functions for the target domain/output.
SQL
When putting free format strings into a SQL database, best practice is to use prepared statements (in java this is the PreparedStatement object or using ORM that will automatically parameterizes the data.
To read more on SQL injection attacks and other forms of Injection attacks (XML, LDAP, etc.) I recommended OWASPS Top 10 - A1 Injections
XSS
You also mentioned what to do when outputting this data to client. In this case I you want to make sure you html encode the output for the proper context, aka contextual output encoding. ESAPI has Encoder Class/Interface for this. The important thing to note is which context (HTML Body, HTML Attribute, JavaScript, URL, etc.) will the data be outputted. Each area is going to be encoded differently.
Take for example the input: <script>alert('Hello World');<script>
Sample Encoding Outputs:
HTML: <script>alert('Hello World');<script>
JavaScript: \u003cscript\u003ealert(\u0027Hello World\u0027);\u003cscript\u003e
URL: %3Cscript%3Ealert%28%27Hello%20World%27%29%3B%3Cscript%3E
Form URL:
%3Cscript%3Ealert%28%27Hello+World%27%29%3B%3Cscript%3E
CSS: \00003Cscript\00003Ealert\000028\000027Hello\000020World\000027\000029\00003B\00003Cscript\00003E
XML: <script>alert('Hello World');<script>
For more reading on XSS look at OWASP Top 10 - A3 Cross-Site Scripting (XSS)