How can I inject a sql to query property in int-jdbc:outbound-gateway?
Background:
I receive an amqp message with a where clause to query a table and do some logic. Where clause could be anything like state in ('ca','ma) or zipcode = '01760'
Is there an example on using int-jdbc:outbound-gateway passing a query that can change based on the received message?
For example:
we receive amqp message:
1: {"whereClause":"State in ('ca','ma')"}
2: {"whereClause":"id = 1"}
How can I inject to query prop in int-jdbc:outbound-gateway as below?
query="SELECT id FROM account where State in ('ca','ma')"
query="SELECT id FROM account where id = 1"
No, you can't do that with out-of-the-box Spring Integration JDBC components.
The query property is final and can't be changed at runtime.
Consider to use <service-activator> with JdbcTemplate.query() direct usage.
Related
So I am trying to write a query for Azure Application Insights logs.
Until now, I logged custom events, so all of the properties which I wanted to show could be found in the event's customDimensions. This was easy to query, it looked something like this:
customEvents |
project
name,
Endpoint = customDimensions.Endpoint,
Context = customDimensions.Context,
...
Response = customDimensions.Response
This was fine, but now there are cases where the customDimensions.Response's value is longer than 8192 characters, which is the limit of these custom properties. For this reason, I removed the Response property, and added an EventId property instead, which is a unique Id representing each event.
The responses are now stored as traces, because the trace message limit is 32k instead of 8.
In order to be able to identify which response belongs to which event, I added an EventId property to these traces too, giving it the same value as for it's custom event.
Now I am trying to write a query which could retrieve these, project the same fields it did before from customEvents, and also the Response (message) from traces, joining them on the EventId property stored in customDimensions.
Please point me in the right direction.
So you want to join data from customEvents with the traces? Just use the join operator like this:
customEvents | project
name,
Endpoint = customDimensions.Endpoint,
Context = customDimensions.Context,
eventId = tostring(customDimensions.EventId)
| join kind=leftouter
(traces | project message, eventId = tostring(customDimensions.EventId)) on eventId
Sorry if this is a bit vague or rambly, I'm still getting to grips with Data Factory and a lot of it seems a bit obtuse...
What I want to do is query my Cosmos Database for a list of Ids of records that need to be updated. For each of these records, I want to call a REST API using the Id (i.e. /Record/{Id}/Details)
I've created a Data Flow that took a string as a parameter and then called the REST API fine.
I then made a pipeline using a Lookup with a query (select c.RecordId from c where...) and pass that into a ForEach with items set to #activity('Lookup1').output.value
I then setup the Activity of the ForEach to my Data flow. From research, I think I'm supposed to set the Parameter value to "#item().RecordId", but that gives an error "parameter [name] does not match parameter type 'string'".
I can change the type of the parameter to any (and use toString([parameter]) to cast it ) and then when I try and debug it passes the parameter in, but it gives an error of "Job failed due to reason: at (Line 2/Col 14): Datatype any not found".
I'm not sure what the solution is. Is there a way to cast the result of the lookup to an integer or string? Is there a way to narrow an any down? Is there a better way than toString() that would work? Is there a better way than ForEach?
I tried to reproduce similar scenario what you are trying.
My sample data in cosmos
To query Cosmos Database for a list of Ids and call a REST API using the Id For each of these records.
First, I took Lookup activity in data factory and selected the id's where the last_name is Bluth
Its output and settings are as below:
Then I passed the output of lookup activity to For-each activity.
Then inside for each activity I created Dataflow activity and for that DataSource I gave the source as Rest API. My Rest API to call specific user is https://reqres.in/api/users/2 I gave base URL as https://reqres.in/api/users.
Then I created parameter called demoId as datatype string and in relative URL I gave that dynamic value as #dataset().demoId
After this I gave value source parameter as #item().id as after https://reqres.in/api/users there is only id should be provided to get data in you case you can try Record/#{item().id}/Details.
For each id it is successfully passing id to rest API and fetching data:
I am using Spring Integration and the Feed Inbound Channel Adapter to process news RSS feeds (which I think is fantastic :). I would also like to consume some additional news feeds which are available by an API into the same channel. The API is just a HTTP endpoint which returns a list of news articles in JSON. The fields are very similar to RSS i.e. there is a title, description, published date which could be mapped to the SyndEntry object.
Ideally I want to use the same functionality available in feed inbound channel adapter which deals with duplicate entries etc. Is it possible to customise Feed Inbound Channel Adaptor to process and map the JSON?
Any sample code or pointers would be really helpful.
Well, no. FeedEntryMessageSource is fully based on the Rome Tools which deals only with the XML model.
I'm afraid you have to create your own component which will produce SyndEntry instances for those JSON records. That can really be something like HttpRequestExecutingMessageHandler, based on the RestTemplate with the MappingJackson2HttpMessageConverter which is present by default anyway.
You can try to configure HttpRequestExecutingMessageHandler to the setExpectedResponseType(SyndFeedImpl.class) and expect to have application/json content-type in the response.
To achieve the "deals with duplicate entries" you can consider to use Idempotent Receiver pattern afterwards. Where the MessageSelector should be based on the MetadaStore and really preform logic similar to the one in the FeedEntryMessageSource:
if (lastModifiedDate != null) {
this.lastTime = lastModifiedDate.getTime();
}
else {
this.lastTime += 1; //NOSONAR - single poller thread
}
this.metadataStore.put(this.metadataKey, this.lastTime + "");
...
if ((entryDate != null && entryDate.getTime() > this.lastTime)
where entry is a payload (FeedEntry) of splitted result from mentioned HttpRequestExecutingMessageHandler.
We have clustered environment with 2 nodes in Oracle Weblogic 10.3.6 server and it is Round-Robin.
I have service which goes and gets the message from a external system and puts them in the Database (Oracle DB).
I am using a jdbc-inbound-adapter to convert these messages and pass it to the channels.
And to have a message processed only once. I am planning to have a column(NODE_NAME) in the DB-table. When the first service which gets the message from the external system also updates the column with the NODE_NAME (weblogic.Name). In the SELECT query of jdbc-inbound-adapter if I specify the NODE_NAME then the messages would be processed only once.
i.e. If the Service1(of Node1) saves the message in DB then inbound-adapter1 (of node1) passes the message to channel.
Example:
<si-jdbc:inbound-channel-adapter id="jdbcInboundAdapter"
channel="queueChannel" data-source="myDataSource"
auto-startup="true"
query="SELECT * FROM STAGE_TABLE WHERE STATUS='WAITING' and NODE_NAME = '${weblogic.Name}'"
update="UPDATE STAGE_TABLE SET STATUS='IN_PROGRESS' WHERE ID IN (:Id)"
max-rows-per-poll="100" row-mapper="rowMapper"
update-per-row="true">
<si:poller fixed-rate="5000">
<si:advice-chain>
<ref bean="txAdvice"/>
<ref bean="inboundAdapterConfiguration"/>
</si:advice-chain>
</si:poller>
</si-jdbc:inbound-channel-adapter>
Is this a good design?
By second approach: using the below Select SQL in the jdbc-inbound-adapter but I am guessing this would fail as I am using Oracle Database.
SELECT * FROM TABLE WHERE STATUS='WAITING' FOR UPDATE SKIP LOCKED
It would be great if some one could point me in the right direction.
Actually, FOR UPDATE SKIP LOCKED is exactly Oracle's feature - https://asktom.oracle.com/pls/asktom/f?p=100:11:0::::P11_QUESTION_ID:2060739900346201280
If you are in doubt as, here is a code from Spring Integration: https://github.com/spring-projects/spring-integration/blob/master/spring-integration-jdbc/src/main/java/org/springframework/integration/jdbc/store/channel/OracleChannelMessageStoreQueryProvider.java#L39
I'm trying to persist spring-security-acl domain objects in mongodb using grails mongo plugin. While executing following line of code
aclUtilService.addPermission Phone.class, phoneInstance.id, new PrincipalSid(username), BasePermission.ADMINISTRATION
I'm getting following error:
String-based queries like [executeQuery] are currently not supported in this implementation of GORM. Use criteria instead.. Stacktrace follows:
Message: String-based queries like [executeQuery] are currently not supported in this implementation of GORM. Use criteria instead.
Any toughts?
Grails Configuration Details:
app.grails.version=2.0.3
app.name=eateri
app.servlet.version=2.5
app.version=0.1
plugins.mongodb=1.0.0.RC5
plugins.spring-security-acl=1.1
plugins.spring-security-core=1.2.7.2
As #sudhir mentioned, there are some methods in aclService using hql executeQuery method, like:
protected AclObjectIdentity retrieveObjectIdentity(ObjectIdentity oid) {
return AclObjectIdentity.executeQuery(
"FROM AclObjectIdentity " +
"WHERE aclClass.className = :className " +
" AND objectId = :objectId",
[className: oid.type,
objectId: oid.identifier])[0]
}
But mongodb gorm plugin does not support hql, therefore the calling path with which your code fall into hql error is like:
aclUtilService.addPermission -> aclService.createAcl ->
retrieveObjectIdentity
And two other aclService methods using hql :
deleteEntries, findChildren
So, a simple solution to this is to store ACL objects in mysql and enable hibernate working with mongodb gorm.
Anther one is to override these 3 aclService methods with meta programming.