SqlQuerySpec object as a parameter for StoredProcedure in Azure DocumentDb - azure

Can we pass a SqlQuerySpec object as a parameter to stored procedure in Document Db? I think this gives us flexibility to send parameterized SQL text and parameters to procedure. If this is not possible, I would like to know if it is possible to access the complete SQL from the SqlQuerySpec.
Thank you,
Soma.

The server-side API takes the same JSON string as the REST and node.js APIs. The SqlQuerySpec type for the .NET SDK actually translates to this before sending it up when doing a regular query.
So, here are two ways you can compose the parameterized query from within your stored procedure.
In the package of parameters that you send to your stored procedure, include a JSON string like this:
'''
{
"query": "SELECT * FROM books b WHERE (b.Author.Name = #name)",
"parameters": [
{"name": "#name", "value": "Herman Melville"}
]
}
'''
You may be able to pass the string directly into a call to Collection.queryDocuments() but you may have to do JSON.parse(<your string>) before calling Collection.queryDocuments()
If you know the shape of your query is going to be the same for every call to this stored procedure, then you can send each parameter for your query in as a parameter for the stored procedure. You then have to compose the filterQuery object in the JavaScript of your stored procedure.

Related

Starting azure data factory with incoming parameter which is a json

I want to start my data factory with parameters. For that I defined a parameter "param" in my factory. The parameter is of type string and contains a json object like this
{
"url": "http://mySpecialFile.csv",
"name": "testing"
"destination: "farAway"
}
Now how can I a access these values in my factory. Is it necessary to define for each attribue a variable? And can I extract for example the value of url? I tried it like this
(#json(#pipeline().parameters.param).url)
But this does not work, it will write this value into the variable?
Is there better way to access the incoming json object? Also if I define for each json entry a variable I need to change my factory if the json changes?
Thanks for any advice
In Azure Data factory pipeline parameter for Json object type of data there is Data type called Object.
Here I created Pipeline parameter named json with Object data type and your sample value.
from this parameter to access particular value you can use dynamic parameter like #pipeline().parameters.json['value_which_you_want']
e.g. #pipeline().parameters.json['url'],#pipeline().parameters.json['name'],#pipeline().parameters.json['destination']
And it is fetching proper data.

In Azure Data Factory, how do I pass the Index of a ForEach as a parameter properly

Sorry if this is a bit vague or rambly, I'm still getting to grips with Data Factory and a lot of it seems a bit obtuse...
What I want to do is query my Cosmos Database for a list of Ids of records that need to be updated. For each of these records, I want to call a REST API using the Id (i.e. /Record/{Id}/Details)
I've created a Data Flow that took a string as a parameter and then called the REST API fine.
I then made a pipeline using a Lookup with a query (select c.RecordId from c where...) and pass that into a ForEach with items set to #activity('Lookup1').output.value
I then setup the Activity of the ForEach to my Data flow. From research, I think I'm supposed to set the Parameter value to "#item().RecordId", but that gives an error "parameter [name] does not match parameter type 'string'".
I can change the type of the parameter to any (and use toString([parameter]) to cast it ) and then when I try and debug it passes the parameter in, but it gives an error of "Job failed due to reason: at (Line 2/Col 14): Datatype any not found".
I'm not sure what the solution is. Is there a way to cast the result of the lookup to an integer or string? Is there a way to narrow an any down? Is there a better way than toString() that would work? Is there a better way than ForEach?
I tried to reproduce similar scenario what you are trying.
My sample data in cosmos
To query Cosmos Database for a list of Ids and call a REST API using the Id For each of these records.
First, I took Lookup activity in data factory and selected the id's where the last_name is Bluth
Its output and settings are as below:
Then I passed the output of lookup activity to For-each activity.
Then inside for each activity I created Dataflow activity and for that DataSource I gave the source as Rest API. My Rest API to call specific user is https://reqres.in/api/users/2 I gave base URL as https://reqres.in/api/users.
Then I created parameter called demoId as datatype string and in relative URL I gave that dynamic value as #dataset().demoId
After this I gave value source parameter as #item().id as after https://reqres.in/api/users there is only id should be provided to get data in you case you can try Record/#{item().id}/Details.
For each id it is successfully passing id to rest API and fetching data:

JMeter: Connect to PostGresSQL in JSR using groovy and then compare values from multiple tables in DB with API response

Sorry for the long post, but I really need some guidance here. I need to compare values from an API response with the values from multiple tables in the DB.
Currently, I am doing it as follows:
Use a JDBC Connect Configuration to connect to Postgres DB and then use the JDBC Sampler to execute queries. I use it three times to query 3 different tables. I store this data in variables (lets call them DBVariables). Please see this image for current Jmeter setup. https://i.stack.imgur.com/GZJyF.png
In JSR Assertion, I have written code that takes data from various DBVariables and compares it against the API response.
However, my issue is the API response can have an array of records and then nested arrays inside each (please see API response sample below). And these array elements can be sorted in any order. This is where I have issues.
I was wondering what would be the most efficient way of writing this JSR Assertion to validate all data elements returned by the API are the same as what is in the DB.
I am very new to groovy, but I think if I can query the DB inside the JSR assertion (instead of using the JDBC sampler), then the comparison can be done by storing API response in a map and then the DBResponse in another map and sorting both and comparing the items.
My questions are:
How can I connect to postgressql using groovy and then execute query statements in it? I have not done that before and was hoping if someone can provide a sample code.
How can I store API response and DB responses in Map and sort them and compare them in groovy?
The API response is of the following type:
{
"data":{
"response":{
"employeeList":[
{
"employeeNumber":"11102",
"addressList":[
{
"addrType":"Home",
"street_1":"123 Any street"
},
{
"addrType":"Alternate",
"street_1":"123 Any street"
}
],
"departmentList":[
{
"deptName":"IT"
},
{
"deptName":"Finance"
},
{
"deptName":"IT"
}
]
},
{
"employeeNumber":"11103",
"addressList":[
{
"addrType":"Home",
"street_1":"123 Any street"
},
{
"addrType":"Alternate",
"street_1":"123 Any street"
}
],
"departmentList":[
{
"deptName":"IT"
},
{
"deptName":"Finance"
},
{
"deptName":"IT"
}
]
}
]
}
}
}
Have you seen Working with a relational database chapter of Groovy documentation? Alternatively you can obtain a Connection instance from the JDBC Configuration Element like
def connection = org.apache.jmeter.protocol.jdbc.config.getConnection('your-pool-name')
With regards to "sort" There is DefaultGroovyMethods class which provides sort() function for any "sortable" entity. With regards to "compare" - we don't know how the object from the database looks like hence cannot provide a comprehensive solution.
Maybe an easier option would be converting the response from the JDBC Sampler to JSON using JsonBuilder and once you have 2 JSON structures use the library like JSONassert which doesn't care about order and depth
You haven't asked, but if you're "very new to groovy" maybe it worth extracting individual values from API using JSON Extractor, do the same for the database with the JDBC elements and compare individual JMeter Variables using Response Assertion?

Dynamic REST calls in Azure Synapse Pipeline

I am making a call to a REST API with Azure Synapse and the return dataset looks something like this:
{
"links": [
{
"rel": "next",
"href": "[myRESTendpoint]?limit=1000&offset=1000"
},
{
"rel": "last",
"href": "[myRESTendpoint]?limit=1000&offset=60000"
},
{
"rel": "self",
"href": "[myRESTendpoint]"
}
],
"count": 1000,
"hasMore": true,
"items": [
{
"links": [],
"closedate": "6/16/2014",
"id": "16917",
"number": "62000",
"status": "H",
"tranid": "0062000"
},...
],
"offset": 0,
"totalResults": 60316
}
I am familiar with making a REST call to a single endpoint that can return all the the data with a single call using a Synapse pipeline, but this particular REST endpoint has a hard limit on only returning 1000 records, but it does give a property named "hasMore".
Is there a way to recursively make rest calls in a Synapse pipeline until the "hasMore" property equals false?
The end goal of this is to sink data to either a dedicated SQL pool or into ADLS2 and transform from there.
I have tried to achieve the same scenario using Azure Data Factory which seems to be more appropriate and easy to achieve the goal "The end goal of this is to sink data to either a dedicated SQL pool or into ADLS2 and transform from there".
As you have to hit the page recursively to fetch 1000 records , you might set it in the following fashion if the response header/response body contain the URL for the next page.
You're less likely to be able to use the functionality if the next page link or query parameter isn't included in the response headers/body.
Alternatively, you may utilise loop logic and do the Copy Activity.
Create two parameters in the Rest Connector:
Fill in the parameters for the RestConnector's relative URL.
Using the Set Variable action, the value of this variable would be increased in a loop. For each cycle, the URL for the Copy Activity is dynamically set.If you want to loop or iterate, you may use the Until activity.
Alternative:
In my experience, the REST connection pagination is quite rigid. Usually put the action within a loop. As a result, to have more control.
FOREACH Loop, here
For those following the thread, I used IpsitaDash-MT's suggestion using the ForEach loop. In the case of this API, when a call is made I get a property returned at the end of the call named "totalResults". Here are the steps I used to achieve what I was looking to do:
Make a dummy call to the API to get the "totalResults" parameter. This is just a call to return the number of results I am looking to get. In the case of this API, the body of the request is a SQL statement, so when the dummy request is made I am only asking for the ID's of the results I am looking to get.
SQL statement example
I then take the property "totalResults" from that request set a dynamic value in the "Items" of the ForEach loop like this:
#range(0,add(div(sub(int(activity('Get Pages Customers').output.totalResults),mod(int(activity('Get Pages Customers').output.totalResults),1000)),1000),1))
NOTE: The API only allows pages of 1000 results, I do some math to get a range of page numbers. I also have to add 1 to the final result to include the last page.
ForEach Loop Settings
In the API I have two parameters that can be passed "limit" and "offset". Since I want all of the data there is no reason to have limit set to anything other than 1000 (the max allowable number). The offset parameter can be set to any number less than or equal to "totalResults" - "limit" and greater than or equal to 0. So I use the range established in step 2 and multiply it out by 1000 to set the offset parameter in the URL.
Setting the offset parameter in the copy data activity
Dynamic value of the Relative URL in the REST connector
NOTE: I found it better to sink the data as JSON into ADLS2 first rather than into a dedicated SQL pool due to the Lookup feature.
Since synapse does not allow nested ForEach loops, I run the data through a data flow to format the data and check for duplicates and updates.
When the data flow is completed it kicks off a lookup activity to get the data that was just processed and pass it into a new pipeline to use another ForEach loop to get the child data for each ID of parent data.
Data Flow and Lookup for child data pipeline

CRM 2011 JavaScript How to access data stored in an entity passed from a lookup control?

As the question suggests, I need to find out how to access entity data that has been passed into a JavaScript function via a lookup.
JavaScript Code Follows:
// function to generate the correct Weighting Value when these parameters change
function TypeAffectedOrRegionAffected_OnChanged(ExecutionContext, Type, Region, Weighting, Potential) {
var type = Xrm.Page.data.entity.attributes.get(Type).getValue();
var region = Xrm.Page.data.entity.attributes.get(Region).getValue();
// if we have values for both fields
if (type != null && region != null) {
// create the weighting variable
var weighting = type[0].name.substring(4) + "-" + region;
// recreate the Weighting Value
Xrm.Page.data.entity.attributes.get(Weighting).setValue(weighting);
}
}
As you can see with the following line using the name operator I can access my Type entity's Type field.
// create the weighting variable
var weighting = type[0].name.substring(4) + "-" + region;
I am looking for a way now to access the values stored inside my type object. It has the following fields new_type, new_description, new_value and new_kind.
I guess I'm looking for something like this:
// use value of entity to assign to our form field
Xrm.Page.data.entity.attributes.get(Potential).setValue(type[0].getAttribute("new_value"));
Thanks in advance for any help.
Regards,
Comic
REST OData calls are definitely the way to go in this case. You already have the id, and you just need to retrieve some additional values. Here is a sample to get you started. The hardest part with working with Odata IMHO is creating the Request Url's. There are a couple tools, that you can find on codeplex, but my favorite, is actually to use LinqPad. Just connect to your Org Odata URL, and it'll retrieve all of your entities and allow you to write a LINQ statement that will generate the URL for you, which you can test right in the browser.
For your instance, it'll look something like this (it is case sensitive, so double check that if it doesn't work):
"OdataRestURL/TypeSet(guid'" + type[0].Id.replace(/{/gi, "").replace(/}/gi, "") + "'select=new_type,new_description,new_value,new_kind"
Replace OdataRestURL with whatever your odata rest endpoint is, and you should be all set.
Yes Guido Preite is right. You need to retrieve the entity by the id which come form the lookup by Rest Sync or Async. And then get Json object. However for make the object light which is returned, you can mention which fields to be backed as part of the Json. Now you can access those fields which you want.

Resources