I'm using an Execute Sql Query action in logic app.
Returned result is composed of 1..n tables ("selects").
I want to create a csv table and send it over tfs.
The issue I'm having is that the tables are elements of resultset, and not part of an array.
Is there some way to perform ForEach action on the resultset elements (i.e. - 'Table1', 'Table2', etc...)?
Is there some way to perform ForEach action on the resultset elements (i.e. - 'Table1', 'Table2', etc...)?'
According to the mentioned data format, it seems that it is not supported via foreach action in logic app.
If Azure function is acceptable, I recommend that you could use the Azure function to implement it with your customized logic to deal with data.
Foreach on a Resultset will return JSON object of each row.
I couldn't find any option DesignView to extract the value but you can achieve same in CodeView by assigning following code to your variable in CodeView.
In below MEETINGID is my column name.
"#{items('For_each')?['MEETINGID']}"
"imeetingid": "#{items('For_each')?['MEETINGID']}"
In DesignView, you can use Parse JSON after SQL querying step.
Then, you can use for each in order to reach each database record. Because SQL resultset returns as JSON object in the Logic App.
Related
I have a very large excel file and it gets crashed every time whenever I try to find some data so now I'm planning to store the file in Azure Blob Storage/other database and trying to write an Azure function in python so that I can fetch the data from Azure Blob/DB.
I have 1000 Columns and need a dynamic query so that end user can put any column out of 1000 columns to get the data from the excel which is stored in Blob/DB.
Could someone please help me to solution for the same. Which DB would be best and which python library I can use.
I will trigger azure function from Azure API management.
Have a look at the Microsoft example docs for quering Azure Cosmos using the python SDK here. Once we know how to hardcode an azure cosmos query with python, we only need to make it dynamic. Meaning we let the
user specify the column name.
One way you could do that is by creating an azure function that performed the database query. You would allow the user to specify the column name in url query parameters.
As an example, imagine we have an Azure function that is hosted at
https://my-function-app.azurewebsites.com/api/httptrigger1
We would let the user specify which column they would like data from using the url query parameters. They could add which column at the end of url. Example ...
https://my-function-app.azurewebsites.com/api/httptrigger1?MyColumn=Column1
An Azure function can read the url query parameters. They are part of the incoming request object. So, you could use them to dynamically build your sql statement ...
var myColumn = req.Query["MyColumn"]; // Get column name from url query params
QUERY = $"SELECT {myColumn} FROM MyTable"; // build dynamic query
This would allow your user to select their own column using the url!
Note: Be sure to sanitize and parametrize those user inputs to protect against sql injection.
I have a pipeline that includes a simple copy task that reads data from an SFTP source and writes to a table within a server. I have successfully parameterized the pipeline to prompt for which server and table I want to use at runtime but I want to specify a list of server/table pairs in a table that is accessed by a lookup task for use as parameters instead of needing to manually enter the server/table each time. For now it's only three combinations of servers and tables but that number should be able to flex as needed.
The issue I'm running into as that when I try to specify the array variable as my parameter in the lookup task within a For Each loop the pipeline fails telling me I need to specify an integer in the value array. I understand what it's telling me but it doesn't seem logical to me that I'd have to specify '0', '1','2' and so on each time.
How do I just let it iterate through the server and table pairs until there aren't any more to process? I'm not sure of the exact syntax but there has to be a way to tell it run the pipeline once with this server and table, again with a different server and table, then again and again until no more pairs are found in the table.
Not sure if it matters but I am on the data flow preview and using ADFv2
https://learn.microsoft.com/en-us/azure/data-factory/control-flow-for-each-activity#iteration-expression-language
I guess you want to access the iterate item, which is item() in adf expression language.
If you append a foreach activity after a look up activity, and put the output of lookup activity in items field in foreach activity, then item() means the iterate item in the lookup output.
I am trying to access an integer value from CRM Entity using Parse JSON in Logic Apps. It is returning a null value instead of an integer value.
"territoryinteger": "#{items('For_each')?['_vcm_territoryid_value#OData.Community.Display.V1.FormattedValue']}",
Any help is appreciated.
Thanks
If you look at the Runs History (under the Logic App Overview) and drill down to the Get Record or Get List action block, you can see the output of the JSON retrieved and you can visually verify that the query is returning a value.
I have a query like;
SELECT
*
INTO [documentdb]
FROM
[iothub]
TIMESTAMP BY eventenqueuedutctime
I need to use * because data is dynamic and dont have specific schema. Problem is Iothub system information data is written to documentdb in this query. Is there any way to exclude Iothub system information data?
Thanks.
This is not possible currently but this will be possible in Job Compatibility Level 1.2 in near future. For now, one workaround is that you could create a post create trigger in Cosmos DB to remove this property from the document.
To answer your question, Azure stream analytics service doesn't have an in-built support for excluding columns from dynamic data (iothub information). But, we can achieve this by using UDF. Here is more info on UDF.
UDF can help us in deleting the column from input data and returning us the updated json.
There are two steps basically to achieve this:
Create a JavaScript UDF.
Go to functions from left hand side navigation (below inputs).
Click on Add --> JavaScript UDF.
Give a function alias = removeiothubinfo
keep output type - any.
copy paste following code into function definition.
function main(input) {
delete input['IoTHub'];
return input;
}
Click on Save
Update query
Go to query mode and copy paste the following query :
WITH NewInput AS
(
SELECT
udf.removeiothubinfo(iothub) AS UpdatedJson
FROM
[iothub]
)
SELECT
UpdatedJson.*
INTO
[documentdb]
FROM
NewInput
Click on Save
I suggest you to test your query before running the job by uploading a sample file containing similar structure for json.
Edited
Also, even in job compatibility level 1.2 there has been no additional functionality to achieve this. Check this out for more info.
As #chetangm said in his answer, no such filtering mechanism is supported in ASA so far. Yes, you could use create trigger in Cosmos db, however it need to be triggered in sdk code or REST API. It won't be triggered automatically.
I provide you with another workaround that using Azure Function Cosmos DB Triggered. It could be executed when data is added to or changed in Azure Cosmos DB. You just need to remove the fields you don't want in the function code.
Does anyone knows how data is beeing retrived from table storage?
var result = ctx.CreateQuery<Contact>("Contacts")
.Where(x => x.PartitionKey == "key")
.Take(50)
.AsTableServiceQuery<Contact>().Execute();
foreach(var item in result)
{
Console.WriteLine(item.FirstName);
}
Does it get all items from storage and than loops through them or it get each item separately?
Take a look at the following links.
This one talks about the basics of table storage -
http://msdn.microsoft.com/en-us/magazine/ff796231.aspx
This one covers more than you are asking about, but there are some How To code examples that might be useful for querying table storage - http://www.windowsazure.com/en-us/develop/net/how-to-guides/table-services/
I also recommend this video from the PDC. It's a deep dive into tables and queues in Azure. - http://www.microsoftpdc.com/2009/svc09
You could have checked this using Fiddler. Table service is a REST Service, the CreateQuery() method creates REST Query, executes a HTTP REST Call, then parses the result, which is a XML containing all the entities in the result for the query (limit to 1000 and including continuation tokens if result is more than 1000). All the items are in the result XML, there is no point for querieng every single item from the result.