Unable to query Azure Table Storage using Azure CLI - azure

I wanted to filter the entry in my Azure Storage Table and the structure looks like the following. I wanted to filter the entry's based on the given Id for example JD.98755. How can we achieve this?
{
"items": [
{
"selectionId": {
"Id": "JD.98755",
"status": 0
},
"Consortium": "xxxxxx",
"CreatedTime": "2019-09-06T09:34:07.551260+00:00",
"RowKey": "yyyyyy",
"PartitionKey": "zzzzzz-zzzzz-zz-zzzzz-zz",
"Timestamp": "2019-09-06T09:41:34.660306+00:00",
"etag": "W/\"datetime'2019-09-06T09%3A41%3A34.6603060Z'\""
}
],
"nextMarker": {}
}
I can filter other elements like the Consortium using the below query but not the Id
az storage entity query -t test --account-name zuhdefault --filter "Consortium eq 'test'"
I tried something like the following to filter based on the given ID but it has not returned any results.
az storage entity query -t test --account-name zuhdefault --filter "Id eq 'JD.98755'"
{
"items": [],
"nextMarker": {}
}

I do agree with #Gaurav Mantri and I guess one of other approach you can use is:
I have reproduced in my environment and got expected results as below:
Firstly, you need to store the output of the command into a variable like below:
I have stored output in $x variable:
$x
Then you can change the output from Json:
$r= $x | ConvertFrom-Json
Then you can store items.id value in a variable like below:
Now you can use below command to get the items with Id JD.98755:
If you have more data, then store the first output into variable then divide them into objects using ConvertFrom-json and then you use the above steps from first.

The reason you are not getting any data back is because Azure Table Storage is a simple key/value pair store and you are storing a JSON there (in all likelihood, the SDK serialized JSON data and stored it as string in Table Storage).
Considering there is no key named Id, you will not be able to search for that.
If you need to store JSON document, one option is to make use of Cosmos DB (with SQL API) instead of Table Storage. Other option would be to flatten your JSON so that you store them as key/value pair. In this scenario, your data would look something like:
{
"selectionId_Id": "JD.98755",
"selectionId_status": 0,
"Consortium": "xxxxxx",
"CreatedTime": "2019-09-06T09:34:07.551260+00:00",
"RowKey": "yyyyyy",
"PartitionKey": "zzzzzz-zzzzz-zz-zzzzz-zz",
"Timestamp": "2019-09-06T09:41:34.660306+00:00",
"etag": "W/\"datetime'2019-09-06T09%3A41%3A34.6603060Z'\""
}
then you should be able to filter by selectionId_Id.

Related

Azure Storage Explorer - Save results of query into a variable to be used in another query

Is it possible to save the query results into a variable, then use that variable to make a separate query in Azure Storage Explorer?
I.E.
First Query
Select C.ID from C where C.Temp <> '' - This will return results of ID. Need to save this into variable.
Second Query
Select * From C Where C.Temp2 IN (#Variable)
Edited to include sample data:
First set up data
{
"id": <string>,
"Temp": <string>,
"name": <string>,
"Address": <string>,
"City": <string>
}
Second set of data
{
"id": <string>,
"Temp2": <string>,
"name2": <string>,
"Address2": <string>,
"City2": <string>
}
Azure Storage Explorer support storing structured, non-relational data.
As per my understanding Azure Storage Explorer doesn't support sub-queries or parameterized queries. It support basic filter and sorting operations.
To perform complex operations/queries you may need to use Azure explorer SDK.
References :-
Table-storage-overview
Storage-explorer
Supported query operators-LINQ

Save Log Analytics API results to Azure Table Storage

I start learning the Azure Logic Apps and my first tasks is to store the result of a specific Kusto query from calling the log analytics of azure https://api.loganalytics.io/v1/workspaces/{guid}/query.
Currently, I can successfully call the log analytics api using Http in Logic App and this is the sample return.
{
"tables": [
{
"name": "PrimaryResult",
"columns": [
{
"name": "UserPrincipalName",
"type": "string"
},
{
"name": "OperationName",
"type": "string"
},
{
"name": "ResultDescription",
"type": "string"
},
{
"name": "AuthMethod",
"type": "string"
},
{
"name": "TimeGenerated",
"type": "string"
}
],
"rows": [
[
"first.name#email.com",
"Fraud reported - no action taken",
"Successfully reported fraud",
"Phone call approval (Authentication phone)",
"22-01-03 [09:01:03 AM]"
],
[
"last.name#email.com",
"Fraud reported - no action taken",
"Successfully reported fraud",
"Phone call approval (Authentication phone)",
"22-02-19 [01:28:29 AM]"
]
]
}
]
}
From this result, I'm stuck on how should iterate the rows property of the json result and save those data to Azure Table Storage which correspond to the columns property in the json result.
E.g.,
| UserPrincipalName | OperationName | ResultDescription | AuthMethod | TimeGenerated |
---------------------------------------------------------------------------------------------------------------------------------------------------------------
| first.name#email.com | Fraud reported - no action taken | Successfully reported fraud | Phone call approval (Authentication phone) | 22-01-03 [09:01:03 AM] |
| last.name#email.com | Fraud reported - no action taken | Successfully reported fraud | Phone call approval (Authentication phone) | 22-02-19 [01:28:29 AM] |
Hope someone can guide me on how to achieve this.
TIA!
You can use Parse_JSON to extract the inner data of the output provided and then can use Insert or Replace Entity action inside a for_each loop. Here is the screenshot of my logic app
In my storage account
UPDATED ANSWER
Instead of directly using Insert or Replace Entity I have Initialised 2 variables and then used Insert or Merge Entity. One variable is to iterate inside rows and other is to iterate inside columns using until loop and fetched the required values from tables. Here is the screenshot of my logic app.
In the first until loop, The iteration continues until rows variable is equal to no of rows. Below is the expression:
length(body('Parse_JSON')?['tables']?[0]?['rows'])
In the second until loop, The iteration continues until columns variable is equal to no of columns. Below is the expression:
length(body('Parse_JSON')?['tables']?[0]?['rows'])
Below is the expression I'm using in Insert or Merge Entity's entity
{
"#{body('Parse_JSON')?['tables']?[0]?['columns']?[variables('columns')]?['name']}": "#{body('Parse_JSON')?['tables']?[0]?['rows']?[variables('rows')]?[variables('columns')]}"
}
RESULTS:
Firstly, use a Parse JSON action and load your JSON in as sample to generate the schema.
Then use a For each (rename them accordingly) to traverse the rows, this will then automatically generate an outer For each for the tables.
This is the trickier part, you need to generate a payload that contains your data with some specific keys that you can then identify in your storage table.
This is my test flow with your data ...
... and this is the end result in storage explorer ...
The JSON within the entity field in the Insert Entity action looks like this ...
{
"Data": "#{items('For_Each_Row')}",
"PartitionKey": "#{guid()}",
"RowKey": "#{guid()}"
}
I simply used GUID's to make it work but you'd want to come up with some kind of key from your data to make it much more rational. Maybe the date field or something.

Convert JSON objects to array using Azure data flow

I'm using Azure Data Flow, and I'm using Union to combine two sources, so this union contains JSON documents. Is there a way to convert this JSON document to array of documents?
Union contains:
{"key":1,"value":"test8"}
{"key":2,"value":"test6"}
{"key":3,"value":"test3"}
What I'm looking for is a way to get like this format:
[
{
"key": 1,
"value": "test8"
},
{
"key": 2,
"value": "test6"
},
{
"key": 3,
"value": "test3"
}
]
Thanks for you help
You could use Aggregate transformation and use collect expression to combine all the JSON document and pass it to sink with JSON dataset. But this will not output the result exactly what you are looking for and gives aggregated column name in the output as shown below.
Aggregate:
Column1: collect(#(key=key,value=value))
Data flow Output:
As an alternative, you can copy the union JSON documents to the storage and use a copy data activity to get convert the JSON document to an array of documents.
Output:

How to archive old CosmosDB data to Azure Table using Azure Data Factory when CosmosDB collection documents have different properties?

I'm trying to archive old data from CosmosDB into Azure Tables but I'm very new to Azure Data Factory and I'm not sure what would be a good approach to do this. At first, I thought that this could be done with a Copy Activity but because the properties from my documents stored in the CosmosDB source vary, I'm getting mapping issues. Any idea on what would be a good approach to tackle this archiving process?
Basically, the way I want to store the data is to copy the document root properties as they are, and store the nested JSON as a serialized string.
For example, if I wanted to archive these 2 documents :
[
{
"identifier": "1st Guid here",
"Contact": {
"Name": "John Doe",
"Age": 99
}
},
{
"identifier": "2nd Guid here",
"Distributor": {
"Name": "Jane Doe",
"Phone": {
"Number": "12345",
"IsVerified": true
}
}
}
]
I'd like these documents to be stored in Azure Table like this:
identifier | Contact | Distributor
"Ist Guid here" | "{ \"Name\": \"John Doe\", \"Age\": 99 }" | null
"2nd Guid here" | null | "{\"Name\":\"Jane Doe\",\"Phone\":{\"Number\":\"12345\",\"IsVerified\":true}}"
Is this possible with the Copy Activity?
I tried using the mapping tab inside the CopyActivity, but when I try to run it I get an error saying that the dataType for one of the Nested JSON columns that are not present in the first row cannot be inferred.
Please follow my configuration in Mapping Tag.
Test output with your sample data:

'GET MetaData' Activity output to SQL table using ADF v2

I'm using 'GetMetadata' activity in my pipelines to get all the folders and child items and item types. But this activity is giving output in the JSON format which i'm unable to store the values to a variable so that i can iterate thru them. I need to store the Folders metadata in a sql table
Get Metadata activity sample output is like below.
{
"itemName": "ParentFolder",
"itemType": "Folder",
"childItems": [
{
"name": "ChildFolder1",
"type": "Folder"
},
{
"name": "ChildFolder2",
"type": "Folder"
},
{
"name": "ChildFolder3",
"type": "Folder"
}
],
"effectiveIntegrationRuntime": "DefaultIntegrationRuntime (North Europe)",
"executionDuration": 187
}
Can some help me to store the above json output of 'Get MetaData' Activity into a sql table like below.
The easiest way to do this is pass the Get MetaData output as a string to a stored proc and parse it in your sql db using OPENJSON.
This is how to convert the output to a string.
#string(activity('Get Metadata').output)
Now you just pass that to a stored proc and then use OPENJSON to parse it.
I have seen many others doing this using ADF foreach, however if you have 1000s of files/folders you will end up paying a lot for this method overtime. (each loop counts as an activity)

Resources