This is a sample JSON input packet. I'm writing transformation queries to get the data and it is working fine.
[{
"source": "xda",
"data":
[{
"masterTag": "UNIFY",
"speed": 180
}],
"EventEnqueuedUtcTime": "2018-07-20T19:28:18.5230000Z",
},
{
"source": "xda",
"data": [{
"masterTag": "UNIFY",
"speed": 214
}],
"EventEnqueuedUtcTime": "2018-07-20T19:28:20.5550000Z",
}
]
However, a custom property has been added to the message object when it is sent to IoT hub by the name of "proFilter". This is not inside the payload, but is present in the message object. I can get this property using Azure function but I'm not sure how to get it in Stream Analytics transformation query. Is there any way I can get it?
Basic transformation query:
WITH data AS
(
SELECT
source,
GetArrayElement(data,0) as data_packet
FROM input
)
SELECT
source,
data_packet.masterTag
INTO
output
FROM data
Include the following function in your SELECT statement:
GetMetadataPropertyValue(input, '[User].[proFilter]') AS proFilter
If you are interested in retrieving all your custom properties as a record, you can use
GetMetadataPropertyValue(input, '[User]') AS userprops
See this doc for further reference
Related
I have out form one of the tasks in Logic App:
{
"headers": {
"Connection": "close",
"Content-Type": "application/json"
},
"body": {
"systemAlertId": "....",
"endTimeUtc": null,
"entities": [
{
"$id": "us_1",
"hostName": "...",
"azureID": "someID",
"type": "host"
},
{
"$id": "us_2",
"address": "fwdedwedwedwed",
"location": {
"countryCode": "",
},
"type": "ip"
},
],
}
}
I need initialize some variable named resourceID that contains value someID which is read from above example.
Value someID will always be found in the first member of Entities array, in that case I guess need to use function first
Any idea how expression of Initial variable should look?
Thanks
Considering the mentioned data you are receiving from Http trigger, I have used Parse JSON in order to get the inner values of the mentioned JSON. Here is how you can do it.
and now you can initialize the resourceID using 'Initialise variable' connector and set its value to azureID as per your requirement.
Have a look at the Parse JSON action.
To reference or access properties in JavaScript Object Notation (JSON) content, you can create user-friendly fields or tokens for those properties by using the Parse JSON action. That way, you can select those properties from the dynamic content list when you specify inputs for your logic app. For this action, you can either provide a JSON schema or generate a JSON schema from your sample JSON content or payload.
With the information in the JSON available in an object, you can more easily access it.
So, I want to capture Administrative events sent by Azure to an EventHub with Stream Analytics Job and forward only the events which match an specific criteria to an Azure Function. The events come in an object like this (heavily trimmed to simplify):
{
"records": [
{
"resourceId": "<resource_path>",
"operationName": "MICROSOFT.COMPUTE/VIRTUALMACHINES/WRITE",
},
{
"time": "2021-03-19T19:19:56.0639872Z",
"operationName": "MICROSOFT.COMPUTE/VIRTUALMACHINES/WRITE",
"category": "Administrative",
"resultType": "Accept",
"resultSignature": "Accepted.Created",
"properties": {
"statusCode": "Created",
"serviceRequestId": "<trimmed>",
"eventCategory": "Administrative",
"message": "Microsoft.Compute/virtualMachines/write",
"hierarchy": "<trimmed>"
},
"tenantId": "<trimmed>"
}
],
"EventProcessedUtcTime": "2021-03-19T19:25:21.1471185Z",
"PartitionId": 1,
"EventEnqueuedUtcTime": "2021-03-19T19:20:43.9080000Z"
}
I want to filter the query based on these criteria: records[0].operationName = 'MICROSOFT.COMPUTE/VIRTUALMACHINES/WRITE' AND records[1].properties.statusCode = 'Created'. To achieve that, I began with the following query which returns this record, but it's lacking one of the criteria I NEED to match (statusCode)
SELECT
records
INTO
[output]
FROM
[input]
WHERE
GetArrayElement(records, 0).operationName = 'MICROSOFT.COMPUTE/VIRTUALMACHINES/WRITE'
Trying the query below doesn't work (it returns 0 matches):
SELECT
records
INTO
[output]
FROM
[input]
WHERE
GetArrayElement(records, 0).operationName = 'MICROSOFT.COMPUTE/VIRTUALMACHINES/WRITE'
AND GetArrayElement(records, 1).properties.statusCode = 'OK'
Anyone has a clue on this?
Found out the solution! I need to use GetRecordPropertyValue, like so:
SELECT
records
INTO
[output]
FROM
[input]
WHERE
GetArrayElement(records, 0).operationName = 'MICROSOFT.COMPUTE/VIRTUALMACHINES/WRITE'
AND GetRecordPropertyValue(GetArrayElement(records, 1).properties, 'statusCode') = 'Created'
Looks a bit clumsy to me, but it worked!
IoT Edge v2 with the modbus module sends data to IoT Hub in the format of:
[
{
"DisplayName": "Voltage",
"HwId": "",
"Address": "400001",
"Value": "200",
"SourceTimestamp": "2019-01-03 23:40:24"
},
{
"DisplayName": "Voltage",
"HwId": "",
"Address": "400002",
"Value": "24503",
"SourceTimestamp": "2019-01-03 23:40:24"
},
...
]
I want to convert this array to rows using a stream analytics query containing the CROSS APPLY GetArrayElements() but this function requires an array name. Obviously there is no name. Any suggestions?
https://learn.microsoft.com/en-us/stream-analytics-query/getarrayelements-azure-stream-analytics
https://learn.microsoft.com/en-us/azure/stream-analytics/stream-analytics-parsing-json
Yes, it needs an array name. CROSS APPLY GetArrayElements() is used for nested array.
Example:
[{
"source": "xda",
"data":
[{
"masterTag": "UNIFY1",
"speed": 180
},
{
"masterTag": "UNIFY2",
"speed": 180
}],
"EventEnqueuedUtcTime": "2018-07-20T19:28:18.5230000Z",
},
{
"source": "xda",
"data": [{
"masterTag": "UNIFY3",
"speed": 214
},
{
"masterTag": "UNIFY4",
"speed": 180
}],
"EventEnqueuedUtcTime": "2018-07-20T19:28:20.5550000Z",
}
]
You could use below sql to convert it to rows:
SELECT
jsoninput.source,
arrayElement.ArrayValue.masterTag
INTO
output
FROM jsoninput
CROSS APPLY GetArrayElements(jsoninput.data) AS arrayElement
However ,now the input data you provided is a pure array. If you want to convert this array to rows, just use sql:
select jsoninput.* from jsoninput
You don't have to use GetArrayElements. Just selecting json array as input format is enough. Stream analytics reads each object in the array as a record. Same with line or whitespace separated jain objects, each object is read as a record.
I am trying to use Azure Search Service for querying data based on spatial data.
I want to filter data based on geography. The query tried out in search explorer is search=*&$filter=geo.distance(geolocation, geography'POINT(9.2869001 47.3532887)') le 50
Type defined for geolocation field is Ed.GeographyPoint and attributes set are filterable and retrievable
But i am not getting results, instead getting message "Expected a JSON object, array or literal" in the result window.
What is wrong with the query is here.
I am using Standard plan for Azure and API version used is API version: 2016-09-01
Does your geolocation field has format similar to below?
"location": { "type": "Point", "coordinates": [ -121.355, 47.71 ], "crs": { "type": "name", "properties": { "name": "EPSG:4326" } } }
I have setup my application to consume the content deliver API using the contentful SDK, its all hunky dory untill now when i realized the fieldType for each field in the content model is missing in the API response.
Am i missing something? I am providing more details about the API and its response below -
API response
The issue is if i dont know the the field type, i would have to ask the content writers to stick to a specific template and order of the fields instead of rendering the fields dynamically as you parse the response.
Please help!
You do not get the field types in the response, but you do get the content type id. It is expected that you already know what type of fields a specific content type contains.
The content type id can be found in the sys.contentType.sys.id property of each entry. With this information you could select which template to render.
If you still need to dynamically decide how to render based on the type of a field you would have to resort to the typeof operator to check what the type of each field is. You would lose out on the possibility to differentiate between specific Contentful properties though as they would all be returned as object.
You could also call the content type endpoint to fetch the entire content model from the Contentful API. http://cdn.contentful.com/spaces/space-id/content_types/
This would give you a the field each content type contains and the type of each field in the following structure:
{
"sys": {
// sys properties
},
"displayField": "productName",
"name": "Product",
"description": null,
"fields": [
{
"id": "productName",
"name": "Product name",
"type": "Text",
"localized": true,
"required": true,
"disabled": false,
"omitted": false
},
{
"id": "slug",
"name": "Slug",
"type": "Symbol",
"localized": false,
"required": false,
"disabled": false,
"omitted": false
},
// further fields
]
}
It would result in multiple API calls to get the information you want though.