Azure automation job OData $filter - azure

Background
I have an application which get list of runbooks jobs using REST API
I would like to apply $filter on properties parameters(see JSON in the end). Parameters is of type IDictionary<string,string>
Issue
$filter works fine for most of the properties but fails for IDictionary<string,string>
This is what I am trying https://management.azure.com/subscriptions/XXX/resourceGroups/XXX/providers/Microsoft.Automation/automationAccounts/XXX/jobs?$filter=properties/parameters/any(keyValue: keyValue/owner eq 'Adam#test.com')&api-version=2015-10-31
Request fails with error {
"code": "BadRequest",
"message": "Could not find a property named 'owner' on type 'System.Collections.Generic.KeyValuePair_2OfString_String'."
}
Question
Is it possible to filter what I am trying?
If yes, then what I am doing wrong?
JSON response on which I want to apply filter
"value": [
{
"id": "/subscriptions/XXX/resourceGroups/XXX/providers/Microsoft.Automation/automationAccounts/XXX/jobs/XXX",
"properties": {
"jobId": "XXX",
"runbook": {
"name": "HelloWorldRunbook"
},
"schedule": null,
"provisioningState": "Succeeded",
"status": "Completed",
"creationTime": "2018-06-17T05:44:12.197+00:00",
"startTime": "2018-06-17T05:44:21.227+00:00",
"lastModifiedTime": "2018-06-17T05:44:43.43+00:00",
"endTime": "2018-06-17T05:44:43.43+00:00",
"jobScheduleId": "7fc134ac-d8bd-464e-b041-6e6b50f83f0c",
"runOn": null,
"parameters": {
"Owner": "Adam#test.com",
"mailBox": "test_mailbox#test.com"
}
}
............removed for brevity

When I try to run List Job By Automation Account Rest API, there is no properties parameters returned in the response body.
If yes, then what I am doing wrong?
...
"parameters": {
"Owner": "Adam#test.com",
"mailBox": "test_mailbox#test.com"
}
...
But according to you mentioned responsed json format, the parameters properties is an object not a dictionary, so you could use $filter=properties/parameters/Owner eq 'Adam#test.com' to do that if you make sure that parameters is existed.
Get https://management.azure.com/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.Automation/automationAccounts/{automationAccountName}/jobs?api-version=2015-10-31&$filter=properties/parameters/Owner eq 'Adam#test.com'

Related

Azure Data Factory with SP Activity - Debug and Publish fails

I've created an Azure Data Factory pipeline with one simple Stored Procedure Activity which is supposed to fetch data from a Stored Procedure residing in Azure SQL DB. The Stored Procedure accepts one input parameter. I've published these changes already.
When I click on Validate, I get the below error from where I hardly get any information:
{
"code": "BadRequest",
"message": null,
"target": "pipeline//runid/dcb92f70-0a4b-4be1-943b-5ggn68365tyc",
"details": null,
"error": null
}
When I click on Trigger now, it just says 'Failed to run pipeline' without anymore details.
My pipeline JSON is given below:
{
"name": "GetPopulationRecordsForAnalysis",
"properties": {
"description": "Gets Population Records",
"activities": [
{
"name": "GetPopulationRecords",
"type": "SqlServerStoredProcedure",
"dependsOn": [],
"policy": {
"timeout": "7.00:00:00",
"retry": 0,
"retryIntervalInSeconds": 30,
"secureOutput": false,
"secureInput": false
},
"userProperties": [],
"typeProperties": {
"storedProcedureName": "[dbo].[usp_GetPopulationRecords]",
"storedProcedureParameters": {
"#countryID": {
"value": "48",
"type": "Int64"
}
}
},
"linkedServiceName": {
"referenceName": "AzureSqlLinkedService",
"type": "LinkedServiceReference"
}
}
],
"annotations": [],
"lastPublishTime": "2022-08-02T13:37:27Z"
},
"type": "Microsoft.DataFactory/factories/pipelines"
}
What am I doing wrong here?
I have figured out the issue now. The first mistake that I was doing is that I was giving the complete SP name with schema, '['character and all, usp_GetPopulationRecords works just fine. Second is that I was adding an extra '#' character before my Input parameter like how we do while running in SQL Server. That is not required here, only countryIDworks fine. Hope my answer helps.
When we connect to the Linked service in the Settings tab, “Stored Procedure name” dropdown will populate the names of the Stored Procedures present in the Database.
We should select the required Stored procedure and Upon clicking Import it will display parameter Name, Type and Value (supply Value) as below:
We need not give ‘#’ before parameter name.
Corresponding JSON will look below:
This will be Validated successfully in ADF.

How to purge records from Application Insights bucket?

I used to following instructions from https://learn.microsoft.com/en-us/rest/api/application-insights/components/purge, but now it produces the following error:
Purge operation on a workspace-based Application Insights resource is not supported. Instead, please run purge operation directly on the workspace, scoped to specific resource id. See more at https://aka.ms/purgeai"
OK, so https://aka.ms/purgeai are the new instructions, but it does not explain how to "scope the operation to the specific resource id", because the records are still in an AI bucket, not the workspace.
So, how do I purge from AI in the new world?
EDIT 1
I should have been more clear. I do not want to purge all the records. For example, consider the following purge request body, which was enough in the past:
{
"table": "customEvents",
"filters": [
{
"column": "customMeasurements",
"key": "seq",
"operator": "==",
"value": "''"
}
]
}
It could be used to purge all the records with customMeasurements.seq being empty. Now how do I compose it with the _ResourceId field?
When giving the following request body:
{
"table": "customEvents",
"filters": [
{
"column": "customMeasurements",
"key": "seq",
"operator": "==",
"value": "''"
},
{
"column": "_ResourceId",
"operator": "==",
"value": "/subscriptions/.../resourcegroups/.../providers/microsoft.insights/components/..."
}
]
}
The request fails with:
{
"error": {
"message": "The request had some invalid properties",
"code": "BadArgumentError",
"correlationId": "...",
"innererror": {
"code": "QueryValidationError",
"message": "'customEvents' is not a valid table"
}
}
}
EDIT 2
There is still a problem. Indeed, consider the following query:
AppEvents
| where Measurements.seq == '' and _ResourceId == '/subscriptions/d...8/resourcegroups/app508-re-mark-test/providers/microsoft.insights/components/app508-re-mark-test-ai'
| summarize count()
This query works and return some count.
Now I want to use the same condition to purge records in the same workspace:
{
"table": "AppEvents",
"filters": [
{
"column": "Measurements",
"key": "seq",
"operator": "==",
"value": "''"
},
{
"column": "_ResourceId",
"operator": "==",
"value": "/subscriptions/d...8/resourcegroups/app508-re-mark-test/providers/microsoft.insights/components/app508-re-mark-test-ai"
}
]
}
But I get back the following error:
{
"error": {
"message": "The request had some invalid properties",
"code": "BadArgumentError",
"correlationId": "c08bb441-a316-4989-a527-202a101e515d",
"innererror": {
"code": "QueryValidationError",
"message": "Unsupported column '_ResourceId'"
}
}
}
Note, that omitting the _ResourceId works, but I do want to be able to use it.
What am I missing?
Thank you for your question, and feedback.
What you will need to do, is to filter by the specific resource id, in addition to any other required filter.
Ex:
`{
"column": "_ResourceId",
"operator": "==",
"value": “/subscriptions/12341234-1234-1234-1234-123412341234/resourceGroups/SomeResourceGroup/providers/microsoft.insights/components/AppInsightResource”
}`
Please note,
The table names have changed between the Classic Applicaiton insights, and the workspace-based Application insights. For example, in your case the table name should be AppEvents instead of customEvents.
You can review this link for other table names changes.
We will also modify the articles to reflect this information.

Azure Data Factory Copy Activity error mapping JSON to SQL

I have an Azure Data Factory Copy Activity that is using a REST request to elastic search as the Source and attempting to map the response to a SQL table as the Sink. Everything works fine except when it attempts to map the data field that contains the dynamic JSON. I get the following error:
{
"errorCode": "2200",
"message": "ErrorCode=UserErrorUnsupportedHierarchicalComplexValue,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=The retrieved type of data JObject with value {\"name\":\"department\"} is not supported yet, please either remove the targeted column or enable skip incompatible row to skip them.,Source=Microsoft.DataTransfer.Common,'",
"failureType": "UserError",
"target": "CopyContents_Paged",
"details": []
}
Here's an example of my mapping configuration:
"type": "TabularTranslator",
"mappings": [
{
"source": {
"path": "['_source']['id']"
},
"sink": {
"name": "ContentItemId",
"type": "String"
}
},
{
"source": {
"path": "['_source']['status']"
},
"sink": {
"name": "Status",
"type": "Int32"
}
},
{
"source": {
"path": "['_source']['data']"
},
"sink": {
"name": "Data",
"type": "String"
}
}
],
"collectionReference": "$['hits']['hits']"
}
The JSON in the data object is dynamic so I'm unable to do an explicit mapping for the nested fields within it. That's why I'm trying to just store the entire JSON object under data in a column of a SQL table.
How can I adjust my mapping configuration to allow this to work properly?
I posted this question on the MSDN forums and I was told that if you are using a tabular sink you can set this option "mapComplexValuesToString": true and it should allow complex JSON properties to get mapped correctly. This resolved my ADF copy activity issue.
I have the same problem a few days ago. You need to convert your JSON object to a Json String. It will solve your mapping problem (UserErrorUnsupportedHierarchicalComplexValue).
Try it and tell me if also resolves your error.

Stream Analytics Job deployed as Azure Resource Manager (ARM) template

I am trying to setup an output EventHub for a Stream Analytics Job defined as a JSON template. Without the output bit the template is successfully deployed, however when adding the output definition it fails with:
Deployment failed. Correlation ID: <SOME_UUID>. {
"code": "BadRequest",
"message": "The JSON provided in the request body is invalid. Property 'eventHubName' value 'parameters('eh_name')' is not acceptable.",
"details": {
"code": "400",
"message": "The JSON provided in the request body is invalid. Property 'eventHubName' value 'parameters('eh_name')' is not acceptable.",
"correlationId": "<SOME_UUID>",
"requestId": "<SOME_UUID>"
}
}
I've defined the ARM template as:
{
"$schema": "http://schema.management.azure.com/schemas/2015-01-01/deploymentTemplate.json#",
"contentVersion": "1.0.0.0",
"parameters": {
"location": {
"type": "string",
"defaultValue": "westeurope"
},
"hubName": {
"type": "string",
"defaultValue": "fooIotHub"
},
"eh_name": {
"defaultValue": "fooEhName",
"type": "String"
},
"eh_namespace": {
"defaultValue": "fooEhNamespace",
"type": "String"
},
"streamAnalyticsJobName": {
"type": "string",
"defaultValue": "fooStreamAnalyticsJobName"
}
},
"resources": [{
"type": "Microsoft.StreamAnalytics/StreamingJobs",
"apiVersion": "2016-03-01",
"name": "[parameters('streamAnalyticsJobName')]",
"location": "[resourceGroup().location]",
"properties": {
"sku": {
"name": "standard"
},
"outputErrorPolicy": "Drop",
"eventsOutOfOrderPolicy": "adjust",
"eventsOutOfOrderMaxDelayInSeconds": 0,
"eventsLateArrivalMaxDelayInSeconds": 86400,
"inputs": [{
"Name": "IoTHubInputLable",
"Properties": {
"DataSource": {
"Properties": {
"iotHubNamespace": "[parameters('hubName')]",
"sharedAccessPolicyKey": "[listkeys(resourceId('Microsoft.Devices/IotHubs/IotHubKeys',parameters('hubName'), 'iothubowner'),'2016-02-03').primaryKey]",
"sharedAccessPolicyName": "iothubowner",
"endpoint": "messages/events"
},
"Type": "Microsoft.Devices/IotHubs"
},
"Serialization": {
"Properties": {
"Encoding": "UTF8"
},
"Type": "Json"
},
"Type": "Stream"
}
}],
"transformation": {
"name": "Transformation",
"properties": {
"streamingUnits": 1,
"query": "<THE SQL-LIKE CODE FOR THE JOB QUERY>"
}
},
"outputs": [{
"name": "EventHubOutputLable",
"properties": {
"dataSource": {
"type": "Microsoft.ServiceBus/EventHub",
"properties": {
"eventHubName": "parameters('eh_name')",
"serviceBusNamespace": "parameters('eh_namespace')",
"sharedAccessPolicyName": "RootManageSharedAccessKey"
}
},
"serialization": {
"Properties": {
"Encoding": "UTF8"
}
}
}
}]
}
}]
}
Checking here https://learn.microsoft.com/en-us/azure/templates/microsoft.streamanalytics/streamingjobs
it looks like the structure of the JSON for the output is as the expected one (with the properties field along with the type).
I've figured out those "Event Hub properties" from the Chrome browser using Developer Tools and checking the details of the HTTP request "GetOutputs", otherwise I am not sure where I could see how to specify those properties? The structure looks quite similar to the one for the input IoT Hub (which is working), in that case using different lables for the properties related to the IoT Hub details.
Checking this blog post https://blogs.msdn.microsoft.com/david/2017/07/20/building-azure-stream-analytics-resources-via-arm-templates-part-2-the-template/
the output part is related to PowerBI and it looks like a different structure is used for the properties: outputPowerBISource, so I've tried to use for the Event Hub the field outputEventHubSource (from the checks using Chrome Developer Tools) instead of properties, but then I get this error:
Deployment failed. Correlation ID: <SOME_UUID>. {
"code": "BadRequest",
"message": "The JSON provided in the request body is invalid. Required property 'properties' not found in JSON. Path '', line 1, position 1419.",
"details": {
"code": "400",
"message": "The JSON provided in the request body is invalid. Required property 'properties' not found in JSON. Path '', line 1, position 1419.",
"correlationId": "<SOME_UUID>",
"requestId": "<SOME_UUID>"
}
}
The command I am using to deploy this template is the Azure CLI (from a Linux Debian machine):
az group deployment create \
--name "deployStreamAnalyticsJobs" \
--resource-group "MyRGName" \
--template-file ./templates/stream-analytics-jobs.json
How do I specify an output in an Azure Resource Manager (ARM) template for a Stream Analytics Job?
Any property that contains a parameter (or any expression that needs to be evaluated must contain square brackets, e.g.
"eventHubName": "[parameters('eh_name')]",
"serviceBusNamespace": "[parameters('eh_namespace')]",
Otherwise the literal value in the quotes is used.
That help?
I found out all parameters need to be wrapped in square brakets (as pointed out in the other answer to this question).
Also to dynamically retrieve the shared access policy key (or any other parameter for an existing resource like the Event Hub) a combination of functions like listKeys and resourceId etc must be used, see below for a full example of an Event Hub described as an output for a Stream Analytics Job.
In details:
the parameters defined for eventHubName and serviceBusNamespace must be evaluated using square brackets (see how I defined those parameters in the JSON example in the body of the question I asked above),
the shared access policy could be either an hardcoded string (or a parameter as before) like sharedAccessPolicyName or dynamically retrieved using "sharedAccessPolicyKey": "[listKeys(resourceId('Microsoft.EventHub/namespaces/eventhubs/authorizationRules', parameters('eh_namespace'), parameters('eh_name'), 'RootManageSharedAccessKey'),'2017-04-01').primaryKey]" for the sharedAccessPolicyKey (this is sensitive data and it should be protected avoiding hardcoding information as a plain string)
The following JSON configuration shows an existing Event Hub defined as an output defined for the Stream Analytics Job:
"outputs": [{
"Name": "EventHubOutputLable",
"Properties": {
"DataSource": {
"Type": "Microsoft.ServiceBus/EventHub",
"Properties": {
"eventHubName": "[parameters('eh_name')]",
"serviceBusNamespace": "[parameters('eh_namespace')]",
"sharedAccessPolicyKey": "[listKeys(resourceId('Microsoft.EventHub/namespaces/eventhubs/authorizationRules', parameters('eh_namespace'), parameters('eh_name'), 'RootManageSharedAccessKey'),'2017-04-01').primaryKey]",
"sharedAccessPolicyName": "RootManageSharedAccessKey"
}
},
"Serialization": {
"Properties": {
"Encoding": "UTF8"
},
"Type": "Json"
}
}
}]

How do I use Azure Key Vault secret in linked template

I'm trying to create automation variable off KeyVault secret. I assume I can probably do the same thing what is currently done in main template for retrieving windows password but it fails with non-descriptive error below. Not sure what shall be done next to troubleshoot.
Error
{
"code": "BadRequest",
"message": "{\"Message\":\"The request is invalid.\",\"ModelState\":{\"variable.properties.value\":[\"An error has occurred.\"]}}"
}
Template
{
"name": "mystring",
"type": "variables",
"apiVersion": "2015-10-31",
"dependsOn": [
"[concat('Microsoft.Automation/automationAccounts/', parameters('AutomationAccountName'))]"
],
"properties": {
"value": {
"reference": {
"keyVault": {
"id": "[resourceId(subscription().subscriptionId, 'Utility-RG', 'Microsoft.KeyVault/vaults', 'MyKeyVault')]"
},
"secretName": "WindowsPasswordSecret"
}
},
"description": "test var",
"isEncrypted": false
}
}
That error is indeed helpful, while I have no idea what went wrong there, I can tell you how to work around that, you need to pass the data from the KV to the template (as input parameter) not to the resource. And in the template use parameter to assign value to the object in question.
Reference: https://github.com/4c74356b41/bbbb-is-the-word/blob/master/_arm/parent.json#L151

Resources