I am pulling Recommendations from the Azure Advisor Rest Api and am not able to retrieve the extendedProperties values.
Specifically, I am looking for savings data from Recommendations of the Cost category.
In the following video at 58 seconds there is an example of the expected response.
https://www.youtube.com/watch?v=hAxrdmOAB8s
Are there specific permissions necessary to give my account in order to pull the data, or is the API not capable of supplying the values?
I am able to see the data in the portal, but the extendedProperties property is always empty.
I'm supposing you're trying the Recommendations - List API.
Essentially, extended properties expose additional information about a recommendation from Azure Advisor.
AFAIK, they need not be present for every recommendation, and shouldn't need additional privileges to list. It could just be the case that the type of recommendations you are receiving do not have any to list.
Here is a sample response that I received that has a mix of both:
[
{
"properties": {
"category": "Cost",
"impact": "Medium",
"impactedField": "Microsoft.Network/publicIPAddresses",
"impactedValue": "foo",
"lastUpdated": "2020-03-20T14:10:24.6928024Z",
"recommendationTypeId": "1b4dd958-c202-47af-af97-99bfc98376a5",
"shortDescription": {
"problem": "Delete Public IP address not associated to a running Azure resource",
"solution": "Delete Public IP address not associated to a running Azure resource"
},
"extendedProperties": {}
},
"id": "xxx",
"type": "Microsoft.Advisor/recommendations",
"name": "xxx"
},
{
"properties": {
"category": "Cost",
"impact": "Medium",
"impactedField": "Microsoft.Sql/servers/databases",
"impactedValue": "bar",
"lastUpdated": "2020-03-20T13:27:35.8394386Z",
"recommendationTypeId": "b83241d3-47ba-4603-8d5a-a1b3331e74f4",
"shortDescription": {
"problem": "Right-size underutilized SQL Databases",
"solution": "Right-size underutilized SQL Databases"
},
"extendedProperties": {
"ServerName": "fooserver",
"DatabaseName": "fooDB",
"IsInReplication": "1",
"ResourceGroup": "xyz",
"DatabaseSize": "6",
"Region": "East US 2",
"ObservationPeriodStartDate": "03/04/2020 00:00:00",
"ObservationPeriodEndDate": "03/19/2020 00:00:00",
"Recommended_DTU": "10",
"Recommended_SKU": "S0",
"HasRecommendation": "true"
}
}
}
]
Related
I'm working on testing out using GitHub and GitHub Actions to do policy as code for Azure. I have been successful in following the tutorials that Microsoft has where you export the policy you want to manage to GitHub from the Azure portal. This works fine and I'm able to edit and run the workflows to update Azure with changes to policies.
What I'd like to know is, can you create NEW policies in GitHub and push them to Azure? It seems that you need to first export a custom policy from Azure into GitHub, then you can manage that policy. I say this because when I create a new policy and a workflow for that policy I get the following error in GitHub from the workflow:
> Did not find any policies to create/update. No policy files match the
> given patterns or no changes were detected.
The policy I have in the folder is called "policy.json"
I also see:
Error occured while reading policy in path :
policies/global_tagging_policy. Error : Error: Path :
policies/global_tagging_policy. Property id is missing from the policy
definition. Please add id to the definition file.
That leads me to believe I need an ID prior to being able to push a policy, that says to me that Azure must have assigned one... I can't just make one up.
This is the policy I'm trying to push - just a tagging policy for testing, I don't have an ID in there, I read that you don't need to add one... that Azure would do it for you. Am I wrong?:
{
"properties": {
"displayName": "test-policy",
"description": "this is a test policy",
"mode": "indexed",
"parameters": {
"tagName": {
"type": "String",
"metadata": {
"displayName": "Tag Name",
"description": "Name of the tag, such as 'environment'"
}
},
"tagValue": {
"type": "String",
"metadata": {
"displayName": "Tag Value",
"description": "Value of the tag, such as 'production'"
}
}
}
},
"policyRule": {
"if": {
"allOf": [
{
"field": "type",
"equals": "Microsoft.Resources/subscriptions/resourceGroups"
},
{
"field": "[concat('tags[', parameters('tagName'), ']')]",
"exists": "false"
}
]
},
"then": {
"effect": "modify",
"details": {
"roleDefinitionIds": [
"/providers/microsoft.authorization/roleDefinitions/b24988ac-6180-42a0-ab88-20f7382dd24c"
],
"operations": [
{
"operation": "add",
"field": "[concat('tags[', parameters('tagName'), ']')]",
"value": "[parameters('tagValue')]"
}
]
}
}
}
}
This tripped me up too so I did some exploring of the APIs and files. I've written about this in greater detail here.
To create a custom Policy, Initiative or Assignment file using GitHub Actions you'll need to generate an id, name & type at the root of the JSON.
The name property needs to be unique at the scope you assign it, I use GUIDs for this but you don't have to. Bear in mind if you define/assign at the Management Group scope then the name needs to be 24 characters or less.
The type denotes the type of file, the options are:
Microsoft.Authorization/policyDefinitions --> Policies
Microsoft.Authorization/policySetDefinitions --> Initiatives
Microsoft.Authorization/policyAssignments --> Assignments
The id is a bit more complex, and is a concatenation of the name and type values with other values mixed in.
The prefix depends on the scope which you want to define your Policy/Initiative/Assignment.
For Management Groups it would be:
/providers/Microsoft.Management/managementGroups/00000000-0000-0000-0000-000000000000
Subscriptions would be:
/subscrptions/00000000-0000-0000-0000-000000000000
Resource Groups:
/subscriptions/00000000-0000-0000-0000-000000000000/resourceGroups/myResourceGroup
This is followed by: providers in all cases
Next is the type value, so whatever you've used for that use again here.
Finally the last segment of the id is the same value you've used for the name property.
In one line that is
/{scope}/providers/{type}/{name}
So as an example:
Policy Definition scoped at a Management Group
{
"id": "/providers/Microsoft.Management/managementGroups/00000000-0000-0000-0000-000000000000/providers/Microsoft.Authorization/policyDefinitions/5f44e572-5d2d-4edf-9d61",
"name": "5f44e572-5d2d-4edf-9d61",
"type": "Microsoft.Authorization/policyDefinitions",
"properties":{}
}
Policy Definition scoped at a Subscription
{
"id": "/subscriptions/00000000-0000-0000-0000-000000000000/providers/Microsoft.Authorization/policyDefinitions/8e4a8c58-1938-4467-8698",
"name": "8e4a8c58-1938-4467-8698",
"type": "Microsoft.Authorization/policyDefinitions",
"properties":{}
}
Initiative scoped at a Management Group
{
"id": "/providers/Microsoft.Management/managementGroups/00000000-0000-0000-0000-000000000000/providers/Microsoft.Authorization/policySetDefinitions/be09f23f-0252-4d8a-a805",
"name": "5f44e572-5d2d-4edf-9d61",
"type": "Microsoft.Authorization/policySetDefinitions",
"properties":{}
}
Initiative scoped at a Subscription
{
"id": "/subscriptions/00000000-0000-0000-0000-000000000000/providers/Microsoft.Authorization/policySetDefinitions/8e4a8c58-1938-4467-8698",
"name": "8e4a8c58-1938-4467-8698",
"type": "Microsoft.Authorization/policySetDefinitions",
"properties":{}
}
I have a scenario : I want to build an azure logic app, where I have to got documents from various folder from the Sharepoint get process and give email notification. My confusion is how can I give multiple input folder path?
I'm going to make an assumptions with your architecture in my answer. I'm assuming you want to process multiple files in different sites within the same SharePoint tenant. So, not across tenants.
To achieve what you're asking for, I created a Parse JSON action which takes in the following structure (as an example, obviously the structure is the key point here, not the data) ...
Scenario 1 - Specific Files
[
{
"SiteName": "ExampleSolution",
"FileName": "/Shared Documents/General/Book.xlsx"
},
{
"SiteName": "TestSite",
"FileName": "/Shared Documents/Test Folder/Document.docx"
}
]
The SP tenant needs to be authenticated to with the appropriate user.
Then, in a For Each action, loop through each item and retrieve the contents of each document using the Get file content using path action.
Site Address = concat('https://yourtenant.sharepoint.com/sites/', items('For_each')?['SiteName'])
File Path = File Name (from Dynamic Content)
It will then retrieve the contents dynamically using those expressions.
File 1 (Excel Document)
File 2 (Word Document)
Scenario 2 - All Files
If you want to do it for all files, just change it up slightly ...
[
{
"FolderName": "/Shared Documents/General",
"SiteName": "ExampleSolution"
},
{
"FolderName": "/Shared Documents/Test Folder",
"SiteName": "TestSite"
}
]
Site Address = concat('https://yourtenant.sharepoint.com/sites/', items('For_each')?['SiteName'])
File Identifier = Folder Name (from Dynamic Content)
Output - Folder 1
[
{
"Id": "%252fShared%2bDocuments%252fGeneral%252fBook.xlsx",
"Name": "Book.xlsx",
"DisplayName": "Book.xlsx",
"Path": "/Shared Documents/General/Book.xlsx",
"LastModified": "2021-12-24T02:56:14Z",
"Size": 15330,
"MediaType": "application/vnd.openxmlformats-officedocument.spreadsheetml.sheet",
"IsFolder": false,
"ETag": "\"{23948609-0DA0-43E0-994C-2703FEEC8567},7\"",
"FileLocator": "dataset=aHR0cHM6Ly9icmFka2RpeG9uLnNoYXJlcG9pbnQuY29tL3NpdGVzL0V4YW1wbGVTb2x1dGlvbg==,id=JTI1MmZTaGFyZWQlMmJEb2N1bWVudHMlMjUyZkdlbmVyYWwlMjUyZkJvb2sueGxzeA==",
"LastModifiedBy": null
},
{
"Id": "%252fShared%2bDocuments%252fGeneral%252fTest%2bDocument.docx",
"Name": "Test Document.docx",
"DisplayName": "Test Document.docx",
"Path": "/Shared Documents/General/Test Document.docx",
"LastModified": "2021-12-30T11:49:28Z",
"Size": 17959,
"MediaType": "application/vnd.openxmlformats-officedocument.wordprocessingml.document",
"IsFolder": false,
"ETag": "\"{7A3C7133-02FC-4A63-9A58-E11A815AB351},8\"",
"FileLocator": "dataset=aHR0cHM6Ly9icmFka2RpeG9u etc",
"LastModifiedBy": null
},
{
"Id": "%252fShared%2bDocuments%252fGeneral%252fHierarchy.xlsx",
"Name": "Hierarchy.xlsx",
"DisplayName": "Hierarchy.xlsx",
"Path": "/Shared Documents/General/Hierarchy.xlsx",
"LastModified": "2022-01-07T02:49:38Z",
"Size": 41719,
"MediaType": "application/vnd.openxmlformats-officedocument.spreadsheetml.sheet",
"IsFolder": false,
"ETag": "\"{C919454C-48AB-4897-AD8C-E3F873B52E50},72\"",
"FileLocator": "dataset=aHR0cHM6Ly9icmFka2RpeG9uL etc",
"LastModifiedBy": null
}
]
Output - Folder 2
[
{
"Id": "%252fShared%2bDocuments%252fTest%2bFolder%252fTest.xlsx",
"Name": "Test.xlsx",
"DisplayName": "Test.xlsx",
"Path": "/Shared Documents/Test Folder/Test.xlsx",
"LastModified": "2022-01-09T11:08:31Z",
"Size": 17014,
"MediaType": "application/vnd.openxmlformats-officedocument.spreadsheetml.sheet",
"IsFolder": false,
"ETag": "\"{CCF71CE7-89E7-4F89-B5CB-0F078E22C951},163\"",
"FileLocator": "dataset=aHR0cHM6Ly9icmFka2RpeG9u etc",
"LastModifiedBy": null
},
{
"Id": "%252fShared%2bDocuments%252fTest%2bFolder%252fDocument.docx",
"Name": "Document.docx",
"DisplayName": "Document.docx",
"Path": "/Shared Documents/Test Folder/Document.docx",
"LastModified": "2022-01-09T11:08:16Z",
"Size": 17293,
"MediaType": "application/vnd.openxmlformats-officedocument.wordprocessingml.document",
"IsFolder": false,
"ETag": "\"{317C5767-04EC-4264-A58B-27A3FA8E4DF3},3\"",
"FileLocator": "dataset=aHR0cHM6Ly9icmFka2RpeG etc",
"LastModifiedBy": null
}
]
From here, just process each file individually using one of the files actions like in the first scenario above.
Note: You'll need to work through sub folders and recursion. There doesn't appear to be a way to do that easily.
You've provided very little information but it should be enough for you to adapt it accordingly.
Also, I strongly recommend you use a means other than a hardcoded JSON document in the action itself. There are way better means for housing that information which wouldn't result in a need to update the action itself everytime you want to add or delete a file.
The concept of the loop and and the expressions are the most important part to grasp as they will give you what you want.
Current situation: I currently have a working web app bot with LUIS integration (NODE.js). I want to add a QnA maker to the bot. I have created a QnA maker via the Azure bot service and created a knowledge base for it to use.
Issue: When adding the qna maker details to the bot and running with nodemon ./index.js, I get the error "Error: The encrypted value is not a valid format". I've tested and this error is thrown when it tries to read the hostname value during
botConfig = BotConfiguration.loadSync(BOT_FILE, process.env.botFileSecret);
When pasting the hostname into the browser Azure shows me a "Your App Service app is up and running" page, indicating the hostname is fine.
Questions:
How do I debug this further? Could it be something to do with how the QnA maker is set up?
Both the QnA maker and Knoledge base is published - is there something I have to add manually to the config of the bot via the Azure portal to get it to recognise the QnA maker?
A lot of the documentation is based on v3 of the bot framework and have no idea if it's still applicable.
QnA snippet in Bot file (some values omitted, not sure how sensitive they are):
{
"type": "qna",
"name": "pathqna",
"KbId": "OMITTED",
"subscriptionId": "OMITTED",
"endpointKey": "OMITTED",
"hostname": "https://pathqna.azurewebsites.net",
"id": "7"
}
Documentation I've looked at:
https://learn.microsoft.com/en-gb/azure/bot-service/bot-builder-tutorial-dispatch?view=azure-bot-service-4.0&tabs=javascript
https://learn.microsoft.com/en-us/azure/cognitive-services/qnamaker/tutorials/create-qna-bot
https://github.com/Microsoft/botbuilder-tools/blob/master/packages/MSBot/docs/sample-bot-file.json
https://learn.microsoft.com/en-us/azure/cognitive-services/QnAMaker/how-to/troubleshooting-runtime#how-to-get-latest-qnamaker-runtime-updates
Full bot file with the empty padlock value (all OMITTED values have real keys and name has been changed to Test):
{
"name": “Test”,
"padlock": "",
"version": "2.0",
"services": [
{
"tenantId": "OMITTED",
"subscriptionId": "OMITTED",
"resourceGroup": OMITTED,
"serviceName": OMITTED,
"type": "abs",
"name": OMITTED,
"id": "1"
},
{
"connectionString": "OMITTED",
"tenantId": "OMITTED",
"subscriptionId": "OMITTED",
"resourceGroup": OMITTED,
"serviceName": "patha048",
"type": "blob",
"id": "2"
},
{
“appId": "OMITTED",
"appPassword": “OMITTED”,
"endpoint": "http://localhost:3978/api/messages",
"type": "endpoint",
"name": "development",
"id": "3"
},
{
"appId": "OMITTED",
"appPassword": “OMITTED”,
"endpoint": "https://path-a048.azurewebsites.net/api/messages",
"type": "endpoint",
"name": "production",
"id": "4"
},
{
"instrumentationKey": “OMITTED”,
"applicationId": “OMITTED”,
"apiKeys": {},
"tenantId": "OMITTED",
"subscriptionId": "OMITTED",
"resourceGroup": OMITTED,
"serviceName": "Patht6r6m4",
"type": "appInsights",
"id": "5"
},
{
"appId": “OMITTED”,
"authoringKey": “OMITTED”,
"version": "0.1",
"region": "westus",
"type": "luis",
"name": "BasicBotLuisApplication",
"id": "6"
},
{
"type": "qna",
"name": "pathqna",
"id": "7",
"kbId": “OMITTED”,
"subscriptionKey": "OMITTED",
"endpointKey": “OMITTED”,
"hostname": "https://pathqna.azurewebsites.net"
}
]
}
Found the solution - Use msbot cli to add the QnA maker instead of adding manually as the file is encrypted and loses it's decryption if you don't use the msbot/emulator shrug
I removed the qna snippet and ran this command (have added the generic values to preserve the real values):
msbot connect qna --secret <botFileSecret> --name pathqna --kbId <KB-ID> --subscriptionKey <SUB_KEY> --endpointKey <ENDPOINT_KEY> --hostname "https://pathqna.azurewebsites.net" --bot Path.bot
This preserved the padlock value and added it successfully.
Although the information you have provided is not enough to provide a solution, you may please check on the following steps.
"Error: The encrypted value is not a valid format"
*Please check your bot secret keys once again.
Then, in your bot file, try removing the padlock value.
"padlock": ""
Also, I assume in your actual code, you have replaced 'OMITTED' with the real time keys that you have obtained from QnAMaker portal.*
Provide a screenshot of the error if possible.
guys, couldn't find similar question, so asking here.
We have a client to Microsoft REST API, and we receive consumed usage normally for multiple subscriptions.
But there's a problematic point.
There are some resource types, which are billed depending on the consumed volume. Each of these has got it's own resource ID. For example for BLOB storage there're at least 3 different IDs depending on the consumed amount (which I suspect, should be billed differently).
The question is - am I right presuming, that when end user (our customer) will exceed amount of resources allocated for a particular usage resource ID, next report will contain different resource ID for the same, well, resource?
Here's the REST response i'm talking about:
{
"usageStartTime": "2017-06-07T17:00:00-07:00",
"usageEndTime": "2017-06-08T17:00:00-07:00",
"resource": {
"id": "**8767aeb3-6909-4db2-9927-3f51e9a9085e**", //I'm talking about this one
"name": "Storage Admin",
"category": "Storage",
"subcategory": "Block Blob",
"region": "Azure Stack"
},
"quantity": 0.217790327034891,
"unit": "1 GB/Hr",
"infoFields": {},
"instanceData": {
"resourceUri": "/subscriptions/ab7e2384-eeee-489a-a14f-1eb41ddd261d/resourcegroups/system.local/providers/Microsoft.Storage/storageaccounts/srphealthaccount",
"location": "azurestack",
"partNumber": "",
"orderNumber": "",
"additionalInfo": {
"azureStack.MeterId": "09F8879E-87E9-4305-A572-4B7BE209F857",
"azureStack.SubscriptionId": "dbd1aa30-e40d-4436-b465-3a8bc11df027",
"azureStack.Location": "local",
"azureStack.EventDateTime": "06/05/2017 06:00:00"
}
"attributes": {
"objectType": "AzureUtilizationRecord"
}
}
I have created an Activity Log Alert in Azure that does a custom log search against an Application Insights instance.
The alert is working and action groups is notified through the channels I have set up.
The problem I'm having is to create that alert in the arm template we are using to deploy the resources.
When looking at the automation script in the portal the alerts are left out and is not visible. (microsoft.insights/scheduledqueryrules)
I can't find any information online on how to write the condition in the template so it works with a custom log search.
Any suggestions where to find info on how to write the condition or how to extract the template from the portal for those alerts.
This is an ARM template part that creates an alert with a scheduled query. It also adds an array of action groups that get notified when the alert is triggered:
{
"name": "[parameters('scheduleQueryMonitorApplicationError')]",
"type": "microsoft.insights/scheduledqueryrules",
"apiVersion": "2018-04-16",
"location": "[resourceGroup().location]",
"tags": {
"[concat('hidden-link:', resourceGroup().id, '/resourceGroups/', parameters('resourceGroupName'), '/providers/microsoft.insights/components/', parameters('applicationInsightsName'))]": "Resource"
},
"properties": {
"description": "[parameters('scheduleQueryMonitorApplicationError')]",
"enabled": "true",
"source": {
"query": "traces | where severityLevel == 3",
"queryType": "ResultCount",
"dataSourceId": "[resourceId('microsoft.insights/components', parameters('applicationInsightsName'))]"
},
"schedule": {
"frequencyInMinutes": 5,
"timeWindowInMinutes": 5
},
"action": {
"odata.type": "Microsoft.WindowsAzure.Management.Monitoring.Alerts.Models.Microsoft.AppInsights.Nexus.DataContracts.Resources.ScheduledQueryRules.AlertingAction",
"severity": "3",
"aznsAction": {
"actionGroup": "[array( resourceId('microsoft.insights/actiongroups', parameters('actionGroupName')) )]"
},
"trigger": {
"threshold": 1,
"thresholdOperator": "GreaterThan"
}
}
},
"dependsOn": [
"[resourceId('microsoft.insights/components', parameters('applicationInsightsName'))]"
]
},
Please see this stackoverflow thread, where a similar question was asked. Elfocrash mentions that he wrote a blog post about that, explaining how it works. I tried his method and it works.