i am trying to retrieve retail prices of azure storage i am following this document.
https://learn.microsoft.com/en-us/rest/api/cost-management/retail-prices/azure-retail-prices#api-endpoint
In this document, I can able to find only virtual machine, compute, etc. but when I try to use the Retail Rates Prices API to get retail prices of storage account, as per above document information I made some changes I am getting response as null.
API endpoint as https://prices.azure.com/api/retail/prices
(https://i.imgur.com/14MfQN8.png)
I tried to reproduce the same in my environment I am getting the same null value:
You can make use of below ApI endpoint for storage account as below:
https://prices.azure.com/api/retail/prices?api-version=2021-10-01-preview&$filter=serviceFamily eq 'Storage'
when I try to pass filter=serviceFamily eq 'Storage' I can able to get retail prices of azure storage successfully.
Related
We've been using APIM for years, it's generally not a bad platform, pretty stable and reliable. We recently onboarded a fairly big customer and, according to the best practices of the Murphy's law, APIM went down for almost an hour on one of the first days. Which, obviously, made no one happy.
APIM has been fine and dandy before and after the incident, but the Health history only goes back 4 weeks. It would help to show logs demonstrating that it was an outlier event. Is there a way to get the Health history months or years back?
How about using ADF Service, I have tried using ADF, it's possible and here is the implementation details.
Using ADF - Copy data activity, I've configured for storing the Health History in the Azure Blob storage, the following workaround I did:
My APIM Instance Health History in the Portal:
Created APIM Instance > Added few of the Function APIs > Tested
Created Storage Account > Blob Storage > Container
Created ADF Pipeline > Copy Data Activity >
Source: I have Selected the REST API of the Azure Resource and given the API URL with its authorization header:
Sink: Added the Blob Storage as data set > Selected the container > Given the sample file name as TestJson.json as I have selected the JSON Format while mapping.
Mappings: Clicked on Import Schemas and added respective keys to the schema cells:
Note: You can make Import Schemas to None so that it will add automatically all the data.
Validate/Run the Pipeline:
Result - Azure Resource Health History in the Blob Storage:
I want to get the fully qualified instance id(Ex-:"/subscriptions/9xxxxxx5-6xxe-4xxc-8xx4-2xxxxxxxxx5/resourceGroups/test/providers/Microsoft.Compute/virtualMachines/vm-test")which is stored in storage account table in Azure.
I have enabled guest level monitoring in my virtual machine and exported metrics to a Storage account table. In that table, instance id column (PARTITIONKEY) shows like below.
":002Fsubscriptions:002F9xxxxxx5:002D6xxe:002D4xxc:002D8xx4:002D2xxxxxxxxx5:002FresourceGroups:002Ftest:002Fproviders:002FMicrosoft:002ECompute:002FvirtualMachines:002Fvm:002Dtest"
Not sure how to convert instance id column PARTITIONKEY into like a instance Id.
However, for your purpose to get vm memory related metrics. It's recommended to use Log Analytics. Search Log Analytics workspace resource in the Azure portal then narrow down to your specific VM scope then run the query language.
Perf
| where ObjectName == "Memory"
Or, you can execute an Analytics query using Query - Get
For more information, you could read these docs.
https://learn.microsoft.com/en-us/azure/azure-monitor/log-query/get-started-portal
https://learn.microsoft.com/en-us/azure/azure-monitor/log-query/log-query-overview
Hope this could help you.
I already post my problem here and they suggested me to post it here.
I am trying to export data from Azure ML to Azure Storage but I have this error:
Error writing to cloud storage: The remote server returned an error: (400) Bad Request.. Please check the url. . ( Error 0151 )
My blob storage configuration is Storage v2 / Standard and Require secure transfer set as enabled.
If I set the Require secure transfer set as disabled, the export works fine.
How can I export data to my blob storage with the require secure transfer set as enabled ?
According to the offical tutorial Export to Azure Blob Storage, there are two authentication types for exporting data to Azure Blob Storage: SAS and Account. The description for them as below.
For Authentication type, choose Public (SAS URL) if you know that the storage supports access via a SAS URL.
A SAS URL is a special type of URL that can be generated by using an Azure storage utility, and is available for only a limited time. It contains all the information that is needed for authentication and download.
For URI, type or paste the full URI that defines the account and the public blob.
For private accounts, choose Account, and provide the account name and the account key, so that the experiment can write to the storage account.
Account name: Type or paste the name of the account where you want to save the data. For example, if the full URL of the storage account is http://myshared.blob.core.windows.net, you would type myshared.
Account key: Paste the storage access key that is associated with the account.
I try to use a simple module combination as the figure and Python code below to test the issue you got.
import pandas as pd
def azureml_main(dataframe1 = None, dataframe2 = None):
dataframe1 = pd.DataFrame(data={'col1': [1, 2], 'col2': [3, 4]})
return dataframe1,
When I tried to use the authentication type Account of my Blob Storage V2 account, I got the same issue as yours which the error code is Error 0151 as below via click the View error log Button under the link of View output log.
Error 0151
There was an error writing to cloud storage. Please check the URL.
This error in Azure Machine Learning occurs when the module tries to write data to cloud storage but the URL is unavailable or invalid.
Resolution
Check the URL and verify that it is writable.
Exception Messages
Error writing to cloud storage (possibly a bad url).
Error writing to cloud storage: {0}. Please check the url.
Based on the error description above, the error should be caused by the blob url with SAS incorrectly generated by the Export Data module code with account information. May I think the code is old and not compatible with the new V2 storage API or API version information. You can report it to feedback.azure.com.
However, I switched to use SAS authentication type to type a blob url with a SAS query string of my container which I generated via Azure Storage Explorer tool as below, it works fine.
Fig 1: Right click on the container of your Blob Storage account, and click the Get Shared Access Signature
Fig 2: Enable the permission Write (recommended to use UTC timezone) and click Create button
Fig 3: Copy the Query string value, and build a blob url with a container SAS query string like https://<account name>.blob.core.windows.net/<container name>/<blob name><query string>
Note: The blob must be not exist in the container, otherwise an Error 0057 will be caused.
Here is the link for the resources of template format for the creation of azure automation account.
what is sku , sku.family and sku.capacity for automation account?
https://learn.microsoft.com/en-us/azure/templates/microsoft.automation/automationaccounts/jobschedules
Normally we can create the azure automation without sku.family and sku.capacity properties.But what are the possible values that can be passed to these resources for creation of arm template.
What could be the sku.family and its corresponding sku.capacity that can be passed????
What is the use of sku.family and sku.capacity in general for azure Resources ???
As you mentioned that link, we can find Sku object as pic.
Also we can get information about an automation account ,For a Free Sku, there is a response as below:
"properties":{
"sku":{
"name":"Free",
"family":null,
"capacity":null
},
Moreover, you can get more details from this relevant schema for Automation
https://raw.githubusercontent.com/Azure/azure-resource-manager-schemas/master/schemas/2015-10-31/Microsoft.Automation.json
Hope this helps.
In Azure Automation, SKU-Family and SKU-Capacity are unrequired and are more or less placeholders - "null" accepted.
SKu-name is the only required property of type "SKuNameEnum" with 2 possible values: "Basic" & "Free"
Here are 2 sample responses showing all 3 SKU object properties and you will notice family & capacity are both "null"
For your reference, there's a public facing API Browser page (in preview) that enumerates all supported REST APIs including API properties, description, usage and response samples. Simply enter the Azure service you would like to explore in the dropdown and drill through the listed APIs. In most cases, It includes a "Tryit" workflow to test the API call.
Hope this helps.
I am trying to create an ARM template to deploy an Azure Log Analytics Workspace via ARM. The template works fine, except it needs to understand which SKUs are valid for the target subscription - PerGB2018 for new subscriptions or one of the older SKUs for non-migrated subscriptions.
Pricing models are detailed here:
https://learn.microsoft.com/en-gb/azure/monitoring-and-diagnostics/monitoring-usage-and-estimated-costs#new-pricing-model-and-operations-management-suite-subscription-entitlements
Available SKUs for workspace creation are listed here:
https://learn.microsoft.com/en-us/rest/api/loganalytics/workspaces/createorupdate
I don't know how to identify which ones are valid for the specific subscription prior to deployment and end up with errors and failing deployments where the default I pick is not valid. I cannot assume the person or system calling the template will understand and have access to the correct set of pricing SKUs. PerGB2018 cannot be used on non-migrated subscriptions so cannot be my default.
Can anyone share a method for determining which SKUs will work BEFORE trying to deploy and thus avoiding an error? I have checked the Monitor and Billing APIs in case it is listed there but cannot see anything, and the network calls from the portal page don't offer much insight :(
My preference is to avoid PowerShell as the rest of the deployment uses BASH to request deployment information and build out the parameter files.
Thank You
Inevitably, after asking the question have had a breakthrough - the BASH script below uses Azure CLI 2 to get an AAD Access Token and store it in token. Next we grab the subscription id and store it in subscriptionId.
Once we have the sub ID and a valid Access Token we use curl to call an API endpoint which lists the date of migration to the new pricing model.
token=$(az account get-access-token | jq ".accessToken" -r)
subscriptionId=$(az account show | jq ".id" -ropt)
optedIn=$(curl -X POST -H "Authorization:Bearer $token" -H "Content-Length:0" https://management.azure.com/subscriptions/$subscriptionId/providers/microsoft.insights/listmigrationdate?api-version=2017-10-01 | jq ".optedInDate" -r)
My understanding is that a value of "null" for optedIn means it is the legacy pricing SKUs.
Shout if you disagree or have a better answer!