Does the linked service support dynamic json in azure data factory? - azure

Currently, I'm trying to set up a dynamic key vault linked service.
Unfortunately, whatever I try I'm not able to successfully test the connection.
{
"name": "AzureKeyVault1",
"properties": {
"type": "AzureKeyVault",
"typeProperties": {
"baseUrl": {
"value": "#concat('https://vault.azure.net','/')",
"type": "Expression"
}
}
}
}
The above code using concat is not a real use case but just a way to test if dynamic json is possible for linked service.
I was expecting (based on the documentation) that I could make the baseUrl property dynamic. Am I using the right formatting?
I get the following error:
Error: Error: Can't get property concat

Wim,based on the official document,parameterize linked services only supports below services,not including Azure Key Vault.
You could submit feedback here to push some improvements of azure data factory you want.

#Wim
To Answer your question, this is the correct formatting.
I have been able to parameterize a pipeline parameter, and then pass it as the baseurl in a dynamic expression.

Related

Dynamic Email attachment using Logic Apps via Data Factory

I need to build a generic Logic app using which i can send mail with attachment.
Is this possible to pass path and file name as parameter so i can use same logic app for different ADF pipelines.
Of course we can use a generic with a generic logic app. You just need to set the "When a HTTP request is received" trigger with two parameters, we can do it by specify the schema of it(shown like below).
schema:
{
"type": "object",
"properties": {
"path": {
"type": "string"
},
"fileName": {
"type": "string"
}
}
}
In following actions of your logic app, you can use the parameters path and fileName when you get the file from Azure Data Lake.
Then you can use the logic app in Azure Data Factory by a "Web" activity.

Not able to create a VM using azure Rest API

I am trying to create a VM using the azure Rest API. I am trying to call the REST Api through Postman.
PUT Request:-
https://management.dev.azure.com/subscriptions/subscriptionID/resourcegroups/ResourceGroupName/providers/Microsoft.Resources/deployementName/DetDeployment?api-version=2019-05-01
I am using the above REST API with my subscription ID and resouceGroupName.
In the Authorization section, I am providing the Type as Basic Authentication and passing my credentials in the username and password section. Along with this, I am also passing values in the Body section.
{
"properties": {
"templateLink": {
"uri": "https://mystoragename.blob.core.windows.net/templates/VMTemplate.json",
"contentVersion": "1.0.0.0"
},
"parametersLink": {
"uri": "https://mystoragename.blob.core.windows.net/templates/VMParam.json",
"contentVersion": "1.0.0.0"
},
"mode": "Incremental",
"debugSetting": {
"detailLevel": "requestContent, responseContent"
}
}
}
Whenever I am sending this request so it is giving me an error like 400 Bad Request and message in the body section is :
Our services aren't available right nowWe're working to restore all services as soon as possible. Please check back soon.0ddImXQAAAABmya8eHqWDRp1JX69tDGdATUFBMDFFREdFMDIyMABFZGdl
Please tell me what wrong I am Doing here. From last 1 day, I am trying this.
Looks like your resource is wrong, it should be https://management.azure.com not https://management.dev.azure.com.
PUT https://management.azure.com/subscriptions/{subscriptionId}/resourcegroups/{resourceGroupName}/providers/Microsoft.Resources/deployments/{deploymentName}?api-version=2019-05-01
Reference - Deploy resources with Resource Manager templates and Resource Manager REST API
Besides, I notice you use the Basic Authentication, not sure if it works for azure rest API(I think may not), even the way will work, but if your account is MFA-enabled, then you will not be able to use that.
So for the Authentication, i recommend you to see this link to get an access token to call the rest api. Or you could try the easiest way -> click Try it in this doc -> login in your account -> then you will be able to test the rest api like that in postman. Also, you can copy the Authorization token and test it in the postman.

Stream Analytics Job -> DataLake ouput

I want to set up CI/CD (ARM template) with StreamAnalytics Job with output set to DataLake Store.
https://learn.microsoft.com/en-us/azure/templates/microsoft.streamanalytics/streamingjobs/outputs#microsoftdatalakeaccounts
The issue comes with refreshToken:
"It is recommended to put a dummy string value here when creating the data source
and then going to the Azure Portal to authenticate the data source
which will update this property with a valid refresh token"
Furthermore after 90-days refresh token is outdated and you need to do "Renvew Authorization"
https://learn.microsoft.com/en-us/azure/stream-analytics/stream-analytics-data-lake-output#renew-data-lake-store-authorization
I tried to authorize ServicePrincipal.
How to do automatic deployment for ASA with DataLake?
How to handle issue with this 90-days token validitiy?
Maybe you wiped the trail :)
at this time it is not yet possible. Sorry for the inconvenience. However we know this is very important and we will add Service Principal auth in the near future (we are looking at the exact ETA).
In the meantime you need to renew manually the token. This can be done without losing any data by (1) stopping the job, (2) changing the token, then (3) restarting the job last time it was stopped.
Let me know if you have any further question.
As far as I know quite soon MSI-based authentication will be in preview.
But if you need an immediate solution (to e.g. be able to have a VSTS pipeline running through without error) you can do the following:
Create template (e.g. with the CICD NuGet Package [1])
Manipulate the ARM Template <jobName>.JobTemplate.json
Add the output datasource object for the ADLS output object
If you work with Visual Studio you can get the values quite easily from the ADLS output JSON
It is important to set refreshToken to some fake value
Like the following:
"outputs": [
{
"name": "xxx",
"properties": {
"serialization": {
"type": "Json",
"properties": {
"encoding": "UTF8",
"format": "LineSeparated"
}
},
"datasource": {
"type": "Microsoft.DataLake/Accounts",
"properties": {
"accountName": "xxx",
"tenantId": "xxx-xxx-xxx-xxx-xxx",
"tokenUserPrincipalName": "xxx#xxx.com",
"tokenUserDisplayName": "xxx, xxx",
"filePathPrefix": "xxx/{date}/{time}",
"dateFormat": "yyyy/MM/dd",
"timeFormat": "HH",
"refreshToken": "faketoken"
}
}
}
},
...
Deploy the ARM Template
The job will start successfully but it is necessary to renew the token, therefore
Stop the job
Renew the authentication of the ADLS output
Start the job
Resources
[1] CICD NuGet Package

Azure Machine Learning: What error is this?

I am using a Classic Web Service with a non-default endpoint for a Update Resource activity on the Azure Data Factory. This is the error I get:
Screenshot of Error
I didn't find any info on the web and couldn't figure it out myself. This website shows an example that I used by just filling in my values for mlEndpoint, apiKey and updateRessourceEndpoint:
{
"name": "updatableScoringEndpoint2",
"properties": {
"type": "AzureML",
"typeProperties": {
"mlEndpoint": "https://ussouthcentral.services.azureml.net/workspaces/xxx/services/--scoring experiment--/jobs",
"apiKey": "endpoint2Key",
"updateResourceEndpoint": "https://management.azureml.net/workspaces/xxx/webservices/--scoring experiment--/endpoints/endpoint2"
}
}
}
There is no mention of a token that needs to be passed...
this error is basically saying the apiKey you provided is invalid to perform the update resource operation. Here is some posts for your reference: https://social.msdn.microsoft.com/Forums/azure/en-US/3bb77e37-8860-43c6-bcaa-d6ebd70617b8/retrain-predictive-web-service-programmatically-when-do-not-have-access-to-managementazuremlnet?forum=MachineLearning
Please also be noted that if you modified your linked service in ADF, remember to re-deploy the pipeline as well to reflect your change in time.

Dynamically retrieving azure storage account key in ARM template

I am trying to automate creating an API Connection for a storage account in Azure using Resource Manager templates.
I am using the listKeys method in ARM to retrieve the access key of the storage account. I went through this question and it is not working for me.
When I use the method in the outputs section of the template, it is working fine and successfully retrieving and displaying the access key.
"outputs": {
"listKeysOutput": {
"type": "string",
"value": "[listKeys(resourceId('Microsoft.Storage/storageAccounts', parameters('storagename')), providers('Microsoft.Storage', 'storageAccounts').apiVersions[0]).keys[0].value]"
}
}
However, when I try to use the same function inside a connection resource (as shown below), the template executes without any error. But on accessing the API Connection from the Azure portal, it says 'parameter is missing'.
"parameterValues": {
"accesskey": "[listKeys(resourceId('Microsoft.Storage/storageAccounts', parameters('storagename')), providers('Microsoft.Storage', 'storageAccounts').apiVersions[0]).keys[0].value]",
"accountName": "[parameters('storagename')]"
}
Am I missing something here? Or the output of listKeys is not being accepted by the 'accesskey' property?
I had a similar experience a few months ago, and resolved it by using a connection string directly in my code and then passing the connection string into the connections. The value looked like this:
[concat('DefaultEndpointsProtocol=https;AccountName=', variables('storageConfigs')[0].name,';AccountKey=', listKeys(resourceId('Microsoft.Storage/storageAccounts/', variables('storageConfigs')[0].name), variables('defaultStorageApiVersion')).key1)]
I used a storage config object as an input, so that's why it looks like above you could replace variables('storageConfigs')[0].name with whatever name or variable function you use in your code. Looks like above it may be storagename
Two things that might be causing the issue:
Ensure the API Connection has a dependency on the storage account
Capitalise the key in "accessKey" (some things in templates are case sensitive)
#Naren, I recommend you can use this API function to get your Storage Key
POST
https://management.azure.com/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.Storage/storageAccounts/{accountName}/listKeys?api-version={api-version}
You could get the same result as the template.
{
“keys”: [
{
“keyName”: “key1”,
“value”: "key1Value”,
“permissions”: “FULL”
},
{
“keyName”: “key2”,
“value”: "key2Value”,
“permissions”: “FULL”
},
]
}
Just for your reference:
https://msdn.microsoft.com/en-us/library/mt163589.aspx
Dependency is indeed a requirement so that the storage account is already created before the deployment of the api connection is initiated.
The problem with the OP template code is the use of accesskey while the correct parameter name is accessKey (Note the capital K) for an Azure Blob api connection resource.
For anyone who struggles with the lack of documentation for the required parameters of API Connection resources - initiate this API Call:
https://management.azure.com/subscriptions/<YOUR SUBSCRIPTION ID>/providers/Microsoft.Web/locations/<YOUR LOCATION>/managedApis/<API TYPE>?api-version=2016-06-01
The <API TYPE> should be the api type of the connection to check for example azureblob, azurequeues or documentdb.
A description of all the expected parameters is returned along side other descriptive information for that resource.

Resources