Azure Data Factory - Dynamic Account information - Parameterization of Connection - azure

The documentation demonstrates how to create a parameter for a connected service but not how to actual pass in that parameter from a dataset or activity. Basically the connection string is coming from a lookup foreach loop and I want to connect to a storage table.
The connection looks like this. The test works when passing in a correct parameter:
{
"name": "StatsStorage",
"properties": {
"type": "AzureTableStorage",
"parameters": {
"connectionString": {
"type": "String"
}
},
"annotations": [],
"typeProperties": {
"connectionString": "#{linkedService().connectionString}"
}
}
}
The dataset is the following, I'm struggling to determine how to set the connectionString parameter for the connection. The dataset has two parameters, the connectionstring from the db and the tablename that it needs to connect to:
{
"name": "TestTable",
"properties": {
"linkedServiceName": {
"referenceName": "StatsStorage",
"type": "LinkedServiceReference"
},
"parameters": {
"ConnectionString": {
"type": "string"
},
"TableName": {
"type": "string"
}
},
"annotations": [],
"type": "AzureTable",
"schema": [],
"typeProperties": {
"tableName": {
"value": "#dataset().TableName",
"type": "Expression"
}
}
}
}
How do I set the connection string on the connection?

First, you can't make the whole connection string as an expression. You need provide accountName and accountKey sperately. Refer this post about how to do it. How to provide connection string dynamically for azure table storage/blob storage in Azure data factory Linked service
Then, if you are using ADF UI, it will guide you how to provide value for linked service. For example, if you have two dataset parameters, you could specify it as following.
If you want to see json code, you could click the code icon on the top left corner.
I am using azure blob as an example, but the azure table is almost the same.
Hope it could help.

Related

How to reference Keyvault Secret Tags from ARM template

I have an ARM template which syncs secret value from source Keyvault into Destination one.
I also want to sync secret tags, but ARM reference that I use for 'sourceKV.secret.tags' retrieval does not work
[reference(resourceId('subscriptionId', 'resourceGroup', 'Microsoft.KeyVault/vaults/secrets', 'SourceKV', 'Secret'), '2021-04-01-preview', 'Full').tags.tagName]
any ideas what can be the issue, or what is the correct form to retrieve tags during ARM template deployment?
These work for me:
"outputs": {
"tags": {
"type": "string",
"value": "[reference('/subscriptions/xxxx/resourceGroups/yyyy/providers/Microsoft.KeyVault/vaults/zzzz/secrets/mysecret', '2022-07-01', 'Full').tags]"
},
"tagValue": {
"type": "string",
"value": "[reference('/subscriptions/xxxx/resourceGroups/yyyy/providers/Microsoft.KeyVault/vaults/zzzz/secrets/mysecret', '2022-07-01', 'Full').tags.hello]"
},
"tagValue2": {
"type": "string",
"value": "[reference(resourceId(subscription().subscriptionId, resourceGroup().name, 'Microsoft.KeyVault/vaults/secrets', 'xxxx', 'mysecret'), '2021-04-01-preview', 'Full').tags.hello]"
}
}
Will result in:
"outputs": {
"tagValue": {
"type": "String",
"value": "world"
},
"tagValue2": {
"type": "String",
"value": "world"
},
"tags": {
"type": "Object",
"value": {
"hello": "world"
}
}
}
Also works with the API version you used. It is important that you use 'Full', otherwise you won't get the tags. Note that you can use this syntax anywhere in your template. I just used it in the outputs because it is good for testing.
As I found out it is not possible to use Reference function for setting tags property value for keyvault as valid usages state
reference func only works if it is used inside properties block or for outputs; but as tags are not part of properties instead of returning value reference fun returns just string "reference(resource...)"

Data factory SQL connection String with keyvault

I exported arm template from Datafactory V2, when Importing the template it is asking me to manually enter SQL database connection string. To minimize the human interaction I made the following changes.
{
"name": "[concat(parameters('factoryName'), '/myFactory')]",
"type": "Microsoft.DataFactory/factories/linkedServices",
"apiVersion": "2018-06-01",
"properties": {
"type": "AzureSqlDatabase",
"typeProperties": {
"connectionString": "[concat('Server=tcp:',parameters('sqlServerName'),'.database.windows.net,1433;Initial Catalog=', parameters('sqlDatabaseName'), ';Persist Security Info=False;User ID=',parameters('sqlServerUserName'),';Password=(password)',';MultipleActiveResultSets=False;Encrypt=True;TrustServerCertificate=False;Connection Timeout=30')]",
"password": {
"type": "AzureKeyVaultSecret",
"store": {
"referenceName": "AzureKeyVault1",
"type": "LinkedServiceReference"
},
"secretName": "sql-password"
}
}
},
"dependsOn": [
"[concat(variables('factoryId'), '/linkedServices/AzureKeyVault1')]"
]
},
So currently when deploying to Datafactory V2 and test connection to this SQL server, I got
Cannot connect to SQL Database: 'tcp:mysqlserver.database.windows.net,1433',
Database: 'mydatabase', User: 'admin'. Check the linked service configuration
is correct, and make sure the SQL Database firewall allows the integration runtime to access.
Login failed for user 'admin'., SqlErrorNumber=18456,
If I manually input all the connections in the portal UI, I can easily connect to the database and test successfully so it is not a firewall issue.
Then I think there could be 2 issue:
1.how the password from keyvault is consumed in the connectionstring. I didn't find much information about it online.
When I open the created Sql Linked service, I notice the Fully qualified domain name is missing,If i manually add it in then the connection works.
The SQL connection UI
Throwing this as an alternative answer/approach.
Store the connection string in it's entirety in Key Vault. If doing this then the reference would look like:
{
"name": "[concat(parameters('factoryName'), '/',parameters('connectionNameAdventureWorks'))]",
"type": "Microsoft.DataFactory/factories/linkedServices",
"apiVersion": "2018-06-01",
"properties": {
"annotations": [],
"type": "AzureSqlDatabase",
"typeProperties": {
"connectionString": {
"type": "AzureKeyVaultSecret",
"store": {
"referenceName": "[variables('azkDataAnalyticsReferenceName')]",
"type": "LinkedServiceReference"
},
"secretName": "[variables('azkAdventureWorksSecretName')]"
}
}
},
"dependsOn": [
"[concat(variables('factoryId'), '/linkedServices/',variables('azkDataAnalyticsReferenceName'))]"
]
}
And even more secure approach would be to add Data Factory as a Managed Identity and then run a sql script to add the user If doing this then all there is no need for any credentials to be passed at all.
One downside is if the DataFactory is deleted and recreated then the managed identity permissions would need to be reassigned to the sql database.

Index Out of Range Error When Creating SnowFlake Linked Service in Azure Data Factory

I am passing the credentials and parameters required but I get the error
The value of the property 'index' is invalid: 'Index was out of range.
Must be non-negative and less than the size of the collection.
Parameter name: index'. Index was out of range. Must be non-negative
and less than the size of the collection. Parameter name: index
Activity ID: 36a4265d-3607-4472-8641-332f5656661d.
I had the same issue, the password contained a ' and that's causing the trouble. Changed the password with no symbols and it works like a charm
Seems the UI doesn't generate the linked service correctly. Using Microsoft Docs Example JSON I received the same index error when attempting to create the linked service. If I remove the password from the connection string and add it as a separate property I am able to successfully generate the linked service.
Microsoft Docs Example (Doesn't Work)
{
"name": "SnowflakeLinkedService",
"properties": {
"type": "Snowflake",
"typeProperties": {
"connectionString": "jdbc:snowflake://<accountname>.snowflakecomputing.com/?user=<username>&password=<password>&db=<database>&warehouse=<warehouse>&role=<myRole>"
},
"connectVia": {
"referenceName": "<name of Integration Runtime>",
"type": "IntegrationRuntimeReference"
}
}
}
Working Example
{
"name": "SnowflakeLinkedService",
"properties": {
"type": "Snowflake",
"typeProperties": {
"connectionString": "jdbc:snowflake://<accountname>.snowflakecomputing.com/?user=<username>&db=<database>&warehouse=<warehouse>",
"password": {
"type": "SecureString",
"value": "<password>"
}
},
"connectVia": {
"referenceName": "<name of Integration Runtime>",
"type": "IntegrationRuntimeReference"
}
}
}
We hit this same issue today, it was because our password had an ampersand (&) at the end. This seemed to mess up the connection string as it contained this:
&password=abc123&&role=MyRole
Changing the password to not include an ampersand fixed it

Stream Analytics Job deployed as Azure Resource Manager (ARM) template

I am trying to setup an output EventHub for a Stream Analytics Job defined as a JSON template. Without the output bit the template is successfully deployed, however when adding the output definition it fails with:
Deployment failed. Correlation ID: <SOME_UUID>. {
"code": "BadRequest",
"message": "The JSON provided in the request body is invalid. Property 'eventHubName' value 'parameters('eh_name')' is not acceptable.",
"details": {
"code": "400",
"message": "The JSON provided in the request body is invalid. Property 'eventHubName' value 'parameters('eh_name')' is not acceptable.",
"correlationId": "<SOME_UUID>",
"requestId": "<SOME_UUID>"
}
}
I've defined the ARM template as:
{
"$schema": "http://schema.management.azure.com/schemas/2015-01-01/deploymentTemplate.json#",
"contentVersion": "1.0.0.0",
"parameters": {
"location": {
"type": "string",
"defaultValue": "westeurope"
},
"hubName": {
"type": "string",
"defaultValue": "fooIotHub"
},
"eh_name": {
"defaultValue": "fooEhName",
"type": "String"
},
"eh_namespace": {
"defaultValue": "fooEhNamespace",
"type": "String"
},
"streamAnalyticsJobName": {
"type": "string",
"defaultValue": "fooStreamAnalyticsJobName"
}
},
"resources": [{
"type": "Microsoft.StreamAnalytics/StreamingJobs",
"apiVersion": "2016-03-01",
"name": "[parameters('streamAnalyticsJobName')]",
"location": "[resourceGroup().location]",
"properties": {
"sku": {
"name": "standard"
},
"outputErrorPolicy": "Drop",
"eventsOutOfOrderPolicy": "adjust",
"eventsOutOfOrderMaxDelayInSeconds": 0,
"eventsLateArrivalMaxDelayInSeconds": 86400,
"inputs": [{
"Name": "IoTHubInputLable",
"Properties": {
"DataSource": {
"Properties": {
"iotHubNamespace": "[parameters('hubName')]",
"sharedAccessPolicyKey": "[listkeys(resourceId('Microsoft.Devices/IotHubs/IotHubKeys',parameters('hubName'), 'iothubowner'),'2016-02-03').primaryKey]",
"sharedAccessPolicyName": "iothubowner",
"endpoint": "messages/events"
},
"Type": "Microsoft.Devices/IotHubs"
},
"Serialization": {
"Properties": {
"Encoding": "UTF8"
},
"Type": "Json"
},
"Type": "Stream"
}
}],
"transformation": {
"name": "Transformation",
"properties": {
"streamingUnits": 1,
"query": "<THE SQL-LIKE CODE FOR THE JOB QUERY>"
}
},
"outputs": [{
"name": "EventHubOutputLable",
"properties": {
"dataSource": {
"type": "Microsoft.ServiceBus/EventHub",
"properties": {
"eventHubName": "parameters('eh_name')",
"serviceBusNamespace": "parameters('eh_namespace')",
"sharedAccessPolicyName": "RootManageSharedAccessKey"
}
},
"serialization": {
"Properties": {
"Encoding": "UTF8"
}
}
}
}]
}
}]
}
Checking here https://learn.microsoft.com/en-us/azure/templates/microsoft.streamanalytics/streamingjobs
it looks like the structure of the JSON for the output is as the expected one (with the properties field along with the type).
I've figured out those "Event Hub properties" from the Chrome browser using Developer Tools and checking the details of the HTTP request "GetOutputs", otherwise I am not sure where I could see how to specify those properties? The structure looks quite similar to the one for the input IoT Hub (which is working), in that case using different lables for the properties related to the IoT Hub details.
Checking this blog post https://blogs.msdn.microsoft.com/david/2017/07/20/building-azure-stream-analytics-resources-via-arm-templates-part-2-the-template/
the output part is related to PowerBI and it looks like a different structure is used for the properties: outputPowerBISource, so I've tried to use for the Event Hub the field outputEventHubSource (from the checks using Chrome Developer Tools) instead of properties, but then I get this error:
Deployment failed. Correlation ID: <SOME_UUID>. {
"code": "BadRequest",
"message": "The JSON provided in the request body is invalid. Required property 'properties' not found in JSON. Path '', line 1, position 1419.",
"details": {
"code": "400",
"message": "The JSON provided in the request body is invalid. Required property 'properties' not found in JSON. Path '', line 1, position 1419.",
"correlationId": "<SOME_UUID>",
"requestId": "<SOME_UUID>"
}
}
The command I am using to deploy this template is the Azure CLI (from a Linux Debian machine):
az group deployment create \
--name "deployStreamAnalyticsJobs" \
--resource-group "MyRGName" \
--template-file ./templates/stream-analytics-jobs.json
How do I specify an output in an Azure Resource Manager (ARM) template for a Stream Analytics Job?
Any property that contains a parameter (or any expression that needs to be evaluated must contain square brackets, e.g.
"eventHubName": "[parameters('eh_name')]",
"serviceBusNamespace": "[parameters('eh_namespace')]",
Otherwise the literal value in the quotes is used.
That help?
I found out all parameters need to be wrapped in square brakets (as pointed out in the other answer to this question).
Also to dynamically retrieve the shared access policy key (or any other parameter for an existing resource like the Event Hub) a combination of functions like listKeys and resourceId etc must be used, see below for a full example of an Event Hub described as an output for a Stream Analytics Job.
In details:
the parameters defined for eventHubName and serviceBusNamespace must be evaluated using square brackets (see how I defined those parameters in the JSON example in the body of the question I asked above),
the shared access policy could be either an hardcoded string (or a parameter as before) like sharedAccessPolicyName or dynamically retrieved using "sharedAccessPolicyKey": "[listKeys(resourceId('Microsoft.EventHub/namespaces/eventhubs/authorizationRules', parameters('eh_namespace'), parameters('eh_name'), 'RootManageSharedAccessKey'),'2017-04-01').primaryKey]" for the sharedAccessPolicyKey (this is sensitive data and it should be protected avoiding hardcoding information as a plain string)
The following JSON configuration shows an existing Event Hub defined as an output defined for the Stream Analytics Job:
"outputs": [{
"Name": "EventHubOutputLable",
"Properties": {
"DataSource": {
"Type": "Microsoft.ServiceBus/EventHub",
"Properties": {
"eventHubName": "[parameters('eh_name')]",
"serviceBusNamespace": "[parameters('eh_namespace')]",
"sharedAccessPolicyKey": "[listKeys(resourceId('Microsoft.EventHub/namespaces/eventhubs/authorizationRules', parameters('eh_namespace'), parameters('eh_name'), 'RootManageSharedAccessKey'),'2017-04-01').primaryKey]",
"sharedAccessPolicyName": "RootManageSharedAccessKey"
}
},
"Serialization": {
"Properties": {
"Encoding": "UTF8"
},
"Type": "Json"
}
}
}]

How to set the connection string for a Service Bus Logic App action in an ARM template?

I'm attempting to deploy an Azure Logic App that includes an action to Send a message on a Service Bus using an ARM template.
In addition to deploying the Logic App, the ARM template deploys a Service Bus Namespace, a Queue and two AuthorizationRule (one for sending and one for listening).
I want to dynamically set the connection information for the Send Service Bus Message action to use the Connection string generated for the AuthorizationRule that supports sending.
When I create this in the portal editor (specifying the connection string for sending), I noticed the following is generated in code view...
"Send_message.": {
"conditions": [
{
"dependsOn": "<previous action>"
}
],
"inputs": {
"body": {
"ContentData": "#{encodeBase64(triggerBody())}"
},
"host": {
"api": {
"runtimeUrl": "https://logic-apis-westus.azure-apim.net/apim/servicebus"
},
"connection": {
"name": "#parameters('$connections')['servicebus']['connectionId']"
}
},
"method": "post",
"path": "/#{encodeURIComponent(string('<queuename>'))}/messages"
},
"type": "apiconnection"
}
},
I assume that the connection information is somehow buried in #parameters('$connections')['servicebus']['connectionId']"
I then used resources.azure.com to navigate to the logic app to see if I could get more details as to how #parameters('$connections')['servicebus']['connectionId']" is defined.
I found this:
"parameters": {
"$connections": {
"value": {
"servicebus": {
"connectionId": "/subscriptions/<subguid>/resourceGroups/<rgname>/providers/Microsoft.Web/connections/servicebus",
"connectionName": "servicebus",
"id": "/subscriptions/<subguid>/providers/Microsoft.Web/locations/westus/managedApis/servicebus"
}
}
}
}
But I still don't see where the connection string is set.
Where can I set the connection string for the service bus action in an ARM template using something like the following?
[listkeys(variables('sendAuthRuleResourceId'), variables('sbVersion')).primaryConnectionString]
EDIT: Also, I've referred to was seems to be a promising Azure quick start on github (based on the title), but I can't make any sense of it. It appears to use an older schema 2014-12-01-preview, and the "queueconnector" references an Api Gateway. If there is a newer example out there for this scenario, I'd love to see it.
I've recently worked on an ARM Template for the deployment of logic apps and service bus connection. Here is the sample template for configuring service bus connection string within the type "Microsoft.Web/connections". Hope it helps.
{
"type": "Microsoft.Web/connections",
"apiVersion": "2016-06-01",
"name": "[parameters('connections_servicebus_name')]",
"location": "centralus",
"dependsOn": [
"[resourceId('Microsoft.ServiceBus/namespaces/AuthorizationRules', parameters('ServiceBusNamespace'), 'RootManageSharedAccessKey')]"
],
"properties": {
"displayName": "ServiceBusConnection",
"customParameterValues": {},
"api": {
"id": "[concat(subscription().id, '/providers/Microsoft.Web/locations/centralus/managedApis/servicebus')]"
},
"parameterValues": {
"connectionString": "[listKeys(resourceId('Microsoft.ServiceBus/namespaces/authorizationRules', parameters('ServiceBusNamespace'), 'RootManageSharedAccessKey'), '2017-04-01').primaryConnectionString]"
}
}
}
As you know connections is a resource so it needs to be created first did you refer this https://blogs.msdn.microsoft.com/logicapps/2016/02/23/deploying-in-the-logic-apps-preview-refresh/. Quick start link you are referring is for older schema.

Resources