How to get/set parameterized connection string from Key Vault in Azure Data Factory? - azure

I have a parameterized connection string in Azure Data Factory Linked Services as below:
Integrated
Security=False;Encrypt=True;Connection Timeout=30;Data
Source=xyz;User ID=admin;Password=password;Initial
Catalog=#{linkedService().LSDBName};
The value for database is passed from a pipeline variable at runtime.
I want to save this connection string to Azure Key Vault, but the issue is that after the value is read from the key vault, the linked service parameter "LSDBName" is not dynamically replaced by the actual value and it tries to connect to "#{linkedService().LSDBName}" as the database name.
Is there any way to secure a dynamically parameterized connection string in key vault? Or a workaround to achieve this?
Thanks!

If you want to store the entire connection string in key vault then you have to pass the connection string in "Server=myServerAddress;Database=myDataBase;User Id=myUsername;Password=myPassword;" format. Create separate connection string for each database and store it in key vault with different secrets and then create parameterized linked service in ADF, giving these secrets as parameter.

My idea is using Set Variable Activity+Azure Function Activity.
First step is using Set Variable Activity to get the LinkedService Connection String.
Second Step is passing the variable as parameter into Azure Function Activity. Then use AKV sdk to store the connection string value in the azure function inside.
Incidentally, I think your connection string has been parameterized already, security issues have been avoided. You don't have to store it into AKV again because mostly we read private information from AKV, rather than write store information into AKV in ADF. Just my own opinion.

Related

Creating Multiple Environment Parameters for Azure Data Factory Linked Services

I have a requirement where I need to point our DEV Azure Data Factory to a Production Azure SQL database and also have the ability to switch the data source back to the Dev database should we need to.
I've been looking at creating parameters against the linked services but unsure of the best approach.
Should I create parameters as follows and choose the relevant parameters depending on the environment I want to pull data from?
DevFullyQualifiedDomainName
ProdFullyQualifiedDomainName
DevDatabaseName
ProdDatabaseName
DevUserName
ProdUserName
Thanks
Any sort of trigger can also have parameters attached to it. Check out the following example, assuming you have a custom event trigger and SQL server as a source:
Create a string parameter for the database name field while establishing a SQL server connected service as a dataset.
Create New parameter in dataset, assign the dataset parameter to that same Linked service parameter, which will be used to store the trigger data.
A custom event trigger has the ability to parse and deliver a custom data payload to your pipeline. You define the pipeline parameters and then populate the values on the Parameters page. To parse the data payload and provide values to the pipeline parameters, use the format #triggerBody().event.data. keyName_.
As per Microsoft Official Documents, which could be referred:
Reference trigger metadata in pipelines
System variables in custom event trigger
When you utilize a pipeline activity in a source, it will request you for a dataset parameter. In this case, utilize dynamic content and choose the parameter containing the trigger data.
I would suggest using Azure Key Vault for that.
Create an Azure Key Vault for each environment (dev, prod, etc.)
Create secrets inside both key vaults with the same name but different values.
For example, for the database server name, create the same secret "database-server" in both dev and prod key vaults but with the correct value representing the connection string of the dev and prod server respectively, in the following format:
integrated security=False;encrypt=True;connection timeout=30;data source=<serverName>.database.windows.net;initial catalog=<databaseName>;user id=<userName>;password=<loginPassword>
In your Azure Data Factory, create a Key Vault linked service pointing to your key vault.
In your Azure Data Factory, create a new Azure SQL Database linked service selecting the Key Vault created in step 1 and the secret created in step 2.
Now you can easily switch between dev and prod by simply adjusting your Key Vault linked service to point to the desired environment.
Have fun ;)
Reference:
https://learn.microsoft.com/en-us/azure/data-factory/store-credentials-in-key-vault

Building Eventhub connection string with KeyVault secret

I am trying to create an Azure Function App which is an EventHub Trigger. The thing is that to connect to Azure Eventhub, I don't have the full connection string. Instead I have the Eventhub SAS token stored as secret in a Key Vault.
I would like to know if in the App Settings section there is a way to build the connection string by passing that token that I get from the key vault.
I have a variable KEYVAULT_SAS_SECRET whose value I want to use in another variable within the App Setting.
Would it be possible to reference the KEYVAULT_SAS_SECRET variable to construct the connection string that is stored in a second variable?
Something like this:
Endpoint=sb://some-namespace.servicebus.windows.net/;SharedAccessKeyName=policy;SharedAccessKey=[KEYVAULT_SAS_SECRET];EntityPath=eventhub-topic.
Thank you very much in advance
Unfortunately it is not possible to reference Azure Function App Settings variable in another App Settings variables.
In your particular case you have to construct EventHub connection string from two App Settings variables directly in your code

Retrieve COSMOS Connection String or Primary Key in javascript running in azure pipeline

I have created azure pipeline using classic editor and executes test.js file using pipeline. I need to retrieve azure COSMOS key which could be used in the js file.
Tried by installing Cosmos DB Key Retriever extension but it doesnt show ADD option in the pipeline.
How can this be resolved? How cosmos key be fetched within js file?
How can this be resolved? How cosmos key be fetched within js file?
We strongly suggest using a config.js file to set your app's configurations, including the PRIMARY KEY of Azure Cosmos DB. Check related official documents here: #1, #2, #3.
It seems that you want to avoid writing the key directly in code, then you can consider:
1.Copy the primary key from this page in Azure Web portal, and then create a variable group in Azure Devops pipelines to store that value. (Change variable type to secret !)
Also you can choose to host that value using Azure key valut and then link secrets from an Azure key vault in current variable group. (If you don't want to host the value in Varibale group directly.)
2.Link the variable group to your current classic pipeline.
3.Then you can use Replace Token task to insert the value of your Primary key into the config.js or xx.js file. You should run your other tasks after this task so that you can use the key in your js file.
Assuming I have the variable TheKeyIfCosmos to store the primary key.
Then specifying this format in config.js file:
key: "#{TheKeyOfCosmos}#",
After running that task, the content in config.js would be real key: "xxxxxxxxxxxxxxxx",.
After above steps you can test/run your app with primary key.

Grouping secrets in azure key-vault

I am trying store secrets in azure vault. I used azure sdk apis and I can successfully store/retrieve using those. I wanted to know if it's possible to categorise/group set of secrets under same tag and store them in some path.
I want to group some secrets used by one service, store them in one storage path. The same way for other services is separate storage paths. I couldn't find any way of doing that. Is that possible in azure vault?
In short: no, this is not possible.
Also: you cannot get secrets and their values in a list. If you want to get a list, you'll only get a SecretItem array and you have to call GetSecret on each secret you want to get the actual value for.
You could, however, implement something like this yourself by defining a template for the name of the secret that incorporates the name of the thing you would like to group on. Something like this:
$"{serviceName}-secrets-{secretName}"
This way, you can filter the list to only hold the secrets for the service you want to get them for and get their values.

Implement Key Vault in .NetCore and convert the values to a model in Startup.cs .NetCore application

I have implemented the KeyVault in my .NetCore application. In the Startup.cs, I can get the values using:
var key1Value = Configuration["Key1"];
I want to read all the KeyVault values at once and convert it into a Class model values. So that model can be passed to all the services.
My requirement is not to write the Configuration["Key"] in the application, but to pass the model in the services using Dependency injection.
Azure key vault configuration provider gives an option of reading configuration values into an array for binding to a POCO array.
In general the configuration keys allow : as a separator. But azure key vault keys do not support colons. You can use double dashes instead --.
Check out the "bind an array to a class" section in this link and here.

Resources