Retrieve COSMOS Connection String or Primary Key in javascript running in azure pipeline - node.js

I have created azure pipeline using classic editor and executes test.js file using pipeline. I need to retrieve azure COSMOS key which could be used in the js file.
Tried by installing Cosmos DB Key Retriever extension but it doesnt show ADD option in the pipeline.
How can this be resolved? How cosmos key be fetched within js file?

How can this be resolved? How cosmos key be fetched within js file?
We strongly suggest using a config.js file to set your app's configurations, including the PRIMARY KEY of Azure Cosmos DB. Check related official documents here: #1, #2, #3.
It seems that you want to avoid writing the key directly in code, then you can consider:
1.Copy the primary key from this page in Azure Web portal, and then create a variable group in Azure Devops pipelines to store that value. (Change variable type to secret !)
Also you can choose to host that value using Azure key valut and then link secrets from an Azure key vault in current variable group. (If you don't want to host the value in Varibale group directly.)
2.Link the variable group to your current classic pipeline.
3.Then you can use Replace Token task to insert the value of your Primary key into the config.js or xx.js file. You should run your other tasks after this task so that you can use the key in your js file.
Assuming I have the variable TheKeyIfCosmos to store the primary key.
Then specifying this format in config.js file:
key: "#{TheKeyOfCosmos}#",
After running that task, the content in config.js would be real key: "xxxxxxxxxxxxxxxx",.
After above steps you can test/run your app with primary key.

Related

SqlPackage.exe deploy DACPAC with always encrypted and Key vault

I am trying to deploy a DACPAC using Azure Release pipeline.
Following are the methods which I used
Using Azure SQL DACPAC task - While using this approach, as part of additional properties following set of params are passed for deployment
AzureKeyVaultAuthMethod:ClientIdSecret /ClientId:'$(SERVICEPRINCIPALID)' /Secret:'$(SERVICEPRINCIPALKEY)'. On enabling diagnostics log, getting following error:
SqlPackage build version - 16.0.6161.0
Using a power-shell script - While using the power-shell script in order to do the DACPAC deployment, I am getting the following error.
Failed to decrypt a column encryption key. Invalid key store provider name: 'AZURE_KEY_VAULT'. A key store provider name must denote either a system key store provider or a registered custom key store provider. Valid system key store provider names are: 'MSSQL_CERTIFICATE_STORE', 'MSSQL_CNG_STORE', 'MSSQL_CSP_PROVIDER'. Valid (currently registered) custom key store provider names are: . Please verify key store provider information in column master key definitions in the database, and verify all custom key store providers used in your application are registered properly.
Logs attached here
SqlPackage build version - 15.0.5472.2
As a pre-requisite before deploying the DACPAC, using a power-shell script I am creating the keys and then inserting them to DB. Contents of the script below.
All the deployments are happening through a service principal and it has admin level access on all the resources in the Azure AD.
Am I missing any steps here before the deployment of DACPAC through release pipeline.
Thanks,
Nandan

Azure DevOps CI/CD Pipelines for the Azure SQL Always Encrypted database Issues

During the Setup of the Azure DevOps CI/CD Pipelines for the Azure SQL Always Encrypted database,
Example :- Table1 consists of the 5 columns, out of the 5 column's Column1 and Column2 were encrypted
Always Enabled setting in Connection string
Dacpac file successfully created without any issues and able to view the Table1
Observed the issue with while inserting data in to Table1 by using transaction data
Error Message : Encryption scheme mismatch for columns/variables
Same code is working fine if execute this dacpac file manually in SSMS studio
Displaying error if use execute the dapac through SSDT or CI/CD Pipelines
Please let me know your thoughts about this issue?
Usually CI/CD pipeline with Dacpac working together is complex with Always encrypted enabled.Please check if the below points can narrow down the issue.
Usually the certificate for the Column Master Key is stored in the
client machine, not on the SQL server machine. If that is the case,
you are not able to insert data into the table with an Always
Encrypted column,Do the Master Key configuration .
(Hope you already knew but just for your info mismatch error in ssms can be solved this way)
According to permissions-for-publishing-a-dac-package-if-always-encrypted
To publish DAC package if Always Encrypted is set up in the DACPAC
or/and in the target database, you might need some or all of the below
permissions, depending on the differences between the schema in the
DACPAC and the target database schema.
ALTER ANY COLUMN MASTER KEY, ALTER ANY COLUMN ENCRYPTION KEY, VIEW ANY
COLUMN MASTER KEY DEFINITION, VIEW ANY COLUMN ENCRYPTION KEY
DEFINITION
Also note that Azure SQL is a PaaS Service which means it receives
update transparently and relatively often with a new compatibility
level. Try updating SSDT version . Always Encrypted is supported in
all editions of SQL Server Database V12.
Always Encrypted uses two types of cryptographic keys: column
encryption keys (CEKs) and column master keys (CMKs). see developing
databases using always encrypted
Please do variable declaration and value assignment are performed on
the same line.
Example:
DECLARE #OPERATION_ID int = 4
DECLARE #PARAMETER_NAME varchar(100) = 'xyz'
Try to store the value to be inserted in a variable or result and store in the application and then insert the data from the result set into SQL Server.
Also see
azure-encryption-server-side-client-side-azure-key-vault
create-and-store-column-master-keys-always-encrypted
ci-with-a-sql-always-encrypted-column

Empty error while executing SSIS package in Azure Data Factory

I have created a simple SSIS project and in this project, I have a package that will delete a particular file in Downloads folder.
I deployed this project to Azure. And when I am trying to execute this package using Azure Data Factory then the pipeline fails with an empty error (I am attaching the screenshot here).
enter image description here
What I have done to fix this error is:
I have added self-hosted IR to Azure-SSIS IR as the proxy to access the data on-premise.
Set the ConnectByProxy as True.
Converted the project to Project Deployment Model.
Please help me out to fix this error and if you need more details then just leave a comment.
Windows Authentication :
To access data stores such as SQL servers/file shares on-premises or Azure Files, check the Windows authentication check box.
If this check box is selected, fill in the Domain, Username, and Password fields with the values for your package execution credentials. The domain is Azure, the username is storage account name>, and the password is storage account key> to access Azure Files, for example.
Using the secrets stored in your Azure Key Vault
As a substitute, you can leverage secrets from your Azure Key Vault as values. Select the AZURE KEY VAULT check box next to them to do so. Create a new key vault connected service or choose or update an existing one. Then choose your value's secret name and version. You can pick or update an existing key vault or create a new one when creating or editing your key vault connected service. If you haven't previously done so, allow Data Factory managed identity access to your key vault. You may also directly input your secret in the format key vault linked service name>/secret name>/secret version>.
Note : If you are using Windows Authentication, there are four methods to
access data stores with Windows authentication from SSIS packages
running on your Azure-SSIS IR: Access data stores and file shares with
Windows authentication from SSIS packages in Azure | Docs
Make Sure it Falls under one of such methods, else it could potentially fail at the Run Time.

Creating Multiple Environment Parameters for Azure Data Factory Linked Services

I have a requirement where I need to point our DEV Azure Data Factory to a Production Azure SQL database and also have the ability to switch the data source back to the Dev database should we need to.
I've been looking at creating parameters against the linked services but unsure of the best approach.
Should I create parameters as follows and choose the relevant parameters depending on the environment I want to pull data from?
DevFullyQualifiedDomainName
ProdFullyQualifiedDomainName
DevDatabaseName
ProdDatabaseName
DevUserName
ProdUserName
Thanks
Any sort of trigger can also have parameters attached to it. Check out the following example, assuming you have a custom event trigger and SQL server as a source:
Create a string parameter for the database name field while establishing a SQL server connected service as a dataset.
Create New parameter in dataset, assign the dataset parameter to that same Linked service parameter, which will be used to store the trigger data.
A custom event trigger has the ability to parse and deliver a custom data payload to your pipeline. You define the pipeline parameters and then populate the values on the Parameters page. To parse the data payload and provide values to the pipeline parameters, use the format #triggerBody().event.data. keyName_.
As per Microsoft Official Documents, which could be referred:
Reference trigger metadata in pipelines
System variables in custom event trigger
When you utilize a pipeline activity in a source, it will request you for a dataset parameter. In this case, utilize dynamic content and choose the parameter containing the trigger data.
I would suggest using Azure Key Vault for that.
Create an Azure Key Vault for each environment (dev, prod, etc.)
Create secrets inside both key vaults with the same name but different values.
For example, for the database server name, create the same secret "database-server" in both dev and prod key vaults but with the correct value representing the connection string of the dev and prod server respectively, in the following format:
integrated security=False;encrypt=True;connection timeout=30;data source=<serverName>.database.windows.net;initial catalog=<databaseName>;user id=<userName>;password=<loginPassword>
In your Azure Data Factory, create a Key Vault linked service pointing to your key vault.
In your Azure Data Factory, create a new Azure SQL Database linked service selecting the Key Vault created in step 1 and the secret created in step 2.
Now you can easily switch between dev and prod by simply adjusting your Key Vault linked service to point to the desired environment.
Have fun ;)
Reference:
https://learn.microsoft.com/en-us/azure/data-factory/store-credentials-in-key-vault

How do I create hierarchical data structures in Azure Key Vaults

I need a way to store hierarchical data in Azure Key Vaults so that I have a structure similar to:
AppName
/Prod
/Data
/Test
/Data
AppName2
/Prod
/Data
...
As far as I can tell I can only store a flat data structure. I am looking to be able to store data similar to Vault by HashiCorp which allows hierarchies.
For instance, in Vault by HashiCorp, I can get data using a 'path': "app/test/TestConnection" and I get the value at the endpoint of the path: TestConnection.
Any suggestion for alternatives would be fine or instruction on how to do what I need to do with Key Vault.
Thanks
Update
I tried some of the suggestions: MySettings--SomeSection--SecretThing, Multiple Vaults and neither works in the manner I need as described above. Not faulting the input but what I want to do just is not available in Key Vault.
#juunas Turns out that your suggestion may be the best solution. I only just discovered in another article that MySettings--SomeSection--Secret translates into something similar in .NET Core:
MySettings: {
SomeSection: "Secret"
}
Since my client wants to use Key Vault we are probably going to go with storing json structured data per a single secret per application.
Any other suggestions are welcome
Key Vault does not support hierarchies for secrets.
To emulate structure, you can do something similar what .NET Core does with its Key Vault configuration provider. You can specify a secret with a name like Settings--SomeCategory--SomeValue, and it'll correspond to the following JSON when loaded:
{
"Settings": {
"SomeCategory": {
"SomeValue": "value goes here"
}
}
}
So essentially you can use a separator to emulate the structure, similar also to how Azure Blob Storage emulates folders.
I would advice against mixing different environment secrets within the same key vault. Access cannot be restricted to some keys, as access is granted and denied on the Key Vault level only. You probably don't want the same persons/applications to be able to access all the different environments, but instead grant access to the production environment to a selected group of users and applications only, and vice versa.
As the Key Vault service by itself doesn't really cost anything, we at least have taken the approach to create one Key Vault per environment, i.e. dev, test and production. Within that key vault the secrets are "structured" by a prefix, i.e. AppName-Data and AppName2-Data. This gives the added benefit, that when moving from dev to test and to production, the references to the secrets don't need to be changed, as they have the same name in all the environments. Just the reference to the Key Vault needs to be changed, and all is set!

Resources