Create Scope in Databrick API 2.0 - INVALID_PARAMETER_VALUE - azure

I have this problem, when I am creating a scope!
How is it possible to create an AAD Token and integrate it into the script?

It's possible to create secret scope baked by Azure KeyVault, but you need to have AD token not the Databricks token. This is possible to do via Databricks CLI and Databricks Terraform provider, and both of them are using REST API to do it, but as mentioned at beginning - you need to use AAD token that you can get via Azure CLI.
My personal preference is to use Databricks Terraform provider that is very flexible, and easier to use than REST API and Databricks CLI.

Create Scope in Databrick API 2.0 - INVALID_PARAMETER_VALUE
This is a known issue with the databricks api and that powershell module:
Scope with Azure KeyVault must have userAADToken defined
Databricks are changing the API and will not commit to the final state
until Key Vault backed scopes comes out of Preview. I've no timescales
yet. In the meantime if you need these I would deploy them manually -
the CLI or REST API do not support them yet.
So, AFAIK, we have no way to create an Azure Key Vault-backed scopes with REST API at this moment. We just can create it in the Azure Databricks UI. In other words, If we provide key vault resource id when we call the REST API or CLI, the api cannot be processed by the backend server.

Related

API Key for Azure Machine Learning Endpoint

I am using Azure ML, I made my models and now I want to connect them to Data Factory to run some process.
I implement an endpoint, but I can't find the API key for the endpoints. Right now, I have the REST endpoint, but not in key-based authentication enabled, it's false. Do you know how to generate the API key?
Currently the only way to retrieve the token is by using the Azure Machine Learning SDK or the Azure CLI machine learning extension.
Key-based auth is supported for Azure Container Instance and Azure Kubernetes Service deployed web-services, and token-based auth is only available for Azure Kubernetes Service deployments.
You can find more information here

Is there a way to retrieve the shared key from the Azure Blob Storage Account using API calls

I was able to create new Storage account by making rest API call and using info (client id, secret, etc) of the principal which I created for this purpose.
After creation of the new azure storage account I would like to continue (create containers, upload blobs etc) by using this account but for that I need the access key which I am not able (for now) to get from the API.
Is there a way to do this or I need to go to Azure portal after creation of each storage account and to pick the access key from there?
Mirko
Everything you can do in the Azure portal can be done with a REST API.
And almost everything is also available in Powershell and the Azure CLI.
The service teams ship the REST API changes when they make changes to the servcie. Those changes sometimes take some time to appear in the PowerShell, CLI and the language APIs.
See https://learn.microsoft.com/en-us/azure/storage/common/storage-account-keys-manage?tabs=azure-powershell
And I believe the REST API is https://learn.microsoft.com/en-us/rest/api/storagerp/storageaccounts/listkeys

Get Azure Databricks URL using Terrafrom

In Dev environment, I deploy Azure DataBricks using Terrafrom and can get the URL using workspace_url https://www.terraform.io/docs/providers/azurerm/r/databricks_workspace.html. But in production, I have to deploy the entire infrastructure through terraform and only Azure Databricks through ARM (this is the company's policy). ut when I use data.azurerm_databricks_workspace.example, for example, I get an error that there is no way to use azurerm_databricks_workspace in data. Tell me how can I get workspace_url and use it in Terrafrom. Thanks.
When going through sources of Databricks Labs Terraform provider, you can find reference to workspace properties endpoint, which would have workspaceUrl in properties field. So if you call https://management.azure.com/subscriptions/<uui>/resourceGroups/<resource-group-name>/providers/Microsoft.Databricks/workspaces/<workspace-name> with management AAD JWT token, you should be able to retrieve workspace URL.
I'm not sure if it's exposed to ARM Templates, though Azure CLI does expose it.

Access Azure Key Vault from Azure build/release pipelines

We have some unit tests/integration tests running on Azure build/release pipelines. There are few tests that retrieve secrets from key vault and these are failing because the code is written for fetching secrets from keyvault using MSI and Azure app authentication features. Since pipelines are not enabled for MSI, the keyvault calls are failing and hence the tests are also failing. What is the alternative that exists for this scenario where pipelines can access keyvault successfuly?
Note: I have already gone through articles suggesting to use variable groups and azure keyvault tasks but not helpful in my scenario .Looking for alternatives.
Note: I have already gone through articles suggesting to use variable
groups and azure keyvault tasks but not helpful in my scenario.Looking
for alternatives.
You can try two directions:
Configure a self-hosted agent to run your pipeline in local environment. Of course the agent should be configured with your managed identity.
According to step5 from this blog:
AzureServiceTokenProvider will use the developer's security context to get a token to authenticate to Key Vault. This removes the need to create a service principal, and share it with the development team. It also prevents credentials from being checked in to source code. AzureServiceTokenProvider will use Azure CLI or Active Directory Integrated Authentication to authenticate to Azure AD to get a token. That token will be used to fetch the secret from Azure Key Vault.
You can use Azure Cli task to run your tests in command-line. Check this similar issue.

PowerShell Azure Functions Multi-Tenancy query

I have written a very simple Azure Function in PowerShell. I have enabled it as multi-tenant which is also fine.
If I wanted to use : connect-msolOnline for example (in the PowerShell script) to connect to an office365 service, how could I get the account that you are asked to provide when you access the API which is secured by AzureAD so I can connect to the tenancy with account given .
Being asked to log on when you hit the API is fine (as it is multi-tenancy) but getting that credential to use in the PowerShell script is the problem I am facing.
Any ideas anyone?
Looking through documentation of Connect-MsolService, it expects PSCredential. For fetching and storing the account name and password you can use App Settings or another option is using Key vault.
Related documentation:
App settings: Deploying secrets to Azure
Key Vault: Getting Started with Key Vault

Resources