Authenticate With Workspace - azure-machine-learning-service

I have a Pipeline registered in my AML workspace. Now I would like to trigger a pipeline run from an Azure Notebook in the same Workspace.
In order to get a reference object to the workspace in the notebook I need to authenticate, e.g.
ws = Workspace.from_config()
However, InteractiveLoginAthentication is blocked by my company's domain and MsiAuthentication throws an error as well. ServicePrincipalAuthentication works, but how do I keep the secret safe? What is the prefered way of dealing with secrets in the Azure Machine Learning Service Notebooks?

Related

Can't create google_storage_bucket via Terraform

I'd like to create the following resource via Terraform:
resource "google_storage_bucket" "tf_state_bucket" {
name = var.bucket-name
location = "EUROPE-WEST3"
storage_class = "STANDARD"
versioning {
enabled = true
}
force_destroy = false
public_access_prevention = "enforced"
}
Unfortunately, during the execution of terraform apply, I got the following error:
googleapi: Error 403: X#gmail.com does not have storage.buckets.create access to the Google Cloud project. Permission 'storage.buckets.create' denied on resource (or it may not exist)., forbidden
Here's the list of things I tried and checked:
Verified that Google Cloud Storage (JSON) API is enabled on my project.
Checked the IAM roles and permissions: X#gmail.com has the Owner and the Storage Admin roles.
I can create a bucket manually via the Google Console.
Terraform is generally authorised to create resources, for example, I can create a VM using it.
What else can be done to authenticate Terraform to create Google Storage Buckets?
I think you run the Terraform code in a Shell session from your local machine and use an User identity instead of a Service Account identity.
In this case to solve your issue from your local machine :
Create a Service Account in GCP IAM console for Terraform with Storage Admin roles role.
Download a Service Account token key from IAM.
Set the GOOGLE_APPLICATION_CREDENTIALS env var in your Shell session to the Service Account token key file path.
If you run your Terraform code in other place, you need to check if Terraform is correctly authenticated to GCP.
The use of a token key is not recommended because it's not the more secure way, that's why it is better to launch Terraform from a CI tool like Cloud Build instead of launch it from your local machine.
From Cloud Build no need to download and set a token key.

Azure passing secrets without depending on Azure Keyvaults

I am building an Azure ML Pipeline for batch scoring. In one step I need to access a key stored in the workspace's Azure Keyvault.
However, I want to strictly separate the authoring environment (responsible for creating the datasets, building the environment, building and running the pipeline) and the production environment (responsible for transforming data, running the prediction etc.).
Therefore, code in the production environment should be somewhat Azure agnostic. I want to be able to submit my inference script to Google Cloud Compute Instances, if needed.
Thus my question is:
What is the best practise to pass secrets to remote runs without having the remote script retrieve it from the keyvault itself?
Is there a way to have redacted environment variables or command line arguments?
Thanks!
Example of what I would like to happen:
# import all azure dependencies
secret = keyvault.get_secret("my_secret")
pipeline_step = PythonScriptStep(
script_name="step_script.py",
arguments=["--input_data", input_data, "--output_data", output_data],
compute_target=compute,
params=["secret": secret] # This will create an env var on the remote?
)
pipeline = Pipeline(workspace, steps=[pipeline_step])
PipelineEndpoint.publish(...)
An within step_script.py:
# No imports from azureml!
secret = os.getenv("AML_PARAMETER_secret")
do_something(secret)

Azure DevOps: Service connection is not being recognized

I can't seem to authorize access to my Azure subscription in Azure DevOps to run a build whenever a commit is pushed to master. I keep getting the below error:
Also, when I click Authorize resources, it says the authorization was successful, but the next time I run the pipeline, I get the same exact error. I verified in Project settings -> Service connections that I have an active connection to the subscription.
How can I get around this issue? When I go to Deployment Center in Azure Functions and wire up the connection there, it creates a task-based pipeline, but I want to use yaml.
The above indicates the azureSubscription you specified in your azure function deployment task doesnot exist, or you didnot have the permission.
If the service connection is already correctly setup, but you still encounter above error. You can follow below to troubleshoot the issue.
1, Check your yaml pipeline.
The azure subscription is validated at compile time. If you use variables to reference the azure subscription yaml pipeline. You need to make sure the variable can be retrieved at compile time.
You can check out this thread.
2, Check the service connection security setting.
Go to project settings-->Service Connections under Pipelines--> Select your azure service connection --> More settings(3 dots)-->Security-->Try adding your pipeline to the Pipeline permissions list.
If the azure subscription service connection is not set up. You need to create an service connection of azure Resource Manager type to connect to your azure subscription. See below steps:
1, Go to project settings-->Service Connections under Pipelines--> New Service connection-->Select Azure Resource Manager--> Next
2, Then select the Authentication method. If your azure devops is connected to AAD. You can select Service principal (automatic) as Authentication method. This will automatically create a service principal in your Azure AD.
3, If you want to create new service principal. You can select Service principal (manual). See below document to create service principal in Azure
Use the portal to create an Azure Active Directory application and a service principal that can access resources
Use Azure PowerShell to create an Azure service principal with a certificate
Then enter the related information in the service connection configuration page.
After the your azure subscription service connection is created. You can use it in your yaml pipeline task by specify the service connection name. See below example:
- task: AzureFunctionApp#1
displayName: Azure Function App Deploy
inputs:
azureSubscription: myAzureSubscription
Note: You need to add the correct role assignment for above service principal to enable the service principal to deploy to your azure resources.
You must create a new connection from the task itself (you may need to use the advanced options to add an existing service principal).
under "Azure subscription" click the name of the subscription you wish to use
Click the drop down next to "Authorize" and open advanced options
Click " use the full version of the service connection dialog."
Enter all your credentials and hit save
I spent a while trying to figure out why I got the same problem. Compared my yaml to another yaml I had worked on previously and couldn't spot any problems, also verified the service connections.
But as #Levi Lu-MSFT mentions, verifying the yaml lead me to finding what caused my issue so I thought I'd share it here even though it's not 100% related:
My variables weren't indented correctly. I was a bit tired and thought DevOps was just goofing with me. So verify that your yaml is properly setup. Sometimes it can be really small things that causes these issues.

GCP Service account key management and usage in Terraform

I am creating CI/CD pipeline for Terraform so that my GCP resource creation would be automated. But Terraform needs Service account to do the job, I create the service account and the key is downloaded to my machine, but what should be the correct way to store it so when running Cloud build pipeline so that Terraform would pick on it and execute scripts.
provider "google" {
credentials = file(var.cred_file)
project = var.project_name
region = var.region
}
Is it okay to store this file in Cloud storage bucket ? Or there are some better alternatives ?
On GCP you have the bucket option to keep sensitive information and you can use access control lists (ACLs) to define who has access on your buckets and objects. GCP offers the next options to storage and I think that the better is according with your needs, just ensure that the option provides you the security tools to keep your files safe. I think that once you are Granting permissions to your Cloud Build service account, you can pass the path to the service account key in code

Set up deployment to app service using personal access token

I've been given a personal access token (full access) which allows me to connect to a private Azure git repo within an Azure devops account from another subscription. Connecting to that repo locally using git is working fine.
I would like to set this up as a CI/CD deployment source for my app service but have been unable to find out how to do this. I tried Azure CLI:
az webapp deployment source config ... --repo-url https://anything:{pat}#dev.azure.com/Company/Project/_git/Reponame
This fails with a 500 error.
So I tried calling the Rest API directly but that also fails with the 500 error so not an Azure CLI issue.
Hoping someone can point me in the right direction,
Thanks for the help, much appreciated

Resources