Programmatically set Cluster, Pool and Jobs Access Control - databricks

Azure Databricks issue.
Is there any way to programmatically set this parameter in Admin Console: Cluster, Pool and Jobs Access Control. I think this is a spark_conf property, but I can't find any informations.
Other question: When a parameter is set, is it possible to export a json parameter file showing all these settings?
Thank you

Yes, the permissions API lets you manage permissions in the Azure Databricks.
Note: This feature is in public preview
The Permissions API supports several objects and endpoints:
Token permissions — Manage which users can create or use tokens.
Password permissions — Manage which users can use password login when
SSO is enabled.
Cluster permissions — Manage which users can manage, restart, or
attach to clusters.
Pool permissions — Manage which users can manage or attach to pools.
Some APIs and documentation refer to pools as instance pools.
Job permissions — Manage which users can view, manage, trigger,
cancel, or own a job.
Notebook permissions — Manage which users can read, run, edit or
manage a notebook.
Directory permissions — Manage which users can read, run, edit, or
manage all notebooks in a directory.
MLflow registered model permissions — Manage which users can read,
edit, or manage MLflow registered models.
The permissions API is not completely documented on the Azure Databricks REST API page. I would request you to follow the document on databricks to create Permissions API.

Related

Is there some how to backup Azure AD B2C?

I can't find an official answer for this. My researches on google say things like...
It's not necessary because Azure AD B2C is geo replicated, resilinece, bla bla bla bla bla...And even in an event of the 3rd world war, Azure AD BC2 will be up and running.
All right, nice speech, Microsoft. Very good for your sales team, but...
We have clients. Clients are paranoids. They want us to show how we are doing the backup.
Also, what about a clumsy admin that's accidentally deletes everyone ?
And Azure AD B2C stores much more than user data. You can store custom user properties, App Registrations, Flows and many other things that's composes the archtecture of your solution. This must be protected as well.
So, since there is no out-of-the-box solution for this...Anyone knows something non official ? Maybe a power script or a non documented solution ? The solution at Back and restore for Azure AD B2C is no longer valid.
what about a clumsy admin that's accidentally deletes everyone ?
You can demonstrate how you have restricted Admin access into a
Production AAD B2C directory. You can demonstrate that you fully
orchestrate your directory configuration through CI/CD pipelines with
gated deployments through multiple AAD B2C tenants that act as lower
environments.
You have 30 days to restore all deleted objects.
Nobody can delete all accounts via the Portal, and nor should there
be any CI/CD pipeline built to perform such an action.
And Azure AD B2C stores much more than user data.
User object - Dump users via Graph API. ObjectId can not be restored in the case of permanently deletion by an Admin.
Application Registrations - Config should be in a repo and controlled with CI/CD. If permanently deleted, you should demonstrate how to rebuild an App Registration using the config from your Repo, and update the Application code to reflect the new ClientId/ClientSecret. ClientId cannot be restored from a permanently deleted Application Registration.
User Flows - Config should be in a repo and controlled with CI/CD
IdP Configurations - Config should be in a repo and controlled with CI/CD
Custom policies - Config should be in a repo and controlled with CI/CD
Generally all features that you've configured have MS Graph API configurable endpoints that you can manage via CI/CD, and maintaining these configs in a repo.
According to MS information here they recommend using Azure Backup for Azure AD B2C. I have not tried it yet but hope to soon.

Integrating Azure AD credentials to Kubeflow notebook pods

I'm currently setting up a Kubeflow environment in Azure using AKS. Everything is set up and working (users are able to log into the kubeflow platform using Azure AZ credentials and start notebook pods in their own namespace). I'm assuming these AD credentials are embedded somewhere in the container creation process, and I'm wondering if it's possible to tap into these credentials for other services that are AD integrated?
Use case:
A user is working in a Jupyter notebook started from the Kubeflow platform. The user wishes to access data in an Azure storage blob. Instead of having to login to Azure from their notebook session, the container already has their credentials stored.
It sounds reasonable but I'm unsure if it can actually be done in a secure way.
Assuming you're following the instructions in Authentication using OIDC in Azure: no, this isn't possible using the default configuration.
The way OIDC works is by giving back a token with a given audience (who it should be used with) and grants (what it says you should be able to do). The token that's being issued to Kubeflow is valid for the Kubeflow service principal audience only; in other words, you can't then take that same token and use it with Azure APIs. This is by design, and is one of the key factors in OIDC security. Allowing Kubeflow to have the permissions to issue more tokens (typically via the user_impersonation grant) opens a fairly major security issue - now anyone who manages to compromise that application secret can get powerful tokens, instead of limited scope tokens as normally designed.
If the resources they need to access aren't specific to the users in question, aad-pod-identity could be used to grant an access token to the pods the users are running instead of requiring them to log in again.

How to script User accounts in HDInsight clusters

I am automating creation of HDInsight Clusters. I can create the clusters. However, the template creates ADMIN accounts. We are using Ambari to create the USER accounts manually but would like to automate this. I think I can get a script included as part of the template.
I need a script to create User accounts in a manner Ambari would. I have no idea where to start.
Creating groups would also be helpful.
Almost all Amabri actions can be scripted using the Rest interface of Ambari. See:
How to use RestSharp with Ambari Swagger
I chose to implement a RestSharp interface using Azure Functions triggered by a Cluster
created subscription event. However, these actions can also be implemented with a Curl script defined by a Script Action during or after creation. The Ambari Rest interface is finicky and requires certain headers are OR are not present. Fiddler can be used to listen to the Ambari web client to determine the correct headers. There is a swagger.json file downloadable from a cluster. It is not a very good file, but enough to get started.
I put my client with swagger.json on github:
https://github.com/USStateDept/Azure-HDInsight-Ambari-RestClient
Unfortunately, you cannot create user accounts in HDInsight clusters using script actions and Ambari in Azure HDInsight does not support creating local users and sign in using those users.
You can add users accounts are only available in HDInsight Enterprise Security Package.
OR
You can LDAP users to login into Ambari UI in domain-joined HDInsight clusters.
HDInsight clusters with Enterprise Security Package (ESP) can use strong authentication with Azure Active Directory (Azure AD) users, as well as use role-based access control (RBAC) policies. As you add users and groups to Azure AD, you can synchronize the users who need access to your cluster.
Reference: Synchronize Azure Active Directory users to an HDInsight cluster and also you may checkout similar question addressed in Azure HDInsight MSDN forum.

Azure - restrict access to app service only

Ive created a website in Azure and I want to allow users to login and use the app, but im slightly confused by azure active directory access. I want users to only have acces to the web app, not to the portal. Users will be from within my organisation and from outside it so its vitally important that access is locked down, If a user somehow ends up at the azure portal they must not be able to access it. If I set users up in our active directory, wont they be able to login to the azure portal too ? I want to take advantage of authentication as a service and hand over authentication and multi factor authentication to azure but everytjhing Ive read so far seems to suggest If i use azure active directory, users will be able to acess the Azure portal too, is this correct or am i misinterpreting the information ? Are there any step by step guides available for these sorts of scenarios ?
If i use azure active directory, users will be able to acess the Azure
portal too, is this correct or am i misinterpreting the information ?
No, your users will not have access to Azure Portal (rather Azure Subscription as Azure Portal is an application using which a user manages one or more Azure Subscriptions) unless you grant them permission to access it. In order for your users to have access to Azure Portal, you would need to grant them permissions explicitly to do so. In the new portal, you do it by assigning roles (e.g. Owner, Contributor, Reader etc.) and in the old portal you do it by making them co-administrators.
Unless you do this, when they login into Azure Portal all they will see is a message stating no Azure Subscriptions were found.

Running applications with DB roles defined

I have a question on how we currently deploy applications on premises and how this would work in Azure.
So our on premises application is as follows:
We have a web application deployed on
our web server (WebAppExample1) that
talks to an application
(AppServerExample1) on our App
Server.
AppServerExample1 goes to our
database for data in Table1.
In our database the only application that
requires permission to Table1 is
AppServerExample1, so we create a DB
Role and grant appropriate
permissions. We associate this role
with an Active directory user
(AppServerExample1User) that
AppServerExample1 runs as.
How can this be done in Azure?
From looking at some samples I dont see anyone defining permissions at this level, which to me should be done (least privelage).
Also I believe you can not be an Admin in SQL Azure so does this mean you cannot create DB roles?
Thanks for replies
From everything I've read I believe you can create Roles in SQL Azure - e.g. see http://www.structuretoobig.com/post/2010/02/13/SQL-Azure-Logins.aspx
If you try and find you can't, then perhaps you could achieve this using users with different permissions rather than roles - see http://msdn.microsoft.com/en-us/library/ee336235.aspx#DatabasePerms
This database-level permission model in SQL Azure Database is same as
an on-premise instance of SQL Server.
For information, see the following
topics in SQL Server Books Online
references.
Identity and Access Control (Database Engine)
Managing Logins, Users, and Schemas How-to Topics
Lesson 2: Configuring Permissions on Database Objects

Resources