How to get list of Users and Groups from Ambari Ui programmatically - azure-hdinsight

I have a Security requirement to fetch list of Users and Groups from Ambari Ui programmatically from all HDInsight Clusters in our environment.
Can one of you please suggest sample method how get this using C#.

Related

Propagating google service account keys in large on-prem hadoop cluster for distCP -> HCFS->GCS

As the title describes, I am trying to glean some details about how service accounts (multiple service accounts across a large cluster) can be handled in a secure manner in a large hadoop on-premesis cluster that runs workflows across many different teams.
What's an effective way to manage propagation of these keys so they're available across data nodes without also making it available to multiple other different users? I know there are certain parameters one must pass through to core-site.xml but this is at a cluster level, how can this be done in a secure manner just for my project?
I will have workflows scheduled via oozie jobs which push a subset of data in the direction of GCS (I already know this is possible manually, just using gsutil)

How do I limit users to just the DBSQL lens of Databricks? I don't want them to have access to SQL endpoints or DE & ML lenses

Here are the current options. I want to give the BI team only access to the S: SQL
Completely removing the other options is currently not available. However, you can lock them as shown in the picture.
Remove databricks-sql-access entitlement from default users group.
Create a new group called 'sql-users-only' and give them only this entitlement. (so they won't see workspace and they can't spin up endpoints)
Resources:
Databricks provide a lot of control using ACLs. Access control overview
You can restrict the dashboard viewing options as well. Dashboard permissions

How can I connect to multiple organization and projects from Azure Devops tool to PowerBI?

I connected to Azure DevOps Boards using personal access token to fetch workitems by referring to the link: https://learn.microsoft.com/en-us/azure/devops/report/powerbi/data-connector-connect?view=azure-devops
I was able to connect to only one organization and one project under that at a time.
I have a requirement in which I need to connect to multiple organizations and projects and fetch all work items under that.
Please advise how can I go about accomplishing that.
I need to connect to multiple organizations and projects and fetch all
work items under that.
You can try with combining OData and Manage Parameters in Power BI to achieve what you want. This is the new feature we provided last month. Just refer to and follow this blog description.
This blog has provided very detailed steps on how should we do. The nutshell of this feature is using Parameters to automatically have a report to create filter, then it will load a data model from azure devops, here our azure devops OData can provide this data model. This Parameters can let users generate a report, and providing values for its Parameters.
Hope this blog is help.

Minimum permission to view live data - Azure Kubernetes

I have enabled Kubernetes RBAC authorization in all my Azure kubernetes clusters. Now I need to give permissions for viewing live data in containers tab
How can I do it? Which is the minimum premission needed?
Thanks
As far as I understand from my investigation, if you want to do it using Azure Built-in roles, you need the following three roles at the very least:
Reader role assignment scoped to the AKS cluster to be able to discover the cluster
Azure Kubernetes Service Cluster User Role role assignment scoped to the AKS cluster as mentioned in the note atop the page in the docs. This is needed to allow access to Microsoft.ContainerService/managedClusters/listClusterUserCredential/action API call. This API call lists the cluster user credentials.
Log Analytics Contributor role assignment scoped to the Log Analytics workspace associated to the AKS cluster. This is needed to execute an Analytics query for data, i.e., perform a /workspaces/{workspaceId}/query API call. More here.
This should let one pull up the live data for containers. If not comfortable with this approach, you might also create a Custom Role allowing only those exact actions.
Hope this helps!

Azure Service Fabric Scale up and out

I'm new with Azure Service Fabric. I have created the smallest possible (3xA0) cluster for testing my stateless application. Ideally I wanted to use F1 instances but they were not available for some reason in Cluster Creation dialog wizard.
Now I'm trying to understand how can I manage instance count and size for my existing cluster but I can't see any menu options in Resource Manager related to this.
Please advise.
I've decided to convert my comment to an answer. So there are a lot of help documents covering this.
https://learn.microsoft.com/en-us/azure/service-fabric/service-fabric-cluster-resource-manager-introduction
https://learn.microsoft.com/en-us/azure/service-fabric/service-fabric-cluster-scale-up-down
https://learn.microsoft.com/en-us/azure/service-fabric/service-fabric-cluster-fabric-settings

Resources