Access data across azure tenants through immuta - azure

I have a web UI and immuta on the same azure tenant and I have an external data source(ADLS) on another azure tenant. The data is on ADLS but I cant expose access to ADLS instead immuta should access the data through synapse or any other solution. Does anyone have information or expertise on how this can be achieved.

Related

External tables in Kusto - Cross tenant ADLS storage account as data source

I have an ADLS storage account in tenant X and a multi tenant App registration that has read access to the ADLS storage account.
The storage account has terabytes of data coming in daily.
So I figured copy activity to Kusto in tenant Y through Synapse is a worse choice.
Then i came to know about external tables in Kusto. But my Kusto cluster in tenant Y does not have access to ADLS account in tenant X.
I want to use my multi tenant App registration from tenant X for Kusto to authenticate itself to the ADLS and read data through external tables.
How to achieve this?
Is there any docs on this?
if this is not possible, how do people transfer terabytes of data from one tenant to another tenant's kusto cluster?
You could consider using Azure Data Share or Purview Data Share that supports in-place sharing of ADLS data across orgs subject to T&Cs being created by owner and accepted by receiver.
This would provide separation of concerns, accurate charge-back, and you end up with a tenant local abfss connection string.
Worth looking into the performance characteristics of such a set up.

Restrict Users permission to access data in ADLS

Is it possible to allow only specific users from databricks to access specific data from Azure Data Lake Storage?
I want to allow only User 1 and User 2 to access data1.csv file and allow User 3 and User 4 to access data2.csv file.
It is a Premium feature in Azure Databricks that allows to authenticate to Azure Data Lake Store using the Azure Active Directory identity logged into Azure Databricks. With this feature customers can control which user can access which data through Azure Databricks.
This feature needs to be enabled on the cluster (see screenshot below) and once configured, users can then log-in & execute read/write commands to Azure Data Lake Store without the need to use service principle. The user can only read/write data based on the roles and ACLs the user has been granted on the Azure Data Lake Store.
Refer - https://learn.microsoft.com/en-us/azure/databricks/security/credential-passthrough/adls-passthrough#--enable-azure-data-lake-storage-credential-passthrough-for-a-standard-cluster

External team partial access to Azure Data Lake

Our team has an Azure Data Lake Gen2. Another team would like to input data to the Data Lake but they should not be able to view our contents in the Data Lake. How can I achieve it?
I think partial access is not possible and need to create another Azure Data Lake for the external team to put data. Am I correct?
Look at this. You can add RBAC access to the data lake, so you can create a new folder where external team has write permissions and no permissions to other folders.
Another option is to create a different container in the storage account and configure IAM control access.

Azure Cross Directory Data Access

I'm currently developing a Azure solution for one of my managed service clients.
we are developing a power bi service for their Azure backup/ azure recovery.
we are looking to host the whole process in our own azure environment, however we cannot get the data from A) their recovery vault logs into B) our Azure environment.
Anyone have any ideas on how to move data from their environment into our environment storage?
thank you
Power BI based reporting gets the data from storage accounts which store Azure Backup data. Once customer configured diagnostic settings to send data to a storage account(ask him to create a dedicated storage account for this), he can share the access keys with you so that you can connect to customer's storage account to get the required data and run Power Bi report in your environment.
This doc has all the details - with the change being in this case, customer will store data in his storage account and provide you access to that storage account via access key.

Manage Authorization To folders in Azure Data Lake from Excel

I am developing an Azure data lake and I want to connect Excel to the data lake.
How do you authorize users too see the data from Excel?
I have used two test users and given them different access to the resource group, the services etc, and they just don't get access. Only I, myself have access.
Is it possible to restrict the access so that excel can only see one specific folder in the data lake?
The normal way to do this is using an app registration, but I can not see how to connect an app to excel.
Users must be authenticated via ADFS and granted global permissions. You can specify O365 credentials and grant AAD access.
https://blogs.msdn.microsoft.com/freddyk/2018/06/29/aad-authentication/
https://learn.microsoft.com/en-us/azure/analysis-services/analysis-services-manage-users
You can apply access control in Azure Data Lake so that users can only see certain folders. https://learn.microsoft.com/en-us/azure/data-lake-store/data-lake-store-access-control#common-scenarios-related-to-permissions

Resources