Connect excel to azure datalake gen 2 with OAuth - excel

When I am looking at microsoft documentation of excels connections possibilities and it says that my Microsoft 365 Apps for enterprise can connect to Azure Blob Storage and Azure Data Lake Storage
I want to connect to my Datalake Gen 2 in excel, I'll go to the Data tab Get Data > From Azure > From Azure Blob Storage
Here is the question: How do I connect to my Gen 2 datalake with Azure AD / OAuth / username (user#domain.com) and password? Storage Account Key Access is disabled on this datalake. That includes Shared key and shared access signature (SAS)

I can see only a UI connector for ADLS Gen1
Alternately for ADLS Gen2....
Here is something I tried, you can use user delegation SAS Key, which is secured with Azure AD credentials.
Generate a user delegation SAS url for that blob you want to access.
Select Web Source to get Data,
If prompted, select Anonymous

First I want to thank KarthikBhyresh-MT for his input that inspired me to find the right solution.
The Solution
First I found the url to the desired file inside the datalake inside azure portal
I copied the changed the url and changed the word blob to dfs
In excel (office 365) I Get Data > From Web, and put in the altered url. From there I get this image, where I can log into the datalake with OAuth credentials
Then I could load the simple.csv file into excel, work on it and see the transactions in the datalake logs.

Related

Cause for 'This endpoint does not support BlobStorageEvents or SoftDelete' in Azure Data Factory

I am developing a data factory that downloads a csv file from a source and writes it to an Azure Storage account that i have read/write rights on. Everything looks good. it gets validated, but when i (test) run, i keep getting the error:
This endpoint does not support BlobStorageEvents or SoftDelete. Please disable these account features if you would like to use this endpoint.\", 409, HEAD
I checked; the source and sink are DIFFERENT files on different locations, i do have successfull connection on both endpoints. What else can i check to fix this?
From the error it seems to be an incorrect storage type. Could you please double check if your storage type is Azure Blob storage or Azure data lake storage gen 2 account? You can go to Storage account Settings and under Configuration section you can find these details. Depending upon version for your storage account you can use the connector for your copy activity in the pipeline. Use Azure Blob Storage connector if your account type is that else use ADLS gen 2 connector if your storage account type is general purpose v2.

Is it possible to open Azure Storage Explorer using direct link with SAS token?

When i want to launch Azure Storage Explorer using direct link what i do is copy it from app and pasting to my browser. It looks somethig like that:
storageexplorer:// ... and there are subId, AccountId etc
My question is, is it possible to produce uri like above using SAS token to open concrete container from my web browser using Azure Storage Explorer?
I'm from the Microsoft for Founders Hub team. With this connection method, you'll use a SAS URI for the required storage account. You'll need a SAS URI whether you want to use a file share, table, queue, or blob container. You can get a SAS URI either from the Azure portal or from Storage Explorer.
To add a SAS connection:
Open Storage Explorer.
Connect to your Azure storage account.
Select the connection type: shared access signature (SAS) URI.
Provide a meaningful name for the connection.
When you're prompted, provide the SAS URI.
Review and verify the connection details, and then select Connect.
When you've added a connection, it appears in the resource tree as a new node. You'll find the connection node in this branch: Local & attached > Storage Accounts > Attached Container > Service.

How to load a specific directory in a Azure container into powerBI

I want to load data into powerBI from a specific directory from an Azure container in an Azure storage account , I have tried using get data -> Azure -> Azure Blob Storage but it's getting all the data in that container , But I need only data from a specific directory in that container
Any help Appreciated
Yes, this capability is not supported in Power BI as of today. Here are some similar ideas on Power BI Ideas forum that you can vote for:
Allow PowerBI to connect to subfolder in azure blob storage
Allow power bi to connect to a particular blob file from a subfolder in the Azure Blob Store

Unable to connect to Azure data lake store via SSIS

I have been trying to connect my SSIS package (on prem) to connect to my data lake store. I have installed the Azure Feature pack which has worked fine.
But when I create a Data Lake connection in my ssis package, I need the following .
Image of SSIS Azure Data Lake connector Manager
ADLS Host – which is fine I know how to get that.
Authentication ( Azure AD User Identity )
UserName & Password, - which I am having issues with.
My question is how do I define a username and password for my data lake?
You can find them in Azure AD User which is within the same subscription with your Azure Data Lake. Usually, it is your email address and password which you used to login Azure portal.
More details, you can refer to this documentation.

Connect Pentaho to Azure Blob storage

I was recently thrown into dealing with databases and Microsoft Azure Blob storage.
As I am completely new to this field I have some troubles:
I can't figure out how to connect to the Blob storage from Pentaho and I could not find any good information online on this topic either.
I would be glad for any information on how to set up this connection.
You can update the core-site.xml like so: https://docs.hortonworks.com/HDPDocuments/HDP2/HDP-2.6.5/bk_cloud-data-access/content/authentication-wasb.html
This will give you access to the azure blob storage account.
Go to the file in Azure Blob and Generate SAS token and URL and copy the URL only. In PDI, select Hadoop file input. Double click the Hadoop file input and select local for the Evironment and insert the Azure URL in File/Folder field and that's it. You should see the file in PDI.
I figured it out eventually.
Pentaho provides a HTTP element, in which you can, amongst other things specify an URL.
In Microsoft Azure Blob storage you can generate a SAS token. If you use the URL made from the storage resource URI and the SAS token as input for the URL field in the HTTP element, Pentaho can access the respective file in the Blob storage.

Resources