I have application in asp.net and uploaded to Azure App service. I have file upload on the azure blob. Here issue is my blob is accessible to public I want to access images and docs in the blobs only accessible when my application is logged in. If I log out then those should not be access. How can I achieve this using azure blob storage?
In this case, you can configure the web app with Azure AD auth(Easy Auth), follow this doc.
After doing the steps in the doc above, it will create an AD App in your AAD tenant. Navigate to the AD App in the Azure Active Directory in the portal -> API permissions, add the delegated permission of Azure Storage.
Then navigate to the authsettings of the web app in the resource explorer, add ["resource=https://storage.azure.com"] to the additionalLoginParams, details see this blog.
Navigate to the storage account in the portal -> Access control (IAM) -> make sure the user account has a role e.g. Storage Blob Data Contributor, if not, add it for the user, follow this doc.
After doing the steps above, use the user account to login the web app, you can get the access token with https://webappname.azurewebsites.net/.auth/me, then you can use the token to call the Storage REST API - GET Blob to access the things in the storage container.
Related
This seems like a simple thing, but I've been unable to find an answer
I'm using Azure Storage, and I have a folder in the container that has some sensitive documents in it. So I'd like to be able to "hide" the folder and it's contents from the directory listing. Is this possible? The only thing I can find is to change the Access Level of the entire container.
If there's not a way to do this, what's the proper way to hide Azure Storage docs from public access?
There is no feature in Azure Storage to hide the content in the Azure Container folder.
There is an alternate approach to secure the Azure Storage container files.
Thanks to Niclas for suggesting the same.
Assign RBAC roles to your Azure Container, like below.
Azure Portal > Storage accounts > Select your storage account > Container > Select your container > Access Control (IAM)
When I tried with another Azure AD user, who doesn't have any RBAC roles to check the Azure container files.
The user is unable to access the Azure Container.
I have assigned Storage Blob Data Contributor to the same user on Storage Account to check the Azure Container files.
Once assign the RBAC roles to the user, the same user can access the Azure Container.
Referance: Assign an Azure role for access to blob data.
I am looking for examples to fetch access token for azure storage account access via azure active directory with service principal in python
It seems like https://github.com/AzureAD/azure-activedirectory-library-for-python/blob/dev/sample/client_credentials_sample.py doesn't support service principal access
are there other ways in fetching token via service principal?
• Yes, you can surely fetch an access token for an Azure storage account via Azure Active Directory using a service principal, i.e., an app registration by following the steps below as given in the below documentation link: -
https://learn.microsoft.com/en-us/azure/developer/python/sdk/authentication-azure-hosted-apps?tabs=azure-portal%2Cazure-app-service
As per the above documentation, you will have to host your python application code in a web app service and create a system assigned managed identity for it. Once created, then an application will be created in Azure AD with the same ‘Object ID’ as shown in the managed identity section of the web app service. Before moving onto this application created in Azure AD, assign the required roles to this system assigned managed identity through the ‘IAM’ tab. So, in your case, you should assign the role of ‘Storage Account Contributor’ to the ‘System-assigned managed identity’ created for the web app service.
• Once, the role has been assigned, then go to the ‘Enterprise application’ and search for the ‘Object ID’ of the managed identity, you will encounter an application with the name of the web app service, in that, go to ‘Permissions’ under ‘Security’ tab and assign the required permissions and admin consent required to the application. The permissions shown are those that are allowed under the scope of ‘Storage Account Contributor’ and similarly, you must assign permissions from it to the app/service principal. Then ensure that you are correctly calling the environment variables of this application created in Azure AD regarding the managed identity and implement the ‘DefaultAzureCredential’ from the ‘azure.identity’ module. For this purpose, kindly refer to the below subsection of the above documentation: -
https://learn.microsoft.com/en-us/azure/developer/python/sdk/authentication-azure-hosted-apps?tabs=azure-portal%2Cazure-app-service#3---implement-defaultazurecredential-in-your-application
In this way, you can fetch an access token for the Azure storage account via Azure Active Directory using a service principal.
How can you access a Storage Account blob container without using an Account Key?
I can access data in Storage Account blobs in Power BI ... but it needs the account Access Key !
Is there some way to access the data using some other authentication approach, i.e. an app registration, service principal, managed identity, whatever ?
You could use Shared Access Signature(SAS) to connect Blob Storage in Power BI. This guide will help you.
If you access blob with Azure AD, it doesn't seem to integrate with Power BI. And there is a .Net code sample about creating a block blob.
Azure AD authenticates the security principal (a user,
group, or service principal) running the application. If
authentication succeeds, Azure AD returns the access token to the
application, and the application can then use the access token to
authorize requests to Azure Blob storage or Queue storage.
For more information, the document describes the options that Azure Storage offers for authorizing access to resources.
I'm building a SPA web app and API on MS Azure. The application needs to authenticate users that aren't part of the organization's Azure AD Directory (and shouldn't be). We are using a B2C directory (tied to the same subscription) for this with local users.
We need to store a file in Azure Blob Storage. We set up the container in the organization's AD Tenant. I want the SPA application to retrieve the file directly from Blob Storage. I tried exposing the Blob Storage permissions in the app registrations which are in the B2C Directory, but it won't allow exposing the permission because the Directory doesn't have a subscription (I did register the B2C Tenant with the organization's subscription).
So my question is - do I need to set this up as a multi-tenant situation between the organization's Directory and the B2C Directory? So setup an app registration in the organization's directory, make it multi-tenant, and expose the needed Blob Storage permissions? Or is there a better way to do this?
As far as I knew, the Azure AD B2C local user account cannot be used to do Azure AD auth then access Azure blob. Because the email address that you use to create account using your 'Sign in / Sign up' user policy is just as “SignInName”. We just can use it to finish Azure AD B2C Authentication. To finish Azure AD Authentication, we need to use "userPrincipalName". But your users cannot get it by themselves. Meanwhile, for security reasons, Microsoft does not recommend customers provide “userPrincipalName” for their users.
So if you want to access azure blob in your application, I suggest you can create a SAS token for the blob or the container which you want to access. For more deatils, please refer to the document.
I'm using Azure Storage Explorer to connect to storage accounts that I've created by hand on Azure. However when I go to browse the storage account that was created by Azure when I created a Media Services account, I'm unable to connect to it.
I'm using blob.core.windows.net as the storage endpoint domain, and setting the storage account name and storage account key to be the same as Azure has defined it in the dashboard, but attempts to connect (with or without HTTPS) result in a 502-Bad Gateway HTTP error.
I'd like an easy way to browse all media files I've created without having to write special code. Has anyone been able to get this to work?
All storage accounts regardless the way they are being created are browsable with Storage Explorer!
For such storage accounts, created when you Create Media Services, you have to use the Storage Account Name and Storage Account Key, but not the Media Service Account Name and Media Service Account Key! You will not be able to access Storage service with Media Key and vice-versa.
When you create a Media Services account, one/multiple storage accounts could be attached to a particular media services account. Let's say your account name is "MediaStorage123". I believe you need to pass the following data to storage explorer:
Account name/key: this can be found in the bottom of your storage account page in Azure portal: press Manage Key button you will see the data.
storage endpoint domain: Not sure why you need this, but if so, you can see the information in Dashboard of your media services account: https://xxx.blob.core.windows.net/.
Hope this helps.
Just for the record, In my case (with the use of proxy) I had do install a previous version of the Azure Storage Explorer