Azure Storage Account : Blob service (SAS) Connectivity Check FAILED - azure

We created a new Storage Account on Azure.
And, when we perform the Connectivity Check, it shows that Blob service (SAS) endpoint is not accessible with message "Public access is not permitted on this storage account." The status code is 409.
The Storage Account was upgraded from V1 to General-Purpose V2. Is that causing this issue?
Also, "Generate SAS and connection string" button in "Shared access signature" is disabled and greyed out.
How do we create and enable this endpoint?
My search so far doesn't point to any solution to create/enable this over the Portal.
Is it possible only through the REST API?
Blob service (SRP) check, Share Access Signature check is successful. There is no private endpoint, firewall created and access is allowed from "All Networks".
Accessing blob from client side with Storage Account Key with an API is currently failing with error code 403.
Also, we are successfully able to fetch the blob details from "Microsoft Azure Storage Explorer" connected with the 'Connection String' of the Storage Account.
Additional Details :
I can also see that "Blob service (Azure AD)" endpoint is not accessible, but "Queue service (Azure AD) endpoint is.

I faced a similar issue, seems like by default Allowed resource types option is unchecked. Select any one option and the Generate SAS Connection String button becomes enabled.

Related

Restricting access to storage account containing package blob for cloud service (extended support) deployment

I'm nearly done migrating our cloud service (classic) deployments to cloud service (extended support). I'm working now on updating deployment pipelines. My package blob is located in a storage account. I create a SAS for the blob and use an API call to management.azure.com to create/update the deployment, passing ARM template as the body of the request.
This works correctly as long as the storage account with the package blob has its network set to "allow access from all networks". I want to restrict this access. I set the allow access from:
specific IP addresses of our devops servers
our own IP addresses
private vnet/subnets for the cloud service
I also tick the "Allow Azure services on the trusted services list to access this storage account" checkbox.
Yet, API call fails with error message indicating access is not allowed to the blob. When I change the storage account network configuration to "allow access from all networks", everything works correctly.
With lots of searches, I found only one hit explaining the same problem - https://github.com/Azure/azure-powershell/issues/20299 - yet no solution has been suggested other than allowing access from all networks.
I must be missing some trick - but what is it? How can I restrict access to the storage account?

Azure Export creation failed. SAS token access to user storage is not supported

I am trying to create Azure Billing Export via Portal Azure with the use of SAS token. I want to export costs from tenant A to storage container in tenant B.
I have generated SAS token in storage account with help of this tutorial, with only change of expire date extension. In storage account there is enabled "Allow storage account key access" configuration.
I am able to connect to storage account via Storage Explorer with use of generated SAS token, but when I try to create export there is error:
Export creation failed.SAS token access to user storage is not supported.
I can't find anything about this error in MS Azure documentation and also anywhere in the web.
I generated token by Azure CLI, storage account Shared access signature, container Shared access signature and Storage Explorer.
I have not generated User Delegation SAS, because I need long term access.
The problem was caused by lack of needed permissions to perform that action. I don't know which role gives permissions to do that, Global Admin helped a lot.

Azure Storage Account Firewall Permissions for Vulnerability Assessment

I have created a storage account for use in storing the results of an Azure Vulnerability Assessment on an Azure SQL Database.
If the firewall on the storage account is disabled, allowing access from all networks, Azure Vulnerability Scans work as expected.
If the firewall is enabled, the Azure Vulnerability Scan on the SQL Database reports an error, saying the storage account is not valid or does not exist.
Checking the box for "Allow Azure services on the trusted services list to access this storage account." in Networking properties for the storage account does not work to resolve this issue, though it is the recommended step in the documentation here: https://learn.microsoft.com/en-us/azure/azure-sql/database/sql-database-vulnerability-assessment-storage
Allow Azure Services
What other steps could resolve this issue, rather than just disabling the firewall?
You have to add the subnet and vnet that is being used by the SQL Managed Instance as mentioned in the document you are following . You can refer the below screenshot:
After enabling the service endpoint status as shown in the above image , Click Add . After adding the vnet it should look like below:
After this is done , Click on save and you should be able to resolve the issue.
Reference:
Store Vulnerability Assessment scan results in a storage account accessible behind firewalls and VNets - Azure SQL Database | Microsoft Docs

How to provide access to storage account in Azure using service principal

I have a account in Azure portal. When I try to access to the specific storage account, its says "Access denied" and unable to open the blob container.
But I have created the storage account using the service principal which got all access.
Now I want to view the storage account using my Azure user account.
Is there way to provide permissions using service principal?
According to your description, it seems that the special storage account that with firewall set or in the Virtual Networks.
If "Allow access from All networks " is possible, you could choose that or you could add your localhost public ip to allow access it from the internet . For more information, please refer to the screenshot.

Why can't Azure Storage Explorer connect to storage accounts created by Azure Media Services?

I'm using Azure Storage Explorer to connect to storage accounts that I've created by hand on Azure. However when I go to browse the storage account that was created by Azure when I created a Media Services account, I'm unable to connect to it.
I'm using blob.core.windows.net as the storage endpoint domain, and setting the storage account name and storage account key to be the same as Azure has defined it in the dashboard, but attempts to connect (with or without HTTPS) result in a 502-Bad Gateway HTTP error.
I'd like an easy way to browse all media files I've created without having to write special code. Has anyone been able to get this to work?
All storage accounts regardless the way they are being created are browsable with Storage Explorer!
For such storage accounts, created when you Create Media Services, you have to use the Storage Account Name and Storage Account Key, but not the Media Service Account Name and Media Service Account Key! You will not be able to access Storage service with Media Key and vice-versa.
When you create a Media Services account, one/multiple storage accounts could be attached to a particular media services account. Let's say your account name is "MediaStorage123". I believe you need to pass the following data to storage explorer:
Account name/key: this can be found in the bottom of your storage account page in Azure portal: press Manage Key button you will see the data.
storage endpoint domain: Not sure why you need this, but if so, you can see the information in Dashboard of your media services account: https://xxx.blob.core.windows.net/.
Hope this helps.
Just for the record, In my case (with the use of proxy) I had do install a previous version of the Azure Storage Explorer

Resources