We have VAPT findings to add a storage account to private endpoint.
Storage account is used by Azure CDN as origin.
After adding endpoint the Azure CDN cannot access and gives error with an XML page.
How to access storage account with private endpoint from Azure CDN?
You need to give access over the firewall. For Microsoft CDN (classic) that would be the following range 147.243.0.0/16 as found in the Microsoft documentation.
if you've placed your storage account in a private link then i imagine you probably want to protect its content from unwanted access.
i think #travisez13's solution would allow anything running in azure to access the storage account directly, assuming they could guess the names.
i think you may want to try this approach instead: https://learn.microsoft.com/en-us/azure/cdn/cdn-sas-storage-support
Related
I'm nearly done migrating our cloud service (classic) deployments to cloud service (extended support). I'm working now on updating deployment pipelines. My package blob is located in a storage account. I create a SAS for the blob and use an API call to management.azure.com to create/update the deployment, passing ARM template as the body of the request.
This works correctly as long as the storage account with the package blob has its network set to "allow access from all networks". I want to restrict this access. I set the allow access from:
specific IP addresses of our devops servers
our own IP addresses
private vnet/subnets for the cloud service
I also tick the "Allow Azure services on the trusted services list to access this storage account" checkbox.
Yet, API call fails with error message indicating access is not allowed to the blob. When I change the storage account network configuration to "allow access from all networks", everything works correctly.
With lots of searches, I found only one hit explaining the same problem - https://github.com/Azure/azure-powershell/issues/20299 - yet no solution has been suggested other than allowing access from all networks.
I must be missing some trick - but what is it? How can I restrict access to the storage account?
From the Azure portal I would like to programmatically and periodically create a service SAS token. Once a token has been created it should expire in one week and a new token also valid for one week will be created and so on. I was reading this article https://learn.microsoft.com/it-it/azure/storage/blobs/sas-service-create?tabs=dotnet but I am not very sure about where that code should run, in a Azure VM? I can't give internet access to the VM
The code from the article can be run from any compute service.
If that is the sole purpose of the compute resource, I would pick Logic Apps to have everything managed for you; it may have a connector to do it or you can embed some JavaScript.
Should that not be sufficient, I would use an Azure Function.
You can also use a VM if that is more suitable and restrict/block its internet access.
If you need to restrict internet access, you must be sure your blob storage is reachable, your options are:
Open whichever firewall/NSG to that storage account
Using service endpoints, service endpoint policies
Project a Private Link endpoint into the VNET from the storage
I have enabled Virtual Network and Firewall access restrictions for Azure Storage Account, but faced the issue, that I do not have an access to Storage Account from Azure Functions(ASE environment), despite fact that ASE public address is added as exception. Additionaly, I have added all environment's virtual networks just to make sure.
Is there any way to check from which address functions/other services is trying to get an access to storage account?
Also, I have a tick "Allow trusted Microsoft services to access this storage account
". I'm not sure what is included into "trusted Microsoft services".
In the Application Insight Functions logs, only timeout issue appears, without additional explanation.
Could you please help me to understand how to properly configure storage account access restriction?
Have a look of this doc:
https://learn.microsoft.com/en-us/azure/storage/common/storage-network-security#trusted-microsoft-services
From your description, I think you dont give a RBAC role to your azure function to access the storage.
Do this steps:
If you need more operation. Like do something with the data. Do need to add more RBAC roles, have a look of this offcial doc to learn more about RBAC roles:
https://learn.microsoft.com/en-us/azure/role-based-access-control/built-in-roles#all
I hosted a simple website in Azure storage using the static website feature. The url of the website is now publicly available. (anyone with the url can access the website). But my intention is to provide access only to the users who I want to. Is there a way that can restrict the public access to the static website hosted in Azure storage?
Static website hosting makes the files available for anonymous access. If you need to control who can access the files, you can store files in Azure blob storage and then generate Using shared access signatures (SAS)
to limit access.
The links in the pages delivered to the client must specify the full URL of the resource. If the resource is protected with a valet key, such as a shared access signature, this signature must be included in the URL.
https://learn.microsoft.com/en-us/azure/architecture/patterns/static-content-hosting
You can try configure a CDN endpoint to hit a private Blob container (do not use Static Website feature because the endpoint is completely public) through SAS tokens. Azure CDN supports this scenario natively – in worst case you can write rewrite rules to redirect requests to the Blob endpoint with SAS tokens.
Using Azure CDN with SAS
You can use SAS (shared access signature)
You can keep your blobs in the static website as Private access (only blob owner can access with storage account key)
Then you can have simple service to authenticate and authorize your clients (if many) and generate SAS tokens for them to access the blob (web page). This service can also renew the tokens for them.
If it's a limited number of people you can generate SAS and simply share a link with clients.
You can do this at the granularity of the blob (web page) so you can authorize some to read some pages, while they can't read others ...etc.
I'm using Azure Storage Explorer to connect to storage accounts that I've created by hand on Azure. However when I go to browse the storage account that was created by Azure when I created a Media Services account, I'm unable to connect to it.
I'm using blob.core.windows.net as the storage endpoint domain, and setting the storage account name and storage account key to be the same as Azure has defined it in the dashboard, but attempts to connect (with or without HTTPS) result in a 502-Bad Gateway HTTP error.
I'd like an easy way to browse all media files I've created without having to write special code. Has anyone been able to get this to work?
All storage accounts regardless the way they are being created are browsable with Storage Explorer!
For such storage accounts, created when you Create Media Services, you have to use the Storage Account Name and Storage Account Key, but not the Media Service Account Name and Media Service Account Key! You will not be able to access Storage service with Media Key and vice-versa.
When you create a Media Services account, one/multiple storage accounts could be attached to a particular media services account. Let's say your account name is "MediaStorage123". I believe you need to pass the following data to storage explorer:
Account name/key: this can be found in the bottom of your storage account page in Azure portal: press Manage Key button you will see the data.
storage endpoint domain: Not sure why you need this, but if so, you can see the information in Dashboard of your media services account: https://xxx.blob.core.windows.net/.
Hope this helps.
Just for the record, In my case (with the use of proxy) I had do install a previous version of the Azure Storage Explorer