Restricting access to storage account containing package blob for cloud service (extended support) deployment - azure

I'm nearly done migrating our cloud service (classic) deployments to cloud service (extended support). I'm working now on updating deployment pipelines. My package blob is located in a storage account. I create a SAS for the blob and use an API call to management.azure.com to create/update the deployment, passing ARM template as the body of the request.
This works correctly as long as the storage account with the package blob has its network set to "allow access from all networks". I want to restrict this access. I set the allow access from:
specific IP addresses of our devops servers
our own IP addresses
private vnet/subnets for the cloud service
I also tick the "Allow Azure services on the trusted services list to access this storage account" checkbox.
Yet, API call fails with error message indicating access is not allowed to the blob. When I change the storage account network configuration to "allow access from all networks", everything works correctly.
With lots of searches, I found only one hit explaining the same problem - https://github.com/Azure/azure-powershell/issues/20299 - yet no solution has been suggested other than allowing access from all networks.
I must be missing some trick - but what is it? How can I restrict access to the storage account?

Related

"PackageUriForbidden" error when trying to deploy an Azure Cloud Service ES

I am trying to redeploy an Azure Cloud Service (classic) to an "extended support" one since the former is being deprecated. Following this guide and the prerequisites I have created a virtual network and new storage account. I set up a bunch of permissions and the Connectivity Check for my storage account indicates no problems. However when I try to create and deploy a new Cloud Service (Extended Support) using my (updated) .cscfg, .csdef and .cspkg files I get this error:
Error:AuthorizationFailure, message:This request is not authorized to perform this operation. (Code: PackageUriForbidden)
I've tried setting the container and blob access to public for the deploy files, I have added Network Contributor and Storage Blob Data Contributor to both the subscription and the cloud storage resources for my user account. What am I missing?
I tried deploying cloud services extended support via Azure Portal and it got deployed successfully.
Refer below :-
I have uploaded all my cloud services packages in my storage account and used those packages from my storage blobs and created the Cloud service ES instance.
I enabled connection from all networks for my storage account, Thus I did not receive any authorization error :-
It looks like your Storage account has Firewall and V-Net enabled for the selected Networks. Or There’s an IP Address added to restrict storage account.
I created a Create a Service endpoint in my V-Net to allow Microsoft.storage like below :-
Added this V-Net in the selected Networks in Storage account’s Firewall :-
Checked this in order to allow azure services to access storage account like below :-
Now, when I try to deploy another cloud service with the same storage account having firewall and V-Net enabled, I get the same error as yours refer below :-
I allowed my client machine’s IP in the storage account and was able to add the packages without any error while deploying the Cloud Service:-

Accessing Azure Storage Accounts with Selected Network Enabled

As per the requirements, I need to Enable Firewall with Selected Network ON for Azure Storage Accounts. But when I do the same along with adding all required IPs, Azure Function App and Azure Data Factory is going down.
Currently the VNET is unavailable and cannot be created. Managed Identity is not an option as Contributor role unavailable.
Is there a way to to configure the Data Factory and Function Apps after enabling FireWall with selected networks for Azure KeyVault and Azure Storage Accounts.
Please find the below steps helps to work around:
Is there a way to to configure the Data Factory and Function Apps after enabling FireWall with selected networks for Azure KeyVault and Azure Storage Accounts.
When Network rules like specified IP Addresses, IP Ranges, subnets are configured to the storage accounts, then that storage accounts can only be accessed by applications that request data over the specified set of networks or through the specified set of Azure resources.
Also, the option Allow Trusted Services is set to ON while enabling the firewall for a storage account, which allows connectivity from Azure trusted services like Data Factory, Azure functions, etc.
Visit this documentation to know the list of trusted services allowed to access a key vault in Azure.
You have to create the VNet, attach to the Azure Function App which helps to connect to the Storage Account.
Currently the VNET is unavailable and cannot be created. Managed Identity is not an option as Contributor role unavailable.
To enable a service endpoint for a subnet/IP Addresses attached to Storage account, you can have custom role like Microsoft.Network/virtualNetworks/subnets/joinViaServiceEndpoint/action.
Refer to MSFT Docs1 and here for more information.

Azure Storage Account Firewall Permissions for Vulnerability Assessment

I have created a storage account for use in storing the results of an Azure Vulnerability Assessment on an Azure SQL Database.
If the firewall on the storage account is disabled, allowing access from all networks, Azure Vulnerability Scans work as expected.
If the firewall is enabled, the Azure Vulnerability Scan on the SQL Database reports an error, saying the storage account is not valid or does not exist.
Checking the box for "Allow Azure services on the trusted services list to access this storage account." in Networking properties for the storage account does not work to resolve this issue, though it is the recommended step in the documentation here: https://learn.microsoft.com/en-us/azure/azure-sql/database/sql-database-vulnerability-assessment-storage
Allow Azure Services
What other steps could resolve this issue, rather than just disabling the firewall?
You have to add the subnet and vnet that is being used by the SQL Managed Instance as mentioned in the document you are following . You can refer the below screenshot:
After enabling the service endpoint status as shown in the above image , Click Add . After adding the vnet it should look like below:
After this is done , Click on save and you should be able to resolve the issue.
Reference:
Store Vulnerability Assessment scan results in a storage account accessible behind firewalls and VNets - Azure SQL Database | Microsoft Docs

Programmatically create a service SAS token for Storage Account in Azure

From the Azure portal I would like to programmatically and periodically create a service SAS token. Once a token has been created it should expire in one week and a new token also valid for one week will be created and so on. I was reading this article https://learn.microsoft.com/it-it/azure/storage/blobs/sas-service-create?tabs=dotnet but I am not very sure about where that code should run, in a Azure VM? I can't give internet access to the VM
The code from the article can be run from any compute service.
If that is the sole purpose of the compute resource, I would pick Logic Apps to have everything managed for you; it may have a connector to do it or you can embed some JavaScript.
Should that not be sufficient, I would use an Azure Function.
You can also use a VM if that is more suitable and restrict/block its internet access.
If you need to restrict internet access, you must be sure your blob storage is reachable, your options are:
Open whichever firewall/NSG to that storage account
Using service endpoints, service endpoint policies
Project a Private Link endpoint into the VNET from the storage

Azure Storage Account : Blob service (SAS) Connectivity Check FAILED

We created a new Storage Account on Azure.
And, when we perform the Connectivity Check, it shows that Blob service (SAS) endpoint is not accessible with message "Public access is not permitted on this storage account." The status code is 409.
The Storage Account was upgraded from V1 to General-Purpose V2. Is that causing this issue?
Also, "Generate SAS and connection string" button in "Shared access signature" is disabled and greyed out.
How do we create and enable this endpoint?
My search so far doesn't point to any solution to create/enable this over the Portal.
Is it possible only through the REST API?
Blob service (SRP) check, Share Access Signature check is successful. There is no private endpoint, firewall created and access is allowed from "All Networks".
Accessing blob from client side with Storage Account Key with an API is currently failing with error code 403.
Also, we are successfully able to fetch the blob details from "Microsoft Azure Storage Explorer" connected with the 'Connection String' of the Storage Account.
Additional Details :
I can also see that "Blob service (Azure AD)" endpoint is not accessible, but "Queue service (Azure AD) endpoint is.
I faced a similar issue, seems like by default Allowed resource types option is unchecked. Select any one option and the Generate SAS Connection String button becomes enabled.

Resources