Accessing Azure Storage services from a different subscription - azure

We are looking to deploy our Azure cloud services to multiple subscriptions but want to be able to be able to access the same Storage accounts for storing blobs and tables. Wanted to know if it is possible to access storage accounts from across different subscriptions using just the storage account name and key?
Our data connection takes the form
Trying to use the above and it always try to find end point for given accountname within the current subscription

If i understood your question...
able to access the same Storage accounts
Via Azure Panel (Management Portal) : you can access the storage account only in the subscription.
Via Visual Studio: you can attach storage account outside your current login account in visual studio <-> azure with account name and key (and manage it)
Via Code: You can access storage account (blob, queue, table) from all your apps with storage connection strings (don't put it in code)
If you want, you can restrict blob access with CORS settings. Something like this :
private static void InitializeCors()
{
ServiceProperties blobServiceProperties = blobClient.GetServiceProperties();
//Attiva e Configura CORS
ConfigureCors(blobServiceProperties);
//Setta
blobClient.SetServiceProperties(blobServiceProperties);
}
private static void ConfigureCors(ServiceProperties prop)
{
var cors = new CorsRule();
cors.AllowedOrigins.Add("www.domain1.net, www.domain2.it");
prop.Cors.CorsRules.Add(cors);
}

Related

Access denied to storage account from Azure Data Factory

My goal is to run an exe file stored in a private Azure Blob container.
The exe is simple : it creates a text file, write the current datetime in it, and then push it to the private Azure Blob container.
This has to be sent from Azure Data Factory. To do this, here is my environment :
Azure Data Factory running with the simple pipeline :
https://i.stack.imgur.com/txQ9r.png
Private storage account with the following configuration :
https://i.stack.imgur.com/SJrGX.png
A linked service connected to the storage account :
https://i.stack.imgur.com/8xW5l.png
A private managed virtual network approved :
https://i.stack.imgur.com/G2DH3.png
A linked service connected to an Azure Batch :
https://i.stack.imgur.com/Yaq6C.png
A batch account linked to the right storage account
A pool running on this batch account
Two things that I need to add in context :
When I set the storage account to public, it works and I find the text file in my blob storage. So the process works well, but there is a security issue somewhere I can't find.
All the resources (ADF, Blob storage, Batch account) used have a role has contributor/owner of the blob with a managed identity.
Here is the error I get when I set the storage account to private :
{
"errorCategory":0,
"code":"BlobAccessDenied",
"message":"Access for one of the specified Azure Blob(s) is denied",
"details":[
{
"Name":"BlobSource",
"Value":"https://XXXXXXXXXXXXXXXXX/testv2.exe?sv=2018-03-28&sr=b&sig=XXXXXXXXXXXXXXXXXX&sp=r"
},
{
"Name":"FilePath",
"Value":"D:\\batch\\tasks\\workitems\\XXXXXXXXXXX\\job-1\\XXXXXXXXXXXXXXXXXXXXXXXX\\testv2.exe"
}
]
}
Thank you for your help!
Solution found Azure community support :
Check Subnet information under Network Configuration from the Azure portal > Batch Account > Pool > Properties. Take note and write the information down.
Navigate to the storage account, and select Networking. In the Firewalls and virtual networks setting, select Enable from selected virtual networks and IP addresses for Public network access. Add the Batch pool's subnet in the firewall allowlist.
If the subnet doesn't enable the service endpoint, when you select it, a notification will be displayed as follows:
The following networks don't have service endpoints enabled for 'Microsoft.Storage'. Enabling access will take up to 15 minutes to complete. After starting this operation, it is safe to leave and return later if you don't wish to wait.
Therefore, before you add the subnet, check it in the Batch virtual network to see if the service endpoint for the storage account is enabled.
After you complete the configurations above, the Batch nodes in the pool can access the storage account successfully.

For VM diagnostics agent, we create SA, if the SA is not publicly allowed, do we need to create Private end point for both Blob and Table for this SA?

As per this Microsoft documentation the VM uses both blob and table for storing the diagnostic logs.
As described here that the StorageType can be Table, Blob, or TableAndBlob. And let say we are using TableAndBlob.
The storage account that we are creating for this purpose will only be accessible via private endpoints. So do, I need to create private endpoint for both blob and table with private DNS zones privatelink.blob.core.windows.net and privatelink.table.core.windows.net?
Also, if I choose StorageType as Table, I just need the table(privatelink.table.core.windows.net) private endpoint?
I tried to reproduce the same in my environment public access does not allow in the traffic. we can able to create a private endpoint for both blob and table as shown below:
In your storage account -> networking under Security + networking ->private endpoint connection -> create
Make sure in Firewalls and virtual networks try to enable Allow Azure services on the trusted services list to access this storage account. as below
while creating Private endpoint connection in a target sub resource you can able to create a blob and table as below.
And DNS also automatically created in Private endpoint connection as privatelink.blob.core.windows.net and i have created for both blob and table.
You can check on your DNS configuration under setting once you click on your table or blob.
Additionally, you can connect in azure account through storage explore to get access storage account -> right click connect -> next
You can make use of Azure Private Link for storage Accounts through virtual network

Connecting Blob Storage to a Synapse Workspace with Public Network Workspace Access Disabled

I'm trying to connect blob storage from my RG storage account to the Data tab in my synapse workspace, but I get the following error: "The public network interface on this Workspace is not accessible. To connect to this Workspace, use the Private Endpoint from inside your virtual network or enable public network access for this workspace."
Public network access to my workspace must be disabled for company reasons. I made private endpoint connections on my synapse resource to Dev, Sql, and Sql-On-Demand, but I'm not sure where to go from there.
Thanks!
Go to Azure Synapse -> Manage -> Managed private endpoints -> +New and add private endpoints.
Accessing blob storage : If you already created linked service the follow below image. If not, please created and follow this Ms Doc for creating linked service.
Fastest way to access Azure blob storage
For for information follow this reference by Dennes Torres.

Limit Azure Blob Access to WebApp

Situation:
We have a web-app on azure, and blob storage, via our web-app we write data into the blob, and currently read that data back out returning it as responses in the web-app.
What we're trying to do:
Trying to find a way to restrict access to the blob so that only our web-app can access it. Currently setting up an IP address in the firewall settings works fine if we have a static IP (we often test running the web app locally from our office and that lets us read/write to the blob just fine). However when we use the IP address of our web app (as read from the cross domain page of the web app) we do not get the same access, and get errors trying to read/write to the blob.
Question:
Is there a way to restrict access to the blob to the web app without having to set up a VPN on azure (too expensive)? I've seen people talk about using SAS to generate time valid links to blob content, and that makes sense for only allowing users to access content via our web-app (which would then deliver them the link), but that doesn't solve the problem of our web-app not being able to write to the blob when not publicly accessible.
Are we just trying to miss-use blobs? or is this a valid way to use them, but you have to do so via the VPN approach?
Another option would be to use Azure AD authentication combined with a managed identity on your App Service.
At the time of writing this feature is still in preview though.
I wrote on article on how to do this: https://joonasw.net/view/azure-ad-authentication-with-azure-storage-and-managed-service-identity.
The key parts:
Enable Managed Identity
Add the generated service principal the necessary role in the storage account/blob container
Change your code to use AAD access tokens acquired with the managed identity instead of access key/SAS token
Acquiring the token using https://www.nuget.org/packages/Microsoft.Azure.Services.AppAuthentication/1.1.0-preview:
private async Task<string> GetAccessTokenAsync()
{
var tokenProvider = new AzureServiceTokenProvider();
return await tokenProvider.GetAccessTokenAsync("https://storage.azure.com/");
}
Reading a blob using the token:
private async Task<Stream> GetBlobWithSdk(string accessToken)
{
var tokenCredential = new TokenCredential(accessToken);
var storageCredentials = new StorageCredentials(tokenCredential);
// Define the blob to read
var blob = new CloudBlockBlob(new Uri($"https://{StorageAccountName}.blob.core.windows.net/{ContainerName}/{FileName}"), storageCredentials);
// Open a data stream to the blob
return await blob.OpenReadAsync();
}
SAS Keys is the correct way to secure and grant access to your Blob Storage. Contrary to your belief, this will work with a private container. Here's a resource you may find helpful:
http://www.siddharthpandey.net/use-shared-access-signature-to-share-private-blob-in-azure/
Please also review Microsoft's guidelines on securing your Blob storage. This addresses many of the concerns you outline and is a must read for any Azure PaaS developer:
https://learn.microsoft.com/en-us/azure/storage/common/storage-security-guide

How to use an Azure blob-only storage account with Azure Functions - Trying to create blob snapshot

I'm trying to set up a function to take a snapshot of a blob container every time a change is pushed to it. There is some pretty simple functionality in Azure Functions to do this, but it only works for general purpose storage accounts. I'm trying to do this with a blob only storage account. I'm very new to Azure so I may be approaching this all wrong, but I haven't been able to find much helpful information. Is there any way to do this?
As #joy-wang mentioned, the Azure Functions Runtime requires a general purpose storage account.
A general purpose storage account is required to configure the AzureWebJobsStorage and the AzureWebJobsDashboard settings (local.settings.json or Appsettings Blade in the Azure portal):
{
"IsEncrypted": false,
"Values": {
"AzureWebJobsStorage": "my general purpose storage account connection string",
"AzureWebJobsDashboard": "my general purpose storage account connection string",
"MyOtherStorageAccountConnectionstring": "my blob only storage connection string"
}
}
If you want to create a BlobTrigger Function, you can specify another connection string and create a snapshot everytime a blob is created/updated:
[FunctionName("Function1")]
public static async Task Run([BlobTrigger("test-container/{name}",
Connection = "MyOtherStorageAccountConnectionstring")]CloudBlockBlob myBlob,
string name, TraceWriter log)
{
log.Info($"C# Blob trigger function Processed blob\n Name:{name}");
await myBlob.CreateSnapshotAsync();
}
In the Visual Studio:
I have tried to create snapshot for a blob-only storage
named joyblobstorage , but it failed. I supposed you should get the same error in the screenshot.
As the error information says Microsoft.Azure.WebJobs.Host: Storage account 'joyblobstorage' is of unsupported type 'Blob-Only/ZRS'. Supported types are 'General Purpose'.
In the portal:
I try to create a Function App and use the existing Storage, but it could not find my blob-only storage account. Azure Function setup in portal should not allow we to select a blob-only storage account. Please refer to the screenshot.
Conclusion:
It is not possible to create snapshot for a blob-only storage. In the official documentation, you could see the Storage account requirements.
When creating a function app in App Service, you must create or link to a general-purpose Azure Storage account that supports Blob, Queue, and Table storage.
Also, in the App settings reference, you could see
AzureWebJobsStorage
The Azure Functions runtime uses this storage account connection string for all functions except for HTTP triggered functions. The storage account must be a general-purpose one that supports blobs, queues, and tables.
AzureWebJobsDashboard
Optional storage account connection string for storing logs and displaying them in the Monitor tab in the portal. The storage account must be a general-purpose one that supports blobs, queues, and tables.
Here is the Feedback, Azure App Service Team has explained the requirements on storage account, you could refer to it.

Resources