View SharedAccessBlobPolicy created programatically - in Azure portal - azure

I'm creating a container and then a Shared Access Signature for that container in code as so:
SharedAccessBlobPolicy policy = new SharedAccessBlobPolicy()
{
Permissions = SharedAccessBlobPermissions.Write,
SharedAccessExpiryTime = DateTime.UtcNow.AddHours(36)
};
var sas = container.GetSharedAccessSignature(policy, $#"{id}-{DateTime.Now}");
That work's fine.
However when I go into Azure portal I can't see a list of policies that have been created.
Does anyone know if this is possible and if so where/how?

Azure Portal offers very limited functionality for managing Storage Accounts. As of today, this functionality doesn't exist there.
What you could do is use any Storage Explorer available in the market (including Microsoft's own Storage Explorer - http://storageexplorer.com) and view access policies there.

Related

Limit Azure Blob Access to WebApp

Situation:
We have a web-app on azure, and blob storage, via our web-app we write data into the blob, and currently read that data back out returning it as responses in the web-app.
What we're trying to do:
Trying to find a way to restrict access to the blob so that only our web-app can access it. Currently setting up an IP address in the firewall settings works fine if we have a static IP (we often test running the web app locally from our office and that lets us read/write to the blob just fine). However when we use the IP address of our web app (as read from the cross domain page of the web app) we do not get the same access, and get errors trying to read/write to the blob.
Question:
Is there a way to restrict access to the blob to the web app without having to set up a VPN on azure (too expensive)? I've seen people talk about using SAS to generate time valid links to blob content, and that makes sense for only allowing users to access content via our web-app (which would then deliver them the link), but that doesn't solve the problem of our web-app not being able to write to the blob when not publicly accessible.
Are we just trying to miss-use blobs? or is this a valid way to use them, but you have to do so via the VPN approach?
Another option would be to use Azure AD authentication combined with a managed identity on your App Service.
At the time of writing this feature is still in preview though.
I wrote on article on how to do this: https://joonasw.net/view/azure-ad-authentication-with-azure-storage-and-managed-service-identity.
The key parts:
Enable Managed Identity
Add the generated service principal the necessary role in the storage account/blob container
Change your code to use AAD access tokens acquired with the managed identity instead of access key/SAS token
Acquiring the token using https://www.nuget.org/packages/Microsoft.Azure.Services.AppAuthentication/1.1.0-preview:
private async Task<string> GetAccessTokenAsync()
{
var tokenProvider = new AzureServiceTokenProvider();
return await tokenProvider.GetAccessTokenAsync("https://storage.azure.com/");
}
Reading a blob using the token:
private async Task<Stream> GetBlobWithSdk(string accessToken)
{
var tokenCredential = new TokenCredential(accessToken);
var storageCredentials = new StorageCredentials(tokenCredential);
// Define the blob to read
var blob = new CloudBlockBlob(new Uri($"https://{StorageAccountName}.blob.core.windows.net/{ContainerName}/{FileName}"), storageCredentials);
// Open a data stream to the blob
return await blob.OpenReadAsync();
}
SAS Keys is the correct way to secure and grant access to your Blob Storage. Contrary to your belief, this will work with a private container. Here's a resource you may find helpful:
http://www.siddharthpandey.net/use-shared-access-signature-to-share-private-blob-in-azure/
Please also review Microsoft's guidelines on securing your Blob storage. This addresses many of the concerns you outline and is a must read for any Azure PaaS developer:
https://learn.microsoft.com/en-us/azure/storage/common/storage-security-guide

Azure Storage Account Metrics only visible for Classic Storage Account

I've tested creating both classic storage account (manage.windowsazure.com) and a "new" storage account in the new Azure Portal. Set them up similar and run the same code to create and configure a queue. But metrics is only showing for the classic storage account in the Portal (Able to see both accounts in the new Portal)
I have set up the ServiceProperties like this, and can successfully see these changes saved when fetching service properties or looking in the Azure Portal.
CloudStorageAccount storageAccount =
CloudStorageAccount.parse(storageConnectionString);
CloudQueueClient queueClient = storageAccount.createCloudQueueClient();
MetricsProperties metricsProperties = new MetricsProperties();
metricsProperties.setMetricsLevel(MetricsLevel.SERVICE_AND_API);
metricsProperties.setRetentionIntervalInDays(2);
LoggingProperties loggingProperties = new LoggingProperties();
loggingProperties.setRetentionIntervalInDays(10);
loggingProperties.setLogOperationTypes(EnumSet.of(LoggingOperations.READ, LoggingOperations.WRITE, LoggingOperations.DELETE));
ServiceProperties serviceProperties = new ServiceProperties();
serviceProperties.setHourMetrics(metricsProperties);
serviceProperties.setMinuteMetrics(metricsProperties);
serviceProperties.setLogging(loggingProperties);
queueClient.uploadServiceProperties(serviceProperties);
When I use Microsoft Azure Storage Explorer both accounts has the tables for metrics and logging set up, so both look like this and the tables contains data.
So from here it looks similar. But the metrics graphs and options are only available for the Classic Storage account in Azure Portal. For the "new" Storage account it only says "No available data".
Is it a bug? Or is a classic Storage Account default configured with some properties I manually need to apply to the new Storage account to make it behave similar?
Screenshot from Microsoft Azure Storage Explorer
According to your code setting, I leverage WindowsAzure.Storage (version 7.2.1) to configure my storage account metrics both on the classic Storage Account and the new Storage Account as follows:
var blobClient = storageAccount.CreateCloudBlobClient();
MetricsProperties metricsProperties = new MetricsProperties();
metricsProperties.MetricsLevel = MetricsLevel.ServiceAndApi;
metricsProperties.RetentionDays = 2;
LoggingProperties loggingProperties = new LoggingProperties();
loggingProperties.RetentionDays = 10;
loggingProperties.LoggingOperations = LoggingOperations.Read | LoggingOperations.Write | LoggingOperations.Delete;
ServiceProperties serviceProperties = new ServiceProperties();
serviceProperties.HourMetrics=metricsProperties;
serviceProperties.MinuteMetrics=metricsProperties;
serviceProperties.Logging=loggingProperties;
blobClient.SetServiceProperties(serviceProperties);
Upon the code snippet, you could configure the minute/hour metrics for your Blob Storage.
Since you have confirmed that the related tables contain metric records, you could try to log into the Azure Portal, choose your storage account, click QUEUE SERVICE > Metrics, click Edit chart and change the Time Range as follows:
Note: The time range is set to today by default if there has any metric records. There could be data latency, you could try to specify the time range and find out whether you could retrieve your metrics data as you expected.

Accessing Azure Storage services from a different subscription

We are looking to deploy our Azure cloud services to multiple subscriptions but want to be able to be able to access the same Storage accounts for storing blobs and tables. Wanted to know if it is possible to access storage accounts from across different subscriptions using just the storage account name and key?
Our data connection takes the form
Trying to use the above and it always try to find end point for given accountname within the current subscription
If i understood your question...
able to access the same Storage accounts
Via Azure Panel (Management Portal) : you can access the storage account only in the subscription.
Via Visual Studio: you can attach storage account outside your current login account in visual studio <-> azure with account name and key (and manage it)
Via Code: You can access storage account (blob, queue, table) from all your apps with storage connection strings (don't put it in code)
If you want, you can restrict blob access with CORS settings. Something like this :
private static void InitializeCors()
{
ServiceProperties blobServiceProperties = blobClient.GetServiceProperties();
//Attiva e Configura CORS
ConfigureCors(blobServiceProperties);
//Setta
blobClient.SetServiceProperties(blobServiceProperties);
}
private static void ConfigureCors(ServiceProperties prop)
{
var cors = new CorsRule();
cors.AllowedOrigins.Add("www.domain1.net, www.domain2.it");
prop.Cors.CorsRules.Add(cors);
}

Azure Storage services logs

I am a beginner in Azure and need some help. We are facing a bit of problem with Azure Storage services and are unable to proceed.
Ok now the issue is
http://blogs.msdn.com/b/windowsazurestorage/archive/2014/08/05/microsoft-azure-storage-service-version-removal.aspx
To summarize:
We have to inspect the log version of an/all of blobs,tables,queues in case any of them are using the one set for planned removal. I have enabled logging for the webapplication on the azure portal site. I am able to see the three services as under
https://.blob.core.windows.net
https://.table.core.windows.net
https://.queue.core.windows.net
Now in the articles as below I gather that we get the log format as this where they have a version included but have NOT specfied from where to locate the logs and how to gather the logs. I have tried different things from using https://.blob.core.windows.net/$logs but makes no difference.
The logs required should be in this format(sample)
Here is a sample log entry, with the version used highlighted – in this case the request was an anonymous GetBlob request which implicitly used the 2009-09-19 version:
1.0;2011-08-09T18:52:40.9241789Z;GetBlob;AnonymousSuccess;200;18;10;anonymous;;myaccount;blob;"https:// myaccount.blob.core.windows.net/thumbnails/lake.jpg?timeout=30000";"/myaccount/thumbnails/lake.jpg";a84aa705-8a85-48c5-b064-b43bd22979c3;0;123.100.2.10;2009-09-19;252;0;265;100;0;;;"0x8CE1B6EA95033D5";Friday, 09-Aug-11 18:52:40 GMT;;;;"8/9/2011 6:52:40 PM ba98eb12-700b-4d53-9230-33a3330571fc"
Can you please show me a way to view these logs. Any tool to use ?
Since these logs are stored in a blob container called $logs, any storage explorer which supports viewing data from this blob container can be used to view the contents. To the best of my knowledge following tools support viewing data from this container: Azure Storage Explorer, Cerebrata Azure Management Studio, Cloud Portam (Disclosure: I am the developer working on this tool).
However before you could view the data you would need to enable logging on your storage account. Only when logging is enabled on the storage account you will see this container show up in your storage account. To enable logging, again you can use Azure Management Studio or Cloud Portam or you could use the code below (the code I mentioned below assumes you have the latest version of Storage Client Library):
static void SetLoggingProperties()
{
CloudStorageAccount account = new CloudStorageAccount(new StorageCredentials(StorageAccount, StorageAccountKey), true);
LoggingProperties properties = new LoggingProperties()
{
LoggingOperations = LoggingOperations.All,
RetentionDays = 365,
Version = "1.0",
};
ServiceProperties serviceProperties = new ServiceProperties()
{
Cors = null,
HourMetrics = null,
MinuteMetrics = null,
Logging = properties,
};
var blobClient = account.CreateCloudBlobClient();
blobClient.SetServiceProperties(serviceProperties);
var tableClient = account.CreateCloudTableClient();
tableClient.SetServiceProperties(serviceProperties);
var queueClient = account.CreateCloudQueueClient();
queueClient.SetServiceProperties(serviceProperties);
}
Once logging properties are set, give it some time for logs to show up.

Azure CDN per Blob SAS

As far as I know in Azure Storage we can delegate access to our storage to single person using SAS on CONTAINER basis.
I need to delegate access on per BLOB basis to prevent hotlinking.
We are using Asp.Net MVC. Sorry for my English:)
Edit: And how new Azure user can create CDN?
So you can create a SAS on a blob. The approach is similar to the way you create a SAS on a blob container. Since you're using ASP.Net MVC, I'm assuming you would want to use .Net Storage Client API to create SAS on a blob. To create a SAS on a blob, just call GetSharedAccessSignature method on the blob object you have created.
For example, the code below would give you a SAS URL where user has permission to download a blob:
var sas = blob.GetSharedAccessSignature(new SharedAccessBlobPolicy()
{
Permissions = SharedAccessBlobPermissions.Read,
SharedAccessStartTime = DateTime.UtcNow.AddMinutes(-5),
SharedAccessExpiryTime = DateTime.UtcNow.AddMinutes(15),
});
return string.Format(CultureInfo.InvariantCulture, "{0}{1}", blob.Uri, sas);
I wrote a blog post some time ago which describes SAS functionality on blobs and containers in more details: http://gauravmantri.com/2013/02/13/revisiting-windows-azure-shared-access-signature/
Regarding your question about CDN, I believe the functionality to create DSN nodes was taken away from the Windows Azure Portal when new portal was announced. I guess you would need to wait for the functionality to come up again on the portal.

Resources