Securing Azure blob for only one user? - azure

How do I secure a blob for only one user?
There are three options I can think of:
1) Shared access policy with a short expiry.
- The link to the blob is accessible from anywhere for that expiry duration, and for the duration of each subsequent page request.
2) Have a proxy between the user request and blob storage and apply authentication here.
- Though in reality there still is a publicly accessible blob for a short period of time.
3) We don't use blob storage for stuff that needs to be secured.
Am I missing a better option?

Your first suggestion of using a shared access policy with a
short expiry is good
You can also make the blob private, secure
an MVC ActionResult and only pass the blob thru the action result
(ie: return File())

I think you might want to do as follows.
time-dependent
SaS using the url
string storageConnectionString ="UseDevelopmentStorage=true";
CloudStorageAccount storageAccount = CloudStorageAccount.Parse(storageConnectionString);
CloudBlobClient blobClient = storageAccount.CreateCloudBlobClient();
CloudBlobContainer container = blobClient.GetContainerReference("mycontainer");
container.CreateIfNotExists();
BlobContainerPermissions blobPermissions = new BlobContainerPermissions();
blobPermissions.SharedAccessPolicies.Add("mypolicy", new SharedAccessBlobPolicy()
{
SharedAccessStartTime = DateTime.UtcNow.AddHours(1),
SharedAccessExpiryTime = DateTime.UtcNow.AddHours(11),
Permissions = SharedAccessBlobPermissions.Write |
SharedAccessBlobPermissions.Read
});
blobPermissions.PublicAccess = BlobContainerPublicAccessType.Off;
container.SetPermissions(blobPermissions);
string sasToken = container.GetSharedAccessSignature(new SharedAccessBlobPolicy(), "mypolicy");
I suggest you to take a look at the article
http://msdn.microsoft.com/en-us/library/windowsazure/jj721951.aspx

Related

Download or View file from Azure Blob in Aurelia UI

I have my files stored in the Azure. I want to download or viewing mechanism the file on the client side. Like this:
Azure -> Api -> Client UI (Aurelia)
I have seen lot of c# examples, however I am not sure how to get the file on the UI side. Can anyone please help!
Thanks!
Edit:
Api Code:
public string getUtf8Text()
{
CloudBlobClient blobClient = storageAccount.CreateCloudBlobClient();
var containerName = "myContainer";
var blobName = "myBlobName.pdf";
CloudBlobContainer container = blobClient.GetContainerReference(containerName);
CloudBlockBlob blockBlob = container.GetBlockBlobReference(blobName);
string text;
using (var memoryStream = new MemoryStream())
{
await blockBlob.DownloadToStreamAsync(memoryStream);
text = System.Text.Encoding.UTF8.GetString(memoryStream.ToArray());
return text;
}
}
Trying to download a file, from the utf8 byte string. The client side code is:
var byteCharacters =result.byteArray;
var byteNumbers = new Array(result.byteArray.length);
for (var i = 0; i < byteCharacters.length; i++) {
byteNumbers[i] = byteCharacters.charCodeAt(i);
}
var byteArray = new Uint8Array(byteNumbers);
var octetStreamMime = "application/octet-stream";
var contentType = octetStreamMime;
var blob = new Blob([byteArray] {type: contentType});
FileSaver.saveAs(blob, result.blobName);
it works sometimes for pdf, rest of the times its just blank pages. It hangs forever for mp4. Any idea whats going on here?
Each blob has a unique URL address. You can use this to display the contents of the blob via a client that can process a URL.
The blob URL will be similar to:
https://myaccount.blob.core.windows.net/mycontainer/myblob
See Naming and Referencing Containers, Blobs, and Metadata for more information.
The greater challenge comes in how you authenticate access to the blob for your users. You have a couple of options:
You can make blobs in the container public, and thus available for anonymous access, without authentication. This means that all blobs in that container will be public. See Manage anonymous read access to containers and blobs.
You can use a shared access signature to delegate access to blobs in the container with the permissions you specify and over the time interval that you specify. This gives you a greater degree of control than anonymous access but also requires more design effort. See Shared Access Signatures, Part 1: Understanding the SAS model.
Note that although anyone possessing your account key can authenticate and access blobs in your account, you should not share your account key with anyone. However, as the account owner, you can access your blobs from your application using authentication with the account key (also known as shared key authentication).

How to restrict azure blob container creation using SAS

How to set permission to not create container, while generating Account SAS token? Here is my settings.
// Create a new access policy for the account.
SharedAccessAccountPolicy policy = new SharedAccessAccountPolicy()
{
Permissions = SharedAccessAccountPermissions.Read | SharedAccessAccountPermissions.Write,
Services = SharedAccessAccountServices.Blob | SharedAccessAccountServices.Table,
ResourceTypes = SharedAccessAccountResourceTypes.Service | SharedAccessAccountResourceTypes.Container | SharedAccessAccountResourceTypes.Object,
SharedAccessExpiryTime = DateTime.UtcNow.AddMinutes(2),
Protocols = SharedAccessProtocol.HttpsOrHttp
};
Updated answer:
Given that you have multiple containers, the account SAS is a good option. You'll need one for the admin and one for the user.
Here's an example of how to create the admin SAS:
// Create a new access policy for the account.
SharedAccessAccountPolicy policy = new SharedAccessAccountPolicy()
{
// SAS for Blob service only.
Services = SharedAccessAccountServices.Blob,
// Admin has read, write, list, and delete permissions on all containers.
// In order to write blobs, Object resource type must also be specified.
ResourceTypes = SharedAccessAccountResourceTypes.Container | SharedAccessAccountResourceTypes.Object,
Permissions = SharedAccessAccountPermissions.Read |
SharedAccessAccountPermissions.Write |
SharedAccessAccountPermissions.Create |
SharedAccessAccountPermissions.List |
SharedAccessAccountPermissions.Delete,
SharedAccessExpiryTime = DateTime.UtcNow.AddHours(24),
Protocols = SharedAccessProtocol.HttpsOnly
};
And here's an example of how to create the user SAS:
// Create a new access policy for the account.
SharedAccessAccountPolicy policy = new SharedAccessAccountPolicy()
{
// SAS for Blob service only.
Services = SharedAccessAccountServices.Blob,
// User has create, read, write, and delete permissions on blobs.
ResourceTypes = SharedAccessAccountResourceTypes.Object,
Permissions = SharedAccessAccountPermissions.Read |
SharedAccessAccountPermissions.Write |
SharedAccessAccountPermissions.Create |
SharedAccessAccountPermissions.Delete,
SharedAccessExpiryTime = DateTime.UtcNow.AddHours(24),
Protocols = SharedAccessProtocol.HttpsOnly
};
Original answer:
You definitely need to use an account SAS for the admin SAS, but you should be able to use a service SAS on the container for the user SAS, unless you have a need for an account SAS that I am not understanding from your question. It's probably better to use the service SAS when you can so that you can use the least complicated permissions. Also, you can use a stored access policy with the service SAS, which we recommend as a best practice so that it's easy to revoke the SAS if it were ever compromised.
With the service SAS, you don't need a permission to restrict container creation, because the service SAS doesn't allow you to create a container in the first place.
Here's code to create the service SAS on the container, including the stored access policy:
// Create the storage account with the connection string.
CloudStorageAccount storageAccount = CloudStorageAccount.Parse(CloudConfigurationManager.GetSetting("StorageConnectionString"));
// Create the blob client object.
CloudBlobClient blobClient = storageAccount.CreateCloudBlobClient();
// Get a reference to the container for which shared access signature will be created.
CloudBlobContainer container = blobClient.GetContainerReference("mycontainer");
container.CreateIfNotExists();
// Create blob container permissions, consisting of a shared access policy
// and a public access setting.
BlobContainerPermissions containerPermissions = container.GetPermissions();
// Clear the container's shared access policies to avoid naming conflicts if you run this method more than once.
//blobPermissions.SharedAccessPolicies.Clear();
// The shared access policy provides
// read/write access to the container for 24 hours.
containerPermissions.SharedAccessPolicies.Add("mypolicy", new SharedAccessBlobPolicy()
{
// To ensure SAS is valid immediately, don’t set start time.
// This way, you can avoid failures caused by small clock differences.
// Note that the Create permission allows the user to create a new blob, as does Write.
SharedAccessExpiryTime = DateTime.UtcNow.AddHours(24),
Permissions = SharedAccessBlobPermissions.Write |
SharedAccessBlobPermissions.Read | SharedAccessBlobPermissions.Create | SharedAccessBlobPermissions.Delete
});
// The public access setting explicitly specifies that
// the container is private, so that it can't be accessed anonymously.
containerPermissions.PublicAccess = BlobContainerPublicAccessType.Off;
// Set the permission policy on the container.
container.SetPermissions(containerPermissions);
// Get the shared access signature to share with users.
string sasToken =
container.GetSharedAccessSignature(null, "mypolicy");
Take a look at the examples shown here: https://azure.microsoft.com/en-us/documentation/articles/storage-dotnet-shared-access-signature-part-1/#examples-create-and-use-shared-access-signatures.
Also see https://msdn.microsoft.com/en-us/library/azure/dn140255.aspx and https://msdn.microsoft.com/en-us/library/azure/mt584140.aspx.
Let us know if you have any other questions.

Azure storage connection string without the account key - public containers

I am having a blob container which ACL is set up to allow full public read access so anyone can read and list the blobs in that container.
I store the files so the WPF client app my clients use could read them but I don't want to allow them to modify/delete/create files.
Does anyone knows what connection string should be used in this scenario?
I hoped to specify the connection string without the account key and/or shared access key due to the fact blobs are public but that didn't work - StorageAccount.Parse throws FormatException
As mentioned by the previous answers, the best practice is usually to control the access to your blob container using shared access signatures (SAS) or a stored access policy. These can be used to create an access token (string) you can pass to your client without revealing your account key.
However, it is also possible to specify the level of public read access to the blobs and metadata saved in the container. Public access is the level of read permission automatically given an anonymous user that is in possession the public access url for the container or blob. You cannot use public access to give anonymous users write permissions to the container. If you need to give write permission to users that are not in possession of the account key of your Azure storage account, then you will need to provide those users with a token in the form of a url the references a shared access signature or a shared access policy.
If the public access to the blob container is not currently off (private,) anonymous user will be able to read all blobs in the container using a public access url such as the following.
http://grassy.blob.core.windows.net/container1/image2.jpg
When you create the container, you can set the value of the publicAccess property to the appropriate constant of the BlobContainerPublicAccessType enum. The value of the publicAccess property can be one of the following three constants which specify the level of public read access.
• BLOB – The public can read the content and metadata of blobs within this container, but cannot read container metadata or list the blobs within the container.
• CONTAINER – The public can read blob content and metadata and container metadata, and can list the blobs within the container.
• OFF – Specifies no public access. Only the account owner can read resources in this container.
So in this case the public access level might be set to CONTAINER. For example:
public static void main(String[] args) throws InvalidKeyException, URISyntaxException, StorageException
{
Account creds = new Account();
final String storageConnectionString = creds.getstorageconnectionstring();
CloudStorageAccount storageAccount = CloudStorageAccount.parse(storageConnectionString);
CloudBlobClient blobClient = storageAccount.createCloudBlobClient();
CloudBlobContainer container = blobClient.getContainerReference("container1");
container.createIfNotExist();
BlobContainerPermissions containerPermissions = new BlobContainerPermissions();
containerPermissions.setPublicAccess(BlobContainerPublicAccessType.CONTAINER);
container.uploadPermissions(containerPermissions);
BlobContainerPublicAccessType access1 = containerPermissions.getPublicAccess();
System.out.println("Public access to " + container.getName() + " is set to:
" + access1);
}
If the public access level on container1 has been set to CONTAINER, an anonymous user should be able to list the blobs in container1 knowing only the storage account AccountName ("grassy") and the container name, but without needing to know the AccountKey. For example, an anonymous application might use java code similar to the following:
public static void main(String[] args) throws InvalidKeyException, URISyntaxException, StorageException, FileNotFoundException, IOException
{
URI baseuri = new URI("http://grassy.blob.core.windows.net");
CloudBlobClient blobclient = new CloudBlobClient(baseuri);
CloudBlobContainer container = blobclient.getContainerReference("container1");
for (ListBlobItem blobItem : container.listBlobs()){System.out.println(blobItem.getUri());}
}
However, as discussed, it is a better practice to avoid giving anonymous users access. Instead control access to the container using a SAS or policy and pass on the token to only known users.
StorageAccount is not meant to connect to public blobs as far as I know. You simply can get at the public blobs via public URL by using something like WebClient or any other tool that can download data over public http/https endpoint.
You could use shared access signature for that purpose. What you could do is create the SAS on a blob container which only allows list and read permissions on the blob container and then distribute that SAS URI to your clients. Your code could then create an instance of BlobContainer object using that SAS URI.
Here's the sample code for listing blobs in a blob container using SAS URI:
static void ListBlobsWithStorageClientLibrary(string blobContainerSasUri)
{
CloudBlobContainer blobContainer = new CloudBlobContainer(new Uri(blobContainerSasUri));
var blobs = blobContainer.ListBlobs(null, true);
foreach (var blob in blobs)
{
Console.WriteLine(blob.Uri);
}
}
Other alternative is to create an instance of StorageCredentials object using SAS token: http://msdn.microsoft.com/en-us/library/windowsazure/jj682529.aspx. Then you could create an instance of CloudStorageAccount object using that StorageCredentials object.
I wrote a detailed post on using Shared Access Signatures with blob storage which you can read here: http://gauravmantri.com/2013/02/13/revisiting-windows-azure-shared-access-signature/

How can I get an Azure CloudBlockBlob from a storage URL with a SAS?

I have am trying to refactor our MVC code which has a lot of pages which make use of download url which point at a blob with a SAS. It would be great to be able to pass the Url to the controller and use it to locate the associated Blob. E.g. Have an action that has the download Url as its only input parameter. I can also create a link helper that only shows the delete link if the SAS exposes delete etc.
It would be a great help if I could pass the Url to Azure and get a CloudBlockBlob in return. So I could delete it, update it, get metadata etc.
The only way I can do it presently is resorting to using techniques like
var deleteBlobRequest = BlobRequest.Delete(new Uri(fileUrl), 30, null, DeleteSnapshotsOption.IncludeSnapshots, "");
deleteBlobRequest.GetResponse().Close();
This works but it seems very odd.
I can't figure out the code to get a CloudBlockBlob from the Uri.
Any ideas? I am presently using Azure Storage 1.7
You don't have to do anything special. If you construct a blob with a SAS Uri, storage client library takes care of this for you. For example, take this code:
CloudBlockBlob cloudBlockBlob = new CloudBlockBlob("http://127.0.0.1:10000/devstoreaccount1/temp/sastest.txt?sr=b&st=2013-01-25T04%3A28%3A09Z&se=2013-01-25T05%3A28%3A09Z&sp=rwd&sig=jIWWFwZ6MXaL6FD%2F2%2FpqPl1g4f0ElFrr1fKNg5U%2FAkg%3D");
cloudBlockBlob.Delete();
This would work just fine.
Here is the code to get the permissions of a SAS key (supposing the blobUrl is an url with the SAS key):
// Get permssions for current SAS key.
var queryString = HttpUtility.ParseQueryString(blobUrl);
var permissionsText = queryString["sp"];
var permissions = SharedAccessBlobPermissions.None;
if (permissionsText.Contains("w"))
permissions = permissions | SharedAccessBlobPermissions.Write;
if (permissionsText.Contains("r"))
permissions = permissions | SharedAccessBlobPermissions.Read;
if (permissionsText.Contains("d"))
permissions = permissions | SharedAccessBlobPermissions.Delete;
if (permissionsText.Contains("l"))
permissions = permissions | SharedAccessBlobPermissions.List;
And this will get an ICloudBlob based on an URL with SAS key (supposing the blobUrl is an url with the SAS key):
// Get the blob reference.
var blobUri = new Uri(blobUrl);
var path = String.Format("{0}{1}{2}{3}", blobUri.Scheme, Uri.SchemeDelimiter, blobUri.Authority, blobUri.AbsolutePath);
var blobClient = new CloudBlobClient(new Uri(path), new Microsoft.WindowsAzure.Storage.Auth.StorageCredentials(blobUri.Query));
ICloudBlob blobReference = blobClient.GetBlobReferenceFromServer(new Uri(path));

How to use SharedAccessSignature to access blobs

I am trying to access a blob stored in a private container in Windows Azure. The container has a Shared Access Signature but when I try
to access the blob I get a StorgeClientException "Server failed to authenticate the request. Make sure the Authorization header is formed
correctly including the signature".
The code that created the container and uploaded the blob looks like this:
// create the container, set a Shared Access Signature, and share it
// first this to do is to create the connnection to the storage account
// this should be in app.config but as this isa test it will just be implemented
// here:
// add a reference to Microsoft.WindowsAzure.StorageClient
// and Microsoft.WindowsAzure.StorageClient set up the objects
//storageAccount = CloudStorageAccount.DevelopmentStorageAccount;
storageAccount = CloudStorageAccount.Parse(ConfigurationManager.AppSettings["ConnectionString"]);
blobClient = storageAccount.CreateCloudBlobClient();
// get a reference tot he container for the shared access signature
container = blobClient.GetContainerReference("blobcontainer");
container.CreateIfNotExist();
// now create the permissions policy to use and a public access setting
var permissions = container.GetPermissions();
permissions.SharedAccessPolicies.Remove("accesspolicy");
permissions.SharedAccessPolicies.Add("accesspolicy", new SharedAccessPolicy
{
// this policy is live immediately
// if the policy should be delatyed then use:
//SharedAccessStartTime = DateTime.Now.Add(T); where T is some timespan
SharedAccessExpiryTime =
DateTime.UtcNow.AddYears(2),
Permissions =
SharedAccessPermissions.Read | SharedAccessPermissions.Write
});
// turn off public access
permissions.PublicAccess = BlobContainerPublicAccessType.Off;
// set the permission on the ocntianer
container.SetPermissions(permissions);
var sas = container.GetSharedAccessSignature(new SharedAccessPolicy(), "accesspolicy");
StorageCredentialsSharedAccessSignature credentials = new StorageCredentialsSharedAccessSignature(sas);
CloudBlobClient client = new CloudBlobClient(storageAccount.BlobEndpoint,
new StorageCredentialsSharedAccessSignature(sas));
CloudBlob sasblob = client.GetBlobReference("blobcontainer/someblob.txt");
sasblob.UploadText("I want to read this text via a rest call");
// write the SAS to file so I can use it later in other apps
using (var writer = new StreamWriter(#"C:\policy.txt"))
{
writer.WriteLine(container.GetSharedAccessSignature(new SharedAccessPolicy(), "securedblobpolicy"));
}
The code I have been trying to use to read the blob looks like this:
// the storace credentials shared access signature is copied directly from the text file "c:\policy.txt"
CloudBlobClient client = new CloudBlobClient("https://my.azurestorage.windows.net/", new StorageCredentialsSharedAccessSignature("?sr=c&si=accesspolicy&sig=0PMoXpht2TF1Jr0uYPfUQnLaPMiXrqegmjYzeg69%2FCI%3D"));
CloudBlob blob = client.GetBlobReference("blobcontainer/someblob.txt");
Console.WriteLine(blob.DownloadText());
Console.ReadLine();
I can make the above work by adding the account credentials but that is exactly what I'm trying to avoid. I do not want something
as sensitive as my account credentials just sitting out there and I have no idea on how to get the signature into the client app without having the account credentials.
Any help is greatly appreciated.
Why this?
writer.WriteLine(container.GetSharedAccessSignature(new SharedAccessPolicy(), "securedblobpolicy"));
and not writing the sas string you already created?
It's late and I could easily be missing something but it seems that you might not be saving the same access signature that you're using to write the file in the first place.
Also perhaps not relevant here but I believe there is a limit on the number of container-wide policies you can have. Are you uploading multiple files to the same container with this code and creating a new container sas each time?
In general I think it would be better to request a sas for an individual blob at the time you need it with a short expiry time.
Is "my.azurestorage.windows.net" just a typo? I would expect something there like "https://account.blob.core.windows.net".
Otherwise the code looks pretty similar to the code in http://blog.smarx.com/posts/shared-access-signatures-are-easy-these-days, which works.

Resources