Azure Shared Access Signatures (SAS) - azure

I started using SAS today for the first time and I was intrigued by it but.. Would i really need it in my current project?
On my azure website, I want users to be able to upload and download blobs. Every user has their own account.
This is a simple upload function:
//Upload blob
CloudBlobContainer container = CloudStorageServices.GetCloudBlobsContainer();
CloudBlockBlob blockBlob = container.GetBlockBlobReference(file.FileName + guid);
blockBlob.UploadFromStream(file.InputStream);
//Get uri from blob
var blobUrl = blockBlob.Uri;
//Upload table
CloudTable table2 = CloudStorageServices.GetCloudUploadsTable();
UploadEntity uploadtable = new UploadEntity(imagename, entity.RowKey + guid, blobUrl.ToString(), User.Identity.Name, file.FileName + guid);
TableOperation insertOperation = TableOperation.InsertOrReplace(uploadtable);
table2.Execute(insertOperation);
I dont really see the point in using SAS here, but i have feeling im totally wrong..
Why should I use it?
When user has uploaded a blob, the belonging blobs(uploaded by that user) will be listed and the user will ONLY be able to select and download their own items, Javascript download.
Same here.. Why would i need SAS?
Thanks!!

If you have a website or web service in front of your blobs then you don't need to use SAS since you can control the user permission by yourself. I used SAS in most cases that I cannot control the user permission.
For example, then we need to display images directly from the blob URL, I need to prevent them from being loaded and displayed by other websites. So I will generate SAS for the images needs to be displayed on that page and change the page html content to the SAS URL with few seconds. So if other wanted to read images by these SAS URLs they will be expired.
Another example might be upload. Different from your scenario if you need someone to upload directly to blob then you might need SAS since you should not give them your storage security keys.
Hope this helps a bit.

Related

Images uploaded to Azure blob storage unavailable when browsing by direct URL

I have uploaded a number of images to a Blob container on an Azure storage account of type StorageV2 (general purpose v2).
These were uploaded programmatically. Here's the code I used:
public Task CopyFile(string fileName, string targetPath)
{
var blobRef = Container.GetBlockBlobReference(targetPath);
blobRef.Properties.ContentType = GetContentType(fileName);
return blobRef.UploadFromFileAsync(fileName);
}
public string GetContentType(string fileName)
{
var provider = new FileExtensionContentTypeProvider();
if (!provider.TryGetContentType(fileName, out var contentType))
{
contentType = "application/octet-stream";
}
return contentType;
}
Container is an initialized CloudBlobContainer instance.
When I use the Storage Explorer I can see the uploaded files. If I view the properties of any file it lists a Uri property. However, if I copy the value (a URL) and paste into a browser I see the following error page:
<Error>
<Code>ResourceNotFound</Code>
<Message>
The specified resource does not exist. RequestId:12485818-601e-0017-6f69-56c3df000000 Time:2019-08-19T08:35:13.2123849Z
</Message>
</Error>
But if I double-click the file in Storage Explorer it downloads the image correctly. The URL it uses is the same as the one I copied earlier as far as I could tell, except for some additional querystrings that look like this: ?sv=2018-03-28&ss=bqtf&srt=sco&sp=rwdlacup&se=2019-08-19T16:49:38Z&sig=%2FJs7VnGKsjplalKXCcl0XosgUkPWJccg0qdvCSZlDSs%3D&_=1566204636804
I assume this must mean my blobs are not publically available, but I can't find any setting that will make my images available publically at their known URI. Can anyone point me in the right direction here? Thank you.
Check the access level that set to your container.
If that is a Private then you will have the error that you experiencing: ResourceNotFound
As far as I know, if you container's access level is Private, you use the direct url to access the blob then you will get the error. If you want to access it, you need to generate a SAS token for it.
For more details, please refer to
https://learn.microsoft.com/en-us/azure/storage/blobs/storage-manage-access-to-resources
https://learn.microsoft.com/en-us/azure/storage/common/storage-sas-overview

How to read .csv file in that located in Azure?

My apologies for asking this basic question. I'm very new in Azure environment.
I have stored log files in Azure portal as .csv
I want to view this .csv file without download it.
Azure already give the URL link for this file. But it is unable to view
This is the link that provides by Azure for my .csv file:
https://xxxxxx.xfile.xcore.windows.net/cctvfeedfs/log/testcsv.csv
Fyi, I do have SAS signature, this SAS signature when I combine with the URL its will download the .csv file. Example like this:
https://xxxxxx.xfile.xcore.windows.net/cctvfeedfs/log/testcsv.csv?sv=2017-11-09&ss=bfqt&srt=sco&sp=rwdlacup&se=2099-10-04T09:06:59Z&st=2018-10-04T01:06:59Z&spr=https&sig=%2Fb%2BrssXtUP5V%2F9%2BSXzpSauyugpG%2BvXOfn9GqLfdf1EOUE%3D
But actually I don't want to download but just want to view it.
It is have any possible way to do so I can view the content in .csv without download it?
Please help. Thank you in advance!
What can I do to view the content online without download it?
If your container is not public,the url can't be viewed the content of the file directly, otherwise there would be no any security for your files.
So please refer to the offical documents Secure access to an application's data in the cloud and Using shared access signatures (SAS). Then, we need to generate a blob url with SAS signature for accessing.
Here is the sample java code to generate a blob url with SAS signature.
SharedKeyCredentials credentials = new SharedKeyCredentials(accountName, accountKey);
ServiceSASSignatureValues values = new ServiceSASSignatureValues()
.withProtocol(SASProtocol.HTTPS_ONLY) // Users MUST use HTTPS (not HTTP).
.withExpiryTime(OffsetDateTime.now().plusDays(2)) // 2 days before expiration.
.withContainerName(containerName)
.withBlobName(blobName);
BlobSASPermission permission = new BlobSASPermission()
.withRead(true)
.withAdd(true)
.withWrite(true);
values.withPermissions(permission.toString());
SASQueryParameters serviceParams = values.generateSASQueryParameters(credentials);
String sasSign = serviceParams.encode();
String blobUrlWithSAS = String.format(Locale.ROOT, "https://%s.blob.core.windows.net/%s/%s%s",
accountName, containerName, blobName, sasSign);
You also can add the SAS signature at the end of the string of blob.toURL().
String blobUrlWithSAS = blob.toString()+sasSign;
About SAS Signature, you can refer to these sample codes in ServiceSASSignatureValues Class and AccountSASSignatureValues Class.
You could check the ContentType with your csv file in Azure Storage Explorer Tool.
If you change the format of it to text/plain,
then it could show the content directly in the browser.
BTW,you could set the content type when you upload the file.(Please see this SO case :Uploading blockblob and setting contenttype)

Getting Image from API, then storing it as Azure Storage Blob (image error)

So I'm trying to do this:
Get Users from AD using Graph Api in a Azure Function (C# or Node)
For each user, get their photo, using Graph Api (in the same Azure Function)
With the photo data, upload as a blob to Azure Storage
Now, I have 1 and 2 working correctly. But now, I have no idea how to convert that image/jpeg string into a Blob in Azure Storage. I've tried a lot, researched a lot but it's been really difficult.
I've tried to use
blob.Properties.ContentType = "image/jpeg"
blob.UploadText(imgString);
But it doesn't work.
So my code looks something like this:
I get a fresh oAuth token from azure (good for 3600 sec)
Get /Users from AD Graph API
For each user, I use /user//photo/$value resource, which returns a image/jpeg data.
From that data (a string?) I try to blob.UploadText but it fails.
The way I'm getting the image data from GraphApi is using RestSharp, like this:
var client = new RestClient("https://graph.microsoft.com/v1.0/users/" + email + "/photo/$value");
var request = new RestRequest(Method.GET);
request.AddHeader("cache-control", "no-cache");
request.AddHeader("authorization", "Bearer " + token);
request.AddHeader("content-type", "image/jpeg");
return client.Execute(request);
So I return an IRestResponse, which contains something like this:
response.ContentType //to get the content type
response.Content // to get the body (the image)
blob.UploadText(response.Content);
And that's what I'm trying to do, but it doesn't work, the file gets saved OK but when you open it, you can't really see the image. I think the issue might be some encoding, I've tried setting to different encoding types with no luck.
Take a look at this next picture. To the right, I got the image from Graph Api using PHP, and set header as image/jpeg, then echo the image data and that's it. It works. On the left, it's the Azure Function with javascript or c#, I get the image and when I try to do the same (show in the browser) I get a different binary string and no picture on the page (like if it wasn't image data), so it looks like as if the problem is encoding. I'm saving this binary data on a blob file with UploadText but it's not working.
Any ideas?
Any ideas?
Please have to use RawBytes as blob content. It works correctly on my side.
blob.UploadFromByteArray(response.RawBytes,0,response.RawBytes.Length-1);
The following is my test demo code
var connectionString = "storage connection string";
CloudStorageAccount storageAccount =
CloudStorageAccount.Parse(connectionString);
CloudBlobClient blobClient = storageAccount.CreateCloudBlobClient();
CloudBlobContainer container =
blobClient.GetContainerReference("container");
container.CreateIfNotExists();
CloudBlockBlob blob = container.GetBlockBlobReference("test.jpeg");
blob.UploadFromByteArray(response.RawBytes,0,response.RawBytes.Length-1);

Azure blob, controlling filename on download

Our application enables users to upload files via a web browser. The file name gets mapped to a GUID and the blob name becomes the GUID. When the user clicks the file in our application it should download to their file system (show save as, etc) using the original file name and not the GUID blob name.
I found this post, and similar posts that describe how to set the Content-Disposition on the blob when downloading through a Shared Access Signature.
Friendly filename when public download Azure blob
However, our situation is a little different. We set a single SAS at the Container level (technically this is called a Shared Access Policy I believe -- you can have up to 5 at any given time). When downloading the blob, we simply append the SAS to the end of the uri, and use...
window.location.href = blobUri + containerSAS;
...to download the blob. This downloads the blob, but uses the GUID filename.
How can we take an existing SAS that applies to the Container and have the blob download as the original filename?
Keep in mind this is a slightly different use case from a SAS applied to an individual blob in that...
The SAS is at the Container level (it seems this is the best practice vs. individual SAS's for each blob).
We are downloading from javascript (vs. C# code where you can set the headers).
I have tried to set the Content-Disposition of the blob during the upload process (PUT operation), but it doesn't seem to make a difference when I download the blob. Below, you can see the Content-Disposition header being set for the PUT request (from Fiddler).
Thanks!
This post pointed us in the right direction to change the file name with the the ContentDisposition property Azure.Storage.Blobs
BlobContainerClient container = OpenContianer(containerName);
BlobClient blob = container.GetBlobClient(sourceFilename);
var Builder = new BlobSasBuilder(BlobSasPermissions.Read, DateTimeOffset.Now.AddMinutes(10));
Builder.ContentDisposition= $"attachment; filename = {destFileName} ";
var SasUri = blob.GenerateSasUri(Builder);
I have a solution. I think it's more of a workaround, but for each file to be downloaded, I make a server call and create a special SAS for the download operation. I can set Content-Disposition with that, and now the GUID named blobs are downloading with their original filenames.

Streaming Videos from Azure ( Blob or Media Services)

I have an MVC 4 website hosted in azure that needs to upload a video and on a different page allow that uploaded video to be streamed back to a client player.
The first option allows the user to upload and encode the video (.mp4)
The second option is I manually upload and encode the video and provide the url to the user.
In either case, the video would be presented to the users on another page.
Im having a devil of a time trying to get this to work. Any suggestions/working samples?
I used the below code for my c# application and it works properly.
public static string getBlobStreamURL(string fsBloblFilePath, string fsdirectory)
{
CloudBlobContainer cloudBlobContainer = storageAccount.CreateCloudBlobClient().GetContainerReference(fsdirectory);
var cloudBlob = cloudBlobContainer.GetBlockBlobReference(fsBloblFilePath);
var SharedAccessSignature = cloudBlob.GetSharedAccessSignature(new Microsoft.WindowsAzure.Storage.Blob.SharedAccessBlobPolicy()
{
Permissions = Microsoft.WindowsAzure.Storage.Blob.SharedAccessBlobPermissions.Read,
SharedAccessExpiryTime = DateTime.UtcNow.AddHours(1)
});
var StreamURL = string.Format("{0}{1}", cloudBlob.Uri, lsSharedAccessSignature);
return StreamURL;
}
I've written a previous answer on how to serve video's from Azure.
You'll want to use Azure Blob Storage to store the files. This will scale very well, let you take advantage of the Azure CDN for faster delivery, and the outgoing traffic won't count against your website instances.
You can then use any HTML5 or flash player you want. One important thing, make sure when saving the file, you set the Content Type. You can also change the Azure Service Version to support video seek.

Resources