Our application enables users to upload files via a web browser. The file name gets mapped to a GUID and the blob name becomes the GUID. When the user clicks the file in our application it should download to their file system (show save as, etc) using the original file name and not the GUID blob name.
I found this post, and similar posts that describe how to set the Content-Disposition on the blob when downloading through a Shared Access Signature.
Friendly filename when public download Azure blob
However, our situation is a little different. We set a single SAS at the Container level (technically this is called a Shared Access Policy I believe -- you can have up to 5 at any given time). When downloading the blob, we simply append the SAS to the end of the uri, and use...
window.location.href = blobUri + containerSAS;
...to download the blob. This downloads the blob, but uses the GUID filename.
How can we take an existing SAS that applies to the Container and have the blob download as the original filename?
Keep in mind this is a slightly different use case from a SAS applied to an individual blob in that...
The SAS is at the Container level (it seems this is the best practice vs. individual SAS's for each blob).
We are downloading from javascript (vs. C# code where you can set the headers).
I have tried to set the Content-Disposition of the blob during the upload process (PUT operation), but it doesn't seem to make a difference when I download the blob. Below, you can see the Content-Disposition header being set for the PUT request (from Fiddler).
Thanks!
This post pointed us in the right direction to change the file name with the the ContentDisposition property Azure.Storage.Blobs
BlobContainerClient container = OpenContianer(containerName);
BlobClient blob = container.GetBlobClient(sourceFilename);
var Builder = new BlobSasBuilder(BlobSasPermissions.Read, DateTimeOffset.Now.AddMinutes(10));
Builder.ContentDisposition= $"attachment; filename = {destFileName} ";
var SasUri = blob.GenerateSasUri(Builder);
I have a solution. I think it's more of a workaround, but for each file to be downloaded, I make a server call and create a special SAS for the download operation. I can set Content-Disposition with that, and now the GUID named blobs are downloading with their original filenames.
Related
I have created a storage_account with a container named data.
In that container I have a single .zip file.
I'm generating an Account Key SAS Token with Read permission directly on the data container :
The Blob SAS URL looks like this :
https://<STORAGE_ACCOUNT>.blob.core.windows.net/data?sp=r&st=2022-06-06T15:23:31Z&se=2022-06-06T23:23:31Z&spr=https&sv=2020-08-04&sr=c&sig=<SIGNATURE>
How am I supposed to download my zip file from that URI?
I'm always running into some Authorization error whereas I though having the link was enough and unfortunately documentation didn't help me to figure out what's wrong.
I would like to download the file from a HTTP call, not using az copy or powershell.
from your description and the URL you provided, I guess the issue is that you didn't reference the name of the zip file in the URL
so instead of
https://<STORAGE_ACCOUNT>.blob.core.windows.net/data?sp=r&st=2022-06-06T15:23:31Z&se=2022-06-06T23:23:31Z&spr=https&sv=2020-08-04&sr=c&sig=<SIGNATURE>
try
https://<STORAGE_ACCOUNT>.blob.core.windows.net/data/zipName?sp=r&st=2022-06-06T15:23:31Z&se=2022-06-06T23:23:31Z&spr=https&sv=2020-08-04&sr=c&sig=<SIGNATURE>
Using logic apps and event grid, I have no problem triggering event and getting blob properties, but how do I feed this into an Azure file server?
I can see the blob object, the url and so on. I can use Compose action to get the URL. When I pass the URL into a Copy File action of the Azure File object, it gives me a 404 that the file doesn't exist (hence the http 404 response code). Do I need to getBlobContent into a variable and write that to a file? Do I need to use the create SAS URI from the blob path and then use the SAS URI? The latter is what you'd do if you were sending a "click here to get blob" file link to a colleague. But my thought is that the Blob object exists as an accessible object when the trigger occurs (event grid see a file created in the stroage account). Documentation is not helping me.
For this requirement, you need to get the blob content first, then use "Create file" action to create it in your file server. Please refer to my logic app below:
In my logic app, I get the blob content with the path of the blob. Since you mentioned yo have got the url of the blob, so think you can substring the url and get the path. And then put the path into the "Blob" box of the "Get blob content" action.
I have had similar issues and found the 404 was regarding the formatting of the dynamic path returned from the blob.
I ended up (after a lot of hair pulling) stripping out the path using Compose and a bit of hard coded text in a "Get blob content using path" action.
Essentially (this is for my use case which was pulling json files out of blob into a LA workspace, but parts may be applicable for yourself)
Get Subject (from the Eventgrid blob trigger action)
Compose
Inputs - (Subject)
Get Blob content using path
Blob path - /directory/substring(...)
Infer Content Type - NO
(The Substring expression here I am using is - substring(outputs('Compose'), x) where x is the number of characters before my hardcoded directory in the path.
Then -
Initialize variable
Name - BlobContentAsText
Type - String
Value - File Content (from Get blob content using path)
Then -
Send Data (Preview)
JSON Request Body - BlobContentAsText
Custom Log Name - Logs_CL
I have uploaded a number of images to a Blob container on an Azure storage account of type StorageV2 (general purpose v2).
These were uploaded programmatically. Here's the code I used:
public Task CopyFile(string fileName, string targetPath)
{
var blobRef = Container.GetBlockBlobReference(targetPath);
blobRef.Properties.ContentType = GetContentType(fileName);
return blobRef.UploadFromFileAsync(fileName);
}
public string GetContentType(string fileName)
{
var provider = new FileExtensionContentTypeProvider();
if (!provider.TryGetContentType(fileName, out var contentType))
{
contentType = "application/octet-stream";
}
return contentType;
}
Container is an initialized CloudBlobContainer instance.
When I use the Storage Explorer I can see the uploaded files. If I view the properties of any file it lists a Uri property. However, if I copy the value (a URL) and paste into a browser I see the following error page:
<Error>
<Code>ResourceNotFound</Code>
<Message>
The specified resource does not exist. RequestId:12485818-601e-0017-6f69-56c3df000000 Time:2019-08-19T08:35:13.2123849Z
</Message>
</Error>
But if I double-click the file in Storage Explorer it downloads the image correctly. The URL it uses is the same as the one I copied earlier as far as I could tell, except for some additional querystrings that look like this: ?sv=2018-03-28&ss=bqtf&srt=sco&sp=rwdlacup&se=2019-08-19T16:49:38Z&sig=%2FJs7VnGKsjplalKXCcl0XosgUkPWJccg0qdvCSZlDSs%3D&_=1566204636804
I assume this must mean my blobs are not publically available, but I can't find any setting that will make my images available publically at their known URI. Can anyone point me in the right direction here? Thank you.
Check the access level that set to your container.
If that is a Private then you will have the error that you experiencing: ResourceNotFound
As far as I know, if you container's access level is Private, you use the direct url to access the blob then you will get the error. If you want to access it, you need to generate a SAS token for it.
For more details, please refer to
https://learn.microsoft.com/en-us/azure/storage/blobs/storage-manage-access-to-resources
https://learn.microsoft.com/en-us/azure/storage/common/storage-sas-overview
My apologies for asking this basic question. I'm very new in Azure environment.
I have stored log files in Azure portal as .csv
I want to view this .csv file without download it.
Azure already give the URL link for this file. But it is unable to view
This is the link that provides by Azure for my .csv file:
https://xxxxxx.xfile.xcore.windows.net/cctvfeedfs/log/testcsv.csv
Fyi, I do have SAS signature, this SAS signature when I combine with the URL its will download the .csv file. Example like this:
https://xxxxxx.xfile.xcore.windows.net/cctvfeedfs/log/testcsv.csv?sv=2017-11-09&ss=bfqt&srt=sco&sp=rwdlacup&se=2099-10-04T09:06:59Z&st=2018-10-04T01:06:59Z&spr=https&sig=%2Fb%2BrssXtUP5V%2F9%2BSXzpSauyugpG%2BvXOfn9GqLfdf1EOUE%3D
But actually I don't want to download but just want to view it.
It is have any possible way to do so I can view the content in .csv without download it?
Please help. Thank you in advance!
What can I do to view the content online without download it?
If your container is not public,the url can't be viewed the content of the file directly, otherwise there would be no any security for your files.
So please refer to the offical documents Secure access to an application's data in the cloud and Using shared access signatures (SAS). Then, we need to generate a blob url with SAS signature for accessing.
Here is the sample java code to generate a blob url with SAS signature.
SharedKeyCredentials credentials = new SharedKeyCredentials(accountName, accountKey);
ServiceSASSignatureValues values = new ServiceSASSignatureValues()
.withProtocol(SASProtocol.HTTPS_ONLY) // Users MUST use HTTPS (not HTTP).
.withExpiryTime(OffsetDateTime.now().plusDays(2)) // 2 days before expiration.
.withContainerName(containerName)
.withBlobName(blobName);
BlobSASPermission permission = new BlobSASPermission()
.withRead(true)
.withAdd(true)
.withWrite(true);
values.withPermissions(permission.toString());
SASQueryParameters serviceParams = values.generateSASQueryParameters(credentials);
String sasSign = serviceParams.encode();
String blobUrlWithSAS = String.format(Locale.ROOT, "https://%s.blob.core.windows.net/%s/%s%s",
accountName, containerName, blobName, sasSign);
You also can add the SAS signature at the end of the string of blob.toURL().
String blobUrlWithSAS = blob.toString()+sasSign;
About SAS Signature, you can refer to these sample codes in ServiceSASSignatureValues Class and AccountSASSignatureValues Class.
You could check the ContentType with your csv file in Azure Storage Explorer Tool.
If you change the format of it to text/plain,
then it could show the content directly in the browser.
BTW,you could set the content type when you upload the file.(Please see this SO case :Uploading blockblob and setting contenttype)
I started using SAS today for the first time and I was intrigued by it but.. Would i really need it in my current project?
On my azure website, I want users to be able to upload and download blobs. Every user has their own account.
This is a simple upload function:
//Upload blob
CloudBlobContainer container = CloudStorageServices.GetCloudBlobsContainer();
CloudBlockBlob blockBlob = container.GetBlockBlobReference(file.FileName + guid);
blockBlob.UploadFromStream(file.InputStream);
//Get uri from blob
var blobUrl = blockBlob.Uri;
//Upload table
CloudTable table2 = CloudStorageServices.GetCloudUploadsTable();
UploadEntity uploadtable = new UploadEntity(imagename, entity.RowKey + guid, blobUrl.ToString(), User.Identity.Name, file.FileName + guid);
TableOperation insertOperation = TableOperation.InsertOrReplace(uploadtable);
table2.Execute(insertOperation);
I dont really see the point in using SAS here, but i have feeling im totally wrong..
Why should I use it?
When user has uploaded a blob, the belonging blobs(uploaded by that user) will be listed and the user will ONLY be able to select and download their own items, Javascript download.
Same here.. Why would i need SAS?
Thanks!!
If you have a website or web service in front of your blobs then you don't need to use SAS since you can control the user permission by yourself. I used SAS in most cases that I cannot control the user permission.
For example, then we need to display images directly from the blob URL, I need to prevent them from being loaded and displayed by other websites. So I will generate SAS for the images needs to be displayed on that page and change the page html content to the SAS URL with few seconds. So if other wanted to read images by these SAS URLs they will be expired.
Another example might be upload. Different from your scenario if you need someone to upload directly to blob then you might need SAS since you should not give them your storage security keys.
Hope this helps a bit.