Azure Blob Storage Temporary File URL - azure

I have saved pdf files in azure blob storage blob, I want to show these files on my website but when a file render on html its link should be deactivated means no one can use that link to download the file again. Is this possible in azure blob storage?

You can use the blob policy to make it:
CloudStorageAccount account = CloudStorageAccount.Parse("yourStringConnection");
CloudBlobClient serviceClient = account.CreateCloudBlobClient();
var container = serviceClient.GetContainerReference("yourContainerName");
container
.CreateIfNotExistsAsync()
.Wait();
CloudBlockBlob blob = container.GetBlockBlobReference("test/helloworld.txt");
blob.UploadTextAsync("Hello, World!").Wait();
SharedAccessBlobPolicy policy = new SharedAccessBlobPolicy();
// define the expiration time
policy.SharedAccessExpiryTime = DateTime.UtcNow.AddMinutes(1);
// define the permission
policy.Permissions = SharedAccessBlobPermissions.Read;
// create signature
string signature = blob.GetSharedAccessSignature(policy);
// get full temporary uri
Console.WriteLine(blob.Uri + signature);

If I understand correctly, you're looking for single use links to Azure Blobs. Natively this feature is not available in Azure Storage. You would need to write code to implement something like this where you would keep track of the number of times a link has been used and in case the limit exceeds, you will not process that link.

Related

Is there a way to access azure blob storage using SAS(shared access signatures) URI in java/python code

I have an Azure SAS URI. I need to upload file in the same. Can I access it using java code. I can only find example using the key and the SAS(shared access signatures) URI.
You can use the Azure Storage SDK for java . To upload a file, you just need to use the class CloudBlockBlob which should look like,
URL url = new URL("yourSASUrl");
try
{
CloudBlockBlob blob = new CloudBlockBlob(url)
File source = new File(filePath);
blob.upload(new FileInputStream(source), source.length());
}

How do I read and write to immutable blob storage in azure

I would like to put audit data in immutable blob storage. Is there a specific format that needs to be written in?
Also once written how do we query or see this data in case we have to look at the data.
Would log analytic be able to this?
You have to create an immutable policy. You can do it using one of the management sdks or using the portal:
Create a blob container in any General Purpose V2 storage account. Then navigate to its properties and create a policy:
I used a legal policy, there is also a time based retention policy.
Offical documentation: here
You can read and write it using standard methods like (C#):
var storageAccount = CloudStorageAccount.Parse(CloudStorageConnectionString);
var blobClient = storageAccount.CreateCloudBlobClient();
var container = blobClient.GetContainerReference(StorageContainer);
var blob = container.GetBlockBlobReference("test");
blob.UploadText("Content");
Console.WriteLine(blob.DownloadText());
blob.UploadText("Modified Content"); // fails with a (409) conflict error with ErrorCode: BlobImmutableDueToLegalHold

How to set content disposition on individual azure blob requests?

I have an application that hosts videos, and we recently migrated to Azure.
On our old application we gave the ability for users to either play or download the video. However on Azure it seems like I have to pick between which functionality I want, as the content disposition has to be set on the file and not on the request.
So far I have came up with two very poor solutions.
The first solution is streaming the download through my MVC server.
CloudStorageAccount storageAccount = CloudStorageAccount.Parse(ConfigurationManager.AppSettings["StorageConnectionString"]);
CloudBlobClient blobClient = storageAccount.CreateCloudBlobClient();
CloudBlobContainer container = blobClient.GetContainerReference("videos");
string userFileName = service.FirstName + service.LastName + "Video.mp4";
Response.AddHeader("Content-Disposition", "attachment; filename=" + userFileName); // force download
container.GetBlobReference(service.Video.ConvertedFilePath).DownloadToStream(Response.OutputStream);
return new EmptyResult();
This option works okay for smaller videos, but it is very taxing on my server. For larger videos the operation times out.
The second option is hosting every video twice.
This option is obviously bad, as I will have to pay double the storage cost.
However on Azure it seems like I have to pick between which
functionality I want, as the content disposition has to be set on the
file and not on the request.
There's a workaround for that. As you may know there's a Content-Disposition property that you can define on a blob. However when you define a value for this property, it will always be applied on that blob. When you want to selectively apply this property on a blob (say on a per request basis), what you do is create a Shared Access Signature (SAS) on that blob and override this request header there. Then you can serve the blob via SAS URL.
Here's the sample code for this:
CloudStorageAccount storageAccount = CloudStorageAccount.Parse(ConfigurationManager.AppSettings["StorageConnectionString"]);
CloudBlobClient blobClient = storageAccount.CreateCloudBlobClient();
CloudBlobContainer container = blobClient.GetContainerReference("videos");
string userFileName = service.FirstName + service.LastName + "Video.mp4";
CloudBlockBlob blob = container.GetBlockBlobReference(userFileName);
SharedAccessBlobPolicy policy = new SharedAccessBlobPolicy()
{
Permissions = SharedAccessBlobPermissions.Read,
SharedAccessExpiryTime = DateTime.UtcNow.AddHours(1)
};
SharedAccessBlobHeaders blobHeaders = new SharedAccessBlobHeaders()
{
ContentDisposition = "attachment; filename=" + userFileName
};
string sasToken = blob.GetSharedAccessSignature(policy, blobHeaders);
var sasUrl = blob.Uri.AbsoluteUri + sasToken;//This is the URL you will use. It will force the user to download the video.
I wrote a blog post about the same long time ago that you may find useful: http://gauravmantri.com/2013/11/28/new-changes-to-windows-azure-storage-a-perfect-thanksgiving-gift/.
As far as I know, azure blob storage doesn't support add the custom header to the special container.
I suggest you could follow and vote this feedback to push the azure develop team to support this feature.
Here is a workaround, you could compression the video file firstly, then uploaded to the azure blob storage.
It will not be opened by the browser.

How to download files with white space on name on Azure Blob Storage?

I'm trying to download a file from this URL:
https://renatoleite.blob.core.windows.net/mycontainer/documents/Test Document.pdf
The browser is changing the URL to this:
https://renatoleite.blob.core.windows.net/mycontainer/documents/Test%20Document.pdf
My file in the blob storage has the name: Test Document.pdf
So, when I clicks to download, the Azure say that file not exist:
The specified resource does not exist.
Probably because the browser is trying to get the file with "%20" in the name.
How I can solve this?
As far as I know, if you want to upload the file space name by using azure storage api, it will auto encoded the name(replace the space with %20) when uploading it.
You could see below example:
I uploaded the Test Document.pdf to the blob storage.
CloudStorageAccount storageAccount = CloudStorageAccount.Parse(
CloudConfigurationManager.GetSetting("StorageConnectionString"));
// Create the blob client.
CloudBlobClient blobClient = storageAccount.CreateCloudBlobClient();
// Retrieve a reference to a container.
CloudBlobContainer container = blobClient.GetContainerReference("brando");
// Create the container if it doesn't already exist.
container.CreateIfNotExists();
// Retrieve reference to a blob named "myblob".
CloudBlockBlob blockBlob = container.GetBlockBlobReference("Test Document.pdf");
// Create or overwrite the "myblob" blob with contents from a local file.
using (var fileStream = System.IO.File.OpenRead(#"D:\Test Document.pdf"))
{
blockBlob.UploadFromStream(fileStream);
}
Then I suggest you could use storage explorer(right click the properties to see its url) or azure portal to see its url from the blob's property.
The url like this:
You could find it replace the space with %20.

Avoid over-writing blobs AZURE on the server

I have a .NET app which uses the WebClient and the SAS token to upload a blob to the container. The default behaviour is that a blob with the same name is replaced/overwritten.
Is there a way to change it on the server, i.e. prevents from replacing the already existing blob?
I've seen the Avoid over-writing blobs AZURE but it is about the client side.
My goal is to secure the server from overwritting blobs.
AFAIK the file is uploaded directly to the container without a chance to intercept the request and check e.g. existence of the blob.
Edited
Let me clarify: My client app receives a SAS token to upload a new blob. However, an evil hacker can intercept the token and upload a blob with an existing name. Because of the default behavior, the new blob will replace the existing one (effectively deleting the good one).
I am aware of different approaches to deal with the replacement on the client. However, I need to do it on the server, somehow even against the client (which could be compromised by the hacker).
You can issue the SAS token with "create" permissions, and without "write" permissions. This will allow the user to upload blobs up to 64 MB in size (the maximum allowed Put Blob) as long as they are creating a new blob and not overwriting an existing blob. See the explanation of SAS permissions for more information.
There is no configuration on server side but then you can implement some code using the storage client sdk.
// retrieve reference to a previously created container.
var container = blobClient.GetContainerReference(containerName);
// retrieve reference to a blob.
var blobreference = container.GetBlockBlobReference(blobName);
// if reference exists do nothing
// else upload the blob.
You could do similar using the REST api
https://learn.microsoft.com/en-us/rest/api/storageservices/fileservices/blob-service-rest-api
GetBlobProperties which will return 404 if blob does not exists.
Is there a way to change it on the server, i.e. prevents from replacing the already existing blob?
Azure Storage Services expose the Blob Service REST API for you to do operations against Blobs. For upload/update a Blob(file), you need invoke Put Blob REST API which states as follows:
The Put Blob operation creates a new block, page, or append blob, or updates the content of an existing block blob. Updating an existing block blob overwrites any existing metadata on the blob. Partial updates are not supported with Put Blob; the content of the existing blob is overwritten with the content of the new blob.
In order to avoid over-writing existing Blobs, you need to explicitly specify the Conditional Headers for your Blob Operations. For a simple way, you could leverage Azure Storage SDK for .NET (which is essentially a wrapper over Azure Storage REST API) to upload your Blob(file) as follows to avoid over-writing Blobs:
try
{
var container = new CloudBlobContainer(new Uri($"https://{storageName}.blob.core.windows.net/{containerName}{containerSasToken}"));
var blob = container.GetBlockBlobReference("{blobName}");
//bool isExist=blob.Exists();
blob.UploadFromFile("{filepath}", accessCondition: AccessCondition.GenerateIfNotExistsCondition());
}
catch (StorageException se)
{
var requestResult = se.RequestInformation;
if(requestResult!=null)
//409,The specified blob already exists.
Console.WriteLine($"HttpStatusCode:{requestResult.HttpStatusCode},HttpStatusMessage:{requestResult.HttpStatusMessage}");
}
Also, you could combine your blob name with the MD5 code of your blob file before uploading to Azure Blob Storage.
As I known, there is no any configurations on Azure Portal or Storage Tools for you to achieve this purpose on server-side. You could try to post your feedback to Azure Storage Team.

Resources