I have an application that hosts videos, and we recently migrated to Azure.
On our old application we gave the ability for users to either play or download the video. However on Azure it seems like I have to pick between which functionality I want, as the content disposition has to be set on the file and not on the request.
So far I have came up with two very poor solutions.
The first solution is streaming the download through my MVC server.
CloudStorageAccount storageAccount = CloudStorageAccount.Parse(ConfigurationManager.AppSettings["StorageConnectionString"]);
CloudBlobClient blobClient = storageAccount.CreateCloudBlobClient();
CloudBlobContainer container = blobClient.GetContainerReference("videos");
string userFileName = service.FirstName + service.LastName + "Video.mp4";
Response.AddHeader("Content-Disposition", "attachment; filename=" + userFileName); // force download
container.GetBlobReference(service.Video.ConvertedFilePath).DownloadToStream(Response.OutputStream);
return new EmptyResult();
This option works okay for smaller videos, but it is very taxing on my server. For larger videos the operation times out.
The second option is hosting every video twice.
This option is obviously bad, as I will have to pay double the storage cost.
However on Azure it seems like I have to pick between which
functionality I want, as the content disposition has to be set on the
file and not on the request.
There's a workaround for that. As you may know there's a Content-Disposition property that you can define on a blob. However when you define a value for this property, it will always be applied on that blob. When you want to selectively apply this property on a blob (say on a per request basis), what you do is create a Shared Access Signature (SAS) on that blob and override this request header there. Then you can serve the blob via SAS URL.
Here's the sample code for this:
CloudStorageAccount storageAccount = CloudStorageAccount.Parse(ConfigurationManager.AppSettings["StorageConnectionString"]);
CloudBlobClient blobClient = storageAccount.CreateCloudBlobClient();
CloudBlobContainer container = blobClient.GetContainerReference("videos");
string userFileName = service.FirstName + service.LastName + "Video.mp4";
CloudBlockBlob blob = container.GetBlockBlobReference(userFileName);
SharedAccessBlobPolicy policy = new SharedAccessBlobPolicy()
{
Permissions = SharedAccessBlobPermissions.Read,
SharedAccessExpiryTime = DateTime.UtcNow.AddHours(1)
};
SharedAccessBlobHeaders blobHeaders = new SharedAccessBlobHeaders()
{
ContentDisposition = "attachment; filename=" + userFileName
};
string sasToken = blob.GetSharedAccessSignature(policy, blobHeaders);
var sasUrl = blob.Uri.AbsoluteUri + sasToken;//This is the URL you will use. It will force the user to download the video.
I wrote a blog post about the same long time ago that you may find useful: http://gauravmantri.com/2013/11/28/new-changes-to-windows-azure-storage-a-perfect-thanksgiving-gift/.
As far as I know, azure blob storage doesn't support add the custom header to the special container.
I suggest you could follow and vote this feedback to push the azure develop team to support this feature.
Here is a workaround, you could compression the video file firstly, then uploaded to the azure blob storage.
It will not be opened by the browser.
Related
I have saved pdf files in azure blob storage blob, I want to show these files on my website but when a file render on html its link should be deactivated means no one can use that link to download the file again. Is this possible in azure blob storage?
You can use the blob policy to make it:
CloudStorageAccount account = CloudStorageAccount.Parse("yourStringConnection");
CloudBlobClient serviceClient = account.CreateCloudBlobClient();
var container = serviceClient.GetContainerReference("yourContainerName");
container
.CreateIfNotExistsAsync()
.Wait();
CloudBlockBlob blob = container.GetBlockBlobReference("test/helloworld.txt");
blob.UploadTextAsync("Hello, World!").Wait();
SharedAccessBlobPolicy policy = new SharedAccessBlobPolicy();
// define the expiration time
policy.SharedAccessExpiryTime = DateTime.UtcNow.AddMinutes(1);
// define the permission
policy.Permissions = SharedAccessBlobPermissions.Read;
// create signature
string signature = blob.GetSharedAccessSignature(policy);
// get full temporary uri
Console.WriteLine(blob.Uri + signature);
If I understand correctly, you're looking for single use links to Azure Blobs. Natively this feature is not available in Azure Storage. You would need to write code to implement something like this where you would keep track of the number of times a link has been used and in case the limit exceeds, you will not process that link.
I have a WordPress on Linux App running on Azure with a MySql database.
I need to be able to upload PDF files to Azure, and then have a link in the web site that will enable users to then click the link and view the PDF.
To be more specific, the document is a monthly invoice that is created on premise and then uploaded to Azure. The user will log-in and then see a link that will allow him to view the invoice.
What I don't know is how the document should be stored. Should it be stored in the MySql database? Or in some type of storage which can be linked to? Of course, it needs to be secure.
You can use Azure blob storage to upload/store your PDF documents. Each document stored will have a link that can be then shown in your website. And also you can protect these resources and use SAS, shared key authentication mechanisms to access the resources.
Greg, Blob storage would be your best option within Azure, here's what it can do:
1 -Serving images or documents directly to a browser
2 -Storing files for distributed access
3- Streaming video and audio
4- Storing data for backup and restore, disaster recovery, and archiving
5- Storing data for analysis by an on-premises or Azure-hosted service
Any file stored within Azure Blobs would be able to be accessed through a link, ex:
https://storagesample.blob.core.windows.net/mycontainer/blob1.txt or use an alias such as http://files.mycompany.com/somecontainer/bolbs.txt
Full details can be accessed here: https://learn.microsoft.com/en-us/azure/storage/blobs/storage-dotnet-how-to-use-blobs
You can use Azure Blob Storage to store any file type.
As shown below, get the File-Name, File-Stream, MimeType and File Data for any file.
var fileName = Path.GetFileName(#"C:\ConsoleApp1\Readme.pdf");
var fileStream = new FileStream(fileName, FileMode.Create);
string mimeType = MimeMapping.MimeUtility.GetMimeMapping(fileName);
byte[] fileData = new byte[fileName.Length];
objBlobService.UploadFileToBlobAsync(fileName, fileData, mimeType);
Here's the main method to upload files to Azure Blob
private async Task<string> UploadFileToBlobAsync(string strFileName, byte[] fileData, string fileMimeType)
{
// access key will be available from Azure blob - "DefaultEndpointsProtocol=https;AccountName=XXX;AccountKey=;EndpointSuffix=core.windows.net"
CloudStorageAccount csa = CloudStorageAccount.Parse(accessKey);
CloudBlobClient cloudBlobClient = csa.CreateCloudBlobClient();
string containerName = "my-blob-container"; //Name of your Blob Container
CloudBlobContainer cbContainer = cloudBlobClient.GetContainerReference(containerName);
string fileName = this.GenerateFileName(strFileName);
if (await cbContainer.CreateIfNotExistsAsync())
{
await cbContainer.SetPermissionsAsync(new BlobContainerPermissions { PublicAccess = BlobContainerPublicAccessType.Blob });
}
if (fileName != null && fileData != null)
{
CloudBlockBlob cbb = cbContainer.GetBlockBlobReference(fileName);
cbb.Properties.ContentType = fileMimeType;
await cbb.UploadFromByteArrayAsync(fileData, 0, fileData.Length);
return cbb.Uri.AbsoluteUri;
}
return "";
}
Here's the reference URL. Make sure you install these Nuget packages.
Install-Package WindowsAzure.Storage
Install-Package MimeMapping
I'm trying to download a file from this URL:
https://renatoleite.blob.core.windows.net/mycontainer/documents/Test Document.pdf
The browser is changing the URL to this:
https://renatoleite.blob.core.windows.net/mycontainer/documents/Test%20Document.pdf
My file in the blob storage has the name: Test Document.pdf
So, when I clicks to download, the Azure say that file not exist:
The specified resource does not exist.
Probably because the browser is trying to get the file with "%20" in the name.
How I can solve this?
As far as I know, if you want to upload the file space name by using azure storage api, it will auto encoded the name(replace the space with %20) when uploading it.
You could see below example:
I uploaded the Test Document.pdf to the blob storage.
CloudStorageAccount storageAccount = CloudStorageAccount.Parse(
CloudConfigurationManager.GetSetting("StorageConnectionString"));
// Create the blob client.
CloudBlobClient blobClient = storageAccount.CreateCloudBlobClient();
// Retrieve a reference to a container.
CloudBlobContainer container = blobClient.GetContainerReference("brando");
// Create the container if it doesn't already exist.
container.CreateIfNotExists();
// Retrieve reference to a blob named "myblob".
CloudBlockBlob blockBlob = container.GetBlockBlobReference("Test Document.pdf");
// Create or overwrite the "myblob" blob with contents from a local file.
using (var fileStream = System.IO.File.OpenRead(#"D:\Test Document.pdf"))
{
blockBlob.UploadFromStream(fileStream);
}
Then I suggest you could use storage explorer(right click the properties to see its url) or azure portal to see its url from the blob's property.
The url like this:
You could find it replace the space with %20.
I am uploading zip files to Azure Blob Storage which are relatively huge.
Now I need to go to those containers, get the blob reference and store that zip file into multiple zips on Cloud file share. I am not sure how to proceed with that.
var storageAccount = AzureUtility.CreateStorageAccountFromConnectionString();
var container = AzureUtility.GetAzureCloudBlobContainerReference("fabcd");
CloudFileClient fileClient = storageAccount.CreateCloudFileClient();
CloudFileShare share = fileClient.GetShareReference("sample-share");
share.CreateIfNotExists();
CloudBlockBlob sourceBlob = container.GetBlockBlobReference("Test.zip");
var file = share.GetRootDirectoryReference().GetFileReference("Test.zip").Exists();
if(file)
{
//split and share
}
any suggestions
My understanding is that you're trying to download a blob and then divide and upload it into multiple files.
There's two options here both of which are exposed through the .Net API. The blob API exposes both DownloadRangeTo* methods and DownloadTo* methods. The file API exposes UploadFrom* methods. If you know in advance the divisions you want to make you can download the range and then upload it to the file. Otherwise you can download the entire blob, divide it client side, and then upload the divisions.
I have a CentOS vm running on Azure. I have several WordPress sites on the vm and I am using Azure CDN. Is there a way to config the CDN to serve compressed content? Also, I'm using W3 Total Cache.
To my knowledge, there is no way to configure the CDN to compress your files before serving them to clients. However, you can certainly compress your files when you upload them to you blob storage and therefore the CDN will serve files that have already been compressed. That's what I do, anyway. Here's a snippet of C# code that demonstrates how I compress my files before uploading them to my Azure blob storage:
var blobData = new byte[] { ... you data goes here ... };
var stream = new MemoryStream();
var zipStream = new GZipStream(stream, CompressionMode.Compress))
zipStream.Write(data, 0, blobData.Length);
zipStream.Close();
var storageCredentials = new StorageCredentials("myaccountname", "mykey");
var storageAccount = new CloudStorageAccount(storageCredentials, true);
var blobClient = storageAccount.CreateCloudBlobClient();
var container = blobClient.GetContainerReference("$root");
var blob = container.GetBlockBlobReference("mycompressedfile.txt");
blob.Properties.ContentEncoding = "gzip";
blob.UploadFromStream(stream);
Updated Answer:
All 3 Azure CDN providers; Verizon, Akamai and Microsoft support compressing resources on the endpoint to return to the client, based on Accept-Encoding request headers.
see documentation for further details
https://learn.microsoft.com/en-gb/azure/cdn/cdn-improve-performance#compression-rules
Resource must meet certain criteria, and the resource MIME type be expressly configured for compression.