How to store pdf's on Azure - azure

I have a WordPress on Linux App running on Azure with a MySql database.
I need to be able to upload PDF files to Azure, and then have a link in the web site that will enable users to then click the link and view the PDF.
To be more specific, the document is a monthly invoice that is created on premise and then uploaded to Azure. The user will log-in and then see a link that will allow him to view the invoice.
What I don't know is how the document should be stored. Should it be stored in the MySql database? Or in some type of storage which can be linked to? Of course, it needs to be secure.

You can use Azure blob storage to upload/store your PDF documents. Each document stored will have a link that can be then shown in your website. And also you can protect these resources and use SAS, shared key authentication mechanisms to access the resources.

Greg, Blob storage would be your best option within Azure, here's what it can do:
1 -Serving images or documents directly to a browser
2 -Storing files for distributed access
3- Streaming video and audio
4- Storing data for backup and restore, disaster recovery, and archiving
5- Storing data for analysis by an on-premises or Azure-hosted service
Any file stored within Azure Blobs would be able to be accessed through a link, ex:
https://storagesample.blob.core.windows.net/mycontainer/blob1.txt or use an alias such as http://files.mycompany.com/somecontainer/bolbs.txt
Full details can be accessed here: https://learn.microsoft.com/en-us/azure/storage/blobs/storage-dotnet-how-to-use-blobs

You can use Azure Blob Storage to store any file type.
As shown below, get the File-Name, File-Stream, MimeType and File Data for any file.
var fileName = Path.GetFileName(#"C:\ConsoleApp1\Readme.pdf");
var fileStream = new FileStream(fileName, FileMode.Create);
string mimeType = MimeMapping.MimeUtility.GetMimeMapping(fileName);
byte[] fileData = new byte[fileName.Length];
objBlobService.UploadFileToBlobAsync(fileName, fileData, mimeType);
Here's the main method to upload files to Azure Blob
private async Task<string> UploadFileToBlobAsync(string strFileName, byte[] fileData, string fileMimeType)
{
// access key will be available from Azure blob - "DefaultEndpointsProtocol=https;AccountName=XXX;AccountKey=;EndpointSuffix=core.windows.net"
CloudStorageAccount csa = CloudStorageAccount.Parse(accessKey);
CloudBlobClient cloudBlobClient = csa.CreateCloudBlobClient();
string containerName = "my-blob-container"; //Name of your Blob Container
CloudBlobContainer cbContainer = cloudBlobClient.GetContainerReference(containerName);
string fileName = this.GenerateFileName(strFileName);
if (await cbContainer.CreateIfNotExistsAsync())
{
await cbContainer.SetPermissionsAsync(new BlobContainerPermissions { PublicAccess = BlobContainerPublicAccessType.Blob });
}
if (fileName != null && fileData != null)
{
CloudBlockBlob cbb = cbContainer.GetBlockBlobReference(fileName);
cbb.Properties.ContentType = fileMimeType;
await cbb.UploadFromByteArrayAsync(fileData, 0, fileData.Length);
return cbb.Uri.AbsoluteUri;
}
return "";
}
Here's the reference URL. Make sure you install these Nuget packages.
Install-Package WindowsAzure.Storage
Install-Package MimeMapping

Related

Is there any difference between azure storage of local environment and with online storage

Is there any difference between azure storage of local environment and with online storage.
We have created a Azure local storage using storage emulator. Refer the below link.
https://learn.microsoft.com/en-us/azure/storage/common/storage-use-emulator
https://medium.com/oneforall-undergrad-software-engineering/setting-up-the-azure-storage-emulator-environment-on-windows-5f20d07d3a04
But, we are unable to access the files for (read files) Azure local storage. Refer the below code.
const string accountName = "devstoreaccount1";// Provide the account name
const string key = "Eby8vdM02xNOcqFlqUwJPLlmEtlCDXJ1OUzFT50uSRZ6IFsuFq2UVErCz4I6tq/K1SZFPTOtr/KBHBeksoGMGw==";// Provide the account key
var storageCredentials = new StorageCredentials(accountName, key);
var cloudStorageAccount = new CloudStorageAccount(storageCredentials, true);
// Connect to the blob storage
CloudBlobClient serviceClient = cloudStorageAccount.CreateCloudBlobClient();
// Connect to the blob container
CloudBlobContainer container = serviceClient.GetContainerReference(**"container name"**);
container.SetPermissionsAsync(new
BlobContainerPermissions
{
PublicAccess =
BlobContainerPublicAccessType.Blob
});
// Connect to the blob file
CloudBlockBlob blob = container.GetBlockBlobReference("sample.txt");
blob.DownloadToFileAsync("sample.txt", System.IO.FileMode.Create);
// Get the blob file as text
string contents = blob.DownloadTextAsync().Result;
The above code works correctly for reading the files in online Azure storage. Anyone suggest how to resolve the issue in reading the files in local Azure storage.
The document explains clearly about differences between the Storage Emulator and Azure Storage.
If you would like to access the local storage, you could call this API. For more details about URI, see here.
Get http://<local-machine-address>:<port>/<account-name>/<resource-path>

Azure Blob Storage Temporary File URL

I have saved pdf files in azure blob storage blob, I want to show these files on my website but when a file render on html its link should be deactivated means no one can use that link to download the file again. Is this possible in azure blob storage?
You can use the blob policy to make it:
CloudStorageAccount account = CloudStorageAccount.Parse("yourStringConnection");
CloudBlobClient serviceClient = account.CreateCloudBlobClient();
var container = serviceClient.GetContainerReference("yourContainerName");
container
.CreateIfNotExistsAsync()
.Wait();
CloudBlockBlob blob = container.GetBlockBlobReference("test/helloworld.txt");
blob.UploadTextAsync("Hello, World!").Wait();
SharedAccessBlobPolicy policy = new SharedAccessBlobPolicy();
// define the expiration time
policy.SharedAccessExpiryTime = DateTime.UtcNow.AddMinutes(1);
// define the permission
policy.Permissions = SharedAccessBlobPermissions.Read;
// create signature
string signature = blob.GetSharedAccessSignature(policy);
// get full temporary uri
Console.WriteLine(blob.Uri + signature);
If I understand correctly, you're looking for single use links to Azure Blobs. Natively this feature is not available in Azure Storage. You would need to write code to implement something like this where you would keep track of the number of times a link has been used and in case the limit exceeds, you will not process that link.

How to set content disposition on individual azure blob requests?

I have an application that hosts videos, and we recently migrated to Azure.
On our old application we gave the ability for users to either play or download the video. However on Azure it seems like I have to pick between which functionality I want, as the content disposition has to be set on the file and not on the request.
So far I have came up with two very poor solutions.
The first solution is streaming the download through my MVC server.
CloudStorageAccount storageAccount = CloudStorageAccount.Parse(ConfigurationManager.AppSettings["StorageConnectionString"]);
CloudBlobClient blobClient = storageAccount.CreateCloudBlobClient();
CloudBlobContainer container = blobClient.GetContainerReference("videos");
string userFileName = service.FirstName + service.LastName + "Video.mp4";
Response.AddHeader("Content-Disposition", "attachment; filename=" + userFileName); // force download
container.GetBlobReference(service.Video.ConvertedFilePath).DownloadToStream(Response.OutputStream);
return new EmptyResult();
This option works okay for smaller videos, but it is very taxing on my server. For larger videos the operation times out.
The second option is hosting every video twice.
This option is obviously bad, as I will have to pay double the storage cost.
However on Azure it seems like I have to pick between which
functionality I want, as the content disposition has to be set on the
file and not on the request.
There's a workaround for that. As you may know there's a Content-Disposition property that you can define on a blob. However when you define a value for this property, it will always be applied on that blob. When you want to selectively apply this property on a blob (say on a per request basis), what you do is create a Shared Access Signature (SAS) on that blob and override this request header there. Then you can serve the blob via SAS URL.
Here's the sample code for this:
CloudStorageAccount storageAccount = CloudStorageAccount.Parse(ConfigurationManager.AppSettings["StorageConnectionString"]);
CloudBlobClient blobClient = storageAccount.CreateCloudBlobClient();
CloudBlobContainer container = blobClient.GetContainerReference("videos");
string userFileName = service.FirstName + service.LastName + "Video.mp4";
CloudBlockBlob blob = container.GetBlockBlobReference(userFileName);
SharedAccessBlobPolicy policy = new SharedAccessBlobPolicy()
{
Permissions = SharedAccessBlobPermissions.Read,
SharedAccessExpiryTime = DateTime.UtcNow.AddHours(1)
};
SharedAccessBlobHeaders blobHeaders = new SharedAccessBlobHeaders()
{
ContentDisposition = "attachment; filename=" + userFileName
};
string sasToken = blob.GetSharedAccessSignature(policy, blobHeaders);
var sasUrl = blob.Uri.AbsoluteUri + sasToken;//This is the URL you will use. It will force the user to download the video.
I wrote a blog post about the same long time ago that you may find useful: http://gauravmantri.com/2013/11/28/new-changes-to-windows-azure-storage-a-perfect-thanksgiving-gift/.
As far as I know, azure blob storage doesn't support add the custom header to the special container.
I suggest you could follow and vote this feedback to push the azure develop team to support this feature.
Here is a workaround, you could compression the video file firstly, then uploaded to the azure blob storage.
It will not be opened by the browser.

How to download files with white space on name on Azure Blob Storage?

I'm trying to download a file from this URL:
https://renatoleite.blob.core.windows.net/mycontainer/documents/Test Document.pdf
The browser is changing the URL to this:
https://renatoleite.blob.core.windows.net/mycontainer/documents/Test%20Document.pdf
My file in the blob storage has the name: Test Document.pdf
So, when I clicks to download, the Azure say that file not exist:
The specified resource does not exist.
Probably because the browser is trying to get the file with "%20" in the name.
How I can solve this?
As far as I know, if you want to upload the file space name by using azure storage api, it will auto encoded the name(replace the space with %20) when uploading it.
You could see below example:
I uploaded the Test Document.pdf to the blob storage.
CloudStorageAccount storageAccount = CloudStorageAccount.Parse(
CloudConfigurationManager.GetSetting("StorageConnectionString"));
// Create the blob client.
CloudBlobClient blobClient = storageAccount.CreateCloudBlobClient();
// Retrieve a reference to a container.
CloudBlobContainer container = blobClient.GetContainerReference("brando");
// Create the container if it doesn't already exist.
container.CreateIfNotExists();
// Retrieve reference to a blob named "myblob".
CloudBlockBlob blockBlob = container.GetBlockBlobReference("Test Document.pdf");
// Create or overwrite the "myblob" blob with contents from a local file.
using (var fileStream = System.IO.File.OpenRead(#"D:\Test Document.pdf"))
{
blockBlob.UploadFromStream(fileStream);
}
Then I suggest you could use storage explorer(right click the properties to see its url) or azure portal to see its url from the blob's property.
The url like this:
You could find it replace the space with %20.

Saving an X509 certificate from an Azure Blob and using it in an Azure website

I have an Azure Website and an Azure Blob that I'm using to store a .cer X509 certificate file.
The goal is to get the .cer file from the blob and use it to perform an operation (the code for that is in the Controller for my Azure website and it works).
When I run the code locally (without publishing my site) it works, because I save it in D:\
CloudStorageAccount storageAccount = CloudStorageAccount.Parse(ConfigurationManager.AppSettings["StorageConnectionString"]);
// Create the blob client.
CloudBlobClient blobClient = storageAccount.CreateCloudBlobClient();
// Retrieve reference to a previously created container.
CloudBlobContainer container = blobClient.GetContainerReference("myContainer");
// Retrieve reference to a blob named "testcert.cer".
CloudBlockBlob blockBlob = container.GetBlockBlobReference("testcert.cer");
// Save blob contents to a file.
using (var fileStream = System.IO.File.OpenWrite("D:/testcert.cer"))
{
blockBlob.DownloadToStream(fileStream);
}
**string certLocation = "D:/testcert.cer";
X509Certificate2 myCert = new X509Certificate2();
myCert.Import(certLocation);**
I am unable to figure out how/where I can save it. If I try and use the Import method but enter a url (that of the Azure blob where the certificate is stored) I get an error because Import can't handle urls.
Any idea what I can use as temp storage on the Azure website or in the blob and create an X509Certificate from it?
Edit: I'm trying to add more detail about the problem I'm trying to solve.
Get a cert file from an Azure blob and write it to an Azure website.
Use the .Import(string pathToCert) on an X509Certificate object to create the cert which will be used to make a call in a method I've written in my controller.
I've been able to work around 1 by manually adding the .cer file to the wwwroot folder of my site via FTP. But now when I use Server.MapPath("~/testcert.cer"); to get the path for my certificate I get this: D:\home\site\wwwroot\testcert.cer
Obviously when the Import method uses the string above as a path once it's deployed to my azure website, it's not a valid path and so my cert creation fails.
Any ideas? Thanks!
Saving the certificate locally is generally a no-no for Azure, you've got BlobStorage for that.
Use the Import(byte[]) overload to keep and load the certificate in memory. Here's a quick hand coded attempt...
// Used to store the certificate data
byte[] certData;
// Save blob contents to a memorystream.
using (var stream = new MemoryStream())
{
blockBlob.DownloadToStream(stream);
certData = stream.ToArray();
}
X509Certificate2 myCert = new X509Certificate2();
// Import from the byte array
myCert.Import(certData);
Very simple. And the answer covers all and any web hosters, not just Azure.
First of all, I would highly recommend that you never statically put a path to a folder in your Web Projects! Then what you can do is:
User the Server.MapPath("~/certs") method to obtain the physical path of a certs folder within your web site root folder.
Make Sure that no one can access this folder from the outside world:
By adding this additional location section in your web.config you are blocking any external access to this folder. Please note that locationelement has to be direct descendant of the root configuration element in your web.config file.
UPDATE with non-azure relevant part on how to write file to local file system of a ASP.NET web project:
var path = Server.MapPath("~/certs");
using (var fileStream = System.IO.File.OpenWrite(path + "\testcert.cer"))
{
// here use the fileStream to write
}
And a complete sample code on how to use Blob Stroage client to write content of a blob to a local file:
var path = Server.MapPath("~/certs");
using (var fileStream = System.IO.File.OpenWrite(path + "\testcert.cer"))
{
blockBlob.DownloadToStream(fileStream );
}
But, #SeanCocteau has a good point and much simpler approach - just use MemoryStream instead!
You can now upload your certificates via the Portal, add an app setting to your site, and have the certificate show up in your site's Certificate Store.
See this blog post for more details:
http://azure.microsoft.com/blog/2014/10/27/using-certificates-in-azure-websites-applications/

Resources