Content Type not being set on Azure File Store in Node JS - node.js

I'm testing the functionality of uploading a file to Azure File Storage with this github sample: Node Getting Started
I modified line 111 to include an option for the contentSettings:
var options = { contentSettings: { contentType: 'application/pdf'}};
fileService.createFileFromLocalFile(shareName, directoryName, fileName, imageToUpload, options, function (error) {
if (error) {
callback(error);
} else {
... and whether I upload a PDF with contentType of 'application/pdf' or an image with 'image/png', the file content type is not set once it's posted to Azure storage.
When I copy the URL to the file in my website, the error comes back saying the content type is incorrect.
What am I doing wrong? How do I set the content types of the uploaded files to make them work in my website?

what's the version of azure-storage package you are using? I tried the code you pasted and the content type is set successfully to Azure Storage (latest version).
After uploading successfully, try to call getFileProperties and you can see the properties stored on the Azure Storage server side.
And my very clear about the scenario of "copying the URL to the file in my website" and the error.

Related

How to store files in firebase using node.js

I have a small assignment where I will have a URL to a document or a file like google drive link or dropbox link.
I have to use this link to store that file or doc in firebase using nodejs. How should i start?
Little head's up might help. What should i use? Please help I'm stuck here.
The documentation for using the admin SDK is mostly covered in GCP documentation.
Here's a snippet of code that shows how you could upload a image directly to Cloud Storage if you have a URL for it. Any public link works, whether it's shared from Dropbox or somewhere else on the internet.
Edit 2020-06-01 The option to upload directly from URL was dropped in v2.0 of the SDK (4 September 2018): https://github.com/googleapis/nodejs-storage/releases/tag/v2.0.0
const fileUrl = 'https://www.dropbox.com/some/file/download/link.jpg';
const opts = {
destination: 'path/to/file.jpg',
metadata: {
contentType: 'image/jpeg'
}
};
firebase.storage().bucket().upload(fileUrl, opts);
This example is using the default bucket in your application and the opts object provides file upload options for the API call.
destination is the path that your file will be uploaded to in Google Cloud Storage
metadata should describe the file that you're uploading (see more examples here)
contentType is the file MIME type that you are uploading

Copy css file from one subdirectory to another one in CONTAINER BLOB

Scenario:
I copy .css file from one subdirectory to another in Azure Storage Container. It is done from C# code level in my application. This is css style file for my website. Unfortunately I received error in my browser console during loading page:
Error
Resource interpreted as Stylesheet but transferred with MIME type application/octet-stream:
"SOME_PATH/template/css/styles.css?d=00000000-0000-0000-0000-000000000000".
Knowledge:
I know that it is why my file is sended as octet-stream instead of text/css. What can I do to say Azure to treat this file as text/css?
Edit: My code
string newFileName = fileToCopy.Name;
StorageFile newFile = cmsDirectory.GetStorageFileReference(newFileName);
using (var stream = new MemoryStream())
{
fileToCopy.DownloadToStream(stream);
stream.Seek(0, SeekOrigin.Begin);
newFile.UploadFromStream(stream);
}
where DownloadToStream and UploadToStream are methodes in my class:
CloudBlob.DownloadToStream(target);
and
CloudBlob.DownloadToStream(target);
CloudBlob is CloudBlockBlob type
You can set content type of blob via property ContentType
look at:
https://learn.microsoft.com/en-us/dotnet/api/microsoft.windowsazure.storage.blob.blobproperties.contenttype
Download AzCopy - http://aka.ms/azcopy
If you specify /SetContentType without a value, AzCopy sets each blob or file's content type according to its file extension.
Run this command on Windows
AzCopy /Source:C:\myfolder\ /Dest:https://myaccount.blob.core.windows.net/myContainer/ /DestKey:key /Pattern:ab /SetContentType
More details: https://learn.microsoft.com/en-us/azure/storage/common/storage-use-azcopy?toc=%2fazure%2fstorage%2fblobs%2ftoc.json
Use the Microsoft Azure Storage Explorer to modify the content-type string by hand for already existing file. Right click the blob file in explorer and the Left click on properties, scroll down to change the file format.

Azure File storage content-type is always application/octet-stream

I'm currently having issue with Azure File storage when I build up a URL with a shared access signature (SAS) Token. The file will download in the browser, but the content-type is always application/octet-stream rather than changing to match the mime type of the file. If I put the file in Azure BLOB storage and build up a URL with a SAS Token, it sends the correct content-type for my file (image/jpeg).
I've upgraded my storage account from V1 to V2 thinking that was the problem, but it didn't fix it.
Does anyone have a clue what I could try that might get Azure File storage to return the correct content-type using a URL with SAS Token to download the file?
So far these are the only fixes for the content-type that I've found:
Use the Microsoft Azure Storage Explorer to modify the content-type string by hand. You have to right click the file and the left-click properties to get the dialog to appear.
Programmatically modify the file using Microsoft's WindowsAzure.Storage Nuget package.
Surface file download via my own web site and not allow direct access.
For me, none of these are acceptable choices. The first two can lead to mistakes down the road if a user uploads a file via the portal or Microsoft Azure Storage Explore and forgets to change the content type. I also don't want to write Azure Functions or web jobs to monitor and fix this problem.
Since blob storage does NOT have the same problems when uploading via Microsoft Azure Storage Explore or via the portal, the cost is much lower AND both work with SAS Tokens, we are moving towards blob storage instead. We do lose the ability to mount the drive to our local computers and use something like Beyond Compare to do file comparisons, but that is a disadvantage that we can live with.
If anyone has a better solution than the ones mentioned above that fixes this problem, I will gladly up-vote it. However, I think that Microsoft will have to make changes for this problem to be fixed.
When I upload a jpeg file to file share through portal, content-type is changed to application/octet-stream indeed. But I can't reproduce your download problem.
I didn't specify content-type in my SAS request uri, but the file just download as a jpeg file. Have tested in SDK(Account SAS/Stored Access Policy/SAS on file itself) or REST API, both work even without content-type.
You can try to specify the content-type using the code below.
SharedAccessFileHeaders header = new SharedAccessFileHeaders()
{
ContentDisposition = "attachment",
ContentType = "image/jpeg"
};
string sasToken = file.GetSharedAccessSignature(sharedPolicy,header);
Azure blob falls to the default value of 'application/octet-stream' if nothing is provided. To get the correct mimetypes, this is what I did with my flask app:
#app.route('/', methods=['GET', 'POST'])
def upload_file():
if request.method == 'POST':
f = request.files['file']
mime_type = f.content_type
print (mime_type)
print (type(f))
try:
blob_service.create_blob_from_stream(container, f.filename, f,
content_settings=ContentSettings(content_type=mime_type))
except Exception as e:
print (str(e))
pass
mime_type was passed to ContentSettings to get the current mimetypes of files uploaded to azure blob.
In nodeJS:
blobService.createBlockBlobFromStream(container, blob, stream, streamLength, { contentSettings: { contentType: fileMimeType } }, callback)
where:
fileMimeType is the type of the file being uploaded
callback is your callback implementation
Reference to method used:
https://learn.microsoft.com/en-us/javascript/api/azure-storage/azurestorage.services.blob.blobservice.blobservice?view=azure-node-latest#createblockblobfromstream-string--string--stream-readable--number--createblockblobrequestoptions--errororresult-blobresult--
Check this out - Microsoft SAS Examples
If you don't want to update the content-type of your file in Azure or it's too much of a pain to update the content-type of all your existing files, you can pass the desired content-type w/ the SAS token as well. The rsct param is where you would specify the desired content-type.
e.g. - https://myaccount.file.core.windows.net/pictures/somefile.pdf?sv=2015-02-21&st=2015-07-01T08:49Z&se=2015-07-02T08:49Z&sr=c&sp=r&rscd=file;%20attachment&rsct=application%2Fpdf&sig=YWJjZGVmZw%3d%3d&sig=a39%2BYozJhGp6miujGymjRpN8tsrQfLo9Z3i8IRyIpnQ%3d
This works with java using com.microsoft.azure azure-storage library. Uploading to Shared Access Signature resource.
InputStream is = new FileInputStream(file);
CloudBlockBlob cloudBlockBlob = new CloudBlockBlob(new URI(sasUri));
cloudBlockBlob.getProperties().setContentType("application/pdf");
cloudBlockBlob.upload(is, file.length());
is.close();
For anyone looking to upload files correctly with a declared Content Type, the v12 client has changed setting Content type. You can use the ShareFileHttpHeaders parameter of file.Create
ShareFileClient file = directory.GetFileClient(fileName);
using FileStream stream = File.OpenRead(#"C:\Temp\Amanita_muscaria.jpg");
file.Create(stream.Length, new ShareFileHttpHeaders { ContentType = ContentType(fileName) });
file.UploadRange(new HttpRange(0, stream.Length),stream);
where ContentType(fileName) is a evaluation of filename, eg:
if (fileName.EndsWith(".txt")) return "text/plain";
// etc
// here you define your file content type
CloudBlockBlob cloudBlockBlob = container.GetBlockBlobReference(file.FileName);
cloudBlockBlob.Properties.ContentType = file.ContentType; //content type
I know that I'm not answering the question, but I do believe the answer is applicable. I had the same problem with a storage account that I need it to have it as a static website. Whenever I upload a blob to a container, the default type is "application/octet-stream" and because of this the index.html get downloaded instead of being displayed.
To change the file type do the following:
# Get Storage Account for its context
$storageAccount = Get-AzStorageAccount -ResourceGroupName <Resource Group Name> -Name <Storage Account Name>
# Get Blobs inside container of storage account
$blobs = Get-AzStorageBlob -Context $storageAccount.Context -Container <Container Name>
foreach ($blob in $blobs) {
$CloudBlockBlob = [Microsoft.Azure.Storage.Blob.CloudBlockBlob] $blob.ICloudBlob
$CloudBlockBlob.Properties.ContentType = <Desired type as string>
$CloudBlockBlob.SetProperties()
}
Note: for Azure File storage you might wanna change the library to [Microsoft.WindowsAzure.Storage.Blob.CloudBlockBlob]
I have not tried this, but ideally, you could use ClientOptions to specify a different header. It'd would look something like this:
ClientOptions options = new ClientOptions();
HttpHeader httpHeaders = new HttpHeader("Content-Type", "application/pdf");
options.setHeaders(Collections.singleton(httpHeaders));
blobClient = new BlobClientBuilder()
.endpoint(<SAS-URL>)
.blobName("hello")
.clientOptions(options)
.buildClient();
This way we can provide the our own mime_type as 'content-type'
with open(file.path,"rb") as data:
#blob_client.upload_blob(data)
mime_type =mimetypes.MimeTypes().guess_type(file.name)[0]
blob_client.upload_blob(data,content_type=mime_type)
print(f'{file.name}' " uploaded to blob storage")
Based on this answer: Twong answer
Example if you are using .NET (C#) API to proxy/generate SAS url from ShareFileClient (ShareFileClient class description):
if (downloadClient.CanGenerateSasUri)
{
var sasBuilder = new ShareSasBuilder(ShareFileSasPermissions.Read, DateTimeOffset.Now.AddDays(10))
{
ContentType = "application/pdf",
ContentDisposition = "inline"
};
return downloadClient.GenerateSasUri(sasBuilder);
}
Above example setup 10 days long token for pdf file which will be open into new browser tab (especially on Apple iOS).
Solution in Java is to specify the content-type when generating the signature image url:
blobServiceSasSignatureValues.setContentType("image/jpeg");

Google Cloud Storage - how to receive specific generation of file in Node.JS

I am using #google-cloud/storage for accessing the GCS API. After enabling versioning, I am trying to fetch a specific generation of a file using the following code:
bucket.file(`${name}#${generationId}`).download({
destination
}).then(() => {
res.set({
'Content-Type': 'image/jpeg'
});
return res.status(200).sendFile(destination);
})
Where name is the complete object name and generationId contains the number of the generation of the file.
However, if I am executing above code within Google Cloud Functions, I am receiving the following error message:
ApiError: No such object: my-bucket/image-mxmg569kdvnby6z85mi#1522107649220516
I am sure that the file and generation exists, as I have checked it with the JSON API explorer. My guess is that #google-cloud/storage does not support file versioning (yet), the documentation did not contain any information. Does anyone have experience with that?
If you set options.generation when instantiating the File object, it should work! https://github.com/googleapis/nodejs-storage/blob/4f4f1b4043c4f70ee99f051499ac62e893abdde0/src/file.js#L82
bucket.file(name, { generation: generationId })

fineuploader server side renaming the file before the put method

Just starting to test the FineUploader and I wonder:
When FineUploader uploading files directly to a blob container on azure,
I see the files (guid name instead of the original).
Is there any option to set on the server side the file name and the full path to save the file ?
Yes, you can retrieve the name for any file before it is uploaded from your server via an ajax call and supply it to Fine Uploader Azure by making use of the fact that the blobProperties.name option allows for a promissory return value. For example:
new qq.azure.FineUploader({
blobProperties: {
name: function(fileId) {
return new Promise(function(resolve) {
// retrieve file name for this file from your server...
resolve(filenameFromServer)
})
}
},
// all other options here...
})
The above option will be called by Fine Uploader Azure once per file, just before the first request is sent. This is true of chunked and non-chunked uploads. The value passed into resolve will be used as the new file name for the associated file.

Resources