How to upload image to Azure Blob Storage with proper content type - node.js

I've read most of the documentation provided by MS to upload files/image to Blob Storage. It's been two days now and I'm stuck. I don't find an appropriate way to upload image with proper content-type. The file/image is uploaded but the content-type after upload to BLOB Storage is changed to 'application/octet-stream'. I want it to be 'image/png' or 'image/jpg' etc. for an image.
There are samples of C# code but they are not useful.
I'm trying with node.js
SDK Library Used : #azure/storage-blob
References :
https://azuresdkdocs.blob.core.windows.net/$web/javascript/azure-storage-blob/12.0.1/classes/blockblobclient.html#uploadfile
https://learn.microsoft.com/en-us/javascript/api/#azure/storage-blob/blockblobclient?view=azure-node-latest
Sample Code :
const bc = new BlockBlobClient(
rhcConfig.STORAGE_CONNECTION_STRING,
rhcConfig.CONTAINER_NAME,
`IMAGES/${fileName}`
);
// let result = await bc.uploadFile(_file);
// console.log(result);
const buff = Buffer.from(file, "base64");
const stream = getStream(buff);
const streamLength = buff.length;
await bc.uploadStream(stream, streamLength, 1, { httpHeaderOptions });
httpHeaderOptions :
const httpHeaders = {
"x-ms-blob-cache-control": "1000",
"x-ms-blob-content-type": "image/png",
"x-ms-blob-content-md5": `${md5Hash}`,
"x-ms-blob-content-encoding": "compress",
"x-ms-blob-content-language": "en",
"x-ms-blob-content-disposition": "multipart/form-data",
};
const httpHeaderOptions = { blobHTTPHeaders: httpHeaders };
Thanks to the community !!

Suppose your httpHeaderOptions format is not correct, you could refer to this interface description:BlobHTTPHeaders, below is my test code.
const blobServiceClient = BlobServiceClient.fromConnectionString(connectionstr)
const containerClient=blobServiceClient.getContainerClient('test')
const blobclient=containerClient.getBlockBlobClient('test.jpg')
let fileStream = fs.createReadStream('E:\\dog.jpg');
const blobOptions = { blobHTTPHeaders: { blobContentType: 'image/jpg' } };
blobclient.uploadStream(fileStream,undefined ,undefined ,blobOptions)

I had a similar issue and like the original poster found the micosoft documnetation and examples rather shockingly bad. I would have thought uploading images is a rather common thing to do but none of their examples show up to change the content type using the newer storage api. Anyhow, my implementation was simliar to George Chen's and is as follows`
BlobClient blobClient = photoContainer.GetBlobClient(fileName);
blobClient.UploadAsync(f.InputStream, new BlobHttpHeaders { ContentType = "image/jpeg" }).`

Related

How to convert image.png to binary in NodeJS?

I am trying to consume Azure Forms Recognizer API, where I have to provide the body in the form of "[Binary PNG data]" as stated here.
The connection seems the be working fine, however I am getting this response:
{"error":{"code":"InvalidImage","innerError":{"requestId":"73c86dc3-51a3-48d8-853b-b6411f54c51e"},"message":"The input data is not a valid image or password protected."}}
I am using a png that is my local directory and I've tried converting it in many different ways including:
fs.readFile('test.png', function(err, data){
if (err) throw err;
// Encode to base64
let encodedImage = new Buffer(data, 'binary').toString('base64');
// Decode from base64
var decodedImage = new Buffer(encodedImage, 'base64').toString('binary');});
or
let data_string = fs.createReadStream('test.png');
and many others. None of them seem to work and I always get the same response from my post request.
I would appreciate if anyone could share how to convert this png into the correct format. Thank you in advance
To base 64:
const file = fs.readFileSync('/some/place/image.png')
const base64String = Buffer.from(file).toString('base64')
Then pass the base64String to Azure
If you want just a BLOB so a binary file, you can do this
const file = fs.readFileSync('/some/place/image.png')
const blob = Buffer.from(file)
const processFile = (file: any) => {
const reader = new FileReader();
reader.readAsArrayBuffer(file);
reader.onload = function(){
const binaryData = Buffer.from(reader.result as string,'binary');
console.log(binaryData);
};
}

Azure Blob Storage - upload file with progress

I have following code - quite normal for uploading files into Azure-Blob-Storage but, when i upload files instead of getting onProgress executed many times, i only have it executed (and always) once with the file.size value (so it is sending - slowly) file to the azure but progress executes only once when finished.
const requestOptions = this.mergeWithDefaultOptions(perRequestOptions);
const client = this.getRequestClient(requestOptions);
const containerClient = await client.getContainerClient(this.options.containerName);
const blobClient = await containerClient.getBlockBlobClient(file.name);
const uploadStatus = await blobClient.upload(file.buffer, file.size, {onProgress: progressCallBack});
What i would love to know is if that outcome is normal for this library (for downloading files from azure, the same approach works correctly).
According to my test, the method is a non-parallel uploading method and it just sends a single Put Blob request to Azure Storage server. For more details, please refer to here.
So if you want to get onProgress executed many times, I suggest you use the method uploadStream. It uses Put Block operation and Put Block List operation to upload. For more details, please refer to here
For example
try {
var creds = new StorageSharedKeyCredential(accountName, accountKey);
var blobServiceClient = new BlobServiceClient(
`https://${accountName}.blob.core.windows.net`,
creds
);
var containerClient = blobServiceClient.getContainerClient("upload");
var blob = containerClient.getBlockBlobClient(
"spark-3.0.1-bin-hadoop3.2.tgz"
);
var maxConcurrency = 20; // max uploading concurrency
var blockSize = 4 * 1024 * 1024; // the block size in the uploaded block blob
var res = await blob.uploadStream(
fs.createReadStream("d:/spark-3.0.1-bin-hadoop3.2.tgz", {
highWaterMark: blockSize,
}),
blockSize,
maxConcurrency,
{ onProgress: (ev) => console.log(ev) }
);
console.log(res._response.status);
} catch (error) {
console.log(error);
}

Azure Blob Storage Compressing files by default?

I am uploading JSONs to Azure Blob storage using the Azure Blob storage API's function:
const response = await blobClient.upload(content, content.length);
There is absolutely no compression logic in the code nor any encoding headers being added but the files seem to be around 60% of their original size when they reach the storage. Also, monitoring the PUT requests using fiddler it seems that the file is compressed and then uploaded by the API.
My question is, does Azure do compression by default?
EDIT:
I was stringifying and then uploading the json objects. They get all the white-spaces remove and hence the reduced size.
Based on my test, there is no compression problem. Here is my sample:
const { BlobServiceClient } = require("#azure/storage-blob");
var fs = require('fs');
async function main() {
const AZURE_STORAGE_CONNECTION_STRING = "Your_Stroage_Account_Connection_String";
const blobServiceClient = BlobServiceClient.fromConnectionString(AZURE_STORAGE_CONNECTION_STRING);
const containerName = 'demo';
const blobName = 'test.txt';
const containerClient = blobServiceClient.getContainerClient(containerName);
if(!await containerClient.exists()){
await containerClient.create();
}
const contents = fs.readFileSync('test.txt');
const blockBlobClient = containerClient.getBlockBlobClient(blobName);
await blockBlobClient.upload(contents,contents.length);
}
main().then(() => console.log('Done')).catch((ex) => console.log(ex.message));
The test.txt file's size is about 99.9KB.
And, from the portal, the uploaded file's size is 99.96KB,which is in line with our expectations.
You should also use byte length when uploading, as storage blob api expects number of bytes, the string length can be different
const content = "Hello 世界!";
console.log(`length: ${content.length}`);
console.log(`byteLength: ${Buffer.byteLength(content)}`);
the output:
length: 9
byteLength: 15

uploaded files to Azure are corrupted when using dio

I'm trying to upload a file from my phone to azure blob storage as a BlockBlob with a SAS. I can get the file to upload, but it can't be opened once downloaded. The file gets corrupted somehow. I thought this was a content-type problem, but I have tried several different approaches to changing to content-type. Nothing has worked so far.
My code:
FileInfo _fileInfo = await filePicker(); // get the file path and file name
// my getUploadInfo fires a call to my backend to get a SAS.
// I know for a fact that this works because my website uses this SAS to upload files perfectly fine
UploadInfo uploadInfo = await getUploadInfo(_fileInfo.fileName, _fileInfo.filePath);
final bytes = File(_fileInfo.filePath).readAsBytesSync();
try {
final response = await myDio.put(
uploadInfo.url,
data: bytes,
onSendProgress:
(int sent, int total) {
if (total != -1) {
print((sent / total * 100).toStringAsFixed(0) + "%");
}
},
options:
dioPrefix.Options(headers: {
'x-ms-blob-type': 'BlockBlob',
'Content-Type': mime(_fileInfo.filePath),
})
);
} catch (e) {
print(e);
}
This code uploads a file just fine. But I can't open the file since it becomes corrupted. At first, I thought this was a Content-Type problem, so I've tried changing the content type header to: application/octet-stream and multipart/form-data as well. That doesn't work.
I've also tried to do
dioPrefix.FormData formData =
new dioPrefix.FormData.fromMap({
'file': await MultipartFile.fromFile(
_fileInfo.filePath,
filename: _fileInfo.fileName,
)
});
...
final response = await myDio.put(
uploadInfo.url,
data: formData, // This approach is recommended on the dio documentation
onSendProgress:
...
but this also corrupts the file. It gets uploaded, but I can't open it.
I have been able to successfully upload a file with this code, but with this approach I cannot get any type of response so I have no idea whether it uploaded successfully or not (Also, I can't get the progress of the upload):
try {
final data = imageFile.readAsBytesSync();
final response = await http.put( // here, response is empty no matter what i try to print
url,
body: data,
headers: {
'x-ms-blob-type': 'BlockBlob',
'Content-Type': mime(filePath),
});
...
Any help would be greatly appreciated. Thanks
I tried to upload a file using dio in Dart to Azure Blob Storage, and then download and print the content of the file, as the code below.
import 'package:dio/dio.dart';
import 'dart:io';
main() async {
var accountName = '<account name>';
var containerName = '<container name>';
var blobName = '<blob name>';
var sasTokenContainerLevel = '<container level sas token copied from Azure Storage Explorer, such as `st=2019-12-31T07%3A17%3A31Z&se=2020-01-01T07%3A17%3A31Z&sp=racwdl&sv=2018-03-28&sr=c&sig=xxxxxxxxxxxxxxxxxxxxxxxxxx`';
var url = 'https://$accountName.blob.core.windows.net/$containerName/$blobName?$sasTokenContainerLevel';
var data = File(blobName).readAsBytesSync();
var dio = Dio();
try {
final response = await dio.put(
url,
data: data,
onSendProgress:
(int sent, int total) {
if (total != -1) {
print((sent / total * 100).toStringAsFixed(0) + "%");
}
},
options: Options(
headers: {
'x-ms-blob-type': 'BlockBlob',
'Content-Type': 'text/plain',
})
);
print(response.data);
} catch (e) {
print(e);
}
Response response = await dio.get(url);
print(response.data);
}
Then, I ran it and got the result as the figure below.
The content of the uploaded file as blob is the json string encoded from a Uint8List bytes from the funtion readAsBytesSync.
I researched the description and the source code of dio, actually I found dio is only suitable for sending the request body of json format, not for raw content as request body.
Fig 1. The default transformer apply for POST method
Fig 2. https://github.com/flutterchina/dio/blob/master/dio/lib/src/transformer.dart
So to fix it is to write a custom transformer class PutTransformerForRawData instead of the default one to override the function transformRequest, as the code below.
import 'dart:typed_data';
class PutTransformerForRawData extends DefaultTransformer {
#override
Future<String> transformRequest(RequestOptions options) async {
if(options.data is Uint8List) {
return new String.fromCharCodes(options.data);
} else if(options.data is String) {
return options.data;
}
}
}
And to replace the default transformer via the code below.
var dio = Dio();
dio.transformer = PutTransformerForRawData();
Then, you can get the data via the code below.
var data = File(blobName).readAsBytesSync();
Or
var data = File(blobName).readAsStringSync();
Note: the custom transfer PutTransformerForRawData is only for uploading, please remove the download & print code Response response = await dio.get(url); print(response.data);, the default transformer seems to check the response body whether be json format, I got the exception as below when my uploaded file is my sample code.
Unhandled exception:
DioError [DioErrorType.DEFAULT]: FormatException: Unexpected character (at character 1)
import 'dart:typed_data';

Microsoft bot framework : How to define the image path which exists in the solution itself

I have stored the image inside the project itself, now I would like to display the image on the hero card, so I have mentioned the relative path. However the image is not appearing....
List<CardImage> cardImages = new List<CardImage>();
cardImages.Add(new CardImage(url: "~/duck-on-a-rock.jpg", alt:"image1"));
But when I referred the image from some website and mention the same path on the page like below that time the image is appearing.
List<CardImage> cardImages = new List<CardImage>();
cardImages.Add(new CardImage(url: "http://www.publicdomainpictures.net/pictures/30000/t2/duck-on-a-rock.jpg", alt:"image1"));
Is it not possible to keep the image inside the project folder?
This is a super old question, but google me brought here nonetheless. The solution I came across was to create a data url for your local resource. In node.js:
const imageData = fs.readFileSync(path.join(__dirname, '../resources/logo.png'));
const base64Image = Buffer.from(imageData).toString('base64');
const inlinePng = {
name: 'logo.png',
contentType: 'image/png',
contentUrl: `data:image/png;base64,${ base64Image }`
};
With an svg, you can skip the base64 encode:
const svgData = fs.readFileSync(path.join(__dirname, './resources/logo.svg'));
const inlineSvg = {
name: 'logo',
contentType: 'image/svg',
contentUrl: `data:image/svg+xml;utf8,${ svgData }`,
};
See Microsoft's docs for reference and C# samples.
Using DI approach: Pass one extra IHostingEnvironment variable to your dialog constructor.
public DialogBot(ConversationState conversationState, T dialog, ConcurrentDictionary<string, ConversationReference> conversationReferences, IHostingEnvironment env)
{
_conversationReferences = conversationReferences;
_conversationState = conversationState;
_dialog = dialog;
_env = env;
}
And later you can use _env.WebRootPath to access your local storage (for bot emulator) or use environmental variable WEBSITE_HOSTNAME if you run bot in azure.
private async Task SendWelcomeCardAsync(ITurnContext turnContext, CancellationToken cancellationToken)
{
var card = new HeroCard();
card.Title = "Welcome to Bot Framework!";
var host = new Uri(Environment.GetEnvironmentVariable("WEBSITE_HOSTNAME") != null ? ("https://" + Environment.GetEnvironmentVariable("WEBSITE_HOSTNAME")) : _env.WebRootPath + "/");
var img = new Uri(host, "MyImage.jpg");
card.Images = new List<CardImage>() { new CardImage(img.AbsoluteUri) };
var response = MessageFactory.Attachment(card.ToAttachment());
await turnContext.SendActivityAsync(response, cancellationToken);
}
I simply dropped MyImage.jpg image to wwwroot project folder and marked it as Content

Resources