File chunk upload to azure storage blob, file seems broken - node.js

I'm trying to upload excel file to azure storage blob in chunks, using the stage block and commitblock from BlobBlockClient Class. File upload seems to success but when i try to download and open the file, there it seems to be broken.
I'm using react and node js to do this. Code follows below
In UI
const chunkSize = (1024 * 1024) * 25; // file chunk size
// here slicing the file and sending it to api method
const fileReader = new FileReader();
const from = currentChunkIndexRef.current * chunkSize;
const to = from + chunkSize;
const blob = file.slice(from, to);
fileReader.onload = ((e: any) => uploadChunksToBlob(e, file, obj));
fileReader.readAsDataURL(blob);
// api method
const uploadChunksToBlob = async (event: any, file: File, obj: any) => {
try {
const totalChunks = Math.ceil(file.size / chunkSize);
const uploadChunkURL = `/upload?currentChunk=${currentChunkIndexRef.current}&totalChunks=${totalChunks}&file=${file.name}&type=${file.type}`;
console.log(event.target.result)
const fileUpload = await fetch(uploadChunkURL, {
method: "POST",
headers: { "Content-Type": "application/octet-stream" },
body: JSON.stringify(event.target.result),
});
const fileUploadJson = await fileUpload.json();
const isLastChunk = (totalChunks - 1) === currentChunkIndexRef.current;
if(!isLastChunk) {
console.log({ Chunk: currentChunkIndexRef.current });
currentChunkIndexRef.current = currentChunkIndexRef.current + 1;
// eslint-disable-next-line #typescript-eslint/no-use-before-define
uploadFileToAzureBlob(file, obj);
} else {
console.log("File Uploaded")
}
//
} catch (error) {
console.log("uploadFileToAzureBlob Catch Error" + error);
}
}
// In Node
const sharedKeyCredential = new StorageSharedKeyCredential(
config.StorageAccountName,
config.StorageAccountAccessKey
);
const pipeline = newPipeline(sharedKeyCredential);
const blobServiceClient = new BlobServiceClient(
`https://${config.StorageAccountName}.blob.core.windows.net`,
pipeline
);
const containerName = getContainerName(req.headers.key, req.headers.clientcode);
const identifier = uuid.v4();
const blobName = getBlobName(identifier, file);
const containerClient = blobServiceClient.getContainerClient(containerName);
const blockBlobClient = containerClient.getBlockBlobClient(blobName);
try {
let bufferObj = Buffer.from(`${file}_${Number(currentChunk)}`, "utf8"); // Create buffer object, specifying utf8 as encoding
let base64String = bufferObj.toString("base64"); // Encode the Buffer as a base64 string
blockIds = [...blockIds, base64String];
const bufferedData = Buffer.from(req.body);
let resultOfUnitArray = new Uint8Array(bufferedData.length);
for (let j = 0; j < bufferedData.length; j++) {
resultOfUnitArray[j] = bufferedData.toString().charCodeAt(j);
} // Converting string to bytes
const stageBlockResponse = await blockBlobClient.stageBlock(base64String, resultOfUnitArray, resultOfUnitArray.length, {
onProgress: (e) => {
console.log("bytes sent: " + e.loadedBytes);
}
});
if ((Number(totalChunks) - 1) === (Number(currentChunk))) {
const commitblockResponse = await blockBlobClient.commitBlockList(blockIds, {blobHTTPHeaders: req.headers});
res.json({ uuid: identifier, message: 'File uploaded to Azure Blob storage.' });
} else {
res.json({ message: `Current Chunks ${currentChunk} is Successfully Uploaded` });
}
} catch (err) {
console.log({ err })
res.json({ message: err.message });
}
I don't know, what i'm doing wrong here.
Any help would be appreciated
Thank you

The problem is that you convert it into dataURL, that’s where things break.
It appears to me that you're under the wrong impression that you need to first encode a blob into string in order to send it. Well, you don't have to, browser fetch API is capable to handle raw binary payload.
So on the client (browser) side, you don’t need to go through FileReader. Just send the chunk blob directly.
const blob = file.slice(from, to);
// ...
fetch(uploadChunkURL, {
method: "POST",
headers: { "Content-Type": "application/octet-stream" },
body: blob,
});
On the server (node.js) side, you'll receive the blob in raw binary form, so you can simply forward that blob untouched to azure storage. There's no need to decode from string and move bytes onto resultOfUnitArray like you currently do.
const base64String = Buffer.from(`${file}_${Number(currentChunk)}`, "utf8").toString("base64");
const bufferedData = Buffer.from(req.body);
const stageBlockResponse = await blockBlobClient.stageBlock(
base64String,
bufferedData,
bufferedData.length
);

Related

The first argument must be of type string or an instance of Buffer, ArrayBuffer, or Array or an Array-like Object. Received undefined

I'm trying to upload a file (pdf/jpg) using a Lambda function written in NodeJS by triggering the request from Postman but I'm getting the following error:-
2022-02-02T15:09:51.135Z 743939db-7511-4003-8e49-40c95ada47b4 ERROR Invoke Error
{
"errorType": "TypeError",
"errorMessage": "The first argument must be of type string or an instance of Buffer, ArrayBuffer, or Array or an Array-like Object. Received undefined",
"code": "ERR_INVALID_ARG_TYPE",
"stack": [
"TypeError [ERR_INVALID_ARG_TYPE]: The first argument must be of type string or an instance of Buffer, ArrayBuffer, or Array or an Array-like Object. Received undefined",
" at new NodeError (internal/errors.js:322:7)",
" at Function.from (buffer.js:334:9)",
" at Runtime.exports.lambdaHandler [as handler] (/var/task/app.js:68:23)"
]
}
The following is a chunk of event object getting logged on the CloudWatch:-
2022-02-02T20:39:52.136+05:30
Copy
info: Event:: {"body":"{\n \"base64String\": \"/9j/4AAQSkZJRgABAQAAAQABAAD/2wCEAAcHBwcIBwgJCQgMDAsMDBEQDg4QERoSFBIUEhonGB0YGB0YJyMqIiAiKiM+MSsrMT5IPDk8SFdOTldtaG2Pj8ABBwcHBwgHCAkJCAwMCwwMERAODhARGhIUEhQSGicYHRgYHRgnIyoiICIqIz4xKysxPkg8OTxIV05OV21obY+PwP/CABEICHAPAAMBIgACEQEDEQH/xAAcAAEAAgMBAQEAAAAAAAAAAAAABwgBBQYEAwL/2gAIAQEAAAAAsiAA
Lambda (NodeJS code):-
'use-strict'
const AWS = require("aws-sdk");
const logger = require('./logger').logger;
const moment = require('moment');
const fileType = ('file-type');
const { Buffer } = require('buffer');
//const { fileTypeFromFile } = 'file-type';
const ddbTable = process.env.RUNTIME_DDB_TABLE_FREE_USER_DOCUMENT;
const s3TempBucket = process.env.RUNTIME_S3_TEMP_BUCKET;
const s3 = new AWS.S3();
const getFile = (fileMime, buffer, userId) => {
let fileExt = fileMime.ext;
let hash = sha1(new Buffer(new Date().toString()));
let now = moment().format('YYYY-MM-DD HH:mm:ss');
let filePath = hash + '/';
let fileName = unixTime(now) + '.' + fileExt;
let fileFullName = filePath + fileName;
let fileFullPath = s3TempBucket + userId + fileFullName;
const params = {
Body: buffer,
Bucket: s3TempBucket,
Key: fileName
};
let uploadFile = {
size: buffer.toString('ascii').length,
type: fileMime.mime,
name: fileName,
fullPath: fileFullPath
}
return {
'params': params,
'uploadFile': uploadFile
}
}
exports.lambdaHandler = async (event, context) => {
logger.info("Event::", event);
logger.info('Uploading file to bucket::', s3TempBucket);
let body, data;
let statusCode = 200;
const headers = {
'Content-Type': 'application/json',
'Access-Control-Allow-Headers': 'Content-Type',
'Access-Control-Allow-Origin': '*',
'Access-Control-Allow-Methods': '*'
};
let request = JSON.parse(event.body);
let base64String = await request.base64String;
logger.info("base64String::", base64String);
let buffer = Buffer.from(base64String, 'base64');
//let buffer = new Buffer(base64String, 'base64');
let fileMime = fileType(buffer);
logger.info(fileMime);
if (fileMime === null) {
return context.fail('String supplied is not file type');
}
//let file = getFile(fileMime, buffer, user.id);
let file = getFile(fileMime, buffer, 'b06eb6f4-0ff0-5cb5-a41c-e000af66c8e9');
let params = file.params;
try {
//await new Promise((resolve, reject) => {
s3.putObject(params, (err, results) => {
if (err) reject(err);
else {
console.log(results);
body = results;
resolve(results)
}
});
// });
} catch (err) {
logger.info(err);
statusCode = 400;
body = err.message;
return err;
} finally {
body = JSON.stringify(data);
}
return {
statusCode,
body,
headers
};
}
The base64String is coming as undefined not sure why as I can see clearly in the event object?:-
let buffer = Buffer.from(base64String, 'base64');
Please assist, thanks
Postman request:-
If you use API Gateway, you don't need base64 encode because API Gateway does automatically.
A sample is provided in AWS.
Select Create function
Select Browse serverless app repository
Find "uploader: Serverless web application for uploading files to
S3"
Deploy
uploader's github
This creates API gateway and NodeJS Lambda. (You need to provide S3 bucket.)
The instruction says that to upload a file, open InvokeURL in browser and drag drop a file. You can do this with Postman too as follows.
Input POST InvokeURL/api/file/text.pdf.
Set body KEY to File and input text.pdf. Select the pdf file as VALUE.
Send
You can find the code in index.js and extract what you need.

aws sdk Multipart Upload to s3 with node.js

I am trying to upload large files to a s3 bucket using the node.js aws-sdk.
the V2 method upload integrally uploads the files in a multipart upload.
I want to use the new V3 aws-sdk. What is the way to upload large files in the new version? The method PutObjectCommand doesn't seem to be doing it.
I've seen there are methods such as CreateMultiPartUpload but I can't seem to find a full working example using them.
Thanks in advance.
As of 2021, I would suggest using the lib-storage package, which abstracts a lot of the implementation details.
Sample code:
import { Upload } from "#aws-sdk/lib-storage";
import { S3Client, S3 } from "#aws-sdk/client-s3";
const target = { Bucket, Key, Body };
try {
const parallelUploads3 = new Upload({
client: new S3({}) || new S3Client({}),
tags: [...], // optional tags
queueSize: 4, // optional concurrency configuration
partSize: 5MB, // optional size of each part
leavePartsOnError: false, // optional manually handle dropped parts
params: target,
});
parallelUploads3.on("httpUploadProgress", (progress) => {
console.log(progress);
});
await parallelUploads3.done();
} catch (e) {
console.log(e);
}
Source: https://github.com/aws/aws-sdk-js-v3/blob/main/lib/lib-storage/README.md
Here's what I came up with, to upload a Buffer as a multipart upload, using aws-sdk v3 for nodejs and TypeScript.
Error handling still needs some work (you might want to abort/retry in case of an error), but it should be a good starting point... I have tested this with XML files up to 15MB, and so far so good. No guarantees, though! ;)
import {
CompleteMultipartUploadCommand,
CompleteMultipartUploadCommandInput,
CreateMultipartUploadCommand,
CreateMultipartUploadCommandInput,
S3Client,
UploadPartCommand,
UploadPartCommandInput
} from '#aws-sdk/client-s3'
const client = new S3Client({ region: 'us-west-2' })
export const uploadMultiPartObject = async (file: Buffer, createParams: CreateMultipartUploadCommandInput): Promise<void> => {
try {
const createUploadResponse = await client.send(
new CreateMultipartUploadCommand(createParams)
)
const { Bucket, Key } = createParams
const { UploadId } = createUploadResponse
console.log('Upload initiated. Upload ID: ', UploadId)
// 5MB is the minimum part size
// Last part can be any size (no min.)
// Single part is treated as last part (no min.)
const partSize = (1024 * 1024) * 5 // 5MB
const fileSize = file.length
const numParts = Math.ceil(fileSize / partSize)
const uploadedParts = []
let remainingBytes = fileSize
for (let i = 1; i <= numParts; i ++) {
let startOfPart = fileSize - remainingBytes
let endOfPart = Math.min(partSize, startOfPart + remainingBytes)
if (i > 1) {
endOfPart = startOfPart + Math.min(partSize, remainingBytes)
startOfPart += 1
}
const uploadParams: UploadPartCommandInput = {
// add 1 to endOfPart due to slice end being non-inclusive
Body: file.slice(startOfPart, endOfPart + 1),
Bucket,
Key,
UploadId,
PartNumber: i
}
const uploadPartResponse = await client.send(new UploadPartCommand(uploadParams))
console.log(`Part #${i} uploaded. ETag: `, uploadPartResponse.ETag)
remainingBytes -= Math.min(partSize, remainingBytes)
// For each part upload, you must record the part number and the ETag value.
// You must include these values in the subsequent request to complete the multipart upload.
// https://docs.aws.amazon.com/AmazonS3/latest/API/API_CompleteMultipartUpload.html
uploadedParts.push({ PartNumber: i, ETag: uploadPartResponse.ETag })
}
const completeParams: CompleteMultipartUploadCommandInput = {
Bucket,
Key,
UploadId,
MultipartUpload: {
Parts: uploadedParts
}
}
console.log('Completing upload...')
const completeData = await client.send(new CompleteMultipartUploadCommand(completeParams))
console.log('Upload complete: ', completeData.Key, '\n---')
} catch(e) {
throw e
}
}
Here is the fully working code with AWS SDK v3
import { Upload } from "#aws-sdk/lib-storage";
import { S3Client, S3 } from "#aws-sdk/client-s3";
import { createReadStream } from 'fs';
const inputStream = createReadStream('clamav_db.zip');
const Bucket = process.env.DB_BUCKET
const Key = process.env.FILE_NAME
const Body = inputStream
const target = { Bucket, Key, Body};
try {
const parallelUploads3 = new Upload({
client: new S3Client({
region: process.env.AWS_REGION,
credentials: { accessKeyId: process.env.AWS_ACCESS_KEY, secretAccessKey: process.env.AWS_SECRET_KEY }
}),
queueSize: 4, // optional concurrency configuration
partSize: 5242880, // optional size of each part
leavePartsOnError: false, // optional manually handle dropped parts
params: target,
});
parallelUploads3.on("httpUploadProgress", (progress) => {
console.log(progress);
});
await parallelUploads3.done();
} catch (e) {
console.log(e);
}

how to solve audio encoding error in Media-translation GCP API?

Here's my code.
I have went through the google cloud platform API documentation, and followed as per the GCP DOC steps correctly. But still unable to fix the encoding error, which you can see it below. I'm trying to translate an audio clip from en-US(english) to hi-IN (hindi), and it would be helpful if you can give some alternative ways for this solution.
function main(filename, encoding, sourceLanguage, targetLanguage) {
const fs = require('fs');
const {
SpeechTranslationServiceClient,
} = require('#google-cloud/media-translation');
const client = new SpeechTranslationServiceClient();
async function quickstart() {
const filename = './16kmonoceo.wav';
const encoding = 'LINEAR16';
const sourceLanguage = 'en-US';
const targetLangauge = 'hi-IN';
const config = {
audioConfig: {
audioEncoding: encoding,
sourceLanguageCode: sourceLanguage,
targetLanguageCode: targetLangauge,
},
};
const initialRequest = {
streamingConfig: config,
audioContent: null,
};
const readStream = fs.createReadStream(filename, {
highWaterMark: 4096,
encoding: 'base64',
});
const chunks = [];
readStream
.on('data', chunk => {
const request = {
streamingConfig: config,
audioContent: chunk.toString(),
};
chunks.push(request);
})
.on('close', () => {
// Config-only request should be first in stream of requests
stream.write(initialRequest);
for (let i = 0; i < chunks.length; i++) {
stream.write(chunks[i]);
}
stream.end();
});
const stream = client.streamingTranslateSpeech().on('data', response => {
const {result} = response;
if (result.textTranslationResult.isFinal) {
console.log(
`\nFinal translation: ${result.textTranslationResult.translation}`
);
console.log(`Final recognition result: ${result.recognitionResult}`);
} else {
console.log(
`\nPartial translation: ${result.textTranslationResult.translation}`
);
console.log(`Partial recognition result: ${result.recognitionResult}`);
}
});
}
quickstart();
}
main(...process.argv.slice(2));
here my error from command line.
CHECK ERROR MESSAGE
I'm using windows 10 and IDE VS CODE.
This is a case where careful reading of the error message helps.
Some module gacked on "LINEAR16" as the audioEncoding value saying there's no encoding with that name.
A quick look at the documentation shows "linear16" (lower case) as the value to use.

Posting An Image from Webcam to Azure Face Api

I am trying to upload an image that I get from my webcam to the Microsoft Azure Face Api. I get the image from canvas.toDataUrl(‘image/png’) which contains the Data Uri. I change the Content Type to application/octet-stream and when I attach the Data Uri to the post request, I get a Bad Request (400) Invalid Face Image. If I change the attached data to a Blob, I stop receiving errors however I only get back an empty array instead of a JSON object. I would really appreciate any help for pointing me in the right direction.
Thanks!
Oh you're in such luck, i've just (successfully!) attempted this 2 days ago.
Sending base64-encoded JPEGs to Face API is seriously inefficient, The ratio of encoded output bytes to input bytes is 4:3 (33% overhead). Just send a byte array, it works, the docs mention it briefly.
And try to read as JPEG not PNG, that's just wasting bandwidth for webcam footage.
...
var dataUri = canvas.toDataURL('image/' + format);
var data = dataUri.split(',')[1];
var mimeType = dataUri.split(';')[0].slice(5)
var bytes = window.atob(data);
var buf = new ArrayBuffer(bytes.length);
var byteArr = new Uint8Array(buf);
for (var i = 0; i < bytes.length; i++) {
byteArr[i] = bytes.charCodeAt(i);
}
return byteArr;
Now use byteArr as your payload (data:) in $.ajax() for jQuery or iDontUnderStandHowWeGotHereAsAPeople() in any other hipster JS framework people use these days.
The reverse-hipster way of doing it is:
var payload = byteArr;
var xhr = new XMLHttpRequest();
xhr.open('POST', 'https://SERVICE_URL');
xhr.setRequestHeader('Content-Type', 'application/octet-stream');
xhr.send(payload);
To extend Dalvor's answer: this is the AJAX call that works for me:
fetch(data)
.then(res => res.blob())
.then(blobData => {
$.post({
url: "https://westus.api.cognitive.microsoft.com/face/v1.0/detect",
contentType: "application/octet-stream",
headers: {
'Ocp-Apim-Subscription-Key': '<YOUR-KEY-HERE>'
},
processData: false,
data: blobData
})
.done(function(data) {
$("#results").text(JSON.stringify(data));
})
.fail(function(err) {
$("#results").text(JSON.stringify(err));
})
Full demo code here: https://jsfiddle.net/miparnisari/b1zzpvye/
For saving someone's 6 hours, I appended my right code.
I hope this code helps you.
Tools
React
Typescript
React-webcam
Mac OS
Axios
Code
index.tsx
Constants and ref
/**
* Constants
*/
const videoConstraints = {
width: 1280,
height: 720,
facingMode: 'user',
};
/**
* Refs
*/
const webcamRef = React.useRef<Webcam>(null);
Call back function
const capture = React.useCallback(() => {
const base64Str = webcamRef.current!.getScreenshot() || '';
const s = base64Str.split(',');
const blob = b64toBlob(s[1]);
callCognitiveApi(blob);
}, [webcamRef]);
In render
<Webcam audio={false} ref={webcamRef} screenshotFormat="image/jpeg" videoConstraints={videoConstraints} />
<button onClick={capture}>Capture photo</button>
base64toBlob
Thanks to creating-a-blob-from-a-base64-string-in-javascript
export const b64toBlob = (b64DataStr: string, contentType = '', sliceSize = 512) => {
const byteCharacters = atob(b64DataStr);
const byteArrays = [];
for (let offset = 0; offset < byteCharacters.length; offset += sliceSize) {
const slice = byteCharacters.slice(offset, offset + sliceSize);
const byteNumbers = new Array(slice.length);
for (let i = 0; i < slice.length; i++) {
byteNumbers[i] = slice.charCodeAt(i);
}
const byteArray = new Uint8Array(byteNumbers);
byteArrays.push(byteArray);
}
const blob = new Blob(byteArrays, { type: contentType });
return blob;
};
callCognitiveApi
import axios from 'axios';
const subscriptionKey: string = 'This_is_your_subscription_key';
const url: string = 'https://this-is-your-site.cognitiveservices.azure.com/face/v1.0/detect';
export const callCognitiveApi = (data: any) => {
const config = {
headers: { 'content-type': 'application/octet-stream', 'Ocp-Apim-Subscription-Key': subscriptionKey },
};
const response = axios
.post(url, data, config)
.then((res) => {
console.log(res);
})
.catch((error) => {
console.error(error);
});
};
Result
So I got the answer finally by sending the image as a blob object. You first grab the image from canvas with:
let data = canvas.toDataURL('image/jpeg');
Afterwards, you can reformat it to a blob data object by running:
fetch(data)
.then(res => res.blob())
.then(blobData => {
// attach blobData as the data for the post request
}
You will also need to switch the Content-Type of the post request to "application/octet-stream"

Azure Functions - NodeJS - Response Body as a Stream

I'd like to return a file from Blob Storage when you hit a given Azure Function end-point. This file is binary data.
Per the Azure Storage Blob docs, the most relevant call appears to be the following since its the only one that doesn't require writing the file to an interim file:
getBlobToStream
However this call gets the Blob and writes it to a stream.
Is there a way with Azure Functions to use a Stream as the value of res.body so that I can get the Blob Contents from storage and immediately write it to the response?
To add some code, trying to get something like this to work:
'use strict';
const azure = require('azure-storage'),
stream = require('stream');
const BLOB_CONTAINER = 'DeContainer';
module.exports = function(context){
var file = context.bindingData.file;
var blobService = azure.createBlobService();
var outputStream = new stream.Writable();
blobService.getBlobToStream(BLOB_CONTAINER, file, outputStream, function(error, serverBlob) {
if(error) {
FileNotFound(context);
} else {
context.res = {
status: 200,
headers: {
},
isRaw: true,
body : outputStream
};
context.done();
}
});
}
function FileNotFound(context){
context.res = {
status: 404,
headers: {
"Content-Type" : "application/json"
},
body : { "Message" : "No esta aqui!."}
};
context.done();
}
Unfortunately we don't have streaming support implemented in NodeJS just yet - it's on the backlog: https://github.com/Azure/azure-webjobs-sdk-script/issues/1361
If you're not tied to NodeJ open to using a C# function instead, you can use the storage sdk object directly in your input bindings and stream request output, instead of using the intermediate object approach.
While #Matt Manson's answer is definitely correct based on the way I asked my question, the following code snippet might be more useful for someone who stumbles across this question.
While I can't send the Stream to the response body directly, I can use a custom stream which captures the data into a Uint8Array, and then sends that to the response body.
NOTE: If the file is REALLY big, this will use a lot of memory.
'use strict';
const azure = require('azure-storage'),
stream = require('stream');
const BLOB_CONTAINER = 'deContainer';
module.exports = function(context){
var file = context.bindingData.file;
var blobService = azure.createBlobService();
var outputStream = new stream.Writable();
outputStream.contents = new Uint8Array(0);//Initialize contents.
//Override the write to store the value to our "contents"
outputStream._write = function (chunk, encoding, done) {
var curChunk = new Uint8Array(chunk);
var tmp = new Uint8Array(this.contents.byteLength + curChunk.byteLength);
tmp.set(this.contents, 0);
tmp.set(curChunk, this.contents.byteLength);
this.contents = tmp;
done();
};
blobService.getBlobToStream(BLOB_CONTAINER, file, outputStream, function(error, serverBlob) {
if(error) {
FileNotFound(context);
} else {
context.res = {
status: 200,
headers: {
},
isRaw: true,
body : outputStream.contents
};
context.done();
}
});//*/
}
function FileNotFound(context){
context.res = {
status: 404,
headers: {
"Content-Type" : "application/json"
},
body : { "Message" : "No esta aqui!"}
};
context.done();
}
I tried #Doug's solution from the last comment above, with a few minor mods in my azure function, and so far, after trying 20 different ideas, this is the only one that actually delivered the file to the browser! Thank you, #Doug...
const fs = require("fs");
const stream = require("stream");
...
const AzureBlob = require('#[my_private_artifact]/azure-blob-storage');
const azureStorage = new AzureBlob(params.connectionString);
//Override the write to store the value to our "contents" <-- Doug's solution
var outputStream = new stream.Writable();
outputStream.contents = new Uint8Array(0);//Initialize contents.
outputStream._write = function (chunk, encoding, done) {
var curChunk = new Uint8Array(chunk);
var tmp = new Uint8Array(this.contents.byteLength + curChunk.byteLength);
tmp.set(this.contents, 0);
tmp.set(curChunk, this.contents.byteLength);
this.contents = tmp;
done();
};
let azureSpeedResult = await azureStorage.downloadBlobToStream(params.containerName, params.objectId, outputStream);
let headers = {
"Content-Length": azureSpeedResult.size,
"Content-Type": mimeType
};
if (params.action == "download") {
headers["Content-Disposition"] = "attachment; filename=" + params.fileName;
}
context.res = {
status: 200,
headers: headers,
isRaw: true,
body: outputStream.contents
};
context.done();
...

Resources