NestJS- Trying to Upload Base64 on Azure Blob Storage - nestjs

Trying to upload basee64 string of image to Azure Storage in nestJS, it's following function uploads empty image and empty content.
async uploadBase64Image() {
const image = 'data:image/jpeg;base64,/9j/4AAQSkZJRgABAQEASABIAAD/2wCEAAMCAgMCAgMDAwMEAwMEBQgFBQQEBQoHBwYIDAoMDAsKCwsNDhIQDQ4RDgsLEBYQERMUFRUVDA8XGB......';
const newFileName = (Math.random() + 1).toString(36).substring(2);
const base64Data = image as string;
const [, imageExtension] = base64Data.split(';')[0].split('/');
const filename = newFileName + '.' + imageExtension;
const base64_img = image.replace(/^data:image\/jpeg;base64,/, '');
const imgBuffer = Readable.from(Buffer.from(base64_img, 'base64'));
const blobName = `images/slider/${filename}`;
// const stream = Readable.from(base64_img); //Tried that too
const containerClient: ContainerClient = (
this.blobServiceClient as BlobServiceClient
).getContainerClient('public');
const blockBlobClient = containerClient.getBlockBlobClient(blobName);
// await blockBlobClient.create
await blockBlobClient.uploadStream(
imgBuffer,
this.uploadOptions.bufferSize,
this.uploadOptions.maxBuffers,
{ blobHTTPHeaders: { blobContentType: 'image/jpeg' } }
);
}

Can you try to use uploadData method instead of uploadStream. Your code would be something like:
async uploadBase64Image() {
const image = 'data:image/jpeg;base64,/9j/4AAQSkZJRgABAQEASABIAAD/2wCEAAMCAgMCAgMDAwMEAwMEBQgFBQQEBQoHBwYIDAoMDAsKCwsNDhIQDQ4RDgsLEBYQERMUFRUVDA8XGB......';
const newFileName = (Math.random() + 1).toString(36).substring(2);
const base64Data = image as string;
const [, imageExtension] = base64Data.split(';')[0].split('/');
const filename = newFileName + '.' + imageExtension;
const base64_img = image.replace(/^data:image\/jpeg;base64,/, '');
const imgBuffer = Buffer.from(base64_img, 'base64');
const blobName = `images/slider/${filename}`;
// const stream = Readable.from(base64_img); //Tried that too
const containerClient: ContainerClient = (
this.blobServiceClient as BlobServiceClient
).getContainerClient('public');
const blockBlobClient = containerClient.getBlockBlobClient(blobName);
// await blockBlobClient.create
await blockBlobClient.uploadStream(
imgBuffer,
{ blobHTTPHeaders: { blobContentType: 'image/jpeg' } }
);
}

Related

Set content type to image in JavaScript v10 SDK Nodejs

When I try to upload images using the Azure JS SDK v10, it shows them as application/octetstream.
This is what I have done initially according to the official azure documentation
async function uploadLocalFile(
aborter,
containerClient,
filePath,
file_guid,
)
filePath = path.resolve(filePath); //Image file path
const fileName = file_guid; //contains file name
const blobClient = containerClient.getBlobClient(fileName);
const blockBlobClient = blobClient.getBlockBlobClient();
return await blockBlobClient.uploadFile(filePath, aborter);
}
This was uploading the images s application/octetstream
Then I tried this, setting the headers and attempting to make it as image/jpeg, but still this makes the content type as application/octetstream.
filePath = path.resolve(filePath);
const fileName = file_guid;
const blobClient = containerClient.getBlobClient(fileName);
const blockBlobClient = blobClient.getBlockBlobClient();
const blobOptions = { blobHTTPHeaders: { blobContentType: 'image/jpeg' } };
return await blockBlobClient.uploadFile(filePath, aborter, blobOptions);
}
Is there any way to make the images' content type as image/jpeg while uploading to azure blob storage?
I test with your code and set httpHeaderOptions, you could refer to this interface description:BlobHTTPHeaders, below is my test code.
const { BlobServiceClient, StorageSharedKeyCredential } = require("#azure/storage-blob");
const path = require("path");
// Enter your storage account name and shared key
const account = "xxxx";
const accountKey = "xxxxxxxxxxxxxxxxxxxxxxxxxxx";
const sharedKeyCredential = new StorageSharedKeyCredential(account, accountKey);
const blobServiceClient = new BlobServiceClient(
`https://${account}.blob.core.windows.net`,
sharedKeyCredential
);
const containerName = "container";
async function main() {
const containerClient = blobServiceClient.getContainerClient(containerName);
const filePath = path.resolve("C:\\xxxxx\\me.jpg");
const fileName = "bbb";
const blobClient = containerClient.getBlobClient(fileName);
const blockBlobClient = blobClient.getBlockBlobClient();
const blobOptions = { blobHTTPHeaders: { blobContentType: 'image/jpeg' } };
const uploadBlobResponse =await blockBlobClient.uploadFile(filePath,blobOptions);
console.log(`Upload block blob test.txt successfully`);
}
main();

Can't set content type from using uploadFile method in #azure/storage-blob (SDK/NPM)

Can't set azure content type from node using below code. It's always storing content type as octane stream.
const { BlobServiceClient } = require('#azure/storage-blob');
const { AZURE_STORAGE_CONNECTION_STRING } = process.env;
let blobServiceClient;
async function getBlobServiceClient() {
if (!blobServiceClient) {
blobServiceClient = await BlobServiceClient.fromConnectionString(AZURE_STORAGE_CONNECTION_STRING);
}
return blobServiceClient;
}
async function uploadFile(filePath, containerName) {
const bsClient = await getBlobServiceClient();
const containerClient = bsClient.getContainerClient(containerName);
const blockBlobClient = containerClient.getBlockBlobClient('myImag6.png', { blobHTTPHeaders: { blobContentType: 'image/png' } });
try {
const res = await blockBlobClient.uploadFile(filePath);
console.log(res);
} catch (error) {
console.log(error);
}
}
The following issue seems related to this but I am not sure.
https://github.com/Azure/azure-sdk-for-js/issues/6192
Please give me more info on this and how to solve this issue.
Suppose it's because you don't set the BlockBlobUploadOptions in the uploadFile method, you only use it in the getBlockBlobClient method, in my below code test, it could set the content type.
const { BlobServiceClient, StorageSharedKeyCredential } = require("#azure/storage-blob");
// Enter your storage account name and shared key
const account = "account name";
const accountKey = "account key";
const sharedKeyCredential = new StorageSharedKeyCredential(account, accountKey);
const blobServiceClient = new BlobServiceClient(
`https://${account}.blob.core.windows.net`,
sharedKeyCredential
);
const containerName = "test";
async function main() {
const containerClient = blobServiceClient.getContainerClient(containerName);
const blockBlobClient = containerClient.getBlockBlobClient('test.txt');
const blobOptions = { blobHTTPHeaders: { blobContentType: 'text/plain' } };
const uploadBlobResponse = await blockBlobClient.uploadFile('E:\\project\\jsstorage\\test.txt',blobOptions);
console.log(`Upload block blob test.txt successfully`, uploadBlobResponse.requestId);
}
main();
Did you try setting the blobHttpHeaders and passed to the uploadFile method?
const blobOptions = { blobHTTPHeaders: { blobContentType: 'image/png' } };
const res = await blockBlobClient.uploadFile(filePath, blobOptions);

Concat two PDFs in Firebase Cloud Functions with pdf-lib

I'm trying to merge 2 pdf files using pdf-lib (I got the example of code from the official site of pdf-lib). The goal is to trigger the cloud function when new file is uploaded to bucket. The function then collect urls of files to be merged in the same bucket with the new one. I am able to get urls but I have an error in pdf-lib. Maybe I'm importing it the wrong way. Because in example it is in ES6 syntax (import) but nodejs needs require. I'm new to backed and nodejs. So any help is highly appreciated.
const functions = require('firebase-functions');
const { Storage } = require('#google-cloud/storage');
const storage = new Storage();
const admin = require('firebase-admin');
admin.initializeApp();
const { PDFDocument } = require('pdf-lib');
const fetch = require('node-fetch');
exports.testCloudFunc = functions.storage.object().onFinalize(async object => {
const filePath = object.name;
const { Logging } = require('#google-cloud/logging');
console.log(`Logged: FILEPATH: ${filePath}`);
const id = filePath.split('/');
console.log(`Logged: ID: ${id[0]}/${id[1]}`);
const bucket = object.bucket;
console.log(`Logged: BUCKET: ${object.bucket}`);
async function listFilesByPrefix() {
const options = {
prefix: id[0] + '/' + id[1]
};
const [files] = await storage.bucket(bucket).getFiles(options);
const endFiles = files.filter(el => {
return (
el.name === id[0] + '/' + id[1] + '/' + 'invoiceReport.pdf' ||
el.name === id[0] + '/' + id[1] + '/' + 'POD.pdf' ||
el.name === id[0] + '/' + id[1] + '/' + 'rateConfirmation.pdf'
);
});
endFiles.forEach(el => console.log(el.name));
const promises = [];
for (let i = 0; i < endFiles.length; i++) {
console.log(endFiles[i].name);
promises.push(
endFiles[i].getSignedUrl({
action: 'read',
expires: '03-17-2025'
})
);
}
const urlsArray = await Promise.all(promises);
return urlsArray;
}
listFilesByPrefix()
.then(results => {
results.forEach(el => console.log(el));
copyPages(results[0], results[1]);
return results;
})
.catch(console.error);
});
async function copyPages(url1, url2) {
const firstDonorPdfBytes = await fetch(url1).then(res => res.arrayBuffer());
const secondDonorPdfBytes = await fetch(url2).then(res => res.arrayBuffer());
const firstDonorPdfDoc = await PDFDocument.load(firstDonorPdfBytes);
const secondDonorPdfDoc = await PDFDocument.load(secondDonorPdfBytes);
const pdfDoc = await PDFDocument.create();
const [firstDonorPage] = await pdfDoc.copyPages(firstDonorPdfDoc, [0]);
const [secondDonorPage] = await pdfDoc.copyPages(secondDonorPdfDoc, [742]);
pdfDoc.addPage(firstDonorPage);
pdfDoc.insertPage(0, secondDonorPage);
const pdfBytes = await pdfDoc.save();
}
But in firebase cloud console logs I'm getting this:
TypeError: Cannot read property 'node' of undefined
at PDFDocument.<anonymous> (/srv/node_modules/pdf-lib/cjs/api/PDFDocument.js:459:62)
at step (/srv/node_modules/tslib/tslib.js:136:27)
at Object.next (/srv/node_modules/tslib/tslib.js:117:57)
at fulfilled (/srv/node_modules/tslib/tslib.js:107:62)
at <anonymous>
at process._tickDomainCallback (internal/process/next_tick.js:229:7)
I was facing the same problem
Make sure your files are public or generate a signed url
an example follows:
const options = {
prefix: 'notas', //folder name
};
const optionsBucket = {
version: 'v2',
action: 'read',
expires: Date.now() + 1000 * 60 * 9, // 9 minutes
};
const [files] = await storage.bucket('your-bucket-name').getFiles(options);
const mergedPdf = await PDFDocument.create();
for (let nota of files) {
let fileName = nota.name;
if (fileName.endsWith('.pdf')) {
const [url] = await storage
.bucket(bucketName)
.file(fileName)
.getSignedUrl(optionsBucket); //generate signed url
const arrayBuffer = await fetch(url).then(res => res.arrayBuffer());
const pdf = await PDFDocument.load(arrayBuffer);
const copiedPages = await mergedPdf.copyPages(pdf, pdf.getPageIndices());
copiedPages.forEach((page) => {
mergedPdf.addPage(page);
});
}
const mergedPdfFile = await mergedPdf.save();
const file = bucket.file(`folder/filename.pdf`);
await file.save(
mergedPdfFile
);
}

Azure Function NodeJS: blobService.createAppendBlobFromLocalFile Promise does not resolve

I am new to azure functions and I am currently just trying to save an image as a blob to my storage. However, my Promise never resolve but the functions ends successfully. The image exists in my directory.
Here is my code so far:
module.exports = async function (context, req) {
const path = require('path');
const storage = require('azure-storage');
const STORAGE_ACCOUNT_NAME = 'something';
const ACCOUNT_ACCESS_KEY = 'also something';
const blobService = storage.createBlobService(STORAGE_ACCOUNT_NAME, ACCOUNT_ACCESS_KEY);
const filePath = './product-example.jpg';
function uploadLocalFile (filePath) {
return new Promise((resolve, reject) => {
const fullPath = path.resolve(filePath);
context.log('before call');
blobService.createAppendBlobFromLocalFile('productimageupload', 'image-upload-post', fullPath, function(err) {
if(err) {
context.log('err');
reject(err);
} else {
context.log('resolve');
resolve({message: 'resolved successfully'});
}
});
});
};
const output = uploadLocalFile(filePath);
context.log(output);
};
Here is the Protokoll output when executing the function:
2019-04-23T11:03:46 Welcome, you are now connected to log-streaming service.
2019-04-23T11:03:59.185 [Information] Executing 'Functions.branding-tool-app' (Reason='This function was programmatically called via the host APIs.', Id=xyz)
2019-04-23T11:04:00.793 [Information] before call
2019-04-23T11:04:00.794 [Information] Promise { <pending> }
2019-04-23T11:04:00.926 [Information] Executed 'Functions.xyz' (Succeeded, Id=xzy)
As you can see the Promise has still the status 'pending' and is not resolve in the createBlob function. What am I doing wrong here?
Here is the complete function for uploading localfile to blob
await uploadLocalFile(aborter, containerURL, localFilePath);
console.log(`Local file "${localFilePath}" is uploaded`);
async function uploadLocalFile(aborter, containerURL, filePath) {
filePath = path.resolve(filePath);
const fileName = path.basename(filePath);
const blockBlobURL = BlockBlobURL.fromContainerURL(containerURL, fileName);
return await uploadFileToBlockBlob(aborter, filePath, blockBlobURL);
}
For reference i am passing the complete quick start functionality.
const {
Aborter,
BlockBlobURL,
ContainerURL,
ServiceURL,
SharedKeyCredential,
StorageURL,
uploadStreamToBlockBlob,
uploadFileToBlockBlob
} = require('#azure/storage-blob');
const fs = require("fs");
const path = require("path");
if (process.env.NODE_ENV !== "production") {
require("dotenv").config();
}
const STORAGE_ACCOUNT_NAME = process.env.AZURE_STORAGE_ACCOUNT_NAME;
const ACCOUNT_ACCESS_KEY = process.env.AZURE_STORAGE_ACCOUNT_ACCESS_KEY;
const ONE_MEGABYTE = 1024 * 1024;
const FOUR_MEGABYTES = 4 * ONE_MEGABYTE;
const ONE_MINUTE = 60 * 1000;
async function showContainerNames(aborter, serviceURL) {
let response;
let marker;
do {
response = await serviceURL.listContainersSegment(aborter, marker);
marker = response.marker;
for(let container of response.containerItems) {
console.log(` - ${ container.name }`);
}
} while (marker);
}
async function uploadLocalFile(aborter, containerURL, filePath) {
filePath = path.resolve(filePath);
const fileName = path.basename(filePath);
const blockBlobURL = BlockBlobURL.fromContainerURL(containerURL, fileName);
return await uploadFileToBlockBlob(aborter, filePath, blockBlobURL);
}
async function uploadStream(aborter, containerURL, filePath) {
filePath = path.resolve(filePath);
const fileName = path.basename(filePath).replace('.md', '-stream.md');
const blockBlobURL = BlockBlobURL.fromContainerURL(containerURL, fileName);
const stream = fs.createReadStream(filePath, {
highWaterMark: FOUR_MEGABYTES,
});
const uploadOptions = {
bufferSize: FOUR_MEGABYTES,
maxBuffers: 5,
};
return await uploadStreamToBlockBlob(
aborter,
stream,
blockBlobURL,
uploadOptions.bufferSize,
uploadOptions.maxBuffers);
}
async function showBlobNames(aborter, containerURL) {
let response;
let marker;
do {
response = await containerURL.listBlobFlatSegment(aborter);
marker = response.marker;
for(let blob of response.segment.blobItems) {
console.log(` - ${ blob.name }`);
}
} while (marker);
}
async function execute() {
const containerName = "demo";
const blobName = "quickstart.txt";
const content = "hello!";
const localFilePath = "./readme.md";
const credentials = new SharedKeyCredential(STORAGE_ACCOUNT_NAME, ACCOUNT_ACCESS_KEY);
const pipeline = StorageURL.newPipeline(credentials);
const serviceURL = new ServiceURL(`https://${STORAGE_ACCOUNT_NAME}.blob.core.windows.net`, pipeline);
const containerURL = ContainerURL.fromServiceURL(serviceURL, containerName);
const blockBlobURL = BlockBlobURL.fromContainerURL(containerURL, blobName);
const aborter = Aborter.timeout(30 * ONE_MINUTE);
console.log("Containers:");
await showContainerNames(aborter, serviceURL);
await containerURL.create(aborter);
console.log(`Container: "${containerName}" is created`);
await blockBlobURL.upload(aborter, content, content.length);
console.log(`Blob "${blobName}" is uploaded`);
await uploadLocalFile(aborter, containerURL, localFilePath);
console.log(`Local file "${localFilePath}" is uploaded`);
await uploadStream(aborter, containerURL, localFilePath);
console.log(`Local file "${localFilePath}" is uploaded as a stream`);
console.log(`Blobs in "${containerName}" container:`);
await showBlobNames(aborter, containerURL);
const downloadResponse = await blockBlobURL.download(aborter, 0);
const downloadedContent = downloadResponse.readableStreamBody.read(content.length).toString();
console.log(`Downloaded blob content: "${downloadedContent}"`);
await blockBlobURL.delete(aborter)
console.log(`Block blob "${blobName}" is deleted`);
await containerURL.delete(aborter);
console.log(`Container "${containerName}" is deleted`);
}
execute().then(() => console.log("Done")).catch((e) => console.log(e));
Reference
https://github.com/Azure-Samples/azure-storage-js-v10-quickstart

Cloud functions. The request signature we calculated does not match the signature you provided

I had a task, after registering users in the application (registering through facebook) to keep facebook avatar in firebase storage, as facebook links have a limited period of work. I implemented the function I'll write below, but I get the following error
The request signature we calculated does not match the signature you provided. Check your Google secret key and signing method
When I try to use a link to an image. Please tell me how it can be fixed?
const functions = require('firebase-functions');
const admin = require('firebase-admin');
const gcs = require('#google-cloud/storage')({keyFilename: "service-account-credentials.json"});
const uuid = require('uuid');
const imageDownloader = require('../lib/Images/image-manager.js');
const path = require('path');
const timestamp = require('unix-timestamp');
module.exports = functions.https.onRequest((req, res) => {
const token = req.header('X-Auth-MyApp-Token');
const imageURL = req.body.imagePath;
const bucketName = functions.config().googlecloud.defaultbacketname;
const bucket = gcs.bucket(bucketName);
var userID = '';
const shortID = uuid.v1();
const filename = shortID + '.jpg';
var profileImagePath = '';
return admin.auth().verifyIdToken(token).then(decodedToken => {
userID = decodedToken.uid;
return imageDownloader.downloadImageToLocalDirectory(imageURL, filename)
}).then(localImagePath => {
profileImagePath = path.normalize(path.join('userImages', userID, 'profileImages', filename));
const uploadProm = bucket.upload(localImagePath, {
destination: profileImagePath,
uploadType: "media",
metadata: {
contentType: 'image/jpeg'
}
});
return uploadProm;
}).then(() => {
console.log('success uploaded');
const config = {
action: 'read',
expires: '03-01-2400',
contentType: 'image/jpeg'
};
const userRefPromise = admin.database().ref()
.child('users')
.child(userID)
.once('value');
const profileImageFile = bucket.file(profileImagePath);
return Promise.all([profileImageFile.getSignedUrl(config), userRefPromise])
}).then(results => {
const url = results[0][0];
const userModel = results[1].val();
const userCheckID = userModel['id'];
console.log("get url", url);
// save to database
const userImagesRef = admin.database().ref().child('userImages')
.child(userID)
.child('userProfileImages')
.push();
const timeStamp = timestamp.now();
console.log('timeStamp', timeStamp);
const imageModelID = userImagesRef.key;
const userImagesRefPromise = userImagesRef.update({
'path': url,
'id': imageModelID,
'fileName': filename,
'timeStamp': timeStamp
});
const userRef = admin.database().ref()
.child('users')
.child(userID)
.child('currentProfileImage');
const userRefPromise = userRef.update({
'path': url,
'id': imageModelID,
'fileName': filename,
'timeStamp': timeStamp
});
return Promise.all([userImagesRefPromise, userRefPromise]);
}).then(() => {
const successJSON = {};
successJSON["message"] = "Success operation";
return res.status(200).send(successJSON);
}).catch(error => {
console.log(error);
const errorJSON = {};
errorJSON["error"] = error;
return res.status(error.code).send(errorJSON);
});
});

Resources