Blank Image on bucket.upload - node.js

I am attempting to upload an image to my firebase storage with nodejs. I've used the following code:
const bucketName = 'mybucket.appspot.com';
// The path to your file to upload
const filePath = 'C:/Users/username/OneDrive/Desktop/testfunctions/dogs.png';
// The new ID for your GCS file
const destFileName = 'your-new-file-name.png';
//function upload() {
const {Storage} = require('#google-cloud/storage');
// Creates a client
const storage = new Storage();
async function uploadFile() {
await storage.bucket(bucketName).upload(filePath, {
destination: destFileName,
});
console.log(`${filePath} uploaded to ${bucketName}`);
}
uploadFile().catch(console.error);
However, upon uploading to my storage, the image fails to load, it is an infinite spinning bar, or there is none but just the file name.
Any help is appreciated.
enter image description here
enter image description here

I Have Made This Code A While Ago When Searching I Saw That Nobody Solved Your Problem.
This Method Will Save It In The "Storage". In Addition, Created A Link For It In "Firebase Realtime Database"
//Important Imports
var admin = require("firebase-admin");
var serviceAccount = require("./serviceAccountKey.json");
const { uuid } = require("uuidv4");
const fs = require("fs");
//This Is Important Line To Tell The Firebase URL + STORAGE URL + CONNECT IT TO SERVICE ACCOUNT ADMIN
admin.initializeApp({
credential: admin.credential.cert(serviceAccount),
databaseURL:
"https://yourfirebase.europe-west1.firebasedatabase.app",
storageBucket: "gs://yourfirebasestorage.com",
});
// Get A Database Reference
const db = admin.database();
const ref = db.ref("Parent");
const usersRef = ref.child("Child");
uploadImageToFirebase("imageName", ".png");
//Upload Image To Firebase Function
function uploadImageToFirebase(imageName, imageFormat) {
var myBucketLink = "yourfirebasestoragewithoutgs.appspot.com";
// CHANGE: TO WERE YOU WANT THE ROOT TO BE
//const usersRef = ref.child("anotherPath");
var bucket = admin.storage().bucket();
var filename = "./" + imageName + imageFormat;
try {
if (fs.existsSync(filename)) {
var UUID = uuid();
async function uploadFile() {
const metadata = {
metadata: {
// This Line Is Very Important. It's To Create A Download Token.
firebaseStorageDownloadTokens: UUID,
},
contentType: "image/png",
cacheControl: "public, max-age=31536000",
};
// Uploads A Local File To The Bucket
await bucket.upload(filename, {
// Support For HTTP Requests Made With `Accept-Encoding: gzip`
gzip: true,
metadata: metadata,
});
console.log(`${filename} uploaded.`);
var HTTP_ImageLink =
"https://firebasestorage.googleapis.com/v0/b/" +
myBucketLink +
"/o/" +
imageName +
imageFormat +
"?alt=media&token=" +
UUID;
console.log(HTTP_ImageLink);
usersRef.child("images").set({
link: HTTP_ImageLink,
});
}
uploadFile().catch(console.error);
} else {
console.log("Can't Upload");
}
} catch (err) {
console.log(err);
}
}

Related

How to process image data from post request in node without express/multer?

I'm trying to submit a POST request with an image to an azure functions endpoint. The endpoint is going to upload the image to azure blob storage. Everything is hooked up, but I'm not sure how to process the form encoded image data to be uploaded to the blob storage account. I'd like to avoid adding express and am looking for alternatives to multer. I've tried consulting the documentation, but it also uses express/multer.
Below is what I have so far for the azure function. It's currently able to upload it to the azure storage account, but the data is not correct since I cannot display when I try to download it or view it.
export const httpTrigger: AzureFunction = async function (context: Context, req: HttpRequest): Promise<void> {
const storageConnectionString = config.storageConnectionString;
// Create the BlobServiceClient object which will be used to create a container client
const blobServiceClient = BlobServiceClient.fromConnectionString(storageConnectionString);
// Create a unique name for the container
const containerName = config.storageContainerName;
// Get a reference to a container
const containerClient = blobServiceClient.getContainerClient(containerName);
const blobName = context.bindingData.imageName;
// Get a block blob client
const blockBlobClient = containerClient.getBlockBlobClient(blobName);
if (req.method === "POST")
{
const content = req.body;
const blobHeaders: BlobHTTPHeaders = {
blobContentType: "image/jpeg",
}
const options: BlockBlobUploadOptions = {blobHTTPHeaders: blobHeaders};
const uploadBlobResponse = await blockBlobClient.upload(content, content.length, options);
}
};
Here is my request from postman removing the token and the path to image.
curl --location --request POST 'http://localhost:7071/api/images/jaimeLannister.jpg' \
--header 'Cookie: BL_SiteLanguageID=1; __RequestVerificationToken=TOKEN' \
--form 'image=#/C:/jaimeLannister.jpg'
Any help would be appreciated!
As wvilSnobu said, you could use parse-multipart from npm which seems to be a more lightweight alternative to formidable.
Refer to the code as below:
module.exports = async function (context, req) {
context.log('JavaScript HTTP trigger function processed a request.');
const streamifier = require('streamifier');
const multipart = require('parse-multipart');
const azureStorage = require('azure-storage');
var bodyBuffer = Buffer.from(req.body);
var boundary = multipart.getBoundary(req.headers['content-type']);
var parts = multipart.Parse(bodyBuffer, boundary);
var filedata = parts[0].data; // Image buffer data
var filename = parts[0].filename;
var a=azureStorage.createBlobService('xxxx','xxxxxxxxxxxx');
try {
var b= a.createBlockBlobFromStream('container', filename, streamifier.createReadStream(new Buffer (filedata)), filedata.length,(err, result)=>{
if(err){
console.log("Image upload failed", err);
}
});
} catch (error) {
console.error(error);
}
};
I send image using Postman:
And after uploading image to portal, the blob content is application/octet-stream and I can download it and view it successfully.
For more details, you could refer to this article.
Update:
The azure-storage library and "new Buffer" are both out of date. Refer to the code using #azure/storage-blob.
import { AzureFunction, Context, HttpRequest } from "#azure/functions";
import { config } from "../cosmos/config";
import { BlobServiceClient, BlockBlobUploadStreamOptions } from "#azure/storage-blob";
const streamifier = require("streamifier");
const multipart = require("parse-multipart");
export const httpTrigger: AzureFunction = async function (context: Context, req: HttpRequest): Promise<void> {
const storageConnectionString = config.storageConnectionString;
// Create the BlobServiceClient object which will be used to create a container client
const blobServiceClient = BlobServiceClient.fromConnectionString(storageConnectionString);
// Create a unique name for the container
const containerName = config.storageContainerName;
// Get a reference to a container
const containerClient = blobServiceClient.getContainerClient(containerName);
// Gets the file name from the url, could be a bug if you want the filename from the file contents
const blobName = context.bindingData.imageName;
// Get a block blob client
const blockBlobClient = containerClient.getBlockBlobClient(blobName);
if (req.method === "POST")
{
const bodyBuffer = Buffer.from(req.body);
const boundary = multipart.getBoundary(req.headers['content-type']);
const parts = multipart.Parse(bodyBuffer, boundary);
const filedata = parts[0].data;
const filename = parts[0].filename;
const options: BlockBlobUploadStreamOptions = {};
try
{
const result = await blockBlobClient.uploadStream(streamifier.createReadStream(Buffer.from(filedata)), filedata.length);
context.res = { status: 200 };
return;
}
catch(err)
{
console.log(err);
}
}
context.res = { status: 302, headers: { "location": blockBlobClient.url }, body: null};
};
Thanks to #Joey Cai and #evilSnobu's answers I was able to get it to work using parse-multipart and streamifier.
import { AzureFunction, Context, HttpRequest } from "#azure/functions";
import { config } from "../cosmos/config";
import { BlobServiceClient, BlockBlobUploadStreamOptions } from "#azure/storage-blob";
const streamifier = require("streamifier");
const multipart = require("parse-multipart");
export const httpTrigger: AzureFunction = async function (context: Context, req: HttpRequest): Promise<void> {
const storageConnectionString = config.storageConnectionString;
// Create the BlobServiceClient object which will be used to create a container client
const blobServiceClient = BlobServiceClient.fromConnectionString(storageConnectionString);
// Create a unique name for the container
const containerName = config.storageContainerName;
// Get a reference to a container
const containerClient = blobServiceClient.getContainerClient(containerName);
// Gets the file name from the url, could be a bug if you want the filename from the file contents
const blobName = context.bindingData.imageName;
// Get a block blob client
const blockBlobClient = containerClient.getBlockBlobClient(blobName);
if (req.method === "POST")
{
const bodyBuffer = Buffer.from(req.body);
const boundary = multipart.getBoundary(req.headers['content-type']);
const parts = multipart.Parse(bodyBuffer, boundary);
const filedata = parts[0].data;
const filename = parts[0].filename;
const options: BlockBlobUploadStreamOptions = {};
try
{
const result = await blockBlobClient.uploadStream(streamifier.createReadStream(Buffer.from(filedata)), filedata.length);
context.res = { status: 200 };
return;
}
catch(err)
{
console.log(err);
}
}
context.res = { status: 302, headers: { "location": blockBlobClient.url }, body: null};
};

Read excel file uploaded to s3 via node lambda function

I am trying to parse through an excel file that is uploaded to s3 using read-excel-file in a node lambda function that triggers on any s3 put. Here is my code which currently doesn't work. Can somebody tell me where I am going wrong?
const aws = require("aws-sdk");
const s3 = new aws.S3({ apiVersion: "2006-03-01" });
const readXlsxFile = require("read-excel-file/node");
exports.handler = async (event, context) => {
// Get the object from the event and show its content type
const bucket = event.Records[0].s3.bucket.name;
const key = decodeURIComponent(
event.Records[0].s3.object.key.replace(/\+/g, " ")
);
const params = {
Bucket: bucket,
Key: key
};
try {
const doc = await s3.getObject(params);
const parsedDoc = await readXlsxFile(doc);
console.log(parsedDoc)
} catch (err) {
console.log(err);
const message = `Error getting object ${key} from bucket ${bucket}. Make sure they exist and your bucket is in the same region as this function.`;
console.log(message);
throw new Error(message);
}
};
I haven't used lambda functions, but I have done something very similar in firebase functions. I used convert-excel-to-json.
I first downloaded the excel file from firebase storage to the firebase functions machine. Then use this npm module to extract the information.
I don't have time to format the code, but I can leave it here for reference:
// Runs when excel file is uploaded to storage
exports.uploadOrder = functions.storage.object().onFinalize(async (file) => {
const fileBucket = file.bucket;
const filePath = file.name || "null";
const filePathList = filePath?.split("/") || ["null"];
const fileName = path.basename(filePath);
if (filePathList[0] !== "excel_orders") {
return;
}
const uid = filePathList[1];
console.log("User ID: " + uid);
const bucket = admin.storage().bucket(fileBucket);
const tempFilePath = path.join(os.tmpdir(), fileName);
console.log(tempFilePath);
await bucket.file(filePath).download({ destination: tempFilePath });
const result = excelToJson({
sourceFile: tempFilePath,
});
var ordersObj: any[] = result.Sheet1;
ordersObj.shift();
console.log(ordersObj);
var orders: any[] = [];
for (let i = 0; i < ordersObj.length; i++) {
const order: Order = {
package_description: ordersObj[i].A,
package_type: ordersObj[i].B,
country: ordersObj[i].C,
address: ordersObj[i].D,
curstomer_name: ordersObj[i].E,
customer_phone: ordersObj[i].F,
collection_ammount: ordersObj[i].G,
order_date: ordersObj[i].H,
delivery_date: ordersObj[i].I,
delivery_time: ordersObj[i].J,
status: "pending",
assignedTo: "",
merchantID: uid,
};
orders.push(order);
}
});

How to upload images and files to Azure Blob Node.js

I have node.js application with frontend in Angular
I need to upload files and images to Azure blob
I have created container and setup the environment according to MS documentation (https://learn.microsoft.com/en-us/azure/storage/blobs/storage-quickstart-blobs-nodejs )
the v12 version.
My functions works for creating and uploading the created file to Azure blob, I could not figure how do I upload the posted file from the client to Azure Blob, below is my code in Node.js TypeScript
import * as formidable from 'formidable';
import * as fs from 'fs';
const { BlobServiceClient } = require('#azure/storage-blob');
const uuidv1 = require('uuid/v1');
const dotenv = require('dotenv');
dotenv.config();
class BlobController {
private AZURE_STORAGE_CONNECTION_STRING = process.env.CONSTRINGBlob;
constructor(router) {
router.post('/file', this.uploadFile.bind(this));
}
//----Get Lookup tables dynamically-----------//
async uploadFile(req, res) {
const blobServiceClient = await BlobServiceClient.fromConnectionString(this.AZURE_STORAGE_CONNECTION_STRING);
// Create a unique name for the container
//const containerName = 'quickstart' + uuidv1();
const containerName = blobServiceClient.getContainerClient('mycontainer');
console.log('\t', containerName.containerName);
// Get a reference to a container
const containerClient = await blobServiceClient.getContainerClient(containerName.containerName);
let form = new formidable.IncomingForm();
form.parse(req, async function (err, fields, files) {
const blobName = 'test' + uuidv1() + files.file;
// Get a block blob client
const blockBlobClient = containerClient.getBlockBlobClient(blobName);
console.log('\nUploading to Azure storage as blob:\n\t', blobName);
// Upload data to the blob
const data = 'Hello test';
const uploadBlobResponse = await blockBlobClient.upload(data, data.length);
console.log("Blob was uploaded successfully. requestId: ", uploadBlobResponse.requestId);
});
}
}
module.exports = BlobController
could anyone help me on how can I upload files posted to Azure blob using Node.js
You were almost there :).
Please change your following code:
form.parse(req, async function (err, fields, files) {
const blobName = 'test' + uuidv1() + files.file;
// Get a block blob client
const blockBlobClient = containerClient.getBlockBlobClient(blobName);
console.log('\nUploading to Azure storage as blob:\n\t', blobName);
// Upload data to the blob
const data = 'Hello test';
const uploadBlobResponse = await blockBlobClient.upload(data, data.length);
console.log("Blob was uploaded successfully. requestId: ", uploadBlobResponse.requestId);
});
to:
form.parse(req, async function (err, fields, files) {
const file = files.file;
const blobName = 'test' + uuidv1() + files.file;
const contentType = file.type;
const filePath = file.path;//This is where you get the file path.
const blockBlobClient = containerClient.getBlockBlobClient(blobName);
const uploadBlobResponse = await blockBlobClient.uploadFile(filePath);
});
//Here's my form.parse code I used to upload pictures.
form.parse(req, async (err: any, fields: any, files: any) => {
const file = files.file;
const filePath = file.path;//This is where you get the file path. (this is the file itself)
const blobName: string = slugify(file.name);
const blockBlobClient = containerClient.getBlockBlobClient(blobName);
const uploadBlobResponse = await blockBlobClient.uploadFile(filePath)
console.log("Blob was uploaded successfully. requestId: ", uploadBlobResponse)
if (err) return reject(err)
//write to DB
//end write to DB
resolve(fields)
})
For anyone trying to make use of streams, this worked for me:
import formidable from 'formidable';
import { PassThrough } from 'stream';
export default async function handler(req: NextApiRequest, res: NextApiResponse) {
if (req.method == 'POST') {
const stream = new PassThrough();
const form = new formidable.IncomingForm({
fileWriteStreamHandler: () => {
return stream;
}
});
form.parse(req, (err, fields, files) => {
if (files) {
if (files['<form-file-input-name>']) {
const file = files['<form-file-input-name>'] as formidable.File;
const mimeType = file.mimetype;
const extension = file.originalFilename ? file.originalFilename.substring(file.originalFilename.lastIndexOf('.')) : '.csv';
const newFileName = `<form-file-input-name>-${new Date().toISOString()}${extension}`;
getFilesBlobContainer().getBlockBlobClient(newFileName).uploadStream(stream, undefined, undefined, {
blobHTTPHeaders: {
blobContentType: mimeType,
},
});
}
}
});
return res.status(200).end();
}
}
export const config = {
api: {
bodyParser: false, //Disable NextJS body parsing so formidable can do that itself (fails silently otherwise)
},
};

Firebase Storage upload file from URL with the admin SDK [duplicate]

I try to find a way to upload a PDF file, generated by a php/MySQL server to my Google Storage bucket. The URL is simple : www.my_domain.com/file.pdf . I tried with the code below , but I'm having some issues to make it work. The error is : path (fs.createWriteStream(destination)) must be a string or Buffer. Thanks in advance for your help !
const http = require('http');
const fs = require('fs');
const {Storage} = require('#google-cloud/storage')
const gcs = new Storage({
keyFilename: 'my_keyfile.json'
})
const bucket = gcs.bucket('my_bucket.appspot.com');
const destination = bucket.file('file.pdf');
var theURL = 'https://www.my_domain.com/file.pdf';
var download = function() {
var file = fs.createWriteStream(destination);
var request = http.get(theURL, function(response) {
response.pipe(file);
file.on('finish', function() {
console.log("File uploaded to Storage")
file.close();
});
});
}
I finally found a solution :
const http = require('http');
const fs = require('fs');
const {Storage} = require('#google-cloud/storage')
const gcs = new Storage({
keyFilename: 'my_keyfile.json'
})
const bucket = gcs.bucket('my_bucket.appspot.com');
const destination = os.tmpdir() + "/file.pdf";
const destinationStorage = path.join(os.tmpdir(), "file.pdf");
var theURL = 'https://www.my_domain.com/file.pdf';
var download = function () {
var request = http.get(theURL, function (response) {
if (response.statusCode === 200) {
var file = fs.createWriteStream(destination);
response.pipe(file);
file.on('finish', function () {
console.log('Pipe OK');
bucket.upload(destinationStorage, {
destination: "file.pdf"
}, (err, file) => {
console.log('File OK on Storage');
});
file.close();
});
}
});
}
firebase-admin, as of v7.0.0, uses google-cloud/storage v2.3.0, which can no longer accept file URLs on bucket.upload.
I figured I would share my solution as well.
const rq = require('request');
// filePath = File location on google storage bucket
// fileUrl = URL of the remote file
const bucketFile = bucket.file(filePath);
const fileWriteStream = bucketFile.createWriteStream();
let rqPipe = rq(fileUrl).pipe(fileWriteStream);
// And if you want the file to be publicly readable
rqPipe.on('finish', async () => {
await bucketFile.makePublic();
});

Node.js Firebase Function sending Base64 image to External API

I’m using Firebase Functions with a Storage trigger in Node.js to send uploaded image data to an external API endpoint where photos are uploaded.
I’m currently taking images uploaded to a bucket in my Firebase storage, converting them to base64 strings, and plug them into my dictionary for the request.
My current issue is that seems like the dictionary is being cut short. I looked at the console logs on the Firebase console and seems like it ends after the base64 variable.
I’m not sure whether this is a bug with the syntax, or with the way I’m using the base64, or with Firebase Functions. If anyone knows what might be going on, please let me know.
const request = require('request-promise');
const gcs = require('#google-cloud/storage')();
const path = require('path');
const os = require('os');
const fs = require('fs');
const firebase = require('firebase');
exports.identifyUpdate = functions.storage.object().onFinalize((object) => {
const fileBucket = object.bucket;
const filePath = object.name;
const contentType = object.contentType;
const fileName = path.basename(filePath);
if(!filePath.substring(0,filePath.indexOf('/')) == 'updates') {
console.log("Triggered by non-update photo")
return null;
}
console.log("Update photo added")
// Create Firebase app (for Realtime Database access)
var config = {
apiKey: "[apikey]",
authDomain: "[PROJECT_ID].firebaseapp.com",
databaseURL: "https://[PROJECT_ID].firebaseio.com",
storageBucket: "[PROJECT_ID].appspot.com",
};
if(!firebase.apps.length) {
firebase.initializeApp(config);
}
// Trace back to Update stored in Realtime Database
const database = firebase.database().ref()
const pendingRef = database.child('pendingUpdates')
console.log(filePath)
const splitPath = filePath.split(path.sep)
const patientID = splitPath[1]
console.log('Patient ID: ' + patientID)
const updateID = splitPath[2]
console.log('Update ID: ' + updateID)
const updateRef = pendingRef.child(patientID).child(updateID)
console.log('Found Update reference')
const photoRef = updateRef.child('photoURLs').child(fileName)
console.log('Photo Reference: ' + photoRef)
// Download and convert image to base64
const bucket = gcs.bucket(fileBucket)
const tempFilePath = path.join(os.tmpdir(), fileName)
const metadata = {
contentType: contentType
};
var base64;
return bucket.file(filePath).download({
destination: tempFilePath
}).then(() => {
console.log('Image downloaded locally to', tempFilePath)
}).then(() => {
base64 = base64_encode(tempFilePath)
console.log("Base 64: " + base64)
}).then(() => {
// Send image data to Kairos
var options = {
method: 'POST',
uri: 'https://api.kairos.com/recognize',
body: {
'image': base64,
'gallery_name': 'gallerytest1'
},
headers: {
'app_id': '[id]',
'app_key': '[key]'
},
json: true
}
return new Promise (() => {
console.log(options)
request(options)
.then(function(repos) {
console.log('API call succeeded');
console.log('Kairos response: ' + repos);
const apiResult = repos['images']['transaction']['subject_id']
console.log("Transaction " + JSON.stringify(apiResult))
})
.catch(function(err) {
console.log('API call failed')
})
});
})
// Delete app instance (to prevent concurrency leaks)
const deleteApp = () => app.delete().catch(() => null);
deleteApp.call
})
function base64_encode(file) {
// read binary data
var bitmap = fs.readFileSync(file);
// convert binary data to base64 encoded string
return new Buffer(bitmap).toString('base64');
}
Image Output:

Resources