Nodejs upload base64 image to azure blob storage using .createBlockBlobFromLocalFile() - node.js

I want to upload profile picture of a user sent from web app and mobile app via Base64 form.
On the POST request they need to send a JSON on the body that looks something like this.
{
"name":"profile-pic-123.jpg",
"file":"data:image/jpeg;base64,/9j/4AAQSkZJRgABAQAAAQABAAD/2wCEAAkGBxQTEhIUEhIUFBUV…K9rk8hCAEkjFMUYiEAI+nHIpsQh0AkisDYRTOiCAbWVtgCtI6IlkHh7LDTQXLH0EIQBj//2Q==" // the base64 image
}
Now on the server side using Node and Express, I used this npm module called azure-storage which offers a nice way of uploading files to azure blob storage using web service.
But there's something that I cannot understand on this. Here's a part of the code from my controller. I successfully created all neccessary connections and keys and whatnot to create a working blobService :
controllers.upload = function(req, res, next){
// ...
// generated some sastoken up here
// etc.
// ...
var uploadOptions = {
container: 'mycontainer',
blob: req.body.name, // im not sure about this
path: req.body.file // im not sure about this either
}
sharedBlobService.createBlockBlobFromLocalFile(uploadOptions.container, uploadOptions.blob, uploadOptions.path, function(error, result, response) {
if (error) {
res.send(error);
}
console.log("result", result);
console.log("response", response);
});
}
Im getting this error:
{
"errno": 34,
"code": "ENOENT",
"path": "iVBORw0KGgoAAAANSUhEUgAAABgAAAAYCAIAAAB..."
}

if you use javascript sdk v12 You can use this sample code. It's just that simple. I have this implemented in a function and it works great when all I need it to trigger an HTTP event.
index.js
const file = await require('./file')();
uploadOptions = {
container: 'mycontainer',
blob: req.body.name,
text: req.body.file
}
const fileUploader = await file(uploadOptions.text, uploadOptions.blob,
uploadOptions.container);
You can use a separate module for your logic and call this from the index.js above
file.js
const { BlobServiceClient } = require("#azure/storage-blob");
const blobServiceClient = BlobServiceClient.fromConnectionString(process.env.AZURE_STORAGE_CONNECTION_STRING);
const Promise = require('bluebird');
module.exports = Promise.method(async function() {
return async function (data, fileName, container) {
const containerClient = await blobServiceClient.getContainerClient(container);
const blockBlobClient = containerClient.getBlockBlobClient(fileName);
const matches = data.match(/^data:([A-Za-z-+\/]+);base64,(.+)$/);
const buffer = new Buffer(matches[2], 'base64');
return await blockBlobClient.upload(buffer, buffer.byteLength );
};
});

In this case, you should not use createBlockBlobFromLocalFile. Instead, you should use createBlockBlobFromText, because you are not uploading a local file, but content in the request body.
Here is the code:
var uploadOptions = {
container: 'mycontainer',
blob: req.body.name,
text: req.body.file
}
sharedBlobService.createBlockBlobFromText(uploadOptions.container,
uploadOptions.blob,
uploadOptions.text,
{
contentType: 'image/jpeg',
contentEncoding: 'base64'
},
function(error, result, response) {
if (error) {
res.send(error);
}
console.log("result", result);
console.log("response", response);
});
The blob is just the file name, which is "profile-pic-123.jpg" this case, and path is the local path to your file. Since you are not storing the file locally in the server side, path is meaningless in the case.
If you need more information about Storage, see this, and this

Related

Sending blob image from Angular to ExpressJS

I'm trying to send a blob image, but I'm getting Error: Unexpected end of form using multer with Serverless Framework.
From console.log
My understanding is I have to append it to FormData before sending it in the body, but I haven't been able to get backend to accept file without crashing
uploadImage(imageData: File) {
console.log('IMAGE DATA', imageData);
let formData = new FormData();
formData.append('file', imageData, 'file.png');
let headers = new HttpHeaders();
headers.append('Content-Type', 'multipart/form-data');
headers.append('Accept', 'application/json');
let options = { headers: headers };
const api = environment.slsLocal + '/add-image';
const req = new HttpRequest('PUT', api, formData, options);
return this.http.request(req);
}
backend
const multerMemoryStorage = multer.memoryStorage();
const multerUploadInMemory = multer({
storage: multerMemoryStorage
});
router.put(
'/add-image',
multerUploadInMemory.single('file'),
async (req, res: Response) => {
try {
if (!req.file || !req.file.buffer) {
throw new Error('File or buffer not found');
}
console.log(`Upload Successful!`);
res.send({
message: 'file uploaded'
});
} catch (e) {
console.error(`ERROR: ${e.message}`);
res.status(500).send({
message: e.message
});
}
console.log(`Upload Successful!`);
return res.status(200).json({ test: 'success' });
}
);
app.ts
import cors from 'cors';
import express from 'express';
import routers from './routes';
const app = express();
import bodyParser from 'body-parser';
app.use(cors({ maxAge: 43200 }));
app.use(
express.json({
verify: (req: any, res: express.Response, buf: Buffer) => {
req.rawBody = buf;
}
})
);
app.use('/appRoutes', routers.appRouter);
app.use(
bodyParser.urlencoded({
extended: true // also tried extended:false
})
);
export default app;
From my understanding with serverless framework I have to install
npm i serverless-apigw-binary
and add
apigwBinary:
types: #list of mime-types
- 'image/png'
to the custom section of the serverless template yaml file.
The end goal is not to save to storage like S3, but to send the image to discord.
What am I missing? I appreciate any help!
I recently encountered something similar in a react native app. I was trying to send a local file to an api but it wasn't working. turns out you need to convert the blob file into a base64 string before sending it. What I had in my app, took in a local file path, converted that into a blob, went through a blobToBase64 function, and then I called the api with that string. That ended up working for me.
I have this code snippet to help you but this is tsx so I don't know if it'll work for angular.
function blobToBase64(blob: Blob) {
return new Promise((resolve, reject) => {
const reader = new FileReader();
reader.onerror = reject;
reader.onload = () => {
resolve(reader.result as string);
};
reader.readAsDataURL(blob);
});
}
Hope this helps!
You can convert your Blob to a File using
new File([blob], "filename")
and then you should be able pass that file to your existing uploadImage method.
Looks like you are passing Blob instead of File based on your console.log(). So you should convert Blob to a File before calling the server. You can change your frontend code like this:
uploadImage(imageData: File) {
// Convert Blob to File
const file = new File([imageData], "file_name", { type: imageData.type });
let formData = new FormData();
formData.append('file', file, 'file.png');
const api = environment.slsLocal + '/add-image';
return this.http.put(api, formData);
}
Note: For more info about converting Blob to File, you can check this StackOverflow question.
The thing that got it working for me was this article.
There might be something different about using Express through Serverless Framework so things like mutler and express-fileupload might not work. Or could be because it's an AWS Lambda function. I don't know this for sure though. I just know I never got it working. This article was the only thing that worked for Serverless Framework + Express.
I also had to install version 0.0.3 of busboy ie npm i busboy#0.0.3. The newer version didn't work for busboy. Newer version was saying Busboy is not a constructor
Since I'm sending the file to discord and not S3 like this article does, I had to tweak the parser.event part in this part of the article for the handler.ts
export const uploadImageRoute = async (
event: any,
context: Context
): Promise<ProxyResult> => {
const parsedEvent: any = await parser(event);
await sendImageToDiscord(parsedEvent.body.file);
const response = {
statusCode: 200,
body: JSON.stringify('file sent successfully')
};
return response;
};
comes in as a Buffer which I was able to send as a file like this
const fs = require('fs-extra');
const cwd = process.cwd();
const { Webhook } = require('discord-webhook-node');
const webhook = new Webhook('<discord-webhook-url>');
export async function sendImageToDiscord(arrayBuffer) {
var buffer = Buffer.from(arrayBuffer, 'base64');
const newFileName = 'nodejs.png';
await fs.writeFile(`./${newFileName}`, buffer, 'utf-8').then(() => {
webhook.sendFile(`${cwd}/${newFileName}`);
});
}
});
I hope this helps someone!

Send Blob File from html form to express server so it can be uploaded to cloud

So I'm trying to make the html form:
<form action="blahblah" encblah="multipart/form-data" whatever>
Thats not the problem, I need to make that form send the blob to express
app.post('/upload/avatars', async (req, res) => {
const body = req.body;
console.log(req.file);
console.log(body);
res.send(body);
});
So I can access the blob, create a read stream, pipe it to the cloud, and bam, upload the file without downloading anything on the express server it self.
Is that possible?
If yes, please tell me how.
If no, please tell me other alternatives.
On the client we do a basic multi-part form upload. This example is setup for a single image but you could call uploadFile in sequence for each image.
//client.ts
const uploadFile = (file: File | Blob) => {
const formData = new FormData();
formData.append("image", file);
return fetch("/upload", {
method: "post",
body: formData,
});
};
const handleUpload = (event: any) => {
return event.target.files.length ? uploadFile(event.target.files[0]) : null;
};
On the server we can use multer to read the file without persisting it to disk.
//server.js
const express = require("express");
const app = express();
const multer = require("multer");
const upload = multer();
app.post(
"/upload",
upload.fields([{ name: "image", maxCount: 1 }]),
(req, res, next) => {
console.log("/upload", req.files);
if (req.files.image.length) {
const image = req.files.image[0]; // { buffer, originalname, size, ...}
// Pipe the image.buffer where you want.
res.send({ success: true, count: req.files.image.originalname });
} else {
res.send({ success: false, message: "No files sent." });
}
}
);
For larger uploads I recommend socket.io, but this method works for reasonably sized images.
it is possible, but when you have a lot of traffic it would overwhelm your express server (in case you are uploading videos or big files ) but if it's for uploading small images (profile image, etc...) you're fine. either way you can use Multer npm
I'd recommend using client-side uploading on ex: s3-bucket, etc..., which returned a link, and therefore using that link.

S3 getSignedUrl v2 equivalent in AWS Javascript SDK v3

I just started using aws-sdk on my app to upload files to S3, and i'm debating whether to use aws-sdk v2 or v3.
V2 is the whole package, which is super bloated considering i only need the s3 services, not the myriad of other options. However, the documentation is very cryptic and im having a really hard time getting the equivalent getSignedUrl function to work in v3.
In v2, i have this code to sign the url and it works fine. I am using express on the server
import aws from 'aws-sdk';
const signS3URL = (req,res,next) => {
const s3 = new aws.S3({region:'us-east-2'});
const {fileName,fileType} = req.query;
const s3Params = {
Bucket : process.env.S3_BUCKET,
Key : fileName,
ContentType:fileType,
Expires: 60,
};
s3.getSignedUrl('putObject',s3Params,(err,data)=>{
if(err){
next(err);
}
res.json(data);
});
}
Now I've been reading documentation and examples trying to get the v3 equivalent to work, but i cant find any working example of how to use it. Here is how I have set it up so far
import {S3Client,PutObjectCommand} from '#aws-sdk/client-s3';
import {getSignedUrl} from '#aws-sdk/s3-request-presigner';
export const signS3URL = async(req,res,next) => {
console.log('Sign')
const {fileName,fileType} = req.query;
const s3Params = {
Bucket : process.env.S3_BUCKET,
Key : fileName,
ContentType:fileType,
Expires: 60,
// ACL: 'public-read'
};
const s3 = new S3Client()
s3.config.region = 'us-east-2'
const command = new PutObjectCommand(s3Params)
console.log(command)
await getSignedUrl(s3,command).then(signature =>{
console.log(signature)
res.json(signature)
}).catch(e=>next(e))
}
There are some errors in this code, and the first I can identify is creating the command variable using the PutObjectCommand function provided by the SDK. The documentation does not clarify to me what i need to pass it as the "input" https://docs.aws.amazon.com/AWSJavaScriptSDK/v3/latest/clients/client-s3/interfaces/putobjectcommandinput.html
Does anyone with experience using aws-sdk v3 know how to do this?
Also a side-question, where can i find the api reference for v2???? cuz all i find is the sdk docs that say "v3 now available" and i cant seem to find the reference to v2....
thanks for your time
The following code would give you a signedUrl in a JSON body with the key as signedUrl.
const signS3URL = async (req, res, next) => {
const { fileName, fileType } = req.query;
const s3Params = {
Bucket: process.env.S3_BUCKET,
Key: fileName,
ContentType: fileType,
// ACL: 'bucket-owner-full-control'
};
const s3 = new S3Client({ region: 'us-east-2' })
const command = new PutObjectCommand(s3Params);
try {
const signedUrl = await getSignedUrl(s3, command, { expiresIn: 60 });
console.log(signedUrl);
res.json({ signedUrl })
} catch (err) {
console.error(err);
next(err);
}
}
Keep the ACL as bucket-owner-full-control if you want the AWS account owning the Bucket to access the files.
You can go to the API Reference for both the JS SDK versions from here
In reference to the AWS docs and #GSSwain's answer (cannot comment, new) this link will show multiple examples getSignedURL examples.
Below is an example of uploading copied from AWS docs
// Import the required AWS SDK clients and commands for Node.js
import {
CreateBucketCommand,
DeleteObjectCommand,
PutObjectCommand,
DeleteBucketCommand }
from "#aws-sdk/client-s3";
import { s3Client } from "./libs/s3Client.js"; // Helper function that creates an Amazon S3 service client module.
import { getSignedUrl } from "#aws-sdk/s3-request-presigner";
import fetch from "node-fetch";
// Set parameters
// Create a random name for the Amazon Simple Storage Service (Amazon S3) bucket and key
export const bucketParams = {
Bucket: `test-bucket-${Math.ceil(Math.random() * 10 ** 10)}`,
Key: `test-object-${Math.ceil(Math.random() * 10 ** 10)}`,
Body: "BODY"
};
export const run = async () => {
try {
// Create an S3 bucket.
console.log(`Creating bucket ${bucketParams.Bucket}`);
await s3Client.send(new CreateBucketCommand({ Bucket: bucketParams.Bucket }));
console.log(`Waiting for "${bucketParams.Bucket}" bucket creation...`);
} catch (err) {
console.log("Error creating bucket", err);
}
try {
// Create a command to put the object in the S3 bucket.
const command = new PutObjectCommand(bucketParams);
// Create the presigned URL.
const signedUrl = await getSignedUrl(s3Client, command, {
expiresIn: 3600,
});
console.log(
`\nPutting "${bucketParams.Key}" using signedUrl with body "${bucketParams.Body}" in v3`
);
console.log(signedUrl);
const response = await fetch(signedUrl, {method: 'PUT', body: bucketParams.Body});
console.log(
`\nResponse returned by signed URL: ${await response.text()}\n`
);
} catch (err) {
console.log("Error creating presigned URL", err);
}
try {
// Delete the object.
console.log(`\nDeleting object "${bucketParams.Key}"} from bucket`);
await s3Client.send(
new DeleteObjectCommand({ Bucket: bucketParams.Bucket, Key: bucketParams.Key })
);
} catch (err) {
console.log("Error deleting object", err);
}
try {
// Delete the S3 bucket.
console.log(`\nDeleting bucket ${bucketParams.Bucket}`);
await s3Client.send(
new DeleteBucketCommand({ Bucket: bucketParams.Bucket })
);
} catch (err) {
console.log("Error deleting bucket", err);
}
};
run();

Upload image to Cloudinary via Lambda using node.js

I'm working on a Serverless Webapp (node.js) and have come across a problem which has been blocking me now for 2 days.
Scenario:
- User comes to the site, uploads an image directly to our S3 bucket.
- Event is fired and is added to the AWS SQS queue ready for Lambda to pick up
- Lambda picks up the image key
- Lambda then connects to Cloudinary (via SDK)
- Lambda then tries to upload via the SDK command cloudinary.v2.uploader.upload('s3://bucket-name/file-name', (error, result) => { console.log(error, result)});
Problem:
Nothing gets uploaded to Cloudinary, no Errors are found in Cloudwatch, console.log is empty from callback.
The S3 bucket is whitelisted, I've created a file in my S3 .wellknown/cloudinary/cloud_name
index.js
import imageTransform from "./imageTransform";
exports.imageFromQueue = async event => {
event.Records.forEach(async record => {
const body = JSON.parse(record.body).Records[0];
const { eventName, s3 } = body;
if (eventName === "ObjectCreated:Post") {
const { key } = s3.object;
console.log("here", key); --> This gets printed out
const data = await ImageTransform(key);
}
});
};
}
imageTransform.js
const cloudinary = require("cloudinary").v2;
cloudinary.config({
cloud_name: process.env.CLOUDINARY_CLOUD_NAME,
api_key: process.env.CLOUDINARY_API_KEY,
api_secret: process.env.CLOUDINARY_API_SECRET
});
const ImageTransform = async imagePath => {
try {
const data = await cloudinary.uploader.upload(
`s3://${process.env.IMAGE_BUCKET}/${imagePath}`
);
console.log(data);
} catch (error) {
console.log("cloudinary - major error", error);
return { majorError: error };
}
}

How to save form data that include image using nodejs posted using angular 2

I am creating a application using mean stack, in which i am using angular 2 for the client side. I had created a form that contain some input fields and a image. Now, for submitting the form i am using formdata to send data to the node server. now I am unable to show, access and save the data at the node server. Please somebody help me as I am new to mean stack.
data array:
const newProduct = {
category: this.pcategory,
name: this.name,
description: this.description,
price: this.price,
quantity: this.quantity,
image: this.fileName
}
here is the code for sending data:
imagedata contain the data of the file
addProduct(newProduct, imagedata:File) {
let formData: FormData = new FormData();
formData.append('body', JSON.stringify(newProduct));
formData.append('file', image, newProduct.imagedata);
let headers = new Headers();
headers.append("enctype", "multipart/form-data");
headers.append("Accept", "application/json");
let options = new RequestOptions({ headers: headers });
return this.http.post('http://localhost:3000/product/add' ,formData, options).map((response: Response) => response.json());
}
here is the code for receiving and saving data:
function (req, res) {
var storage = multer.diskStorage({//multers disk storage settings
destination: function (req, file, callback) {
callback(null, './uploads');
}
});
var upload = multer({//multer settings
storage: storage
}).any();
var model = new Model(req.body);
model.save(function (err) {
if (err) {
return res.status(400).send({
message: errorHandler.getErrorMessage(err)
});
} else {
res.status(201).json(model);
}
});
upload(req, res, function (err) {
if (err) {
// An error occurred when uploading
console.log(err);
return res.status(422).send("an Error occured")
}
});
}
In angular 2 you cannot upload image with this approach consider using this Angular 2 module ng2-file-uploader. You can see the demo app here Angular File Uploads with an Express Backend
.
One solution could be to convert your image to base64 string and pass that string in your model. And then have that base64 string convert back to image in the server.

Resources