Is there an easy way to do a simple file upload using Node/PostgreSQL for any type of file? - node.js

I want to store files in an existing postgreSQL database by uploading them by means of an express server.
The file comes into the POST end point like this:
{ name: 'New Data.xlsx',
data: <Buffer 50 4c 03 04 14 01 06 00 08 00 00 24 21 00 1f 0a 93 21 cf 02 00 00 4f 1f 00 00 13 00 08 02 5b 43 6f 6e 74 65 6e 74 ... >,
size: 6880975,
encoding: '7bit',
tempFilePath: '',
truncated: false,
mimetype: 'application/vnd.openxmlformats-officedocument.spreadsheetml.sheet',
md5: '535c8576e1c94d169ea5a637487ee3b4',
mv: [Function: mv] }
This is a fairly large excel document. Word Docs, pdfs, simple CSVs, etc also need to be possible for upload.
I've tried both node-postgres and sequelize libraries in a similar way:
app.post('/upload', async(req, res) => {
var {upload} = req.files
var data = upload.data
console.log(data);
const x = await pool.query(`insert into attachment (data, file_name, category) values (${data}, '${upload.name}', 'test')`)
// Deconstruct x to get response values
res.send("OK")
});
Some files like txt, plain csv files work and do upload however I receive errors such as
error: invalid message format
for excel or word files.
I've done something like this before with MongoDB but I can't switch databases. Another idea I had was to simply store the files on the production server in an 'uploads' files but I'm not sure that's good practice.
Any advice?

Solved using query parameters.
app.post('/upload', async(req, res) => {
var {upload} = req.files
var data = upload.data
const x = await pool.query(`insert into attachment (data, file_name, category) values ($1, '${upload.name}', 'test')`, [data])
res.send("OK")
});

Related

Get the Blob Using Node Multer Buffer And Convert Blob To Base 64

I'm trying to get the Uint8Contents as Blob to convert to base64 and store it as PgSQL bytea coming from the ArrayBuffer/Buffer using multer middleware for Expressjs.
Most of the answers refer to saving it in a file system first, but how would you use the multer memory storage? (I've used it this way)
import { Router, Request, Response } from 'express'
import multer from 'multer'
const storage = multer.memoryStorage()
const upload = multer({ storage: storage })
const api = Router()
api.post('/customer/:customer_id/photo', upload.single('photo'),
async (req: Request, res: Response) => {
const customerId = req.params.customer_id
const photoBuffer = req?.file?.buffer as Buffer
const arrayBuffer = photoBuffer.buffer.slice(
photoBuffer.byteOffset,
photoBuffer.byteOffset + photoBuffer.byteLength
)
const uInt8Contents = photoBuffer.readUInt8(photoBuffer.byteOffset)
console.log("uInt8Contents",uInt8Contents)
// const arrayBuffer = Uint8Array.from(photoBuffer).buffer
// const photoBlob = Buffer.from(arrayBuffer).Blob([arrayBuffer])
console.log("bufferPhoto", arrayBuffer)
// TODO: Need a code for converting array buffer or buffer to be the correct image Blob
const base64Photo = Buffer.from(arrayBuffer).toString('base64')
// Store base 64 photo in PgSQL bytea
// ...
}
)
I just couldn't figure out how to get the correct Blob to be converted as base64 and store it in PgSQL as bytea.
So, the question is: On the second to the last line, how would I convert the file into a Blob?
I get this output, but it does not seem to be the Uint8Contentsof the blob because the image does not display at all.
ArrayBuffer {
[Uint8Contents]: <ff d8 ff e0 00 10 4a 46 49 46 00 01 01 01 00 48 00 48 00 00 ff e2 02 a0 49 43 43 5f 50 52 4f 46 49 4c 45 00 01 01 00 00 02 90 6c 63 6d 73 04 30 00 00 6d 6e 74 72 52 47 42 20 58 59 5a 20 07 dd 00 0a 00 08 00 17 00 2b 00 36 61 63 73 70 41 50 50 4c 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 ... 2527 more bytes>,
byteLength: 2627
}
I found this issue:
https://github.com/dherault/serverless-offline/issues/464
It was basically a serverless-offline issue: When a server is running in serverless-offline and the image is uploaded, the binary image gets distorted like this:
Where the binary image should look like this, instead:
In short, the characters are being changed by serverless-offline. The link, however, suggests that if you are going to deploy the code in a non-serverless-offline environment, it will work.
So, this code will still work when deployed:
import { Router, Request, Response } from 'express'
import multer from 'multer'
const storage = multer.memoryStorage()
const uploadPhoto = multer({ storage: storage }).single('upload')
const api = Router()
v1Api.post('/user/:user_id/photo', uploadPhoto, async (req: Request, res: Response) => {
const userId = req.params.user_id
const photoBuffer = req?.file?.buffer as Buffer
const binaryPhoto = Buffer.from(photoBuffer).toString('binary')
const base64Photo = Buffer.from(binaryPhoto).toString('base64')
console.log(base64photo) // Save this base64photo as bytea
})
Otherwise, configure the serverless.yml to support binary using this plugin aws-serverless-express-binary.
Edit
We deployed it in an environment and it seems to work correctly.

node js uploading image with multer and save to firebase storage

My ejs front end code is as below
<form action='/powerLink' method='post' enctype='multipart/form-data'> <input type='file' name='file'> <input type='submit' value='fileupload'> </form>
and my js code where receive file is as below
var storagee=firebase.storage().ref("test");
app.post("/powerLink", multer.single('file'),function(req,res){
let file = req.file;
if(file){
console.log(file);
storage.put(file);
}
when I console.log(file)
it has value like below
{ fieldname: '
file',
originalname: 'appiicon.png',
encoding: '7bit',
mimetype: 'image/png',
buffer:
<Buffer 89 50 4e 47 0d 0a 1a 0a 00 00 00 0d 49 48 44 52 00 00 00 e1 00 00 00 e1 08 06 00 00 00 3e b3 d2 7a 00 00 00 19 74 45 58 74 53 6f 66 74 77 61 72 65 00 ... >,
size: 15966 }
I thought it save right to my storage and folder "test" and then save image to that folder. but nothing happened.
I can't guess what's the reason of not uploading image file to storage on firebase
I've done this before using the firebase admin sdk with something like this (typescript):
async function uploadFile(file: Express.Multer.File, directory: string, fileName: string): Promise<string> {
const bucket = firebaseAdmin.storage().bucket();
const fullPath = `${directory}/${fileName}`;
const bucketFile = bucket.file(fullPath);
await bucketFile.save(file.buffer, {
contentType: file.mimetype,
gzip: true
});
const [url] = await bucketFile.getSignedUrl({
action: "read",
expires: "01-01-2050"
});
return url;
}
I had a related issue. Passing in the file object returned the error: TypeError: Cannot read property 'byteLength' of undefined
Rather than passing in the file object, you should pass in the buffer property like this:
var storagee=firebase.storage().ref("test");
app.post("/powerLink", multer.single('file'),function(req,res){
var file = req.file;
if(file){
console.log(file);
let metadata = {contentType: file.mimetype, name: file.originalname}
storage.put(file.buffer, metadata);
}
After this, I got an XMLHttpRequest error. I installed the xhr2 module. https://github.com/pwnall/node-xhr2
npm install xhr2
Then if you have to access the storage in multiple files you can add the code in your index/main file
global.XMLHttpRequest = require("xhr2");

Using a Stream with Sharp in Node.js

I have a Node.js app. In this app, I'm loading an image file from Azure Storage using this API. Specifically, I'm using the createReadStream function. That function provides a stream to read from the image. Right now, my code looks like this:
let chunks = [];
let sharp = require('sharp');
let fileStream = this.fileService.createReadStream('[MY_SHARE]', '[MY_DIRECTORY]', '[MY_PICTURE_NAME]', (err, fileResult, res) => {
console.log('Sharpening image...');
console.log(imageChunks);
sharp(chunks).resize(100, 100).toFile('resized.png', function(err1, info) {
if (err1) {
console.log('oops');
}
});
});
fileStream.on('data', function(chunk) {
chunks.push(chunk);
});
In this block, notice that I'm trying to use the Sharp node module. Before the line sharp(chunks)..., I'm printing out image chunks to the console. When printed in the console, I see the following:
[ <Buffer 87 52 4e 47 0d 0a 1a 0a 01 02 04 0d 49 48 44 52 00 00 06 18 00 00 06 0f 08 06 00 00 00 75 c2 f0 a2 00 00 00 04 67 42 4d 41 00 00 b2 8f 0a fc 61 05 00 ... > ]
However, when I call Sharp, I get an error that says:
Error: Unsupported input object
According to the docs, the Sharp Constructor allows for either a String or a Buffer. Looking at what's printed to the console as shown above, it looks like I'm passing a Buffer to Sharp. Am I misunderstanding something? If so, what?
Please change the following line of code:
sharp(chunks).resize(100, 100).toFile('resized.png', function(err1, info) {
if (err1) {
console.log('oops');
}
});
to
sharp(Buffer.concat(chunks)).resize(100, 100).toFile('resized.png', function(err1, info) {
if (err1) {
console.log('oops');
console.log(err1);
}
});
and that should fix the problem. If I am not mistaken, chunks actually is an array of Buffer whereas sharp library expects a Buffer so we would need to create a new buffer using all the array elements.

Using gridfs to store uploaded file with its metadata in Node / Express

I know there's a few threads about this, but I couldn't find my answer exactly. So using post i have managed to get this file object to the server side
{ fileToUpload:
{ name: 'resume.pdf',
data: <Buffer 25 50 44 46 2d 31 2e 33 0a 25 c4 e5 f2 e5 eb a7 f3 a0 d0 c4 c6 0a 34 20 30 20 6f 62 6a 0a 3c 3c 20 2f 4c 65 6e 67 74 68 20 35 20 30 20 52 20 2f 46 69 ... >,
encoding: '7bit',
mimetype: 'application/pdf',
mv: [Function] } }
How do I save this along with the metadata using mongoose & gridfs? In most threads I've looked at so far, gridfs-stream was used given a temporary path of the file, which I don't have. Could someone help me save this file by streaming the data along with its metadata + given an example of how I would retrieve it & send it back to the clientside?
I must've been tired, I was using the express-fileupload as a middleware but not using it to save the file which is done with the mv function in the object. Using code below to save file locally and then streaming it to mongo using gridfs-stream
var file = req.files.fileToUpload;
file.mv('./uploads/'+file.name, function(err) {
if (err) {
res.send(err, 500);
}
else {
res.send('File uploaded!');
var gfs = Grid(conn.db);
// streaming to gridfs
//filename to store in mongodb
var writestream = gfs.createWriteStream({
filename: file.name
});
fs.createReadStream('./uploads/'+file.name).pipe(writestream);
writestream.on('close', function (file) {
// do something with `file`
console.log(file.filename + ' Written To DB');
});
}
});

NodeJS TypeError argument should be a Buffer only on Heroku

I am trying to upload an image to store on MongoDB through Mongoose.
I am using multiparty to get the uploaded file.
The code works 100% perfectly on my local machine, but when I deploy it on Heroku, it gives the error:
TypeError: argument should be a Buffer
Here is my code:
exports.create = function (req, res) {
'use strict';
var form = new multiparty.Form();
form.parse(req, function (err, fields, files) {
var file = files.file[0],
contentType = file.headers['content-type'],
body = {};
_.forEach(fields, function (n, key) {
var parsedField = Qs.parse(n)['0'];
try {
parsedField = JSON.parse(parsedField);
} catch (err) {}
body[key] = parsedField;
});
console.log(file.path);
console.log(fs.readFileSync(file.path));
var news = new News(body);
news.thumbnail = {
data: new Buffer(fs.readFileSync(file.path)),
contentType: contentType
};
news.save(function (err) {
if (err) {
return handleError(res, err);
}
return res.status(201);
});
});
};
This is the console logs in the above code for HEROKU:
Sep 26 17:37:23 csgowin app/web.1: /tmp/OlvQLn87yfr7O8MURXFoMyYv.gif
Sep 26 17:37:23 csgowin app/web.1: <Buffer 47 49 46 38 39 61 10 00 10 00 80 00 00 ff ff ff cc cc cc 21 f9 04 00 00 00 00 00 2c 00 00 00 00 10 00 10 00 00 02 1f 8c 6f a0 ab 88 cc dc 81 4b 26 0a ... >
The is the console logs on my LOCAL MACHINE:
C:\Users\DOLAN~1.ADS\AppData\Local\Temp\TsfwadjjTbJ8iT-OZ3Y1_z3L.gif
<Buffer 47 49 46 38 39 61 5c 00 16 00 d5 36 00 bc d8 e4 fe fe ff ae cf df dc ea f1 fd fe fe db e9 f1 ad ce de 46 5a 71 2b 38 50 90 b8 cc 4a 5f 76 9a c3 d7 8f ... >
Does Heroku need any settings or configurations or something?
Sounds like the object passed is not a buffer when
data: new Buffer(fs.readFileSync(file.path)) is executed. Probably a difference in how your local environment is handling file writes or it could be how multiparty is handling streams.
This code works flawlessly for me:
news.thumbnail = {
media: fs.createReadStream(fileLocation),
contentType: contentType
};
But you also have to make sure your file has been saved from upload before you can use the file in the above createReadStream method. Things are inconsistent with Node, sometimes this happens synchronously and sometimes not. Ive used Busboy to handle the fileupload since it handles streams and creates a handler when the file stream is complete. Sorry, based on the above I cannot tell you where your issue is so ive included two solutions for you to try :))
Busboy: https://www.npmjs.com/package/busboy
Ive used this after the file has been uploaded to the temp directory in busboy:
//Handles file upload and stores to a more permanent location.:
//This handles streams.
// request is given by express.
var busboy = new Busboy({ headers: request.headers });
var writeStream;
busboy.on('file', function(fieldname, file, filename, encoding, mimetype) {
writeStream = file.pipe(fs.createWriteStream(saveTo));
})
.on('finish', function() {
writeStream = file.pipe(fs.createWriteStream(saveTo));
writeStream.on('close', function(){
//use the file
});
});

Resources