Using gridfs to store uploaded file with its metadata in Node / Express - node.js

I know there's a few threads about this, but I couldn't find my answer exactly. So using post i have managed to get this file object to the server side
{ fileToUpload:
{ name: 'resume.pdf',
data: <Buffer 25 50 44 46 2d 31 2e 33 0a 25 c4 e5 f2 e5 eb a7 f3 a0 d0 c4 c6 0a 34 20 30 20 6f 62 6a 0a 3c 3c 20 2f 4c 65 6e 67 74 68 20 35 20 30 20 52 20 2f 46 69 ... >,
encoding: '7bit',
mimetype: 'application/pdf',
mv: [Function] } }
How do I save this along with the metadata using mongoose & gridfs? In most threads I've looked at so far, gridfs-stream was used given a temporary path of the file, which I don't have. Could someone help me save this file by streaming the data along with its metadata + given an example of how I would retrieve it & send it back to the clientside?

I must've been tired, I was using the express-fileupload as a middleware but not using it to save the file which is done with the mv function in the object. Using code below to save file locally and then streaming it to mongo using gridfs-stream
var file = req.files.fileToUpload;
file.mv('./uploads/'+file.name, function(err) {
if (err) {
res.send(err, 500);
}
else {
res.send('File uploaded!');
var gfs = Grid(conn.db);
// streaming to gridfs
//filename to store in mongodb
var writestream = gfs.createWriteStream({
filename: file.name
});
fs.createReadStream('./uploads/'+file.name).pipe(writestream);
writestream.on('close', function (file) {
// do something with `file`
console.log(file.filename + ' Written To DB');
});
}
});

Related

Is there an easy way to do a simple file upload using Node/PostgreSQL for any type of file?

I want to store files in an existing postgreSQL database by uploading them by means of an express server.
The file comes into the POST end point like this:
{ name: 'New Data.xlsx',
data: <Buffer 50 4c 03 04 14 01 06 00 08 00 00 24 21 00 1f 0a 93 21 cf 02 00 00 4f 1f 00 00 13 00 08 02 5b 43 6f 6e 74 65 6e 74 ... >,
size: 6880975,
encoding: '7bit',
tempFilePath: '',
truncated: false,
mimetype: 'application/vnd.openxmlformats-officedocument.spreadsheetml.sheet',
md5: '535c8576e1c94d169ea5a637487ee3b4',
mv: [Function: mv] }
This is a fairly large excel document. Word Docs, pdfs, simple CSVs, etc also need to be possible for upload.
I've tried both node-postgres and sequelize libraries in a similar way:
app.post('/upload', async(req, res) => {
var {upload} = req.files
var data = upload.data
console.log(data);
const x = await pool.query(`insert into attachment (data, file_name, category) values (${data}, '${upload.name}', 'test')`)
// Deconstruct x to get response values
res.send("OK")
});
Some files like txt, plain csv files work and do upload however I receive errors such as
error: invalid message format
for excel or word files.
I've done something like this before with MongoDB but I can't switch databases. Another idea I had was to simply store the files on the production server in an 'uploads' files but I'm not sure that's good practice.
Any advice?
Solved using query parameters.
app.post('/upload', async(req, res) => {
var {upload} = req.files
var data = upload.data
const x = await pool.query(`insert into attachment (data, file_name, category) values ($1, '${upload.name}', 'test')`, [data])
res.send("OK")
});

node js uploading image with multer and save to firebase storage

My ejs front end code is as below
<form action='/powerLink' method='post' enctype='multipart/form-data'> <input type='file' name='file'> <input type='submit' value='fileupload'> </form>
and my js code where receive file is as below
var storagee=firebase.storage().ref("test");
app.post("/powerLink", multer.single('file'),function(req,res){
let file = req.file;
if(file){
console.log(file);
storage.put(file);
}
when I console.log(file)
it has value like below
{ fieldname: '
file',
originalname: 'appiicon.png',
encoding: '7bit',
mimetype: 'image/png',
buffer:
<Buffer 89 50 4e 47 0d 0a 1a 0a 00 00 00 0d 49 48 44 52 00 00 00 e1 00 00 00 e1 08 06 00 00 00 3e b3 d2 7a 00 00 00 19 74 45 58 74 53 6f 66 74 77 61 72 65 00 ... >,
size: 15966 }
I thought it save right to my storage and folder "test" and then save image to that folder. but nothing happened.
I can't guess what's the reason of not uploading image file to storage on firebase
I've done this before using the firebase admin sdk with something like this (typescript):
async function uploadFile(file: Express.Multer.File, directory: string, fileName: string): Promise<string> {
const bucket = firebaseAdmin.storage().bucket();
const fullPath = `${directory}/${fileName}`;
const bucketFile = bucket.file(fullPath);
await bucketFile.save(file.buffer, {
contentType: file.mimetype,
gzip: true
});
const [url] = await bucketFile.getSignedUrl({
action: "read",
expires: "01-01-2050"
});
return url;
}
I had a related issue. Passing in the file object returned the error: TypeError: Cannot read property 'byteLength' of undefined
Rather than passing in the file object, you should pass in the buffer property like this:
var storagee=firebase.storage().ref("test");
app.post("/powerLink", multer.single('file'),function(req,res){
var file = req.file;
if(file){
console.log(file);
let metadata = {contentType: file.mimetype, name: file.originalname}
storage.put(file.buffer, metadata);
}
After this, I got an XMLHttpRequest error. I installed the xhr2 module. https://github.com/pwnall/node-xhr2
npm install xhr2
Then if you have to access the storage in multiple files you can add the code in your index/main file
global.XMLHttpRequest = require("xhr2");

NodeJS TypeError argument should be a Buffer only on Heroku

I am trying to upload an image to store on MongoDB through Mongoose.
I am using multiparty to get the uploaded file.
The code works 100% perfectly on my local machine, but when I deploy it on Heroku, it gives the error:
TypeError: argument should be a Buffer
Here is my code:
exports.create = function (req, res) {
'use strict';
var form = new multiparty.Form();
form.parse(req, function (err, fields, files) {
var file = files.file[0],
contentType = file.headers['content-type'],
body = {};
_.forEach(fields, function (n, key) {
var parsedField = Qs.parse(n)['0'];
try {
parsedField = JSON.parse(parsedField);
} catch (err) {}
body[key] = parsedField;
});
console.log(file.path);
console.log(fs.readFileSync(file.path));
var news = new News(body);
news.thumbnail = {
data: new Buffer(fs.readFileSync(file.path)),
contentType: contentType
};
news.save(function (err) {
if (err) {
return handleError(res, err);
}
return res.status(201);
});
});
};
This is the console logs in the above code for HEROKU:
Sep 26 17:37:23 csgowin app/web.1: /tmp/OlvQLn87yfr7O8MURXFoMyYv.gif
Sep 26 17:37:23 csgowin app/web.1: <Buffer 47 49 46 38 39 61 10 00 10 00 80 00 00 ff ff ff cc cc cc 21 f9 04 00 00 00 00 00 2c 00 00 00 00 10 00 10 00 00 02 1f 8c 6f a0 ab 88 cc dc 81 4b 26 0a ... >
The is the console logs on my LOCAL MACHINE:
C:\Users\DOLAN~1.ADS\AppData\Local\Temp\TsfwadjjTbJ8iT-OZ3Y1_z3L.gif
<Buffer 47 49 46 38 39 61 5c 00 16 00 d5 36 00 bc d8 e4 fe fe ff ae cf df dc ea f1 fd fe fe db e9 f1 ad ce de 46 5a 71 2b 38 50 90 b8 cc 4a 5f 76 9a c3 d7 8f ... >
Does Heroku need any settings or configurations or something?
Sounds like the object passed is not a buffer when
data: new Buffer(fs.readFileSync(file.path)) is executed. Probably a difference in how your local environment is handling file writes or it could be how multiparty is handling streams.
This code works flawlessly for me:
news.thumbnail = {
media: fs.createReadStream(fileLocation),
contentType: contentType
};
But you also have to make sure your file has been saved from upload before you can use the file in the above createReadStream method. Things are inconsistent with Node, sometimes this happens synchronously and sometimes not. Ive used Busboy to handle the fileupload since it handles streams and creates a handler when the file stream is complete. Sorry, based on the above I cannot tell you where your issue is so ive included two solutions for you to try :))
Busboy: https://www.npmjs.com/package/busboy
Ive used this after the file has been uploaded to the temp directory in busboy:
//Handles file upload and stores to a more permanent location.:
//This handles streams.
// request is given by express.
var busboy = new Busboy({ headers: request.headers });
var writeStream;
busboy.on('file', function(fieldname, file, filename, encoding, mimetype) {
writeStream = file.pipe(fs.createWriteStream(saveTo));
})
.on('finish', function() {
writeStream = file.pipe(fs.createWriteStream(saveTo));
writeStream.on('close', function(){
//use the file
});
});

NodeJS memcache get Buffer output

I'm trying to load data from memcached (EDIT: using the nodejs memcached package) but I keep getting back something like :
{ available_ads: <Buffer 5b 7b 22 69 64 22 3a 37 31 34 31 35 2c 22 74 69 74 6c
65 22 3a 22 44 6f 6c 6c 61 72 53 68 61 76 65 43 6c 75 62 2e 63 6f 6d 22 2c 22 73
75 62 74 69 74 6c ...>, cas: '2' }
I don't know much about memcached so maybe it's because of an invalid character. Here's the output from telnet:
[{"id":71415,"title":"DollarShaveClub.com","subtitle":"","body":"Our Blades Are F***ing Great","click_url":"http:\/\/b.v11media.com\/click?k=eeacbb44ecf52873acd8cfbe63ffdbea&uid=111&ip_list=%5B%2298.143.242.51%22%2C%2210.122.186.15%22%5D&o=71415&src=json","image_url":"http:\/\/img.youtube.com\/vi\/ZUG9qYTJMsI\/0.jpg","cpa":"1","engagement_type":1,"image_width":480,"image_height":360,"currency":"","gender":null,"max_age":null,"min_age":null,"is_vc_ok":true,"width":500,"height":311,"length":94,"is_autoplay":false,"is_responsive":true,"platforms":["web","mobile"],"supported_events":["play","complete"]},{"id":70799,"title":"Watch Settlers Online video!","subtitle":"","body":"Watch the entire video.","click_url":"http:\/\/b.v11media.com\/click?k=eeacbb44ecf52873acd8cfbe63ffdbea&uid=111&ip_list=%5B%2298.143.242.51%22%2C%2210.122.186.15%22%5D&o=70799&src=json","image_url":"http:\/\/img.youtube.com\/vi\/1sZKP0QnIIY\/0.jpg","cpa":"1","engagement_type":1,"image_width":480,"image_height":360,"currency":"","gender":null,"max_age":29,"min_age":21,"is_vc_ok":true,"width":520,"height":325,"length":50,"is_autoplay":false,"is_responsive":true,"platforms":["web","mobile"],"supported_events":["play","complete"]},{"id":70797,"title":"Watch this Samsung Memory - Meet Loading Ball Larry video!","subtitle":"","body":"Please watch the entire video.","click_url":"http:\/\/b.v11media.com\/click?k=eeacbb44ecf52873acd8cfbe63ffdbea&uid=111&ip_list=%5B%2298.143.242.51%22%2C%2210.122.186.15%22%5D&o=70797&src=json","image_url":"http:\/\/s3.amazonaws.com\/V11\/images\/982727944edee812a1a8c874a9abf65a.png","cpa":"1","engagement_type":1,"image_width":125,"image_height":79,"currency":"","gender":null,"max_age":34,"min_age":18,"is_vc_ok":true,"width":500,"height":310,"length":103,"is_autoplay":false,"is_responsive":true,"platforms":["web","mobile"],"supported_events":["play","complete"]},{"id":72006,"title":"Nike: No Cup is Safe","subtitle":"","body":"Please watch the entire video.","click_url":"http:\/\/b.v11media.com\/click?k=eeacbb44ecf52873acd8cfbe63ffdbea&uid=111&ip_list=%5B%2298.143.242.51%22%2C%2210.122.186.15%22%5D&o=72006&src=json","image_url":"http:\/\/img.youtube.com\/vi\/1jRoHGq9EoY\/0.jpg","cpa":"1","engagement_type":1,"image_width":480,"image_height":360,"currency":"","gender":null,"max_age":null,"min_age":null,"is_vc_ok":true,"width":300,"height":250,"length":63,"is_autoplay":false,"is_responsive":false,"platforms":["web"],"supported_events":["play","complete"]},{"id":72049,"title":"Behind the Cameras - by Rolex","subtitle":"","body":"Please watch the entire video.","click_url":"http:\/\/b.v11media.com\/click?k=eeacbb44ecf52873acd8cfbe63ffdbea&uid=111&ip_list=%5B%2298.143.242.51%22%2C%2210.122.186.15%22%5D&o=72049&src=json","image_url":"http:\/\/img.youtube.com\/vi\/k9o1lAq4zTw\/0.jpg","cpa":"1","engagement_type":1,"image_width":480,"image_height":360,"currency":"","gender":null,"max_age":null,"min_age":null,"is_vc_ok":true,"width":520,"height":325,"length":39,"is_autoplay":false,"is_responsive":false,"platforms":["web","mobile"],"supported_events":["play","complete"]},{"id":72055,"title":"Huggies - Meet the Squirmers!","subtitle":"","body":"Please watch the entire video.","click_url":"http:\/\/b.v11media.com\/click?k=eeacbb44ecf52873acd8cfbe63ffdbea&uid=111&ip_list=%5B%2298.143.242.51%22%2C%2210.122.186.15%22%5D&o=72055&src=json","image_url":"http:\/\/img.youtube.com\/vi\/tZnzHq3_xaQ\/0.jpg","cpa":"1","engagement_type":1,"image_width":480,"image_height":360,"currency":"","gender":null,"max_age":null,"min_age":null,"is_vc_ok":true,"width":300,"height":250,"length":33,"is_autoplay":false,"is_responsive":false,"platforms":["web","mobile"],"supported_events":["play","complete"]},{"id":70794,"title":"Open Days 2012","subtitle":"","body":"Please watch the entire video.","click_url":"http:\/\/b.v11media.com\/click?k=eeacbb44ecf52873acd8cfbe63ffdbea&uid=111&ip_list=%5B%2298.143.242.51%22%2C%2210.122.186.15%22%5D&o=70794&src=json","image_url":"http:\/\/img.youtube.com\/vi\/t4TnaVXQCa4\/0.jpg","cpa":"1","engagement_type":1,"image_width":480,"image_height":360,"currency":"","gender":"f","max_age":null,"min_age":null,"is_vc_ok":true,"width":520,"height":325,"length":58,"is_autoplay":false,"is_responsive":true,"platforms":["web","mobile"],"supported_events":["play","complete"]},{"id":70795,"title":"Samsung Memory - Loading Ball Larry","subtitle":"","body":"Please watch the entire video.","click_url":"http:\/\/b.v11media.com\/click?k=eeacbb44ecf52873acd8cfbe63ffdbea&uid=111&ip_list=%5B%2298.143.242.51%22%2C%2210.122.186.15%22%5D&o=70795&src=json","image_url":"http:\/\/img.youtube.com\/vi\/a5S668LyM5c\/0.jpg","cpa":"1","engagement_type":1,"image_width":480,"image_height":360,"currency":"","gender":null,"max_age":100,"min_age":30,"is_vc_ok":true,"width":520,"height":325,"length":103,"is_autoplay":false,"is_responsive":true,"platforms":["web","mobile"],"supported_events":["play","complete"]},{"id":71522,"title":"Huggies Snug & Dry","subtitle":"","body":"Watch the dads test out Huggies at the mall","click_url":"http:\/\/b.v11media.com\/click?k=eeacbb44ecf52873acd8cfbe63ffdbea&uid=111&ip_list=%5B%2298.143.242.51%22%2C%2210.122.186.15%22%5D&o=71522&src=json","image_url":"http:\/\/img.youtube.com\/vi\/Kthn6DkQVL4\/0.jpg","cpa":"1","engagement_type":1,"image_width":480,"image_height":360,"currency":"","gender":null,"max_age":null,"min_age":18,"is_vc_ok":true,"width":520,"height":325,"length":33,"is_autoplay":false,"is_responsive":true,"platforms":["web","mobile"],"supported_events":["play","complete"]}]
Can anyone tell me why the nodejs library would output a buffer tag?
EDIT: here's my code
var Memcached = require('memcached');
var memcached = new Memcached('localhost:11211');
memcached.gets('available_ads', function(err, data){
if(err){ console.log(err); }
console.log(data);
res.json({'click_url' : data});
});
try using:
data.toString();
and then you'll probably need to use JSON.parse:
JSON.parse(data.toString());
It is when you store/set you'll need to store as String.
Example:
mc.set('keyName', JSON.stringify(thisObj), {expires:0}, function(err, val){
if (err) {
callback(err);
} else {
callback(null, thisObj);
}
});
Then when you read it back, just read it as normal json.
Example:
mc.get('keyName', async function(err, data){
callback(null, JSON.parse(data));
}

How do I interact with this <File> object in a node.js stream?

I'm using gulp to build a stream of glob-matched files and move them all, in their nested structure, to a new location. To do this, I first wanted to build a simple 'through' stream to see what I get passed if I pipe to it from gulp.src().
Here is my test gulpfile.js:
var through = require("through");
var fs = require("fs");
function write(file) {
console.log(file);
console.log(file.toString());
}
gulp.task("move", function () {
return gulp.src("./**")
.pipe(through(write));
});
If I run the gulp 'move' task on the command line, I get output like the following:
<File "some/path">
[object Object]
<File "some/path/file.js" <Buffer 2f 2a 0a 0a 4f 72 67 69 6e 61 6c 20 53 74 79 6c 65 20 66 72 6f 6d 20 65 74 68 61 6e 73 63 68 6f 6f 6e 6f 76 65 72 2e 63 6f 6d 2f 73 6f 6c 61 72 69 7a 65 ...>>
[object Object]
What are those objects? How can I interact with them?
Those are vinyl objects. They are the core data type passed through gulp streams. The contain information about the file (such as path info and contents as a buffer or stream). You can see the data better using gulp-debug.
If you want to move a bunch of files, while preserving their relative path, you can do one of the following, no need to dig into the code yourself:
gulp.src('/a/single/src/path/**/*.foo').pipe(gulp.dest('/a/single/dest/path'));
Or, if you have a bunch of different globs:
gulp.src(['/a/src/path/foo/**/*.foo', '/a/src/path/bar/**/*.bar'], {base: '/a/src/path/'})
.pipe(gulp.dest('/a/dest/path/'));
Mostly you'll be using gulp plugins to manipulate the files, then passing the result to gulp.dest(), rather than manipulating them yourself.
If you need to manipulate the files, there's a few plugins that can help:
gulp-tap allows you to peak into the stream, and optionally modify the file or buffer.
vinyl-map lets you easily modify the contents of files
gulp-filter can help you filter the stream if globbing doesn't work.
You can view the file properties using this js:
var propValue;
for(var propName in file) {
propValue = file[propName];
console.log('name:' + propName, ', value:<<<',propValue,'>>>');
}
Sample Output
name:history , value:"C:\Temp\test.txt"
name:cwd , value:"C:\Temp"
name:base , value:"C:\Temp"
name:_contents , value: full file contents
name:isBuffer , value:"function () {
name:isStream , value:"function () {
name:isNull , value:"function () {
name:isDirectory , value:"function () {
name:clone , value:"function (opt) {
name:pipe , value:"function (stream, opt) {
name:inspect , value:"function () {
name:stat , value:<<< { dev: 0,
mode: 33206,
nlink: 1,
uid: 0,
gid: 0,
rdev: 0,
ino: 0,
size: 874,
atime: Sat Sep 19 2015 14:34:51 GMT+1000 (AUS Eastern Standard Time),
mtime: Sat Sep 19 2015 14:34:51 GMT+1000 (AUS Eastern Standard Time),
ctime: Sat Sep 12 2015 14:59:40 GMT+1000 (AUS Eastern Standard Time) } >>>
Usage:
console.log('file name:', file.relative);
console.log('file current working directory:', file.cwd);
console.log('file isDirectory:', file.isDirectory());
For those who also stumbled upon this and don't want to use gulp, here is how I did it:
Assuming files is an array of vinyl objects -
const outputPath = ... // some directory path
files.forEach(file => {
const filePath = path.join(outputPath, file.relative);
// if its a directory then create the directory if not already present
if (file.isDirectory()) {
if (!fs.existsSync(filePath)) {
fs.mkdirSync(filePath, { recursive: true });
}
} else {
// if its a file then save the contents of the file
fs.writeFileSync(filePath, file.contents);
}
});

Resources