how to upload multiple files inside mongodb using nodejs - node.js

Hello friends I am new in nodejs and mongodb. Is it possible to upload multiple files inside single mongodb doc with other information.

You can make use of BSON (Binary JSON) to store files in MongoDB collections. However, BSON has a limit of 16MB. If you are planning to store files bigger than that, consider GridFS
You can write files to MongoDB like so in Node.js:
var Binary = require(‘mongodb’).Binary;
//Read the file that you want to store
var file_data = fs.readFileSync(file_path);
var db_doc = {};
db_doc.file_data= Binary(file_data);
var my_collection = db.collection(‘files’);
my_collection.insert(db_doc, function(err, result){
//more code..
....

Related

Storing and downloading pdf files from mongoDB using Node.js

I am using NodeJs, mongoose, ejs
I want the user to be able to upload his CV (Less the 16 MB) as a pdf file, then the admin to be able to download the pdf or view the CV on the website.
I did the upload part where I stored the cv like this in the database (which might be a wrong way):
cv: Object {
fileName: "Test Results All in testing-homework.pdf"
filePath: "cvs\1650985306448-.pdf"
fileType: "application/pdf"
}
and the pdf file is stored in the "/cvs" folder after upload.
I'm seeking a way to be able to download/view the pdf file from the database.
I will suggest you to use GridFs from mongodb.
refer:https://www.mongodb.com/docs/drivers/node/current/fundamentals/gridfs/
To Download from DB
bucket.openDownloadStreamByName('your_file_name').
pipe(fs.createWriteStream('./desired_name_of_output_file'));
Some of the Basic Upload and Download Operations using GridFs in MongoDB
const mongoUrl='mongodb://127.0.0.1:27017/db'
const mongodb = require('mongodb');
const fs = require('node:fs');
const client = new mongodb.MongoClient(mongoUrl)
const db = client.db('db');
Create Bucket with user-defined name
const bucket = new mongodb.GridFSBucket(db,{bucketName:'name_of_bucket' });
Upload to DB
fs.createReadStream('./filename.pdf').
pipe(bucket.openUploadStream('filename', {
chunkSizeBytes: 1048576,
metadata: { field: 'filename', value: 'myValue' }
}))
Find
const cursor = bucket.find({});
cursor.forEach(doc => console.log(doc));
Delete
cursor.forEach(doc => bucket.delete(doc._id));

Streaming a large CSV file into a mongoDB database using mongoose

Searching for an efficient and quick way to stream a large (10 million lines) of csv data into a mongoose database.
Problems that arise are dealing with streaming instead of importing which could be solve with fs.createReadStream (although still learning how to use it) and how to deal with inserting that large amount of data into the mongoDB using mongoose because overloading mongoose/mongo with insert requests could lead to some errors.
you simply need 'stream-to-mongo-db' and 'csvtojson' npm library.
here is the example code i use to dump millions of records from BIG csv files. It just works!
const fs = require('fs');
const csv = require('csvtojson');
const streamToMongoDB = require('stream-to-mongo-db').streamToMongoDB;
const csvFile = './bigFiles.csv';
const dbURL = 'mongodb://localhost/tweets';
const collection = 'tweets';
fs.createReadStream(csvFile)
.pipe(csv())
.pipe(streamToMongoDB({
dbURL : dbURL, collection : collection
}))
there is a insertMany() method in mongoose. but that only lets you insert 10 000 docs only per once so.. My solution is to you loop asyncly using that method and insert untll the stream finishes..

MongoDB GridFS uploads zero-length file when using Node and request

I'm trying to save a file from a URL to a Mongo GridFS file. I'm using the request npm module, and the official mongo driver.
This is my code:
function retrieveAttachments(NormalizedMessage) {
NormalizedMessage.attachments.forEach(
function (currentAttachment) {
var bucket = new GridFSBucket(db);
var httpRequestStream = require('request');
httpRequestStream.get(currentAttachment.url)
.on('error', function (err) {
console.log(err);
})
.pipe(bucket.openUploadStream('dsa'));
}
);
}
"db" is obtained from MongoClient.connect, and GridFSBucket = require('mongodb').GridFSBucket;
The problem I have is that the file is inserted in mongo (I can see it in the fs.files collection), but it's length is zero.
If i also add a .on('finish') event to the upload stream, it gets called.
If I replace bucket.openUploadStream('dsa') with fs.createWriteStream('someFile'), the HTTP request is saved to someFile in the local filesystem.
If I replace httpRequestStream.get(currentAttachment.url) with fs.createReadStream("test.jpg"), the file test.jpg is successfully uploaded to GridFS.

import data from a csv file into mongoDB collection with meteor

i would like to get data from a csv file into a mongoDB collection using Meteor js and i would be grateful for any help
You can use papa-parse and read csv file using Node file system like this:
var fs = Npm.require('fs');
// Assume that the csv file is in yourApp/public/data folder
var data = fs.readFileSync(process.env.PWD + '/public/data/yourCSV.csv', 'utf8');
var usersData = Papa.parse(data, {header: true});
The userData will in the JSON format, you can store it in the MongoDb as you want.
csv-parse is used to parse csv files. Loading the MongoDB collection can be done via the upsert method of Meteor collection.

How to get size of mongodb single document in bytes using nodejs?

I'm fetching a document using db.coll.findOne. I wanted to get size of the document in bytes using nodejs with only mongo-native driver.can it be possible?
It is possible using BSON (it's a dependency of the mongodb driver):
var bson = require("bson");
var BSON = new bson.BSONPure.BSON();
db.coll.findOne(query).then(function(err, doc) {
var size = BSON.calculateObjectSize(doc);
});

Resources