import data from a csv file into mongoDB collection with meteor - excel

i would like to get data from a csv file into a mongoDB collection using Meteor js and i would be grateful for any help

You can use papa-parse and read csv file using Node file system like this:
var fs = Npm.require('fs');
// Assume that the csv file is in yourApp/public/data folder
var data = fs.readFileSync(process.env.PWD + '/public/data/yourCSV.csv', 'utf8');
var usersData = Papa.parse(data, {header: true});
The userData will in the JSON format, you can store it in the MongoDb as you want.

csv-parse is used to parse csv files. Loading the MongoDB collection can be done via the upsert method of Meteor collection.

Related

Read all JSON files contained in a dynamically updated folder

I've got multiple json files contained within a directory that will dynamically be updated by users. The users can add categories which will create new json files in that directory, and they can also remove categories which would delete json files in that directory. I'm looking for a method to read all json files contained in that folder directory, and push all the json files into a single object array. I imagine asynchronously would be desirable too.
I'm very new to using fs. I've management to read single json files by directory using
const fs = require('fs');
let data = fs.readFileSync('./sw_lbi/categories/category1.json');
let categories = JSON.parse(data);
console.log(categories);
But of course this will only solve the synchronous issue when using require()
As I'll have no idea what json files will be contained in the directory because the users will also name them, I'll need a way to read all the json files by simply calling the folder directory which contains them.
I'm imagining something like this (which obviously is foolish)
const fs = require('fs');
let data = fs.readFileSync('./sw_lbi/categories');
let categories = JSON.parse(data);
console.log(categories);
What would be the best approach to achieve this?
Thanks in advance.
First of all you need to scan this directory for files, next you need to filter them and select only JSONs, and at the end just read every file and do what you need to do
const fs = require('fs');
const path = require('path')
const jsonsInDir = fs.readdirSync('./sw_lbi/categories').filter(file => path.extname(file) === '.json');
jsonsInDir.forEach(file => {
const fileData = fs.readFileSync(path.join('./sw_lbi/categories', file));
const json = JSON.parse(fileData.toString());
});

how to upload multiple files inside mongodb using nodejs

Hello friends I am new in nodejs and mongodb. Is it possible to upload multiple files inside single mongodb doc with other information.
You can make use of BSON (Binary JSON) to store files in MongoDB collections. However, BSON has a limit of 16MB. If you are planning to store files bigger than that, consider GridFS
You can write files to MongoDB like so in Node.js:
var Binary = require(‘mongodb’).Binary;
//Read the file that you want to store
var file_data = fs.readFileSync(file_path);
var db_doc = {};
db_doc.file_data= Binary(file_data);
var my_collection = db.collection(‘files’);
my_collection.insert(db_doc, function(err, result){
//more code..
....

Streaming a large CSV file into a mongoDB database using mongoose

Searching for an efficient and quick way to stream a large (10 million lines) of csv data into a mongoose database.
Problems that arise are dealing with streaming instead of importing which could be solve with fs.createReadStream (although still learning how to use it) and how to deal with inserting that large amount of data into the mongoDB using mongoose because overloading mongoose/mongo with insert requests could lead to some errors.
you simply need 'stream-to-mongo-db' and 'csvtojson' npm library.
here is the example code i use to dump millions of records from BIG csv files. It just works!
const fs = require('fs');
const csv = require('csvtojson');
const streamToMongoDB = require('stream-to-mongo-db').streamToMongoDB;
const csvFile = './bigFiles.csv';
const dbURL = 'mongodb://localhost/tweets';
const collection = 'tweets';
fs.createReadStream(csvFile)
.pipe(csv())
.pipe(streamToMongoDB({
dbURL : dbURL, collection : collection
}))
there is a insertMany() method in mongoose. but that only lets you insert 10 000 docs only per once so.. My solution is to you loop asyncly using that method and insert untll the stream finishes..

Parse.com Node js failed to load json file

I'm trying to read a json file using node js/ express and deploying it to parseCloud but i keep getting
*Failed to load filters.json with: Could not find file filters.json *
here is my code:
var fs = require('fs');
var obj = JSON.parse(fs.readFileSync('cloud/filters.json', 'utf8'));
or this
var filterJson = require('cloud/filters.json');
thanks
Looks like parse.com doesn't allow .json files. You can save your file as .js and load it as plain text file (doesn't work with require()).
var fs = require('fs');
var parsedObject = JSON.parse(fs.readFileSync('cloud/path/json_file.js'));
It looks ugly, but works for me. :)
Try to add ./
fs.readFileSync('./cloud/filters.json', 'utf8')

Populate MongoDB from CSV using NodeJS

I am trying to populate my MongoDB using data from a CSV file. There are currently no databases or collections in my MongoDB and I would like to create these using an update function that creates objects parsed from a csv file.
I am using ya-csv to parse my csv file and the mongodb driver for node.
My code looks like this:
var csv = require('ya-csv');
var fs = require('fs');
var MongoClient = require('mongodb').MongoClient;
var Server = require('mongodb').Server;
var mongoclient = new MongoClient(new Server('localhost', 27017, {'native_parser' : true}));
var reader = csv.createCsvFileReader('YT5.csv', {columnsFromHeader:true,'separator': ','});
reader.addListener('data', function(data){
var nameHolder = data.name;
//I have no issue getting the needed variables from my csv file
mongoclient.db(nameHolder).collection('assets').update({assetId:data.assetId,name:data.name},{upsert:true},function(err,updated){if(err){console.log(err)}});
reader.addListener('end', function(data){
console.log("done");
}
I have not created the databases or collections for this, but can it do this for me with this update? I get an error:
[Error: Connection was destroyed by application]
When I run this, the databases get created but they're empty.
Any help would be greatly appreciated.
Unless there is a specific need to use NodeJS, say to not only reorder but make some complex modification to fields read from CSV file, use mongoimport.
If all you need to do is to skip and reorder some fields - work the fields with simple awk, skipping and changing order as needed:
cat /tmp/csv | awk -F',' 'BEGIN{OFS=","} {print $1,$2,$4,$3,$10}' | mongoimport --type csv --db test --collection csv_import
Additionally, if there is a need to change collection or db name based on the csv values (field 10 in this example is used as db name and field 11 as collection):
cat /tmp/csv | awk -F',' 'BEGIN{OFS=","} {print $1,$2,$4,$3,$10 | "mongoimport --type csv --db "$10" --collection "$11 }'
Or you could split the file into the per db/collection chunks first.
If you convert the csv rows in a javascript array (each row is an object), you can use https://github.com/bitliner/MongoDbPopulator

Resources