Node JS mongoose insert data at the beginning - node.js

I'm experimenting with mongodb using mongoose in NodeJS.
I have a simple web where a user can create a post.
On creation this post is saved to mongodb.
On the web, i have a scroll event listener which checks if the user is on the bottom of the page or not. If he is on the bottom, it will do a fetch to the backend to get more posts.
I want these to be retrived from the db from newest to oldest, but the model.save() method of mongoose always inserts a new model at the end of the collection.
So when a new post created the backend does this right now:
const post = new Post({
text: req.body.text,
author: {
userid: user._id,
name: user.username,
picPath: user.picPath
},
images: images
});
post.save(function (err) {
let status = true;
let message = "Post mentve."
if (err) { status = false; message = err; console.log(err) }
return res.send({ status: status, msg: message });
})
This way, a new post pushed to the collection. And not unshifted.
When the client wants new posts the backend does this:
app.get('/dynamicPostLoad/:offset/:limit', async (req, res) => {
let offset = req.params.offset;
let limit = req.params.limit;
let response = {
status: true,
posts : [],
message: "Fetched"
};
await Post.find({}).skip(offset).limit(limit).then(products => {
response.posts = products;
}).catch((err)=> {
response.status = false;
response.message = err;
});
return res.send(response);
});
So the mongoose will fetch from oldest to newst since all the new is inserted at the end of the collection.
That way, the user will see the oldest post first and as he scrolls, sees the oldest and oldest posts.
I was thinking on three ways.
Either the Post.find({}) method should crawl the documents from the end of the collection or the Post.save() method should unshift the document instead of push or i could find all the posts in the collection and reverse them. ( the last one would be painfully slow )
EDIT: Every post contains a creation date, so it could be sorted.
How can i achive this?

I solved with sort. ( still don't understand why i can't insert a document to the beginning of a collection )
Here is my solution:
app.get('/dynamicPostLoad/:offset/:limit', async (req, res) => {
let offset = req.params.offset;
let limit = req.params.limit;
let response = {
status: true,
posts : [],
message: "Fetched"
};
// Sort every find by created date before limiting.
await Post.find({}).sort({created: -1}).skip(offset).limit(limit).then(products => {
response.posts = products;
}).catch((err)=> {
response.status = false;
response.message = err;
});
return res.send(response);
});

Related

Can't push object into the array after running mongodb query

I was trying to set up the logic for adding some items into an array, which id's express server receives from a client. My program receives the id of the product and then I was fetching the product details from MongoDB query findOne, and then with some customized details I used to push that item into an array but it's not working, whenever I try to push any element after MongoDB query it's not working, I don't know why, but please help me, Sorry for my bad English !
It's an ExpressJS server using MongoDB
Items received: (But it is actually received from the client in the form of JSON) :
const items= [
{
productId:"61e01e7e24612b56c33b06c3",
quantity:"4"
},
{
productId:"61e01e9024612b56c33b06c6",
quantity:"10"
}
]
The actual code here is the problem
let itemsData = [];
items.forEach(async (item) => {
const itemData = await findProduct({ _id: item.productId });
// Check if product found
if (!itemData) return res.status(400).json({ message: "Product is Invalid" });
// If found add that in object
itemsData.push({
productId: itemData._id,
name: itemData.name,
price: itemData.price,
quantity: item.quantity,
unit: "Nos",
totalPrice: parseInt(itemData.price) * parseInt(item.quantity)
});
});
The code above doesn't push that object into the itemsData
array
findProduct Function
// Service to find Product
async findProduct(filter) {
return await ProductModel.findOne(filter);
}
If I used that push method and tried only to itemsData.push("hello"); before the MongoDB query it works, but if I put it after the findProduct Query it doesn't work! I don't know what is wrong with it! Somebody help me!
I just want to push those items with detail into itemData object happily which is not happening I tried to console.log(itemsData) it just return [], what should I do?
Try using For Of instead of forEach (don't forget to add async)
let itemsData = [];
for (const item of items) {
const itemData = await findProduct({ _id: item.productId });
// Check if product found
if (!itemData) return res.status(400).json({ message: "Product is Invalid" });
// If found add that in object
itemsData.push({
productId: itemData._id,
name: itemData.name,
price: itemData.price,
quantity: item.quantity,
unit: "Nos",
totalPrice: parseInt(itemData.price) * parseInt(item.quantity)
});
}
It's because forEach function is not designed to work well with async calls.
You could use map instead.
This should work:
let itemsData = [];
const promises = items.map(async (item) => {
const itemData = await findProduct({ _id: item.productId });
// Check if product found
if (!itemData) return res.status(400).json({ message: "Product is Invalid" });
return {
productId: itemData._id,
name: itemData.name,
price: itemData.price,
quantity: item.quantity,
unit: "Nos",
totalPrice: parseInt(itemData.price) * parseInt(item.quantity)
};
});
itemsData = await Promise.all(promises);
When you use map with async, you will have an array of promises, so you can use Promise.all to wait for the values to get resolved.
Check this out for more details.

Only process 500 lines/row at a time createReadStream

I have to read a really large CSV file so search through the google and get to know about createReadStream. I am using a program that read the csv file data and insert it into the mongoDB.
process I am following
process the data using createReadStream (I think it read the file line by line).
Storing data into an array.
Insert the data into mongoDB using insertMany
Now the problem is whole file is first get stored into an array and then I insert into the database.
But what I think is the better approach would be I only store first 500 line/rows into an array insert it into the DB and again follow the same step for the next 500 records
Is it possible to achieve this ?
and also is it the right way to do this ?
my program
const test = async () => {
const stream = fs.createReadStream(workerData)
.pipe(parse())
.on('data', async function(csvrow) {
try{
stream.pause()
if(!authorName.includes(csvrow.author)) {
const author = new Author({author: csvrow.author})
authorId = author._id
authorName.push(author.author)
authorData.push(author)
}
if(!companyName.includes(csvrow.company_name)) {
const company = new Company({companyName: csvrow.company_name})
companyID = company._id
companyName.push(company.companyName)
companyData.push(company)
}
users = new User({
name: csvrow.firstname,
dob: csvrow.dob,
address: csvrow.address,
phone: csvrow.phone,
state: csvrow.state,
zip: csvrow.zip,
email: csvrow.email,
gender: csvrow.gender,
userType: csvrow.userType
})
userData.push(users)
book = new Book({
book_number: csvrow.book_number,
book_name: csvrow.book_name,
book_desc: csvrow.book_desc,
user_id: users._id,
author_id: authorId
})
bookData.push(book)
relationalData.push({
username: users.name,
author_id: authorId,
book_id: book._id,
company_id: companyID
})
}finally {
stream.resume()
}
})
.on('end', async function() {
try {
Author.insertMany(authorData)
User.insertMany(userData)
Book.insertMany(bookData)
Company.insertMany(companyData)
await Relational.insertMany(relationalData)
parentPort.postMessage("true")
}catch(e){
console.log(e)
parentPort.postMessage("false")
}
})
}
test()
This program is working fine also inserting the data into the DB, But I am looking for something like this:
const stream = fs.createReadStream(workerData)
.pipe(parse())
.on('data', async function(csvrow, maxLineToRead: 500) {
// whole code/logic of insert data into DB
})
so maxLineToRead is my imaginary term.
basically my point is I want to process 500 data at a time and insert it into the DB and want to repeat this process till the end.
You can create a higher scoped array variable where you accumulate rows of data as they arrive on the data event. When you get to 500 rows, fire off your database operation to insert them. If not yet at 500 rows, then just add the next one to the array and wait for more data events to come.
Then, in the end event insert any remaining rows still in the higher scoped array.
In this way, you will insert 500 at a time and then however many are left at the end. This has an advantage vs. inserting them all at the end that you spread out the database load over the time you are parsing.
Here's an attempt to implement that type of processing. There are some unknowns (documented with comments) based on an incomplete description of exactly what you're trying to accomplish in some circumstances):
const test = () => {
return new Promise((resolve, reject) => {
const accumulatedRows = [];
async function processRows(rows) {
// initialize data arrays that we will insert
const authorData = [],
companyData = [],
userData = [],
bookData = [],
relationalData = [];
// this code still has a problem that I don't have enough context
// to know how to solve
// If authorName contains csvrow.author, then the variable
// authorId is not initialized, but is used later in the code
// This is a problem that needs to be fixed.
// The same issue occurs for companyID
for (let csvrow of rows) {
let authorId, companyID;
if (!authorName.includes(csvrow.author)) {
const author = new Author({ author: csvrow.author })
authorId = author._id
authorName.push(author.author)
authorData.push(author)
}
if (!companyName.includes(csvrow.company_name)) {
const company = new Company({ companyName: csvrow.company_name })
companyID = company._id
companyName.push(company.companyName)
companyData.push(company)
}
let users = new User({
name: csvrow.firstname,
dob: csvrow.dob,
address: csvrow.address,
phone: csvrow.phone,
state: csvrow.state,
zip: csvrow.zip,
email: csvrow.email,
gender: csvrow.gender,
userType: csvrow.userType
});
userData.push(users)
let book = new Book({
book_number: csvrow.book_number,
book_name: csvrow.book_name,
book_desc: csvrow.book_desc,
user_id: users._id,
author_id: authorId
});
bookData.push(book)
relationalData.push({
username: users.name,
author_id: authorId,
book_id: book._id,
company_id: companyID
});
}
// all local arrays of data are populated now for this batch
// so add this data to the database
await Author.insertMany(authorData);
await User.insertMany(userData);
await Book.insertMany(bookData);
await Company.insertMany(companyData);
await Relational.insertMany(relationalData);
}
const batchSize = 50;
const stream = fs.createReadStream(workerData)
.pipe(parse())
.on('data', async function(csvrow) {
try {
accumulatedRows.push(csvRow);
if (accumulatedRows.length >= batchSize) {
stream.pause();
await processRows(accumulatedRows);
// clear out the rows we just processed
acculatedRows.length = 0;
stream.resume();
}
} catch (e) {
// calling destroy(e) will prevent leaking a stream
// and will trigger the error event to be called with that error
stream.destroy(e);
}
}).on('end', async function() {
try {
await processRows(accumulatedRows);
resolve();
} catch (e) {
reject(e);
}
}).on('error', (e) => {
reject(e);
});
});
}
test().then(() => {
parentPort.postMessage("true");
}).catch(err => {
console.log(err);
parentPort.postMessage("false");
});

How to implement pagination for mongodb in node.js using official mongodb client?

I want to implement pagination for mongodb in node.js enviroment using offical mongodb package. I tried to find out on internet but all are mongoose based links. I dont want to use mongoose.
How can I implement pagination using official client api given at
http://mongodb.github.io/node-mongodb-native/3.1/api/
Offset-based approach has a big flaw: if the results list has changed between calls to the API, the indices would shift and cause an item to be either returned twice or skipped and never returned
This problem is demonstrated at
https://www.sitepoint.com/paginating-real-time-data-cursor-based-pagination/
Time-based pagination approach would be little better because results are no longer skipped. If you query the first page, and then a new item is deleted, it won’t shift the results in your second page and all is fine. However, this approach has a major flaw: what if there is more than one item that was created at the same time?
Best would be to use Cursor based pagination
Which can be implemented using any field in collection which is Unique, Orderable and Immutable.
_id satisfy all Unique, Orderable and Immutable conditions. Based on this field we can sort and return page result with _id of last document as the cusror for subsequent request.
curl https://api.mixmax.com/items?limit=2
const items = db.items.find({}).sort({
_id: -1
}).limit(2);
const next = items[items.length - 1]._id
res.json({ items, next })
when the user wants to get the second page, they pass the cursor (as next) on the URL:
curl https://api.mixmax.com/items?limit=2&next=590e9abd4abbf1165862d342
const items = db.items.find({
_id: { $lt: req.query.next }
}).sort({
_id: -1
}).limit(2);
const next = items[items.length - 1]._id
res.json({ items, next })
If we want to return results in a different order, such as the date the item then we will add sort=launchDate to the querystring.
curl https://api.mixmax.com/items?limit=2&sort=launchDate
const items = db.items.find({}).sort({
launchDate: -1
}).limit(2);
const next = items[items.length - 1].launchDate;
res.json({ items, next })
For subsequent page request
curl https://api.mixmax.com/items?limit=2&sort=launchDate&next=2017-09-11T00%3A44%3A54.036Z
const items = db.items.find({
launchDate: { $lt: req.query.next }
}).sort({
_id: -1
}).limit(2);
const next = items[items.length - 1].launchDate;
res.json({ items, next });
If we launched a bunch of items on the same day and time? Now our launchDate field is no longer unique and doesn’t satisfy Unique, Orderable and Immutable. condition. We can’t use it as a cursor field. But we could use two fields to generate the cursor.Since we know that the _id field in MongoDB always satisfies the above three condition, we know that if we use it alongside our launchDate field, the combination of the two fields would satisfy the requirements and could be together used as a cursor field.
curl https://api.mixmax.com/items?limit=2&sort=launchDate
const items = db.items.find({}).sort({
launchDate: -1,
_id: -1 // secondary sort in case there are duplicate launchDate values
}).limit(2);
const lastItem = items[items.length - 1];
// The cursor is a concatenation of the two cursor fields, since both are needed to satisfy the requirements of being a cursor field
const next = `${lastItem.launchDate}_${lastItem._id}`;
res.json({ items, next });
For subsequent page request
curl https://api.mixmax.com/items?limit=2&sort=launchDate&next=2017-09-11T00%3A44%3A54.036Z_590e9abd4abbf1165862d342
const [nextLaunchDate, nextId] = req.query.next.split(‘_’);
const items = db.items.find({
$or: [{
launchDate: { $lt: nextLaunchDate }
}, {
// If the launchDate is an exact match, we need a tiebreaker, so we use the _id field from the cursor.
launchDate: nextLaunchDate,
_id: { $lt: nextId }
}]
}).sort({
_id: -1
}).limit(2);
const lastItem = items[items.length - 1];
// The cursor is a concatenation of the two cursor fields, since both are needed to satisfy the requirements of being a cursor field
const next = `${lastItem.launchDate}_${lastItem._id}`;
res.json({ items, next });
Refefence: https://engineering.mixmax.com/blog/api-paging-built-the-right-way/
Using the recommended pagination approach with limit() and skip() (see here):
const MongoClient = require('mongodb').MongoClient;
MongoClient.connect('http:localhost:27017').then((client) => {
const db = client.db(mongo.db);
db.collection('my-collection').find({}, {limit:10, skip:0}).then((documents) => {
//First 10 documents
console.log(documents);
});
db.collection('my-collection').find({}, {limit:10, skip:10}).then((documents) => {
//Documents 11 to 20
console.log(documents);
});
});
Here's a pagination function:
function studentsPerPage (pageNumber, nPerPage) {
return db.collection('students').find({},
{
limit: nPerPage,
skip: pageNumber > 0 ? ( ( pageNumber - 1 ) * nPerPage ) : 0
});
}
I am sending an API that is on MongoDb and Nodejs.
module.exports.fetchLoans = function(req, res, next) {
var perPage = 5;
var page = req.body.page || 1;
loans
.find({ userId: req.user._id})
.select("-emi")
.skip(perPage * page - perPage)
.limit(perPage)
.sort({ timestamp: -1 })
.exec(function(err, loan) {
if (loan != null) {
loans
.find({ userId: req.user._id})
.count()
.exec(function(err, count) {
if (count != null) {
res.json({
success: true,
loans: loan,
currentpage: page,
totalpages: Math.ceil(count / perPage)
});
} else {
console.log("Milestone Error: ", err);
res.json({ success: false, error: "Internal Server Error. Please try again." });
}
});
} else {
console.log("Milestone Error: ", err);
res.json({ success: false, error: "Internal Server Error. Please try again." });
}
});
};
In this code, you will have to provide page number on every hit.
You can use skip and limit options to implement pagination
module.exports = (data)=>{
let page = parseInt(data.page);
let limit = parseInt(data.limit);
let skip = 0
if(page>1){
skip = (page * limit);
}
let mongoClient = require('mongodb').MongoClient;
mongoClient.connect('mongodb://localhost:27017').then((client) => {
let db = client.db('your-db');
db.collection('your-collection').find({}, {limit:limit, skip:skip}).then((documents) => {
console.log(documents);
});
});
};

CosmosDB + MongoAPI, updating document workaround?

I've been trying to simply update a CosmosDB document via the mongodb api in my node application, I've been testing in and out, no errors but the value does not update no matter what.
I know updating array elements is not supported which is fine, but this is a top-level key-value pair. Changes simply don't happen with no error whatsoever.
I've been following the Mean.js project with uses CosmosDB + Mongoose + Node + Angular, looking at the API for updating hero and trying some of that code but it still doesn't update.
I've been reading the documentation trying to figure out the default way of handling CRUD operations within CosmosDB and which parts of the MongoAPI it supports but so far no luck.
For tests purposes, I'm using this code:
async function updateUser(id) {
try {
let user = await User.findById(id);
console.log (id);
console.log(user);
if (!user) return
user.id = id
user.firstName = 'ASDASDASASDASDASDASDASDA'
const result = await user.save()
console.log(result);
}
catch(err) {
console.log("There was an error updating user", err);
}
}
So, I've been playing around some more and managed to update a hero using this code:
updateHero('10')
async function updateHero(id) {
const originalHero = {
uid: id,
name: 'Hero2',
saying: 'nothing'
};
Hero.findOne({ uid: id }, (error, hero) => {
hero.name = originalHero.name;
hero.saying = originalHero.saying;
hero.save(error => {
return(hero);
console.log('Hero updated successfully!');
});
});
}
Now I'm just not sure why this has actually worked and why it hasn't before. The main thing that is different is that I'm using an 'uid' instead of the actual ID assigned by CosmosDB.
I tested sample code you provided and they both updated document successfully.
Sample document:
Snippet One:
updateUser('5b46eb0ee1a2f12ea0af307f')
async function updateUser(id) {
try {
let user = await Family.findById(id);
console.log (id);
console.log(user);
if (!user) return
user.id = id
user.name = 'ASDASDASASDASDASDASDASDA'
const result = await user.save()
console.log(result);
}
catch(err) {
console.log("There was an error updating user", err);
}
}
Output One:
Snippet Two:
updateFamily('5b46eb0ee1a2f12ea0af307f')
async function updateFamily(id) {
const updateFamily = {
_id: id,
name: 'ABCD',
};
Family.findOne({ _id : id }, (error, family) => {
family.name = updateFamily.name;
family.save(error => {
console.log(JSON.stringify(family));
console.log('Hero updated successfully!');
return(family);
});
});
}
Output Two:
In addition, you could use db.collection.update() to update document.
db.families.update(
{ _id: '5b46eb0ee1a2f12ea0af307f' },{ $set:
{
name: 'AAAA'
}
})
More details,please refer to the doc: https://docs.mongodb.com/manual/reference/method/db.collection.update/
Hope it helps you.

How to update/insert an other document in cloud firestore on receiving a create event for a collection using functions

Let us assume that we have two collections say "users" and "usersList"
Upon creating a new user document in users collection with following object
{username: Suren, age:31}
The function should read the above data and update other collection i.e. "usersList" with the username alone like below
{username: Suren}
Let me know the possibility
The code I have tried is
exports.userCreated =
functions.firestore.document('users/{userId}').onCreate((event) => {
const post = event.data.data();
return event.data.ref.set(post, {merge: true});
})
I have done it using below code
exports.userCreated = functions.firestore.document('users/{userId}')
.onCreate((event) => {
const firestore = admin.firestore()
return firestore.collection('usersList').doc('yourDocID').update({
name:'username',
}).then(() => {
// Document updated successfully.
console.log("Doc updated successfully");
});
})
If all you want to do is strip the age property from the document, you can do it like this:
exports.userCreated = functions.firestore.document('users/{userId}').onCreate((event) => {
const post = event.data.data();
delete post.age;
return event.data.ref.set(post);
})

Resources