I am building a MERN stack social media application. On the application, a user can have a profile with posts which can either be a photo or a video.
My photos are stored in the posts collection, however, videos are stored in a collection named media.
When the user wants to view their posts, I have a function that gets all data from both collections, sorts them by their date of creation and returns them to the frontend. This has been working fine until the user builds up a large number of photos/videos, and now MongoDB won't allow that user to make the request anymore as it takes too much RAM.
I want to implement a lazy-load onto this so I'm not requesting all this data that the user doesn't even need, and I know how to do this using a single collection, however, I'm not sure how I would go about doing it when using two collections.
I know I would limit each collection to 2 objects at one time and add a skip to each to request the next two objects, but I don't know how to keep track of which needs to come next, a photo or a video?
My current code:
//First function, called by route
const listPostAndMediaByUser = (req, res) => {
sortMediaAndPosts(req)
.then(function(postsAndMedia) {
return res.json(postsAndMedia);
})
.catch(function(error) {
console.log("Error getting posts", error)
return res.status(400).json({
error: errorHandler.getErrorMessage(error)
});
});
};
//Sorting function
//This function runs getPosts, and getMedia
//And sorts them by their creation date
const sortMediaAndPosts = function(req) {
return new Promise(async function(resolve, reject) {
let postsAndMedia = [];
try {
const posts = await getPosts(req);
const media = await getMedia(req);
postsAndMedia = [...posts, ...media].sort(
(a, b) => new Date(b.created) - new Date(a.created)
);
} catch (error) {
console.log('Error: ', error);
reject(error);
}
resolve(postsAndMedia);
});
};
//Get posts function
const getPosts = function(req) {
return new Promise(async function(resolve, reject) {
try {
Post.find({ postedBy: req.profile._id })
.limit(2)
.select('-photo')
.populate('postedBy', '_id name')
.populate('comments.postedBy', '_id name')
.sort('-created')
.exec((err, posts) => {
if (err) reject(err);
else resolve(posts);
});
} catch(e) {
console.log('error!', e)
reject(e);
}
});
};
//Get Media function
const getMedia = function (req) {
return new Promise(async function(resolve, reject) {
Media.find({postedBy: req.profile._id})
.limit(2)
.populate('postedBy', '_id name')
.populate('comments.postedBy', '_id name')
.sort('-created')
.exec((err, media) => {
if(err) reject(err)
else resolve(media)
})
})
}
Any input would be greatly appreciated.
Thanks
Your schemas for Post and Media look very similar. You should consider merging them into a single schema. This would resolve your problem.
If you don't want to change your schema you should look into the mongodb aggregation pipeline which allows you to join data from multiple collections (using $lookup).
Related
I am new to mongoDb, as I am trying to query from different collection and in order to do that, when I am fetching data from category collection I mean when I am running select * from collection it is throwing error, MongoError: pool destroyed.
As per my understanding it is because of some find({}) is creating a pool and that is being destroyed.
The code which I am using inside model is below,
const MongoClient = require('mongodb').MongoClient;
const dbConfig = require('../configurations/database.config.js');
export const getAllCategoriesApi = (req, res, next) => {
return new Promise((resolve, reject ) => {
let finalCategory = []
const client = new MongoClient(dbConfig.url, { useNewUrlParser: true });
client.connect(err => {
const collection = client.db(dbConfig.db).collection("categories");
debugger
if (err) throw err;
let query = { CAT_PARENT: { $eq: '0' } };
collection.find(query).toArray(function(err, data) {
if(err) return next(err);
finalCategory.push(data);
resolve(finalCategory);
// db.close();
});
client.close();
});
});
}
When my finding here is when I am using
let query = { CAT_PARENT: { $eq: '0' } };
collection.find(query).toArray(function(err, data) {})
When I am using find(query) it is returning data but with {} or $gte/gt it is throwing Pool error.
The code which I have written in controller is below,
import { getAllCategoriesListApi } from '../models/fetchAllCategory';
const redis = require("redis");
const client = redis.createClient(process.env.REDIS_PORT);
export const getAllCategoriesListData = (req, res, next, query) => {
// Try fetching the result from Redis first in case we have it cached
return client.get(`allstorescategory:${query}`, (err, result) => {
// If that key exist in Redis store
if (false) {
res.send(result)
} else {
// Key does not exist in Redis store
getAllCategoriesListApi(req, res, next).then( function ( data ) {
const responseJSON = data;
// Save the Wikipedia API response in Redis store
client.setex(`allstorescategory:${query}`, 3600, JSON.stringify({ source: 'Redis Cache', responseJSON }));
res.send(responseJSON)
}).catch(function (err) {
console.log(err)
})
}
});
}
Can any one tell me what mistake I am doing here. How I can fix pool issue.
Thanking you in advance.
I assume that toArray is asynchronous (i.e. it invokes the callback passed in as results become available, i.e. read from the network).
If this is true the client.close(); call is going to get executed prior to results having been read, hence likely yielding your error.
The close call needs to be done after you have finished iterating the results.
Separately from this, you should probably not be creating the client instance in the request handler like this. Client instances are expensive to create (they must talk to all of the servers in the deployment before they can actually perform queries) and generally should be created per running process rather than per request.
I'm currently implementing admin dashboard of online shopping app.I want to implement method to perform user deletion and store that deleted user data temporally on another collection.
(Copy_userdata->Save it on another collection -> delete original data)
As an example my users data currently available in collection called users, and after deleting that user particular user's data must be available in another collection, lets say deleted_users collection. Are there any easy way to do that? thanks!
You will be modify some of the code but this is the basic logic,
Use aggregation for copy collections over
Refer here for aggregate function using mongo client
So the function looks like this
public aggregation(collectionName: string, pipelines: Object[]): Promise<Array<any>>
{
return new Promise((resolve, reject) =>
{
let cursor: mongodb.AggregationCursor<any> = null;
//Here you will use getCollection method on your own to fetch the collection
this.getCollection(collectionName)
.then((collection: mongodb.Collection) =>
{
cursor = collection.aggregate(pipelines);
return cursor.toArray();
})
.then((result: Array<any>) =>
{
return resolve(result);
})
.catch((error: any) =>
{
//error//
});
}
public dropCollection(collectionName: string): Promise<any>
{
return new Promise((resolve, reject) =>
{
this.getCollection(collectionName)
.then((collection: mongodb.Collection) =>
{
collection.drop((err: Error, result: any) =>
{
if (err)
{
return reject(DataDropError);
}
return resolve(result);
});
})
.catch(reject);
});
}
public async backupAndDrop()
{
const Object = [ { $match: {} }, { $out: "DeletedCollection" } ];
try
{
await this.aggregationPipeline("originalCollection", Object);
await this.dropCollection("originalCollection");
}
catch (e)
{
throw e;
}
}
Also try to run this on your mongo shell:
db.originalCollection.aggregate([ { $match: {} }, { $out: "Backup" } ])
Why don't you add a flag like isDeleted which is false by default and then make it true when the user is deleted?
You can do something like this...
Client.connect(connection_string, function(err, db) {
if(err){
console.log(err);
}
else{
db.collection(CollectionA).find().forEach(function(d){ db.collection(CollectionB).insert(d); });
}
Try out if it works.
This can help too:
How to properly reuse connection to Mongodb across NodeJs application and modules
You can first find the record to be deleted and do a create with that data to the new collection and then delete the record.
db.collection(CollectionA).findOne({userIdTODelete}, function(err, res){
db.collection(CollectionB).insertOne(res, function() {
db.collection(CollectionA).deleteOne({userIdTODelete});
})
});
I'm following the Cloud Datastore sample from the Google documentation as well as the Github sample, following the tasks sample. I'm trying to make a single function call, and mark a task as done by looking it up by the description.
function markDoneByDesc(queryString) {
const query = datastore
.createQuery('Task')
.filter('description', '=', queryString);
var taskKeyId;
datastore
.runQuery(query)
.then(results => {
const tasks = results[0];
console.log('Task found:', tasks[0]);
// I realize there might be multiple tasks with the same desc,
// but I want to update just one for now
taskKeyId = tasks[0][datastore.KEY].id;
console.log('Saving the task Key ID', taskKeyId);
return taskKeyId;
})
.then((taskKeyId) => {
console.log('Calling markDone with task Key ID', taskKeyId);
markDone(taskKeyId); // From the original function in the sample
console.log('Updated task');
})
.catch(err => {
console.error('ERROR:', err);
});
}
Right now, the update doesn't happen :(
I found the solution, thanks to #callmehiphop's help!
Looks like I need to convert the taskKeyId that is returned in the datastore query into an integer, and then pass it to the markDone() function. Otherwise it is passed as a string and the lookup by that ID Key fails.
Here's what the correct code should look like (note the parseInt() in the first return statement):
function markDoneByDesc(queryString) {
const query = datastore
.createQuery('Task')
.filter('description', '=', queryString);
var taskKeyId;
datastore
.runQuery(query)
.then(results => {
const tasks = results[0];
console.log('Task found:', tasks[0]);
// I realize there might be multiple tasks with the same desc,
// but I want to update just one for now
taskKeyId = tasks[0][datastore.KEY].id;
console.log('Saving the task Key ID', taskKeyId);
return parseInt(taskKeyId,10);
})
.then((taskKeyId) => {
console.log('Calling markDone with task Key ID', taskKeyId);
markDone(taskKeyId); // From the original function in the sample
console.log('Updated task');
})
.catch(err => {
console.error('ERROR:', err);
});
}
I have a function that list all users and role refrenced to group. Now i have other function that take user role refid and returns group name. While trying to return name i got promise pending state.
function getAll() {
var deferred = Q.defer();
db.users.find().toArray(function(err, users) {
if (err) deferred.reject(err.name + ': ' + err.message);
// return users (without hashed passwords)
users = _.map(users, function(user) {
//console.log(user);
return _.omit(user, ['hash']);
});
users = _.map(users, function(user){
refId = {}= user['role'][0]['oid']['_id'];
//console.log(typeof refId);
user = _.omit(user, ['role']);
user.role = userRole.userRole(refId).then(function(err,rid){
if(err){
deferred.reject(err.name+':'+err.message);
}
deferred.resolve();
console.log(deferred.resolve(rid));
return deferred.promise;
console.log(deferred.promise);
});
return user;
//console.log(user);
})
// getRefId(users)
//console.log(users);
deferred.resolve(users);
});
function userRole(rid){
return new Promise((resolve, reject) => {
db.groups.findOne({"_id":rid}, function(err, doc){
if(err){
reject(err.name + ':' + err.message);
}
if(doc){
resolve({"name": doc.name});
//console.log(doc.name);
}
})
})
}
You you want to use Promises with Mongoose or the native Mongo driver for Node which you seem to be doing here, you don't have to use new Promise() everywhere. You can use the promises directly.
See the docs:
The official MongoDB Node.js driver provides both callback based as well as Promised based interaction with MongoDB allowing applications to take full advantage of the new features in ES6.
https://mongodb.github.io/node-mongodb-native/
Mongoose async operations, like .save() and queries, return Promises/A+ conformant promises. This means that you can do things like MyModel.findOne({}).then() and yield MyModel.findOne({}).exec() (if you're using co).
http://mongoosejs.com/docs/promises.html
Instead of this:
function userRole(rid){
return new Promise((resolve, reject) => {
db.groups.findOne({"_id":rid}, function(err, doc){
if(err){
reject(err.name + ':' + err.message);
}
if(doc){
resolve({"name": doc.name});
//console.log(doc.name);
}
})
})
}
you should be able to use:
function userRole(rid){
return db.groups.findOne({ _id: rid });
}
or:
const userRole = rid => db.groups.findOne({ _id: rid });
Now, in your version there is one problem - the promise will never get resolved if there is no error but the returned doc is falsy, which can happen. But there is no point in creating your own promises if the methods that you call return promises in the first place.
Even if you want custom error messages, you can still use:
function userRole(rid){
return db.groups.findOne({ _id: rid })
.catch(function (err) {
return Promise.reject(err.name + ':' + err.message);
});
}
or this with more modern syntax:
const userRole = rid => db.groups.findOne({ _id: rid })
.catch(err => Promise.reject(`${err.name}:${err.message}`));
I would like to know if it's possible to run a series of SQL statements and have them all committed in a single transaction.
The scenario I am looking at is where an array has a series of values that I wish to insert into a table, not individually but as a unit.
I was looking at the following item which provides a framework for transactions in node using pg. The individual transactions appear to be nested within one another so I am unsure of how this would work with an array containing a variable number of elements.
https://github.com/brianc/node-postgres/wiki/Transactions
var pg = require('pg');
var rollback = function(client, done) {
client.query('ROLLBACK', function(err) {
//if there was a problem rolling back the query
//something is seriously messed up. Return the error
//to the done function to close & remove this client from
//the pool. If you leave a client in the pool with an unaborted
//transaction weird, hard to diagnose problems might happen.
return done(err);
});
};
pg.connect(function(err, client, done) {
if(err) throw err;
client.query('BEGIN', function(err) {
if(err) return rollback(client, done);
//as long as we do not call the `done` callback we can do
//whatever we want...the client is ours until we call `done`
//on the flip side, if you do call `done` before either COMMIT or ROLLBACK
//what you are doing is returning a client back to the pool while it
//is in the middle of a transaction.
//Returning a client while its in the middle of a transaction
//will lead to weird & hard to diagnose errors.
process.nextTick(function() {
var text = 'INSERT INTO account(money) VALUES($1) WHERE id = $2';
client.query(text, [100, 1], function(err) {
if(err) return rollback(client, done);
client.query(text, [-100, 2], function(err) {
if(err) return rollback(client, done);
client.query('COMMIT', done);
});
});
});
});
});
My array logic is:
banking.forEach(function(batch){
client.query(text, [batch.amount, batch.id], function(err, result);
}
pg-promise offers a very flexible support for transactions. See Transactions.
It also supports partial nested transactions, aka savepoints.
The library implements transactions automatically, which is what should be used these days, because too many things can go wrong, if you try organizing a transaction manually as you do in your example.
See a related question: Optional INSERT statement in a transaction
Here's a simple TypeScript solution to avoid pg-promise
import { PoolClient } from "pg"
import { pool } from "../database"
const tx = async (callback: (client: PoolClient) => void) => {
const client = await pool.connect();
try {
await client.query('BEGIN')
try {
await callback(client)
await client.query('COMMIT')
} catch (e) {
await client.query('ROLLBACK')
}
} finally {
client.release()
}
}
export { tx }
Usage:
...
let result;
await tx(async client => {
const { rows } = await client.query<{ cnt: string }>('SELECT COUNT(*) AS cnt FROM users WHERE username = $1', [username]);
result = parseInt(rows[0].cnt) > 0;
});
return result;