Which approach to choose for messenger architecture? - node.js

We are writing an instant messenger for private use in our application. The planned load of the current version is hundreds of users, maybe a thousand or two.
As a database, I use Mongo. Messages are written in messages (linked via chats_id with user_chats and uid with an external table in the users database). The arrival of a message on the client creates a flurry of events in response (mark read). It is clear that I need to make a queue. How to do it better? Are there methods to prioritize queues?
How many users should the spherical process of a node in a vacuum withstand? How much memory should it use? I'm using pm2 for process management and node-ipc for broadcast.
Where to turn, who to ask to optimize requests to the database. Now there is a quick written construction of the kind illustrated below. This example requests all chats, for each chat it selects the last message, the number of unread messages and the last users seen. In MySQL I would do it with one big request. How is it better to do this in Mongo? My level with Mongo is still in the middle.
The architecture itself. Now the action from the client comes and goes to the handler who understands what to do with it and who needs to be notified about this action. Usually the handler notifies the corresponding manager, and it already writes to the database and, if necessary, notifies the other processes.
const loadUserChatsStatistic = async (chatIds, userId) => {
const startExecTime = new Date();
const userChats = await db()
.collection("user_chats")
.aggregate([
{ $match: { uid: userId, chats_id: { $in: chatIds.map(ObjectID) } } },
{
$lookup: {
from: "chats",
localField: "chats_id",
foreignField: "_id",
as: "chat"
}
},
{ $unwind: "$chat" },
{
$lookup: {
from: "user_last_seen",
localField: "chats_id",
foreignField: "chats_id",
as: "last_seens"
}
}
])
.toArray();
const chats = await Promise.all(
userChats.map(async userChat => {
const usersCount = await db()
.collection("user_chats")
.find({ chats_id: userChat.chats_id })
.count();
let unreadCountQuery = {};
if (userChat.last_read) {
unreadCountQuery = { _id: { $gt: userChat.last_read } };
} else {
unreadCountQuery = { _id: { $gt: userChat._id } };
}
const unreadCount = await db()
.collection("messages")
.find({ ...unreadCountQuery, chats_id: userChat.chats_id })
.count();
const message =
(await db()
.collection("messages")
.findOne({ chats_id: userChat.chats_id }, { sort: { _id: -1 } })) ||
{};
return { ...userChat, usersCount, unreadCount, message };
})
);
console.log(
"Execution time for loadUserChatsStatistic",
new Date() - startExecTime
);
return chats.sort((a, b) => {
if (!a.message._id) {
return true;
}
if (!b.message._id) {
return false;
}
return (
new Date(ObjectID(a.message._id).getTimestamp()) <
new Date(ObjectID(b.message._id).getTimestamp())
);
});
};

Related

Do I really have to pass the session as a param when doing Mongo Node transaction

When doing a transaction like
await ( await this.client )
.withSession(async (session) => {
try {
session.startTransaction();
await collection1.updateMany({ id }, { $set: { done: true } }, { session });
await collection2.updateMany({ someId, test: { $exists: true } }, { $set: { test: [] } }, { session });
session.commitTransaction();
} catch (err) {
session.abortTransaction();
throw new Error(`Failed`);
}
});
Why do I have to pass the { session } as a param for the 2 updates?
The documentation doesn't seem to explain why that is, shouldn't everything between a start, stop session use that session, including await.collection1?
Thank you
This is totally normal in all the DBMS that i worked with.
As far as i know, the reason behind that is that you dont always want to add every DB-transaction to the started Transaction, because it might lead to reduced throughput and blocking.
So you only want to add Transactions that change the state of a Document and which are essential. E.g. most of the time you dont want to add simple read operations to the transactions and block the document with that

How to handle concurrent request with socket io while joining room

I'm trying to create/join a room with limit of 2 user per room everything is working fine
but when 2 users make a concurrent request to create/join a room all 3 users are added to same room. i have tried using
simple mongodb insert/update
using transaction
now storing in memory
here is the code
const rooms = {};
module.exports = function ({ io, socket }) {
socket.on("/createJoinPublicRequest", async ({ user }) => {
let title = `${user._id}.${Date.now()}`;
let foundRoom = Object.keys(rooms).find((room) => {
return rooms[room].roomLimit > rooms[room].users.length;
});
if (!foundRoom) {
//create a room if not found
rooms[title] = {
title,
users: [
{
_id: user._id,
},
],
roomLimit: 2,
roomType: "public",
status: "open",
};
socket.join(title);
} else {
// join existing room
room = rooms[foundRoom];
room.users.push({ _id: user._id });
if (room.users.length == room.roomLimit) {
room.status = "full";
if (await storeToMongoDB(room)) {
delete rooms[foundRoom];
}
}
socket.join(room.title);
}
await User.findByIdAndUpdate(user, {
joinedRoom: title,
});
return;
});
you should use mutex lock to prevent adding more Users to Room.
For Single Instance : https://www.npmjs.com/package/async-mutex
For Multi Instance : Redis Mutex Lock

Handling concurrent request that finds and update the same resource in Node Js & Mongo DB?

I have a function in node that runs after a clicking the checkout button. It checks the availability of the items in cart and if the item is available it will deduct it from the inventory.
I'm currently testing with two users clicking the checkout button at the same time. Both users have the exact same content in their cart (10 apples each) which gives a total of 20 apples, but there are only 10 apples in inventory.
If there is no item in cart it should return an error to the user but both orders are going through.
NOTE: This works if there is a 1 second delay between the clicks.
What can i do to prevent this?
// Check if items in inventory
const availability = await checkInventory(store, cart, seller);
if (!availability.success) {
return res.status(400).json({
success: false,
type: 'unavailable',
errors: availability.errors,
});
}
// Deduct Inventory
const inventory = await deductInventory(store, seller, cart);
if (!inventory) {
return next(new ErrorResponse('Server Error', 500));
}
checkInventory
exports.checkInventory = asyncHandler(async (store, cart, seller) => {
let isAvailable = true;
const unavailableProducts = [];
const inventory = await Inventory.find({
$and: [
{
store: store,
user: seller,
},
],
});
const products = inventory[0].products;
cart.forEach((item) => {
const product = products.find(
(product) => product._id.toString() === item.productId
);
if (!item.hasvariation) {
if (product.stock < item.qty) {
isAvailable = false;
unavailableProducts.push(
`${item.title} is not available, only ${product.stock} left available`
);
}
}
if (item.hasvariation) {
const variation = product.variations.find(
(variation) => variation._id.toString() === item.variationId
);
const option = variation.options.find(
(option) => option._id.toString() === item.optionId
);
if (option.stock < item.qty) {
isAvailable = false;
unavailableProducts.push(
`${item.title} is not available, only ${product.stock} left available`
);
}
}
});
return {
success: isAvailable,
errors: unavailableProducts,
};
});
deductInventory
exports.deductInventory = asyncHandler(async (store, seller, cart) => {
const inventory = await Inventory.findOne({
$and: [
{
store: store,
user: seller,
},
],
});
const products = inventory.products;
cart.forEach((item) => {
const product = products.find(
(product) => product._id.toString() === item.productId
);
if (!item.hasvariation) {
product.stock = product.stock - item.qty;
}
if (item.hasvariation) {
const variation = product.variations.find(
(variation) => variation._id.toString() === item.variationId
);
const option = variation.options.find(
(option) => option._id.toString() === item.optionId
);
option.stock = option.stock - item.qty;
}
});
const saveInventory = await Inventory.findOneAndUpdate(
{
$and: [
{
store: store,
user: seller,
},
],
},
{
$set: { products: products },
},
{ new: true, runValidator: true }
);
if (!saveInventory) {
return {
success: false,
errors: ['Server Error'],
};
}
return {
success: true,
};
});
The problem is that the 2 checkout calls run at (almost) the same time and your routine is not thread-safe. Both calls read a copy of the inventory data in memory. So both calls get a products.stock=10 and based on that local info you check and set the products counter by calculating the new amount in your function (stock-qty) and use an update query to set it as a fixed value (so both calls update the products.stock to 0). Resulting in your concurrency issues.
What you should do is let mongodb handle the concurrency for you.
There are several ways to handle concurrency but you could for example use the $inc to decrease the stock amount directly in mongo. That way the stock amount in the db can never be wrong.
result = await update({stock: {$ge: 10}}, {$inc: {stock : -10}})
As I added a filter to the query the order amount can not be lower than 0 plus you can now check the result of the update call to see if the update modified any documents. If it did not (result.nModified==0) you know the inventory was too low and you can report that back to the user.
https://docs.mongodb.com/manual/reference/operator/update/inc/
https://docs.mongodb.com/manual/reference/method/db.collection.update/#std-label-writeresults-update

Can mongodb send query pipelining with no loop?

I'm new to NodeJS and MongoDB.
I wanna get user's profile with one user's following list. If I use RDB, it was so simple with EQ join but I didn't have much experience of MongoDB, I don't know how.
Sample data below.
// list of users
[
{
_id: "oid_1",
nickname: "user_01",
link: "url/user_01"
},
{
_id: "oid_2",
nickname: "user_02",
link: "url/user_02"
},
{
_id: "oid_3",
nickname: "user_03",
link: "url/user_03"
}
...
]
user_01's followList
[
{
followOid: "foid_1",
userOid: "user_01"
},
{
followOid: "foid_2",
userOid: "user_02"
},
]
My solution is, get follow list, then use loop with follows.findOne() like below
const dataSet = [];
Follow.getFollowerList(userId) // for pipeline, use promise
.exec()
.then( async (result) => { // no async-await, no data output...
for (let data of result) {
let temp = await Users.getUserInfo( // send query for each data, I think it's not effective
data.userId,
{ nickname: 1, link: 1 }
);
dataSet.push(temp);
}
return dataSet;
})
.then((data) => {
res.status(200).json(data);
})
.catch( ... )
I think it's not best solution. If you are good at mongodb, plz save my life :)
thanks
One option would be to use aggregation.
const userId = 'Fill with UserId';
const pipe = [
{
'$match': {
'_id': userId
}
}, {
'$lookup': {
'from': 'followListCollectionName',
'localField': '_id',
'foreignField': 'userOid',
'as': 'followList'
}
}
];
const result = await UserModel.aggregate(pipeline);
and then you can find an array in result which contains one user with given Id ( and more if there are with same Id) and result[0].followList you can find follow objects as array
Second Option is to use virtuals
https://mongoosejs.com/docs/tutorials/virtuals.html
but for this schema of your collection needs some changes.
Good luck

MongoDB - find one and add a new property

Background: Im developing an app that shows analytics for inventory management.
It gets an office EXCEL file uploaded, and as the file uploads the app convert it to an array of JSONs. Then, it comapers each json object with the objects in the DB, change its quantity according to the XLS file, and add a timestamp to the stamps array which contain the changes in qunatity.
For example:
{"_id":"5c3f531baf4fe3182cf4f1f2",
"sku":123456,
"product_name":"Example",
"product_cost":10,
"product_price":60,
"product_quantity":100,
"Warehouse":4,
"stamps":[]
}
after the XLS upload, lets say we sold 10 units, it should look like that:
{"_id":"5c3f531baf4fe3182cf4f1f2",
"sku":123456,
"product_name":"Example",
"product_cost":10,
"product_price":60,
"product_quantity":90,
"Warehouse":4,
"stamps":[{"1548147562": -10}]
}
Right now i cant find the right commands for mongoDB to do it, Im developing in Node.js and Angular, Would love to read some ideas.
for (let i = 0; i < products.length; i++) {
ProductsDatabase.findOneAndUpdate(
{"_id": products[i]['id']},
//CHANGE QUANTITY AND ADD A STAMP
...
}
You would need two operations here. The first will be to get an array of documents from the db that match the ones in the JSON array. From the list you compare the 'product_quantity' keys and if there is a change, create a new array of objects with the product id and change in quantity.
The second operation will be an update which uses this new array with the change in quantity for each matching product.
Armed with this new array of updated product properties, it would be ideal to use a bulk update for this as looping through the list and sending
each update request to the server can be computationally costly.
Consider using the bulkWrite method which is on the model. This accepts an array of write operations and executes each of them of which a typical update operation
for your use case would have the following structure
{ updateOne :
{
"filter" : <document>,
"update" : <document>,
"upsert" : <boolean>,
"collation": <document>,
"arrayFilters": [ <filterdocument1>, ... ]
}
}
So your operations would follow this pattern:
(async () => {
let bulkOperations = []
const ids = products.map(({ id }) => id)
const matchedProducts = await ProductDatabase.find({
'_id': { '$in': ids }
}).lean().exec()
for(let product in products) {
const [matchedProduct, ...rest] = matchedProducts.filter(p => p._id === product.id)
const { _id, product_quantity } = matchedProduct
const changeInQuantity = product.product_quantity - product_quantity
if (changeInQuantity !== 0) {
const stamps = { [(new Date()).getTime()] : changeInQuantity }
bulkOperations.push({
'updateOne': {
'filter': { _id },
'update': {
'$inc': { 'product_quantity': changeInQuantity },
'$push': { stamps }
}
}
})
}
}
const bulkResult = await ProductDatabase.bulkWrite(bulkOperations)
console.log(bulkResult)
})()
You can use mongoose's findOneAndUpdate to update the existing value of a document.
"use strict";
const ids = products.map(x => x._id);
let operations = products.map(xlProductData => {
return ProductsDatabase.find({
_id: {
$in: ids
}
}).then(products => {
return products.map(productData => {
return ProductsDatabase.findOneAndUpdate({
_id: xlProductData.id // or product._id
}, {
sku: xlProductData.sku,
product_name: xlProductData.product_name,
product_cost: xlProductData.product_cost,
product_price: xlProductData.product_price,
Warehouse: xlProductData.Warehouse,
product_quantity: productData.product_quantity - xlProductData.product_quantity,
$push: {
stamps: {
[new Date().getTime()]: -1 * xlProductData.product_quantity
}
},
updated_at: new Date()
}, {
upsert: false,
returnNewDocument: true
});
});
});
});
Promise.all(operations).then(() => {
console.log('All good');
}).catch(err => {
console.log('err ', err);
});

Resources