Query all channels with unread messages by an specific user? - getstream-io

I would like to recover all the channels that have unread messages by a specific user. From the docs, I was only able to find the count of unread messages and the count channels with unread messages for the current user.

If you want get all the channels where currentUser is member and sort them by unread_counts desc:
const result = await client.queryChannels(
{ members: { $in: [currentUser] } },
{ unread_count: -1 },
);
it is also possible to sort by has_unread:
(in this case it doesn't matter the number of unread messages, any channel with unread messages weight the same for sorting)
const result = await client.queryChannels(
{ members: { $in: [currentUser] } },
{ has_unread: -1, last_message_at: -1 },
);
please take a look at our tests for more info

Related

MongoDB nodejs updateOne always returning modifiedCount: 0

I’m having an issue where I’m attempting to update my document and the change is not being reflected. I suspect MongoDB is finding that my value is somehow the same even though I’ve changed it
const User = require('./assets/models/User.js');
var message = user.messages;
//should be an empty array for right now, can be something like
//['erin': [{ from: ‘erin’, to: ‘erin’, content: ‘test’ }]]
//in the future
if (!message[otherPerson]) message[otherPerson] = [];
await message[otherPerson].push(msg);
//where msg is a msg object
//pushes message into new index
//updates messages with new data
const test = await User.updateOne({ usertag: person }, {
$set: { messages: message }
});
console.log(await test);
I’ve tried multiple formats of updating such as
User.updateOne({ usertag: person }, {
messages
});
where the messages variable is called message in the earlier example
or
User.updateOne({ usertag: person }, {
$set: { messages }
});
and nothing seems to work
I will also mention that this is some rather old code that used to work pretty well. Has something changed in how MongoDB handles updates or am I doing something wrong?
If you want to add a new value to the messages array you should use $push:
const test = await User.updateOne({ usertag: person }, {
$push: { messages: msg }
});
If you want to edit a specific message you should filter by its id and reference the specific array element (I'm assuming that _id is your identifier for message elements):
const test = await User.updateOne({ usertag: person, messages._id: msg._id }, {
$set: {
messages.$.from: msg.from,
messages.$.to: msg.to,
messages.$.content: msg.content,
}
});
Also, you should not await the test result since you already resolved the Promise awaiting the updateOne:
console.log(test);

MongoDB collection find method doesn't work not in order

I am trying to get some db collection depends on users ID.
const chat = await Chat.findOne({ users });
now this will work : "users": ["630200a45d22133dbe5bec44", "630200975d22133dbe5bec41"]
but this will not work: "users": [630200975d22133dbe5bec41", "630200a45d22133dbe5bec44"]
Same id's, just not in the right order.
You are looking for an exact match, so order matters. It seems what you want to be doing is to use $all, this checks all that the elements present in the input array exists in the db.
Additional you'll want to add a size check if you want to limit it so an exact match, otherwise documents like {"users": ["630200a45d22133dbe5bec44", "630200975d22133dbe5bec41", "otherid"] } will be matched.
Overall like so:
const chat = await Chat.findOne({
users: {
$all: [
"630200975d22133dbe5bec41",
"630200a45d22133dbe5bec44"
]
},
"users.2": {
$exists: false
}
})
Mongo Playground
Or dynamically based on input size:
const input = [
"630200975d22133dbe5bec41",
"630200a45d22133dbe5bec44"
];
const sizeKey = `users.${input.length}`
const chat = await Chat.findOne({
users: {
$all: input
},
[sizeKey]: {
$exists: false
}
})

Race condition between two request in mongodb

This is eCommerce site and I'm using mongodb as a database, users can place order and each order can have multiple products. Product is a seperate table that contains quantityLeft of each product. There's a situation that when two concurrent requests comes and tries to buy the same product the ordered items in orders table exceeds the available quantity in product table.
Product Table
{
_id: '56e33c56ddec541556a61763',
name: 'Chocolate',
quantityLeft: 1
}
In product's table only 1 chocolate left if one request comes at a time it works fine. Request comes check the order.quantity and handle if there's enough product available.
But when 2 requests comes exactly the same time issue occurs both the request query the database to get the product and check the quantityLeft and found that only 1 chocolate is available and passes the check that enough quantity is still present in inventory and places the order. But in actual 2 orders are placed and quantity we have is only 1.
Order Table
{
_id: '60e33c56ddec541556a61595',
items: [{
_id: '56e33c56ddec541556a61763',
quantity: 1
}]
}
I tried to put both the queries to get the Product detail and place order in same transaction but it doesn't work. Something like this
const session = await mongoose.startSession({ defaultTransactionOptions: { readConcern: { level: 'local' }, writeConcern: { w: 1 } } })
await session.withTransaction(async () => {
const promiseArray = order.items.map((item) => Product.find({ _id: item._id }, { session })
const products = Promise.all(promiseArray)
const productById = {}
products.forEach((product) => {
productById[product._id] = product
})
order.items.forEach((item) => {
if (productById[item].quantityLeft < order.item) {
throw new Error('Not enough quantity')
}
})
await Order.create(order, {session})
}, { readConcern: { level: 'local' }, writeConcern: { w: 1 } });
I'm using nodejs (14.16), mongodb as database npm package is mongoose (5.9).

Which approach to choose for messenger architecture?

We are writing an instant messenger for private use in our application. The planned load of the current version is hundreds of users, maybe a thousand or two.
As a database, I use Mongo. Messages are written in messages (linked via chats_id with user_chats and uid with an external table in the users database). The arrival of a message on the client creates a flurry of events in response (mark read). It is clear that I need to make a queue. How to do it better? Are there methods to prioritize queues?
How many users should the spherical process of a node in a vacuum withstand? How much memory should it use? I'm using pm2 for process management and node-ipc for broadcast.
Where to turn, who to ask to optimize requests to the database. Now there is a quick written construction of the kind illustrated below. This example requests all chats, for each chat it selects the last message, the number of unread messages and the last users seen. In MySQL I would do it with one big request. How is it better to do this in Mongo? My level with Mongo is still in the middle.
The architecture itself. Now the action from the client comes and goes to the handler who understands what to do with it and who needs to be notified about this action. Usually the handler notifies the corresponding manager, and it already writes to the database and, if necessary, notifies the other processes.
const loadUserChatsStatistic = async (chatIds, userId) => {
const startExecTime = new Date();
const userChats = await db()
.collection("user_chats")
.aggregate([
{ $match: { uid: userId, chats_id: { $in: chatIds.map(ObjectID) } } },
{
$lookup: {
from: "chats",
localField: "chats_id",
foreignField: "_id",
as: "chat"
}
},
{ $unwind: "$chat" },
{
$lookup: {
from: "user_last_seen",
localField: "chats_id",
foreignField: "chats_id",
as: "last_seens"
}
}
])
.toArray();
const chats = await Promise.all(
userChats.map(async userChat => {
const usersCount = await db()
.collection("user_chats")
.find({ chats_id: userChat.chats_id })
.count();
let unreadCountQuery = {};
if (userChat.last_read) {
unreadCountQuery = { _id: { $gt: userChat.last_read } };
} else {
unreadCountQuery = { _id: { $gt: userChat._id } };
}
const unreadCount = await db()
.collection("messages")
.find({ ...unreadCountQuery, chats_id: userChat.chats_id })
.count();
const message =
(await db()
.collection("messages")
.findOne({ chats_id: userChat.chats_id }, { sort: { _id: -1 } })) ||
{};
return { ...userChat, usersCount, unreadCount, message };
})
);
console.log(
"Execution time for loadUserChatsStatistic",
new Date() - startExecTime
);
return chats.sort((a, b) => {
if (!a.message._id) {
return true;
}
if (!b.message._id) {
return false;
}
return (
new Date(ObjectID(a.message._id).getTimestamp()) <
new Date(ObjectID(b.message._id).getTimestamp())
);
});
};

Mongoose subquery and append results to mainquery

I have been struggling with the questions for a months now still no solution.
Basically I have 2 mongodb database structures.
One is called Users and another is called Items.
One user can have multiple Items.
User structure is simple =
Users = [{
_id: 1,
name: "Sam",
email: "sam#gmail.com",
group: "Rangers"
},
{
_id: 2,
name: "Michael",
email: "michael#gmail.com"
group: "Muse"
},
{
_id: 3,
name: "John",
email: "john#gmail.com"
group: "Merchant"
},
.....
]
The Items structures are as follows and each item is assigned to a user.
Items = [
{
_id: 1,
user_id: 1,
item_name: "Flying Sword",
timestamp: ...
},
{
_id: 3,
user_id: 1,
item_name: "Invisible Cloak",
timestamp: ...
},
{
_id: 4,
user_id: 2,
item_name: "Iron Shield"
},
{
_id: 5,
user_id: 7,
item_name: "Splashing Gun",
timestamp: ...
},
...
]
I want to run a mongoose query that queries the user as primary object.
And upon the returning the results of the user object I want to query the all the Items objects with the filtered users and append them as subdocuments to each user objects previously queried.
For example I want to query
Users.find({group: "Muse"}, function(err, users){
I DON"T KNOW WHAT TO WRITE INSIDE
})
Basically the results should be:
[
{
_id: 4,
name: "Jack",
email: "jack#gmail.com",
group: "Muse",
items: [
{
_id: 8
name: "Magic Wand",
user_id: 4,
timestamp: ...
}
{
_id: 12
name: "Blue Potion",
user_id: 4,
timestamp: ...
},
{
_id: 18
name: "Teleportation Scroll",
user_id: 4,
timestamp: ...
}
]
}
.....
More USERS of similar structure
]
Each user will return a maximum of three items which are sorted by timestamp.
Thanks in advance, I tried so many times and failed.
This is a multiple step question. So lets list out the steps:
Get a list of user documents that match a particular group.
Get a list of item documents that are assigned to each matched user from step 1.
Assign the appropriate item documents to a new property on the corresponding user document.
This can be tackled a few ways. A first pass might be to retrieve all the user documents and then iterating over them in memory retrieving the list of item documents for each user and appending that list to the user document. If your lists are smallish this shouldn't be too much of an issue but as scale comes into play and this becomes a larger list it could become a memory hog.
NOTE: all of the following code is untested so it might have typos or the like.
Users.find({group: "Muse"}, function(err, users){
var userIDs;
if (err) {
// do error handling
return;
}
userIDs = users.map(function (user) { return user._id; });
Items.find({user_id: {$in: userIDs}}, function (err, items) {
if (err) {
// do error handling
return;
}
users.forEach(function (user) {
user.items = items.filter(function (item) {
return item.user_id === user._id;
});
});
// do something with modified users object
});
});
While this will solve the problem there are plenty of improvements that can be made to make it a bit more performant as well as "clean".
For instance, lets use promises since this involves async operations anyway. Assuming Mongoose is configured to use the native Promise object or a then/catch compliant library
Users.find({group: "Muse"}).exec().then(function(users) {
var userIDs = users.map(function(user) {
return user._id;
});
// returns a promise
return Promise.all([
// include users for the next `then`
// avoids having to store it outside the scope of the handlers
users,
Items.find({
user_id: {
$in: userIDs
}
}).exec()
]);
}).then(function(results) {
var users = results[0];
var items = results[1];
users.forEach(function(user) {
user.items = items.filter(function(item) {
return item.user_id === user._id;
});
});
return users;
}).catch(function (err) {
// do something with errors from either find
});
This makes it subjectively a bit more readable but doesn't really help since we are doing a lot of manipulation in memory. Again, this might not be a concern if the document collections are smallish. However if is, there is a tradeoff that can be made with breaking up the request for items into one-per-user. Thus only working on chunks of the item list at a time.
We will also use Bluebird's map to limit the number of concurrent requests for items.
Users.find({group: "Muse"}).exec().then(function(users) {
return bluebird.map(users, function(user) {
return Items.find({user_id: user._id}).exec().then(function (items) {
user.items = items;
return user;
});
}, {concurrency: 5});
}).then(function(users) {
// do something with users
}).catch(function(err) {
// do something with errors from either find
});
This limits the amount of in memory manipulation for items but still leaves us iterating over users in memory. That can be tackled as well by using mongoose streams but I will leave that up to you to explore on your own (there are also other questions already on SO on how to use streams).
This makes it subjectively a bit more readable but doesn't really help since we are doing a lot of manipulation in memory. Again, this might not be a concern if the document collections are smallish. However if is, there is a tradeoff that can be made with breaking up the request for items into one-per-user. Thus only working on chunks of the item list at a time.

Resources