I have a nodejs API server deployed in a Kubernetes cluster.
Users can send in bids on auction items.
To prevent a bid from overriding another there needs to be some synchronization.
I am seeing the following for an incoming bid:
start a transaction that reads the current bid and compares it to the incoming bid and updates the record
create an aggregation that does the same as above
I don't know which way to go. I also understand that you need to lock the document with either IX or X.
For a RDBMS you would create a transaction that locks the record and releases it after update but I don't know how it works for MongoDB.
Product.findById(productId)
.then(productmatch => {
if (productmatch.currentPrice > price) throw Error('no go')
const bid = new Bid({
price,
date: Date.now(),
user: user._id
})
return Product.findByIdAndUpdate(productId, {
$push: {
bids: bid
},
currentPrice: price,
currentUser: user,
currentBid: bid
}, {
new: true
})
.then(product => {
if (!product) throw Error(`no go`)
return bid._id.toString()
})
})
After a little more research I came up with this solution, however I do not know if its 100% reliable but I believe this method will lock the document and not let any other threads read the document between the query and update operations.
var query = {
_id: productId,
closed: false,
currentPrice: {
$lt: price
}
},
update = {
$push: {
bids: bid
},
currentPrice: price,
currentUser: user,
currentBid: bid
},
options = {
new: true
};
return Product.findOneAndUpdate(query, update, options)
.then(product => {
if (!product) throw Error(`no product found with id ${productId}`)
return bid._id.toString()
})
Related
I'm having a quite complex data structure, and logic, where I'm using many findOneAndUpdates for atomicity, and the whole process can only be successful if there are no errors. If there are errors, all changes need to be rolled back.
My application is not specifically for this, but it may demonstrate the problem. Let's say it's an ecommerce system, and two people are looking at the same items. The same sneakers, and t-shirt for example, but only one is available of both (let's say there's no basket, so we only know about these orders when they come in).
The rule is the order can only be successful if after subtracting the ordered amount of the available inventories, their amounts remain 0, or greater. So the order is either completely fulfilled, or not fulfilled at all.
This is my first time playing with transactions, and cannot even understand them (and the more I read about them, the more unanswered questions I have, so become more confused).
I was playing with the thought of what if the following happens in order:
first transaction starts
second transaction starts
first transaction updates items
second transaction updates items
first transaction commits
second transaction commits
This is an example code of how it would look like:
type OrderItem = {
itemId: string;
amount: string;
};
type Order = OrderItem[];
const makeOrder = async (order: Order) => {
const session = await startSession();
session.startTransaction();
let orderSuccessful = true;
await Promise.all(
order.map(async ({ itemId, amount }) => {
const updatedItem = await ItemModel.findByIdAndUpdate(
itemId,
{ $inc: { amount: -amount } },
{ new: true, session },
);
if (updatedItem.amount < 0) orderSuccessful = false;
}),
);
if (!orderSuccessful) {
await session.abortTransaction();
await session.endSession();
throw 'Items missing from order.';
}
await session.commitTransaction();
await session.endSession();
};
And two orders come in this form: [ { itemId: 'sneakers', amount: 1 }, { itemId: 'tShirt', amount: 1 }]. And exactly that much inventory we have in the database.
So basically there would be two sessions in parallel, and the changes would only be reflected if the transactions are commited.
But my question was at both transactions (because they aren't commited yet, and the sessions "don't know about each other", at both transactions the state at the time of findOneAndUpdate is that there's still one available of each items, and I basically "lose" the benefit of it, because even though there's no gap between read and update, it's only true for that session.
I was doing some playaround, and realised that it's not the case.
console.time('until timeout to update outside session');
new Promise((resolve) => {
setTimeout(async () => {
console.timeEnd('until timeout to update outside session'); // 2nd
console.time('updating the same order outside session');
const updatedOutsideOrder = await OrderModel.findByIdAndUpdate(
order._id,
{
$inc: { value: -1 },
},
{ new: true },
).exec();
console.timeEnd('updating the same order outside session'); // 5th
console.log('updatedOutsideOrder', updatedOutsideOrder); // 6th
resolve(true);
}, 1000);
});
const session = await startSession();
session.startTransaction();
const updatedInsideOrder = await OrderModel.findByIdAndUpdate(
order._id,
{
$inc: { value: -1 },
},
{ new: true, session },
).exec();
console.log('updatedInsideOrder', updatedInsideOrder); // 1st
await new Promise((resolve) => {
setTimeout(() => {
resolve(true);
console.log('timeout to make sure the update outside finishes before commit'); // 3rd
}, 5000);
});
await session.commitTransaction();
await session.endSession();
console.log(
'orderAfter transaction',
await OrderModel.findById(order._id).exec(),
); // 4th
I was surprised when I noticed that mongoose actually waits to do anything if a transaction is in progress. I guess the database is "locked".
This has raised a lot of questions.
what if multiple instances of the api are deployed, and mongoose won't know about the sessions in different instances?
if to the previous question the answer is it's not mongoose, but the database is sending the signal to mongoose that it's currently locked, how is it going to be solved when the database will need to be available through the whole world?
the most important question is how this will look like when there will be thousands of orders per second? If the whole transaction takes more than a millisecond, the delay between every request will grow as time goes.
I'm going round and round with this problem for months, and cannot find the solution, so any help would be appreciated.
So my knowledge of NodeJS and MongoDD are non-existent (just need to do a small code update for a friend) and I'm stuck.
Need to update a single document inside a collection via a unique id but can't seem to do it.
Here's the Model (I've trimmed it down and cut out all unnecessary data). I'm trying to update the field notes inside a transaction.
In short each entry in the given (an Agent) table will have a collection of multiple Transactions & Documents. I need to update a specific Transaction with the unique _id that is auto generated.
import { Schema, model } from 'mongoose';
interface Transaction {
first_name: string;
last_name: string;
type: string;
notes: string;
}
interface Agent {
org_id: number;
transactions: Array<Transaction>;
documents: Array<string>;
}
const transactionSchema = new Schema<Transaction>({
first_name: { type: String },
last_name: { type: String },
type: { type: String },
notes: String,
});
const transactionsSchema = new Schema<Agent>({
org_id: { type: Number },
transactions: [transactionSchema],
documents: [documentTypesSchema],
});
const AgentTransaction = model<Agent>(
'agent_transaction_table',
transactionsSchema
);
export default AgentTransaction;
Here's what I tried but didn't work (obviously), again I've trimmed out all unnecessary data. Just to clarify, the endpoint itself works, but the DB update does not.
import AgentTransaction from '../models/transaction'; // the above model
transaction.put('/notes', async (req, res) => {
const { org_id, transaction_id, notes } = req.body;
try {
const notesResult = await AgentTransaction.updateOne({
'transactions._id': transaction_id,
}, {
$set: {
'notes': notes
},
});
res
.status(200)
.json({ message: 'Updated', success: true, notesResult });
} catch (error) {
res.status(400).send(error);
}
});
So I figured it out. Maybe it'll help someone else as well.
const notesResult = await AgentTransaction.updateOne({
'transactions._id': { $in: [trunc2] },
}, {
$set: {
'transactions.$.notes': notes
},
});
The main issue was that the payload object needed to target the collection folder + the wildcard + the field, not just only the field.
This is eCommerce site and I'm using mongodb as a database, users can place order and each order can have multiple products. Product is a seperate table that contains quantityLeft of each product. There's a situation that when two concurrent requests comes and tries to buy the same product the ordered items in orders table exceeds the available quantity in product table.
Product Table
{
_id: '56e33c56ddec541556a61763',
name: 'Chocolate',
quantityLeft: 1
}
In product's table only 1 chocolate left if one request comes at a time it works fine. Request comes check the order.quantity and handle if there's enough product available.
But when 2 requests comes exactly the same time issue occurs both the request query the database to get the product and check the quantityLeft and found that only 1 chocolate is available and passes the check that enough quantity is still present in inventory and places the order. But in actual 2 orders are placed and quantity we have is only 1.
Order Table
{
_id: '60e33c56ddec541556a61595',
items: [{
_id: '56e33c56ddec541556a61763',
quantity: 1
}]
}
I tried to put both the queries to get the Product detail and place order in same transaction but it doesn't work. Something like this
const session = await mongoose.startSession({ defaultTransactionOptions: { readConcern: { level: 'local' }, writeConcern: { w: 1 } } })
await session.withTransaction(async () => {
const promiseArray = order.items.map((item) => Product.find({ _id: item._id }, { session })
const products = Promise.all(promiseArray)
const productById = {}
products.forEach((product) => {
productById[product._id] = product
})
order.items.forEach((item) => {
if (productById[item].quantityLeft < order.item) {
throw new Error('Not enough quantity')
}
})
await Order.create(order, {session})
}, { readConcern: { level: 'local' }, writeConcern: { w: 1 } });
I'm using nodejs (14.16), mongodb as database npm package is mongoose (5.9).
Background: Im developing an app that shows analytics for inventory management.
It gets an office EXCEL file uploaded, and as the file uploads the app convert it to an array of JSONs. Then, it comapers each json object with the objects in the DB, change its quantity according to the XLS file, and add a timestamp to the stamps array which contain the changes in qunatity.
For example:
{"_id":"5c3f531baf4fe3182cf4f1f2",
"sku":123456,
"product_name":"Example",
"product_cost":10,
"product_price":60,
"product_quantity":100,
"Warehouse":4,
"stamps":[]
}
after the XLS upload, lets say we sold 10 units, it should look like that:
{"_id":"5c3f531baf4fe3182cf4f1f2",
"sku":123456,
"product_name":"Example",
"product_cost":10,
"product_price":60,
"product_quantity":90,
"Warehouse":4,
"stamps":[{"1548147562": -10}]
}
Right now i cant find the right commands for mongoDB to do it, Im developing in Node.js and Angular, Would love to read some ideas.
for (let i = 0; i < products.length; i++) {
ProductsDatabase.findOneAndUpdate(
{"_id": products[i]['id']},
//CHANGE QUANTITY AND ADD A STAMP
...
}
You would need two operations here. The first will be to get an array of documents from the db that match the ones in the JSON array. From the list you compare the 'product_quantity' keys and if there is a change, create a new array of objects with the product id and change in quantity.
The second operation will be an update which uses this new array with the change in quantity for each matching product.
Armed with this new array of updated product properties, it would be ideal to use a bulk update for this as looping through the list and sending
each update request to the server can be computationally costly.
Consider using the bulkWrite method which is on the model. This accepts an array of write operations and executes each of them of which a typical update operation
for your use case would have the following structure
{ updateOne :
{
"filter" : <document>,
"update" : <document>,
"upsert" : <boolean>,
"collation": <document>,
"arrayFilters": [ <filterdocument1>, ... ]
}
}
So your operations would follow this pattern:
(async () => {
let bulkOperations = []
const ids = products.map(({ id }) => id)
const matchedProducts = await ProductDatabase.find({
'_id': { '$in': ids }
}).lean().exec()
for(let product in products) {
const [matchedProduct, ...rest] = matchedProducts.filter(p => p._id === product.id)
const { _id, product_quantity } = matchedProduct
const changeInQuantity = product.product_quantity - product_quantity
if (changeInQuantity !== 0) {
const stamps = { [(new Date()).getTime()] : changeInQuantity }
bulkOperations.push({
'updateOne': {
'filter': { _id },
'update': {
'$inc': { 'product_quantity': changeInQuantity },
'$push': { stamps }
}
}
})
}
}
const bulkResult = await ProductDatabase.bulkWrite(bulkOperations)
console.log(bulkResult)
})()
You can use mongoose's findOneAndUpdate to update the existing value of a document.
"use strict";
const ids = products.map(x => x._id);
let operations = products.map(xlProductData => {
return ProductsDatabase.find({
_id: {
$in: ids
}
}).then(products => {
return products.map(productData => {
return ProductsDatabase.findOneAndUpdate({
_id: xlProductData.id // or product._id
}, {
sku: xlProductData.sku,
product_name: xlProductData.product_name,
product_cost: xlProductData.product_cost,
product_price: xlProductData.product_price,
Warehouse: xlProductData.Warehouse,
product_quantity: productData.product_quantity - xlProductData.product_quantity,
$push: {
stamps: {
[new Date().getTime()]: -1 * xlProductData.product_quantity
}
},
updated_at: new Date()
}, {
upsert: false,
returnNewDocument: true
});
});
});
});
Promise.all(operations).then(() => {
console.log('All good');
}).catch(err => {
console.log('err ', err);
});
I am using sub-documents in my MEAN project, to handle orders and items per order.
These are my (simplified) schemas:
var itemPerOrderSchema = new mongoose.Schema({
itemId: String,
count: Number
});
var OrderSchema = new mongoose.Schema({
customerId: String,
date: String,
items: [ itemPerOrderSchema ]
});
To insert items in itemPerOrderSchema array I currently do:
var orderId = '123';
var item = { itemId: 'xyz', itemsCount: 7 };
Order.findOne({ id: orderId }, function(err, order) {
order.items.push(item);
order.save();
});
The problem is that I obviously want one item per itemId, and this way I obtain many sub-documents per item...
One solution could be to loop through all order.items, but this is not optimal, of course (order.items could me many...).
The same problem could arise when querying order.items...
The question is: how do I insert items in itemPerOrderSchema array without having to loop through all items already inserted on the order?
If you can use an object instead of array for items, maybe you can change your schema a bit for a single-query update.
Something like this:
{
customerId: 123,
items: {
xyz: 14,
ds2: 7
}
}
So, each itemId is a key in an object, not an element of the array.
let OrderSchema = new mongoose.Schema({
customerId: String,
date: String,
items: mongoose.Schema.Types.Mixed
});
Then updating your order is super simple. Let's say you want to add 3 of items number 'xyz' to customer 123.
db.orders.update({
customerId: 123
},
{
$inc: {
'items.xyz': 3
}
},
{
upsert: true
});
Passing upsert here to create the order even if the customer doesn't have an entry.
The downsides of this:
it is that if you use aggregation framework, it is either impossible to iterate over your items, or if you have a limited, known set of itemIds, then very verbose. You could solve that one with mapReduce, which can be a little slower, depending on how many of them you have there, so YMMB.
you do not have a clean items array on the client. You could fix that with either client extracting this info (a simple let items = Object.keys(order.items).map(key => ({ key: order.items[key] })); or with a mongoose virtual field or schema.path(), but this is probably another question, already answered.
First of all, you probably need to add orderId to your itemPerOrderSchema because the combination of orderId and itemId will make the record unique.
Assuming that orderId is added to the itemPerOrderSchema, I would suggest the following implementation:
function addItemToOrder(orderId, newItem, callback) {
Order.findOne({ id: orderId }, function(err, order) {
if (err) {
return callback(err);
}
ItemPerOrder.findOne({ orderId: orderId, itemId: newItem.itemId }, function(err, existingItem) {
if (err) {
return callback(err);
}
if (!existingItem) {
// there is no such item for this order yet, adding a new one
order.items.push(newItem);
order.save(function(err) {
return callback(err);
});
}
// there is already item with itemId for this order, updating itemsCount
itemPerOrder.update(
{ id: existingItem.id },
{ $inc: { itemsCount: newItem.itemsCount }}, function(err) {
return callback(err);
}
);
});
});
}
addItemToOrder('123', { itemId: ‘1’, itemsCount: 7 }, function(err) {
if (err) {
console.log("Error", err);
}
console.log("Item successfully added to order");
});
Hope this may help.