Node.js mssql module transaction not working with async await - node.js

config = {
user: process.env.PROD_USER,
password: process.env.PROD_PASSWORD,
server: process.env.PROD_SERVER,
database: process.env.PROD_DATABASE,
options: {
abortTransactionOnError: true, // <-- SET XACT_ABORT ON
},
}
const pool = await sql.connect(config);
const transaction = new sql.Transaction();
await transaction.begin();
const result = await this.get_sp_data(
data[0],
sp.InsertTransactionMaster,
res
);
const masterId = result.recordset[0].MasterId || 0;
if (masterId) {
asyncForeach(req.body.TransactionDetail, async (value) => {
const detailData = {
MasterId: masterId,
SubServiceId: value.SubServiceId,
Rate: value.Rate,
Quantity: value.Quantity,
Amount: value.Amount,
CreatedBy: req.user.id || 0,
MemberId: value.MemberId,
SubMemberIddd: value.SubMemberId || null,
};
await this.get_sp_data(detailData, sp.InsertTransactionDetail, res);
});
}
await transaction.commit();
console.dir('Transaction commited.');
My custom code is between begin and commit to execute a stored procedure to insert master data after inserting master data master id return that id I use to insert multiple detail data using for loop and last.
I have explicitly place erroneous code in detail insert but transaction is not rolling back as a result master data inserted and giving error of detail.

Related

Mongo DB Update data

I want to decrease previours quantity by 1 how can I do this in Node Js Mongo Db
Here is my code:
app.put('/quantityUpdate',async(req,res)=>{
const id = req?.body?.id;
const dec= req?.body?.dec;
const filter = {_id:ObjectId(id)}
// this option instructs the method to create a document if no documents match the filter
const options = { upsert: true };
const updateDoc = {
$set: {
quantity: //I'm stuck in this place
},
};
const result = await products.updateOne(filter, updateDoc, options);
return res.send(result);
})
Instead of $set use $inc. It increments a field by a specified value.
To decrease the value by 1 you change your code to:
const updateDoc = { $inc: { quantity: -1 } }
To get more details, checkout the documentation.

How to add an object to an array of objects in Nodejs?

I'm creating a backend for my React web application and I'm trying to subscribe a user to a match, this match is an object that have an array called "players" and when I click on the join button the username and profilePicture of the user are being dispatched to my backend. The first user info is sent perfectly but when a second user is subscribed the info of the first one is replaced for the second one.
This is my function that push the data:
const playerJoined = async (req, res) => {
const torneoId = req.params.id;
const uid = req.uid;
const profilePicture = req.profilePicture;
const username = req.username;
console.log(req.params);
try {
const torneo = await Torneo.findById(torneoId);
if (!torneo) {
return res.status(404).json({
ok: false,
msg: "Torneo no existe por ese ID",
});
}
const newPlayer = {
profilePicture: profilePicture,
username: username,
};
const nuevoTorneo = {
...req.body,
players: newPlayer,
};
const torneoActualizado = await Torneo.findByIdAndUpdate(
torneoId,
nuevoTorneo,
{
new: true,
}
);
res.json({
ok: true,
torneo: torneoActualizado,
});
} catch (error) {
console.log(error);
res.status(500).json({
ok: false,
msg: "Hable con el administrador",
});
}
};
My frontend is working well because when I added more users the array of objects shows all the players like this:
players: (2) [{…}, {…}]
But on my mongo DB shows only the last user info added like I mentioned before.
I really appreciate any help.
You seem to be replacing the players property instead of pushing into it.
const nuevoTorneo = {
...req.body,
players: newPlayer,
};
When you grab the torneo by id, you should have access to that players property already, so spread that array into your nuevoTorneo as well:
const nuevoTorneo = {
...req.body,
players: [...torneo.players, newPlayer],
};
It is because you always put your newPlayer into the "player" field of your nuevoTorneo and updated the same document. I assume you are using mongoose, You probably should just modify the "torneo" after your query and do something like this:
const torneo = await Torneo.findById(torneoId);
const newPlayer = {
profilePicture: profilePicture,
username: username,
};
torneo.nuevoTorneo.players.push(newPlayer);
await torneo.save();
Or to simply modify your code as:
const nuevoTorneo = {
...req.body,
players: [...torneo.nuevoTorneo.players,newPlayer],
};
I recommend the first method, let me know if you have any questions.

Handling concurrent request that finds and update the same resource in Node Js & Mongo DB?

I have a function in node that runs after a clicking the checkout button. It checks the availability of the items in cart and if the item is available it will deduct it from the inventory.
I'm currently testing with two users clicking the checkout button at the same time. Both users have the exact same content in their cart (10 apples each) which gives a total of 20 apples, but there are only 10 apples in inventory.
If there is no item in cart it should return an error to the user but both orders are going through.
NOTE: This works if there is a 1 second delay between the clicks.
What can i do to prevent this?
// Check if items in inventory
const availability = await checkInventory(store, cart, seller);
if (!availability.success) {
return res.status(400).json({
success: false,
type: 'unavailable',
errors: availability.errors,
});
}
// Deduct Inventory
const inventory = await deductInventory(store, seller, cart);
if (!inventory) {
return next(new ErrorResponse('Server Error', 500));
}
checkInventory
exports.checkInventory = asyncHandler(async (store, cart, seller) => {
let isAvailable = true;
const unavailableProducts = [];
const inventory = await Inventory.find({
$and: [
{
store: store,
user: seller,
},
],
});
const products = inventory[0].products;
cart.forEach((item) => {
const product = products.find(
(product) => product._id.toString() === item.productId
);
if (!item.hasvariation) {
if (product.stock < item.qty) {
isAvailable = false;
unavailableProducts.push(
`${item.title} is not available, only ${product.stock} left available`
);
}
}
if (item.hasvariation) {
const variation = product.variations.find(
(variation) => variation._id.toString() === item.variationId
);
const option = variation.options.find(
(option) => option._id.toString() === item.optionId
);
if (option.stock < item.qty) {
isAvailable = false;
unavailableProducts.push(
`${item.title} is not available, only ${product.stock} left available`
);
}
}
});
return {
success: isAvailable,
errors: unavailableProducts,
};
});
deductInventory
exports.deductInventory = asyncHandler(async (store, seller, cart) => {
const inventory = await Inventory.findOne({
$and: [
{
store: store,
user: seller,
},
],
});
const products = inventory.products;
cart.forEach((item) => {
const product = products.find(
(product) => product._id.toString() === item.productId
);
if (!item.hasvariation) {
product.stock = product.stock - item.qty;
}
if (item.hasvariation) {
const variation = product.variations.find(
(variation) => variation._id.toString() === item.variationId
);
const option = variation.options.find(
(option) => option._id.toString() === item.optionId
);
option.stock = option.stock - item.qty;
}
});
const saveInventory = await Inventory.findOneAndUpdate(
{
$and: [
{
store: store,
user: seller,
},
],
},
{
$set: { products: products },
},
{ new: true, runValidator: true }
);
if (!saveInventory) {
return {
success: false,
errors: ['Server Error'],
};
}
return {
success: true,
};
});
The problem is that the 2 checkout calls run at (almost) the same time and your routine is not thread-safe. Both calls read a copy of the inventory data in memory. So both calls get a products.stock=10 and based on that local info you check and set the products counter by calculating the new amount in your function (stock-qty) and use an update query to set it as a fixed value (so both calls update the products.stock to 0). Resulting in your concurrency issues.
What you should do is let mongodb handle the concurrency for you.
There are several ways to handle concurrency but you could for example use the $inc to decrease the stock amount directly in mongo. That way the stock amount in the db can never be wrong.
result = await update({stock: {$ge: 10}}, {$inc: {stock : -10}})
As I added a filter to the query the order amount can not be lower than 0 plus you can now check the result of the update call to see if the update modified any documents. If it did not (result.nModified==0) you know the inventory was too low and you can report that back to the user.
https://docs.mongodb.com/manual/reference/operator/update/inc/
https://docs.mongodb.com/manual/reference/method/db.collection.update/#std-label-writeresults-update

Only process 500 lines/row at a time createReadStream

I have to read a really large CSV file so search through the google and get to know about createReadStream. I am using a program that read the csv file data and insert it into the mongoDB.
process I am following
process the data using createReadStream (I think it read the file line by line).
Storing data into an array.
Insert the data into mongoDB using insertMany
Now the problem is whole file is first get stored into an array and then I insert into the database.
But what I think is the better approach would be I only store first 500 line/rows into an array insert it into the DB and again follow the same step for the next 500 records
Is it possible to achieve this ?
and also is it the right way to do this ?
my program
const test = async () => {
const stream = fs.createReadStream(workerData)
.pipe(parse())
.on('data', async function(csvrow) {
try{
stream.pause()
if(!authorName.includes(csvrow.author)) {
const author = new Author({author: csvrow.author})
authorId = author._id
authorName.push(author.author)
authorData.push(author)
}
if(!companyName.includes(csvrow.company_name)) {
const company = new Company({companyName: csvrow.company_name})
companyID = company._id
companyName.push(company.companyName)
companyData.push(company)
}
users = new User({
name: csvrow.firstname,
dob: csvrow.dob,
address: csvrow.address,
phone: csvrow.phone,
state: csvrow.state,
zip: csvrow.zip,
email: csvrow.email,
gender: csvrow.gender,
userType: csvrow.userType
})
userData.push(users)
book = new Book({
book_number: csvrow.book_number,
book_name: csvrow.book_name,
book_desc: csvrow.book_desc,
user_id: users._id,
author_id: authorId
})
bookData.push(book)
relationalData.push({
username: users.name,
author_id: authorId,
book_id: book._id,
company_id: companyID
})
}finally {
stream.resume()
}
})
.on('end', async function() {
try {
Author.insertMany(authorData)
User.insertMany(userData)
Book.insertMany(bookData)
Company.insertMany(companyData)
await Relational.insertMany(relationalData)
parentPort.postMessage("true")
}catch(e){
console.log(e)
parentPort.postMessage("false")
}
})
}
test()
This program is working fine also inserting the data into the DB, But I am looking for something like this:
const stream = fs.createReadStream(workerData)
.pipe(parse())
.on('data', async function(csvrow, maxLineToRead: 500) {
// whole code/logic of insert data into DB
})
so maxLineToRead is my imaginary term.
basically my point is I want to process 500 data at a time and insert it into the DB and want to repeat this process till the end.
You can create a higher scoped array variable where you accumulate rows of data as they arrive on the data event. When you get to 500 rows, fire off your database operation to insert them. If not yet at 500 rows, then just add the next one to the array and wait for more data events to come.
Then, in the end event insert any remaining rows still in the higher scoped array.
In this way, you will insert 500 at a time and then however many are left at the end. This has an advantage vs. inserting them all at the end that you spread out the database load over the time you are parsing.
Here's an attempt to implement that type of processing. There are some unknowns (documented with comments) based on an incomplete description of exactly what you're trying to accomplish in some circumstances):
const test = () => {
return new Promise((resolve, reject) => {
const accumulatedRows = [];
async function processRows(rows) {
// initialize data arrays that we will insert
const authorData = [],
companyData = [],
userData = [],
bookData = [],
relationalData = [];
// this code still has a problem that I don't have enough context
// to know how to solve
// If authorName contains csvrow.author, then the variable
// authorId is not initialized, but is used later in the code
// This is a problem that needs to be fixed.
// The same issue occurs for companyID
for (let csvrow of rows) {
let authorId, companyID;
if (!authorName.includes(csvrow.author)) {
const author = new Author({ author: csvrow.author })
authorId = author._id
authorName.push(author.author)
authorData.push(author)
}
if (!companyName.includes(csvrow.company_name)) {
const company = new Company({ companyName: csvrow.company_name })
companyID = company._id
companyName.push(company.companyName)
companyData.push(company)
}
let users = new User({
name: csvrow.firstname,
dob: csvrow.dob,
address: csvrow.address,
phone: csvrow.phone,
state: csvrow.state,
zip: csvrow.zip,
email: csvrow.email,
gender: csvrow.gender,
userType: csvrow.userType
});
userData.push(users)
let book = new Book({
book_number: csvrow.book_number,
book_name: csvrow.book_name,
book_desc: csvrow.book_desc,
user_id: users._id,
author_id: authorId
});
bookData.push(book)
relationalData.push({
username: users.name,
author_id: authorId,
book_id: book._id,
company_id: companyID
});
}
// all local arrays of data are populated now for this batch
// so add this data to the database
await Author.insertMany(authorData);
await User.insertMany(userData);
await Book.insertMany(bookData);
await Company.insertMany(companyData);
await Relational.insertMany(relationalData);
}
const batchSize = 50;
const stream = fs.createReadStream(workerData)
.pipe(parse())
.on('data', async function(csvrow) {
try {
accumulatedRows.push(csvRow);
if (accumulatedRows.length >= batchSize) {
stream.pause();
await processRows(accumulatedRows);
// clear out the rows we just processed
acculatedRows.length = 0;
stream.resume();
}
} catch (e) {
// calling destroy(e) will prevent leaking a stream
// and will trigger the error event to be called with that error
stream.destroy(e);
}
}).on('end', async function() {
try {
await processRows(accumulatedRows);
resolve();
} catch (e) {
reject(e);
}
}).on('error', (e) => {
reject(e);
});
});
}
test().then(() => {
parentPort.postMessage("true");
}).catch(err => {
console.log(err);
parentPort.postMessage("false");
});

How to add two records in a row?

When I want to add two records in sequence, only one record is added, on the second it throws an error due to the fact that it cannot create a field with such data:
"NOTES_ID is required","key: NOTES_ID, value: undefined, is not a
number"
How to create an entry for two related tables sequentially from the beginning for the main table, and then for the one that has the foreign key installed.
module.exports.create = async function (req, res) {
const stateMatrix = await StateMatrix.select().exec()
const noteObj = {
DATE: req.body.DATE,
TITLE: req.body.TITLE,
CONTENT: req.body.CONTENT
};
const noteStateObj = {
STATE_DATE: new Date().toLocaleDateString("en-US"),
STATES_ID: stateMatrix[0]._props.STATES_ID_CURR,
NOTES_ID: req.body.NOTE_ID,
USERS_ID: req.decoded.user_id
};
try {
await Notes.create(noteObj);
await NoteStates.create(noteStateObj);
res.status(201).json(noteObj, noteStateObj);
} catch (e) {
errorHandler(res, e);
}
};
Probably NoteStates is related to Notes through note_id field which can not be empty (I guess it's foreign key). It means that you should set it before saving noteStateObj:
// Something like this
const newNote = await Notes.create(noteObj);
noteStateObj.NOTES_ID = newNote.ID;
await NoteStates.create(noteStateObj);

Resources