Make one column dependent on association Sequelize - node.js

I have a table called HOUSE. And it has a column named STATUS.
I also have a table called TASK and it also has a column named STATUS.
Each house has many tasks. And if there's one task that has a status of inProgress, the house status shall be inProgress. And if all of the tasks are done, then house is done.
I want this status column of the house be dependent on the status of its all tasks.
When I call /getHouses, here's what I do to add a property called status to each house object, because currently I have no STATUS column in the HOUSE table.
exports.getMyHouses = (req, res) => {
const page = myUtil.parser.tryParseInt(req.query.page, 0)
const limit = myUtil.parser.tryParseInt(req.query.limit, 10)
db.House.findAndCountAll({
where: { userId: req.user.id },
include: [
{
model: db.Task,
as: "task",
include: [
{
model: db.Photo,
as: "photos"
}
]
},
{
model: db.Address,
as: "address"
}
],
offset: limit * page,
limit: limit,
order: [["id", "ASC"]],
})
.then(data => {
let newData = JSON.parse(JSON.stringify(data))
const houses = newData.rows
for (let house of houses) {
house.status = "done"
const tasks = house.task
for (let task of tasks) {
if (task.status == "inProgress") {
house.status = "inProgress"
break
}
}
}
res.json(myUtil.response.paging(newData, page, limit))
})
.catch(err => {
console.log("Error get houses: " + err.message)
res.status(500).send({
message: "An error has occured while retrieving data."
})
})
}
EDIT: I just realized that perhaps I can update the house's status column each time there's an update in the task's status. I've never thought about this before.
But I would still love it if anyone could confirm that this is a good strategy or if there's a better one.

The option you have is viable as long as filtering by the house's status isn't something you require. This would essentially be called a virtual field (since it isn't something directly from the database). If you do need to filter by this field, you'd then need to query for all the tasks InProgress and get the unique house IDs.
You could update the house's status column on task update too but you could run into some race conditions if, for example, multiple requests were being made to update tasks to the same house. Make sure to run a transaction here if you were too. Querying/filtering for houses with InProgress tasks would be much faster since you can query it directly. However, updates would be slower since you'd need to run a task update, a count query on tasks, and an update query on the house.
Both have it's pro's and con's, it mainly depends on your application design's requirement.

Related

How do I copy entries from one collection to another using mongoose?

I'm trying to create a little task management site for a work project. The overall goal is here is that the tasks stay the same each month (their status can be updated and whatnot), and they need to be duplicated at the start of each new month so they can be displayed and sorted by on a table.
I already figured out how to schedule the task, I have the table I need set up. A little explanation before the code - the way I'm planning on doing this is having two different task collections - one I've called "assignments", will have the tasks that need to be duplicated (with their description, status and other necessary data) and another collection, which I called "tasks", will have the exact same data but with an additional "date" field. This is where the table will get it's data from, the date is just for sorting purposes.
This is what I have so far -
Index.js: gets all the assignments from the database, and sends the object over to the duplicate function.
router.get('/test', async function(req, res, next) {
let allTasks = await dbModule.getAllAssignments();
let result = await dbModule.duplicateTasks(allTasks);
res.json(result);
});
dbmodule.js:
getAllAssignments: () => {
allAssignments = Assignment.find({});
return allAssignments;
},
duplicateTasks: (allTasksToAdd) => {
try {
for (let i = 0; i < allTasksToAdd.length; i++) {
let newTask = new Task({
customername: allTasksToAdd.customername,
provname: allTasksToAdd.provname,
description: allTasksToAdd.description,
status: allTasksToAdd.status,
date: "07-2020"
})
newTask.save();
}
return "Done"
} catch (error) {
return "Error"
}
}
The issue arises when I try and actually duplicate the tasks. For testing purposes I've entered the date manually this time, but that's all that ends up being inserted - just the date, the rest of the data is skipped. I've heard of db.collection.copyTo(), but I'm not sure if it'll allow me to insert the field I need or if it's supported in mongoose. I know there's absolutely an easier way to do this but I can't quite figure it out. I'd love some input and suggestions if anyone has any.
Thanks.
The problem is that allTasksToAdd.customername (and the other fields your trying to access) will be undefined. You need to access the fields under the current index:
let newTask = new Task({
customername: allTasksToAdd[i].customername,
provname: allTasksToAdd[i].provname,
description: allTasksToAdd[i].description,
status: allTasksToAdd[i].status,
date: "07-2020"
})
Note that you can simplify this by using a for .. of loop instead:
for (const task of allTasksToAdd) {
const newTask = new Task({
customername: task.customername,
provname: task.provname,
description: task.description,
status: task.status,
date: "07-2020"
});
newTask.save();
}

Structuring a query response with PostgreSQL

I am trying to construct a query to return data from multiple tables and build them into a single array of objects to return to the client. I have two tables, incidents and sources. Each source has an incident_id that corresponds to an incident in the first table.
Since there can be more than one source I want to query for the incidents, then on each incident add a key of source that has a value of the array of associated sources. The desired final structure is this:
{
"incident_id": 1,
"id": "wa-olympia-1",
"city": "Olympia",
"state": "Washington",
"lat": 47.0417,
"long": -122.896,
"title": "Police respond to broken windows with excessive force",
"desc": "Footage shows a few individuals break off from a protest to smash City Hall windows. Protesters shout at vandals to stop.\n\nPolice then arrive. They arrest multiple individuals near the City Hall windows, including one individual who appeared to approach the vandals in an effort to defuse the situation.\n\nPolice fire tear gas and riot rounds at protesters during the arrests. Protesters become agitated.\n\nAfter police walk arrestee away, protesters continue to shout at police. Police respond with a second bout of tear gas and riot rounds.\n\nA racial slur can be heard shouted, although it is unsure who is shouting.",
"date": "2020-05-31T05:00:00.000Z",
"src": ['http://google.com']
}
Here is the route as it stands:
router.get('/showallincidents', (req, res) => {
Incidents.getAllIncidents()
.then((response) => {
const incidents = response.map((incident) => {
const sources = Incidents.createSourcesArray(incident.incident_id);
return {
...incident,
src: sources,
};
});
res.json(incidents);
})
.catch((err) => {
res.status(500).json({ message: 'Request Error' });
});
});
Here are the models I currently have:
async function getAllIncidents() {
return await db('incidents');
}
async function createSourcesArray(incident_id) {
const sources = await db('sources')
.select('*')
.where('sources.incident_id', incident_id);
return sources;
}
When this endpoint is hit I get a "too many connections" error. Please advise.
I found a solution. What I decided to do was to query the two tables independently. Then I looped through the first result array, and within that loop looped through the second array, checking for the foreign key they share, then when I found a match, added those results to an array on the original object, then returned a new array with the objects of the first with the associated data from the second. Models are unchanged, here is the updated route.
router.get('/showallincidents', async(req, res) => {
try {
const incidents = await Incidents.getAllIncidents();
const sources = await Incidents.getAllSources();
const responseArray = []
// Reconstructs the incident object with it's sources to send to front end
incidents.forEach((incident) => {
incident['src'] = []
sources.forEach(source => {
if (source.incident_id === incident.incident_id) {
incident.src.push(source)
}
})
responseArray.push(incident)
})
res.json(responseArray)
} catch (e) {
res.status(500).json({
message: 'Request Error'
});
}
});
Are the two tables in the same database? If that is the case, it is much more efficent to do the primary/foreign key match by doing an SQL join. What you have implemented is "nested loop join" which might not be the optimal way to match depending on the value distribution of the primary key. You can search SQL join algorithms to see examples and pro/cons.
If the tables are in different databases then indeed client side join is likely your only option. Though again if you know something about the underlying distribution it might be better to do a hash join

Nodejs How to handle multiple request that may generate conflicts between them?

So, I'm building an Ecommerce Api with Koajs and Koa router, and i'm now handling orders payment request, the thing is, before the actual paying, the status of my order model change to 'Waiting for payment', and it is at this point when my API changes the stock of my products.
What I thought is that when this happen, for my response to be faster, the product update could be done asynchronously, as maybe the request may need to process a lot of products updates, and the client can recieve a response and then just focus on paying while the products are updating. But, what if two people make the a request with the same product and while updating it enters in a conflict as the sum of two quantities exceeds the actual product stock? (the shoppingCart checks not to surpass the limits of product stock but once this is done it cant be changed)
For this may be better to just make the client wait and do the process with await? or there is a way that can handle this type of problems better?
My code looks somewhat like this:
router.post('api.orders.payment', '/:id/pay', async (ctx) => {
const order = await ctx.orm.order.findByPk(ctx.params.id);
order.status = 'Waiting for payment';
await order.save({ fields: ['status', 'total'] });
// Update all the products
updateProducts(ctx, order);
ctx.body = ctx.jsonSerializer('order', {
attributes: ['status', 'total'],
topLevelLinks: {
self: `${ctx.origin}${ctx.router.url('api.orders.show', { id: order.id })}`,
},
}).serialize(order);
And the updateProduct function looks like this:
async function updateProducts(ctx, order) {
const shoppingCart = await ctx.orm.orderProduct.findAll({ where: { orderId: order.id } });
let product = null;
shoppingCart.forEach(async (element) => {
product = await ctx.orm.product.findByPk(element.productId);
product.stock -= element.quantity;
await product.save({ fields: ['stock'] });
});
}
I'm using Sequelize, Koa, Koa-router, jsonserializer, and my product model has attributes "id, stock", my order model has attributes "id", "total", and OrderProduct that represents my cart model has attributes "orderId", "productId" and "quantity".

Multiple findOneAndUpdate operations are being skipped

I have a forEach loop where I am querying a document, doing some simple math calculations and then updating a document in the collection and move on to the next iteration.
The problem is, alot of times randomly some of the UPDATE operations will not update the document. I don't know why is it happening. Is it because of the lock?
I have tried logging things just before the update operation. The data is all correct but when it comes to update, it will randomly not update at all. Out of 10 iterations, lets say 8 will correctly work
const name = "foo_game";
players.forEach(({ id, team, username }) => {
let updatedStats = {};
Users.findOne({ id }).then(existingPlayer => {
if (!existingPlayer) return;
const { stats } = existingPlayer;
const existingStats = stats[pug.name];
if (!existingStats) return;
const presentWins = existingStats.won || 0;
const presentLosses = existingStats.lost || 0;
updatedStats = {
...existingStats,
won:
team === winningTeam
? presentWins + 1
: changeWinner
? presentWins - 1
: presentWins,
lost:
team !== winningTeam
? presentLosses + 1
: changeWinner
? presentLosses - 1
: presentLosses,
};
// THE CALCULATIONS ARE ALL CORRECT TILL THIS POINT
// THE UPDATE WIILL RANDOMLY NOT WORK
Users.findOneAndUpdate(
{ id, server_id: serverId },
{
$set: {
username,
stats: { ...stats, [name]: updatedStats },
},
},
{
upsert: true,
}
).exec();
});
});
Basically what you are missing here is the asynchronous operations of both the findOne() and the findOneAndUpdate() are not guaranteed to complete before your foreach() is completed. Using forEach() is not a great choice for a loop with async operations in it, but the other main point here is that it's completely unnecessary since MongoDB has a much better way of doing this and in one request to the server.
In short, instead of "looping" you actually want to provide an array of instructions to bulkWrite():
let server_id = serverId; // Alias one of your variables or just change it's name
Users.bulkWrite(
players.map(({ id, team, username }) =>
({
"updateOne": {
"filter": { _id, server_id },
"update": {
"$set": { username },
"$inc": {
[`stats.${name}.won`]:
team === winningTeam ? 1 : changeWinner ? - 1 : 0,
[`stats.${name}.lost`]:
team !== winningTeam ? 1 : changeWinner ? - 1 : 0
}
},
"upsert": true
}
})
)
)
.then(() => /* whatever continuation here */ )
.catch(e => console.error(e))
So rather than looping, that Array.map() produces one "updateOne" statement within the bulk operation for each array member and sends it to the server. The other change of course is you simply do not need the findOne() in order to read existing values. All you really need is to use the $inc operator in order to either increase or decrease the current value. Noting here that if nothing is currently recorded at the specified path, then it would create those with whatever value of 1/-1/0 was determined by the logic and handed to $inc.
Note this is how you actually should be doing things in general, as aside from avoiding uneccesary loops of async calls the main thing here is to actually use the atomic operators like $inc that MongoDB has. Reading data state from the database in order to make changes is an anti-pattern and best avoided.

How to avoid two concurrent API requests breaking the logic behind document validation?

I have an API that in order to insert a new item it needs to be validated. The validation basically is a type validator(string, number, Date, e.t.c) and queries the database that checks if the "user" has an "item" in the same date, which if it does the validation is unsuccessful.
Pseudocode goes like this:
const Item = require("./models/item");
function post(newDoc){
let errors = await checkForDocErrors(newDoc)
if (errors) {
throw errors;
}
let itemCreated = await Item.create(newDoc);
return itemCreated;
}
My problem is if I do two concurrent requests like this:
const request = require("superagent");
// Inserts a new Item
request.post('http://127.0.0.1:5000/api/item')
.send({
"id_user": "6c67ea36-5bfd-48ec-af62-cede984dff9d",
"start_date": "2019-04-02",
"name": "Water Bottle"
})
/*
Inserts a new Item, which shouldn't do. Resulting in two items having the
same date.
*/
request.post('http://127.0.0.1:5000/api/item')
.send({
"id_user": "6c67ea36-5bfd-48ec-af62-cede984dff9d",
"start_date": "2019-04-02",
"name": "Toothpick"
})
Both will be successful, which it shouldn't be since an "user" cannot have two "items" in the same date.
If I execute the second one after the first is finished, everything works as expected.
request.post('http://127.0.0.1:5000/api/item') // Inserts a new Item
.send({
"id_user": "6c67ea36-5bfd-48ec-af62-cede984dff9d",
"start_date": "2019-04-02",
"name": "Water Bottle"
})
.then((res) => {
// It is not successful since there is already an item with that date
// as expected
request.post('http://127.0.0.1:5000/api/item')
.send({
"id_user": "6c67ea36-5bfd-48ec-af62-cede984dff9d",
"start_date": "2019-04-02",
"name": "Toothpick"
})
})
To avoid this I send one request with an array of documents, but I want to prevent this issue or at least make less likely to happen.
SOLUTION
I created a redis server. Used the package redis-lock and wrapped around the POST route.
var client = require("redis").createClient()
var lock = require("redis-lock")(client);
var itemController = require('./controllers/item');
router.post('/', function(req, res){
let userId = "";
if (typeof req.body === 'object' && typeof req.body.id_user === 'string') {
userId = req.body.id_user;
}
lock('POST ' + req.path + userId, async function(done){
try {
let result = await itemController.post(req.body)
res.json(result);
} catch (e) {
res.status(500).send("Server Error");
}
done()
})
}
Thank you.
Explain
That is a race condition.
two or more threads can access shared data and they try to change it at the same time
What is a race condition?
Solution:
There are many ways to prevent conflict data in this case, a lock is 1 option.
You can lock on application level or database level... but I prefer you read this thread before chose any of them.
Optimistic vs. Pessimistic locking
Quick solution: pessimistic-lock https://www.npmjs.com/package/redis-lock
You should create a composite index or a composite primary key that includes the id_user and the start_date fields. This will ensure that no documents for the same user with the same date can be created, and the database will throw an error if you'll try to do it.
Composite index with mongoose
You could also use transactions. To do it, you should execute the find and the create methods inside a transaction, to ensure that no concurrent queries on the same document will be executed.
Mongoose transactions tutorial
More infos
I would go with an unique composite index, that in your specific case should be something like
mySchema.index({user_id: 1, start_date: 1}, {unique: true});

Resources