The case:
I'm creating an api with SAILS.js, sails.js uses waterline for ORM. The api returns lets say photographs, many users are able to vote for a picture. The pictures will be ordered on number of votes.
Procedure:
When a user votes for a song, I've to check the number of votes ("SELECT"|| picture.findById()) AND after that I have to increment that number by one ("UPDATE" picture.update).
Problem:
transaction/ locking in Sails.js, these two queries should be excecuted without having a other query modifying the picture data within the select and update query of our vote system.
How should we perform locking/ transition in sails.js (node js framework)
THANKS
Sails does support transactions.
Here is an example of a transaction in sails.js:
await sails.getDatastore().transaction(async db=> {
await Model
.create({foo: bar})
.usingConnection(db);
if (somethingWentWrong) {
throw 'error happened - transaction rollback';
}
await Model
.update({id: 1}))
.set({votes: 1})
.usingConnection(db);
}).intercept('Error', () => res.serverError());
Related
Im using Objection.js as my ORM for a simple rainfall application. I need to be able to dynamically update and entry of one table when a lower level tables entires has been updated. To do this I need the whole entry I am updating so I can use that data to correctly update the dynamically updated entry.
Im using the $afterUpdate hook for the lower level table entry which. The issue I am having is that when I log this within the $afterUpdate hook function it only contains the properties for the parts of the entry I want to update. How can I get the entire entry? Im sure I could get the record by running an additional query to the DB but I was hoping there would be away to avoid this. Any help would be appreciated
I think, as of right now, you can only get the whole model with an extra query.
If you are doing the update with an instance query ($query) you can get the other properties from options.old.
Query:
const user = await User.query().findById(userId);
await user.$query()
.patch({ name: 'Tom Jane' })
Hook:
$afterUpdate(opt, queryContext) {
console.log(opt.old)
}
Patch
If you don't need to do this in the hook, you might want to use patch function chained with first().returning('*') to get the whole model in a single query, it's more efficient than patchAndFetchById in postgreSQL. Like stated in the documentation.
Because PostgreSQL (and some others) support returning('*') chaining, you can actually insert a row, or update / patch / delete (an) existing row(s), and receive the affected row(s) as Model instances in a single query, thus improving efficiency. See the examples for more clarity.
const jennifer = await Person
.query()
.patch({firstName: 'Jenn', lastName: 'Lawrence'})
.where('id', 1234)
.returning('*')
.first();
References:
http://vincit.github.io/objection.js/#postgresql-quot-returning-quot-tricks
https://github.com/Vincit/objection.js/issues/185
https://github.com/Vincit/objection.js/issues/695
I have a node application with MySQL as DB. I on my way to making an endpoint that will update multiple rows with different data for each. Also, I am using sequelize as ORM.
Now I know I can update a row like
model.update(data).then(()=>{res.end('Row Updated')});
Now my question is where should I call update method for second model. ie in the cb function passed to then() or after the update.model method
I mean which of the following would be a best practice.
models1.update(data1).then(()=>{console.log('Row 1 Updated')});
model2.update(data2).then(()=>{console.log('Row 2 Updated')});
**OR**
model1.update(data1).then(()=>{
model2.update(data2).then(()=>{console.log('All the rows have been updated')})
});
Neither of the above are correct because promises aren't chained properly. This impedes proper control flow and error handling.
If queries are independent, they could be performed in parallel:
Promise.all([
model1.update(data1),
model2.update(data2)
])
.then(() => {
console.log('All the rows have been updated');
});
How would I go about creating multiple documents with different schemas in one REST API request in Node/Mongoose/Express?
Say for example I need to create a user and a site on a single request, say for example /createUser.
I could of course create a user and then in the returned promise, create the next record, but what if that second record doesn't meet validation? Then I've created a user without the second record.
User.create(userData)
.then(user => {
Site.create(siteData)
.then(site => {
// Do something
})
.catch(err => {
console.log(err)
// If this fails, I'm left with a user created without
// a site.
})
})
.catch(err => {
console.log(err)
})
Is there a good practice to follow when creating multiple documents like this? Should I run manual validation instead before each .create() runs? Any guidance/advice would be very much appreciated!
You have the transaction problem here. You are trying to write into two different models but want the whole operation to be atomic and if any of them fails you need to rollback. Up until mongo 4.0 transactions were not supported by mongo and a work around for these sort issues was two phased commits. Now in mongo 4.0 we have transactions to cater such problems.
Hope it helped.
I recently started using Node.js + Express.js (generated with pug) + pg-promise for handling db.
My first target is to obtain data from Postgres (already set up) and display it pretty using render and pug. Let's say it is user list from Users table.
On this restful tutorial I have learned how to get data and return it as JSON - it worked.
Based on Mozilla's tutorial I seperated my code:
routes/users.js: where for '/' I call user_controller.user_list method (using router.get)
controllers/userController.js I have exported user_list where I would like to ask model for data and call render if I have results
queries.js which is kinda my model? But I'm not sure. It has API: connection to db with promises and one function for every query I am going to use in Controllers. I believe I should have like one Model file per table (or any logical entity) but where to store pgp connections?
This file is based on first tutorial I mentioned
// queries.js (connectionString is set properly to my postgres)
var pgp = require('pg-promise')(options);
var db = pgp(connectionString);
function getUsers(req, res, next) {
db.any('SELECT (user_id, username) FROM public.users ORDER BY user_id ASC LIMIT 1000')
.then(function (data) {
res.json({ data: data });
})
.catch(function (err) {
return next(err);
});
}
module.exports = {
getUsers: getUsers
};
Here starts my problem as most tutorials uses mongoose which is very model-db-schema-friendly and what I have is simple 'SELECT ...' string I pass to pg-promise's any() function.
Therefore I have no model class like User.
In userControllers.js I don't know how to call getUsers() to handle its data. Returning JS object from getUsers() would be nice.
Also: where should I call render? In controller or only in
db.any(...).then(function (data) { <--here--> })
Before, I also tried to embed whole Postgres handling into Controller but from db.any() I got this array for handling:
[{ row: '(1,John)' },{ row: '(2,Amy)' },{ row: '(50,Peter)' } ]
Didn't know how go from there as I probably lost my API functionality as well ;-)
I am browsing through multiple tutorials how to handle MVC but usually they handle MongoDB and
satisfy readers with res.send() not render().
I am not sure that I understand what your question is exactly about, but since I do not have enough reputation to comment, I'll do my best to help you with your interrogations. :)
First, regarding the queries.js file, it is IMO not exactly a model, but rather a DAO (Data Access Object) file. DAO comes between you Model (which is actually you database) and your Controller layers. There usually is a DAO file per object (User, Pet, whatever you want) in your data model.
When the data model is rather complex, it can be useful to use an Object Relational Mapping (ORM) such as Mongoose to map your database and execute complexe processes on your objects. In such a case, you might need a specific file per object so as to describe your model and store your queries. But since you don't need an ORM, you DAO can directly interact with your database. That is why you do not have a User.js file.
Regarding the way the db object should be used, I think you should refer directly to pg-promise documentation on the matter.
IMPORTANT: For any given connection, you should only create a single
Database object in a separate module, to be shared in your application
(see the code example below). If instead you keep creating the
Database object dynamically, your application will suffer from loss in
performance, and will be getting a warning in a development
environment (when NODE_ENV = development)
As a matter of fact, a db object in pg-promise sort of represents the database itself and is actually designed for the simultaneous use of several databases, which does not seem to be your case for the moment.
Finally, when it comes to the render function, I believe it should be in the controller, as your DAO is not supposed to know how the data it has gathered is going to be used.
Modularity is always a time-saving choice on the long-term.
Furthermore, note that you might later need a Business Layer between your DAO and your controller, in order to preprocess and postprocess data you are going to persist or to display. In such a case, if you need for instance to ask for data from your database, you will need to render data after it is processed by the Business layer. If the render is made in the DAO layer, it will not be possible.
In the link I provided earlier to pg-promise's db object connection, you will also find documentation on the any() method. You might already have looked it up.
It specifically states that it returns
A promise object that represents the query result:
When no rows are returned, it resolves with an empty array.
When 1 or more rows are returned, it resolves with the array of rows.
so your returned data is a JS Array. If you want to make it a JS object, just use
JSON.stringify(yourArray) to process your data before rendering it in your controller.
But I wonder if Pug is not able to use your data directly.
Also, if you cannot get any data out of your DAO, maybe you should check that your data object is not empty, as such a case is tolerated by the any() method. If you expect your query to always return something, you might want to consider using the many() or the one() methods.
I hope this helps you.
I'm using an ajax request from the front end to load more comments to a post from the back-end which uses NodeJS and mongoose. I won't bore you with the front-end code and the route code, but here's the query code:
Post.findById(req.params.postId).populate({
path: type, //type will either contain "comments" or "answers"
populate: {
path: 'author',
model: 'User'
},
options: {
sort: sortBy, //sortyBy contains either "-date" or "-votes"
skip: parseInt(req.params.numberLoaded), //how many are already shown
limit: 25 //i only load this many new comments at a time.
}
}).exec(function(err, foundPost){
console.log("query executed"); //code takes too long to get to this line
if (err){
res.send("database error, please try again later");
} else {
res.send(foundPost[type]);
}
});
As was mentioned in the title, everything works fine, my problem is just that this is too slow, the request is taking about 1.5-2.5 seconds. surely mongoose has a method of doing this that takes less to load. I poked around the mongoose docs and stackoverflow, but didn't really find anything useful.
Using skip-and-limit approach with mongodb is slow in its nature because it normally needs to retrieve all documents, then sort them, and after that return the desired segment of the results.
What you need to do to make it faster is to define indexes on your collections.
According to MongoDB's official documents:
Indexes support the efficient execution of queries in MongoDB. Without indexes, MongoDB must perform a collection scan, i.e. scan every document in a collection, to select those documents that match the query statement. If an appropriate index exists for a query, MongoDB can use the index to limit the number of documents it must inspect.
-- https://docs.mongodb.com/manual/indexes/
Using indexes may cause increased collection size but they improve the efficiency a lot.
Indexes are commonly defined on fields which are frequently used in queries. In this case, you may want to define indexes on date and/or vote fields.
Read mongoose documentation to find out how to define indexes in your schemas:
http://mongoosejs.com/docs/guide.html#indexes