i can't see difference between put and patch method - node.js

I just want to like a quote or dislike if already i liked the quote. So first i find the quote and then i check if i already liked the quote, if not then i like, otherwise i dislike.
I have a router like below
router.put('/:quoteId', isAuth, quotesController.likeQuote);
And likeQuote method is like below
module.exports.likeQuote = (req, res, next) => {
const quoteId = req.params.quoteId;
const userId = req.userId;
Quote.findById(quoteId)
.then((quote) => {
if (quote.likes.indexOf(userId) == -1) {
quote.likes.push(userId);
} else {
quote.likes.pull(userId);
}
return quote.save();
})
.then((updatedQuote) => {
res.status(201).json({ message: 'You liked the post!' });
})
.catch((err) => {
err.statusCode = 500;
next(err);
});
But my question is, i just want to know how PUT and PATCH works? I think we should send all the fields in PUT but not in PATCH methods, but in my case i don't even send any fields and both work just fine.How this happens?

The actual REST API methods (PUT, PATCH, ... ) do not have any limitations. the logic you choose to write is what defines this. Now you're asking about "best practices" and whenever you ask about that you will get many different answers. I'll explain my view.
PUT, so the essence of PUT is to replace the existing object completely, that's why people are telling you to send the entire object because when you use PUT what's expected is a complete swap.
PATCH, the essence of PATCH is to update the existing resource. which is in your case what you're looking for, in this case you just send the required fields you need for the update.
Now is it wrong if you write PUT to be an update and not a complete swap? I would argue it is not. As long as you keep consistent logic throughout your app you can build your own "best practices" that will suit your needs.
Now you did tag this question as Mongo related so I would like to introduce to you the concept of piplined updates (for Mongo v4.2+) where you can execute your logic in 1 single update.
Mongo Playground

i just want to know how PUT and PATCH works?
An important distinction to understand is that we don't have a standard for how PUT and PATCH work; that's a implementation detail, and is deliberately hidden behind the "uniform interface".
What we do have is standardized semantics, an agreement about what PUT and PATCH mean.
(This is further complicated by people not being familiar with the standard, and therefore misinterpretations of the meaning are common.)
If the implementation of the request handler doesn't match the semantics of the request, that's OK... but if something goes expensively wrong as a result, it's the fault of the implementation.
PUT and PATCH are both method-tokens that indicate that we are trying to modify the resource identified by the target-uri. In particular, we use those method-tokens when we are trying to make the server's representation of the resource match the representation on the client.
For example, imagine editing a web page. We GET /home.html, change the TITLE element in our copy, and we want to save our changes to the server. How do we do that in HTTP?
One answer is that we send a copy of home.html (with our changes) back to the server, so that the server can save it. That's PUT.
Another answer is that we diff our copy and the server's copy, and send to the server a patch-document that describes the changes that the server should make to it's copy. That's PATCH.
router.put('/:quoteId', isAuth, quotesController.likeQuote);
What this invocation is doing is configuring the framework, so that requests with the PUT method token and a target-uri that matches "/:quoteId" are delegated to the likeQuote method.
And at this level, the framework assumes that you know what you are doing - there's no attempt to verify that "likeQuote" implements PUT semantics. To ensure that the implementation and the request semantics match, you are going to need to do some work (inspect the code, test, etc).
in my case i don't even send any fields and both work just fine.
Right - because the framework assumes that you know what you are doing, and your current implementation doesn't try to access or interpret the body of the HTTP request.
Note: that's a big hint that the request handler not actually implementing PUT/PATCH semantics (how could the server possibly make its copy of the quote look like the client's if it doesn't look at the information the client provided)?
It is okay to use POST; assuming that your implementation is correct, you should not be using methods with remote authoring semantics, because that's not what you are doing. This same implementation attached to a POST route would be fine.
As is, things are broken - you have a mismatch between the request semantics and the handler implementation. Under controlled conditions, you are likely to get away with it. It's entirely possible that you are only going to be invoking this code under controlled conditions.

Related

Global authentication/authorization in Rocket based on a header

I know I can use a Request guard. However, if I have a REST API with hundreds of handlers, not only it would be annoying to have to add an extra function param to all of them, but it kinda scares me that it could be easy to miss adding such a param here or there and therefore create a security hole. That's why I'd like to know if there is a way to do such a validation globally.
The documentation on Fairings mentions they can be used for global security policies:
As a general rule of thumb, only globally applicable actions should be implemented via fairings. For instance, you should not use a fairing to implement authentication or authorization (preferring to use a request guard instead) unless the authentication or authorization applies to the entire application. On the other hand, you should use a fairing to record timing and/or usage statistics or to implement global security policies.
But at the same time the docs on the on_request() callback say this:
A request callback can modify the request at will and Data::peek() into the incoming data. It may not, however, abort or respond directly to the request; these issues are better handled via request guards or via response callbacks.
So how am I supposed to return an error to the user in the case of an invalid token for example?
OK, I think I found a way...
First we create a "dummy" handler like this:
#[put("/errHnd", format = "json")]
fn err_handler() -> ApiResult {
// Here simply return an error
}
Then we attach a fairing like this:
rocket::custom(cfg)
.attach(AdHoc::on_request("OnReq", |req, _| {
// Here we validate the token and if it's not OK,
// forward the request to our "dummy" handler:
let u = Origin::parse("/errHnd").unwrap();
req.set_uri(u);
req.set_method(Method::Put);
}))
.mount("/", routes![err_handler, ...
I'm not sure that's the best way to do it, but I tested it and it seems to work. I'm open to other suggestions.
P.S. It may also be worth mentioning that if we wanted to have an exception, so as to skip the validation in the fairing, say, based on the URL, we could simply add something like this in it:
if req.uri().path() == "/let-me-in-please" {
return;
}

NestJS: Controller function with #UploadedFile or String as a parameter

I am using NestJS (version 6.5, with Express platform) and I need to handle a request with a property that can either be a File or a String.
Here is the code I currently have, but I don't find a clean way to implement this.
MyAwesomeController
#Post()
#UseInterceptors(FileInterceptor('source'))
async handle(#UploadedFile() source, #Body() myDto: MyDto): Promise<any> {
//do things...
}
Am I missing something obvious or am I supposed to write my own interceptor to handle this case?
Design-wise, is this bad?
Based on the fact you're designing a REST API:
It depends what use case(s) you want to achieve: is your - client-side - flow designed to be performed in 2 steps o not ?
Can string and file params be both passed at the same time or is there only one of the two on each call ? (like if you want to update a file and its name, or some other non Multer related attributes).
When you pass a string as parameter to your endpoint call, is a file resource created / updated / deleted ? Or maybe not at all ?
Depending on the answer and the flow that you thought of, you should split both cases handling within two independent endpoints, or maybe it makes sense to handle both parameters at the same time.
If only one of the params can be passed at a time, I'd say go for two independent endpoints; you'll benefit from both maintenance and code readability.
If both params can be passed at the same time and they're related to the same resource, then it could make sense to handle both of them at once.
Hope this helps, don't hesitate to comment ;)

Would the values inside request be mixed up in callback?

I am new to Node.js, and I have been reading questions and answers related with this issue, but still not very sure if I fully understand the concept in my case.
Suggested Code
router.post('/test123', function(req, res) {
someAsyncFunction1(parameter1, function(result1) {
someAsyncFunction2(parameter2, function(result2) {
someAsyncFunction3(parameter3, function(result3) {
var theVariable1 = req.body.something1;
var theVariable2 = req.body.something2;
)}
)}
});
Question
I assume there will be multiple (can be 10+, 100+, or whatever) requests to one certain place (for example, ajax request to /test123, as shown above) at the same time with some variables (something1 and something2). According to this, it would be impossible that one user's theVariable1 and theVariable2 are mixed up with (i.e, overwritten by) the other user's req.body.something1 and req.body.something2. I am wondering if this is true when there are multiple callbacks (three like the above, or ten, just in case).
And, I also consider using res.locals to save some data from callbacks (instead of using theVariable1 and theVariable2, but is it good idea to do so given that the data will not be overwritten due to multiple simultaneous requests from clients?
Each request an Node.js/Express server gets generated a new req object.
So in the line router.post('/test123', function(req, res), the req object that's being passed in as an argument is unique to that HTTP connection.
You don't have to worry about multiple functions or callbacks. In a traditional application, if I have two objects cat and dog that I can pass to the listen function, I would get back meow and bark. Even though there's only one listen function. That's sort of how you can view an Express app. Even though you have all these get and post functions, every user's request is passed to them as a unique entity.

Best practices form processing with Express

I'm writing a website which implements a usermanagement system and I wonder what best practices regarding form processing I have to consider.
Especially performance, security, SEO and user experience are important to me. When I was working on it I came across a couple questions and I didn't find an complete node/express code snippet where I could figure out all of my below questions.
Use case: Someone is going to update the birthday of his profile. Right now I am doing a POST request to the same URL to process the form on that page and the POST request will respond with a 302 redirect to the same URL.
General questions about form processing:
Should I do a POST request + 302 redirect for form processing or rather something else like an AJAX request?
How should I handle invalid FORM requests (for example invalid login, or email address is already in use during signup)?
Express specific questions about form processing:
I assume before inserting anything into my DB I need to sanitize and validate all form fields on the server side. How would you do that?
I read some things about CSRF but I have never implemented a CSRF protection. I'd be happy to see that in the code snippet too
Do I need to take care of any other possible vulnerabilities when processing forms with Express?
Example HTML/Pug:
form#profile(method='POST', action='/settings/profile')
input#profile-real-name.validate(type='text', name='profileRealName', value=profile.name)
label(for='profile-real-name') Name
textarea#profile-bio.materialize-textarea(placeholder='Tell a little about yourself', name='profileBio')
| #{profile.bio}
label(for='profile-bio') About
input#profile-url.validate(type='url', name='profileUrl', value=profile.bio)
label(for='profile-url') URL
input#profile-location.validate(type='text', name='profileLocation', value=profile.location)
label(for='profile-location') Location
.form-action-buttons.right-align
a.btn.grey(href='' onclick='resetForm()') Reset
button.btn.waves-effect.waves-light(type='submit')
Example Route Handlers:
router.get('/settings/profile', isLoggedIn, profile)
router.post('/settings/profile', isLoggedIn, updateProfile)
function profile(req, res) {
res.render('user/profile', { title: 'Profile', profile: req.user.profile })
}
function updateProfile(req, res) {
var userId = req.user._id
var form = req.body
var profile = {
name: form.profileRealName,
bio: form.profileBio,
url: form.profileUrl,
location: form.profileLocation
}
// Insert into DB
}
Note: A complete code snippet which takes care of all form processing best practices adapted to the given example is highly appreciated. I'm fine with using any publicly available express middleware.
Should I do a POST request + 302 redirect for form processing or rather something else like an AJAX request?
No, best practice for a good user experience since 2004 or so (basically since gmail launched) has been form submission via AJAX and not web 1.0 full-page load form POSTs. In particular, error handling via AJAX is less likely to leave your user at a dead end browser error page and then hit issues with the back button. The AJAX in this case should send an HTTP PATCH request to be most semantically correct but POST or PUT will also get the job done.
How should I handle invalid FORM requests (for example invalid login, or email address is already in use during signup)?
Invalid user input should result in an HTTP 400 Bad Request status code response, with details about the specific error(s) in a JSON response body (the format varies per application but either a general message or field-by-field errors are common themes)
For email already in use I use the HTTP 409 Conflict status code as a more particular flavor of general bad request payload.
I assume before inserting anything into my DB I need to sanitize and validate all form fields on the server side. How would you do that?
Absolutely. There are many tools. I generally define a schema for a valid request in JSON Schema and use a library from npm to validate that such as is-my-json-valid or ajv. In particular, I recommend being as strict as possible: reject incorrect types, or coerce types if you must, remove unexpected properties, use small but reasonable string length limits and strict regular expression patterns for strings when you can, and of course make sure your DB library property prevents injection attacks.
I read some things about CSRF but I have never implemented a CSRF protection.
The OWSAP Node Goat Project CSRF Exercise is a good place to start with a vulnerable app, understand and exploit the vulnerability, then implement the fix (in this case with a straightforward integration of the express.csrf() middleware.
Do I need to take care of any other possible vulnerabilities when processing forms with Express?
Yes generally application developers must understand and actively code securely. There's a lot of material out there on this but particular care must be taken when user input gets involved in database queries, subprocess spawning, or being written back out to HTML. Solid query libraries and template engines will handle most of the work here, you just need to be aware of the mechanics and potential places malicious user input could sneak in (like image filenames, etc).
I am certainly no Express expert but I think I can answer at least #1:
You should follow the Post/Redirect/Get web development pattern in order to prevent duplicate form submissions. I've heard a 303-redirect is the proper http statuscode for redirecting form submissions.
I do process forms using the POST route and once I'm done I trigger a 302-redirect.
As of #3 I recommend looking into express-validator, which is well introduce here: https://developer.mozilla.org/en-US/docs/Learn/Server-side/Express_Nodejs/forms . It's a middleware which allows you to validate and sanitize like this:
req.checkBody('name', 'Invalid name').isAlpha();
req.checkBody('age', 'Invalid age').notEmpty().isInt();
req.sanitizeBody('name').escape();
I wasn't able to comment hence the answer even though it's not a complete answer. Just thought it might help you.
If user experience is something you're thinking about, a page redirection is a strong no. Providing a smooth flow for the people visiting your website is important to prevent drops, and since forms are already not such a pleasure to fill, easing their usage is primary. You don't want to reload their page that might have already took some time to load just to display an error message. Once the form is valid and you created the user cookie, a redirection is fine though, even if you could do things on the client app to prevent it, but that's out-of-scope.
As stated by Levent, you should checkout express-validator, which is the more established solution for this kind of purpose.
req.check('profileRealName', 'Bad name provided').notEmpty().isAlpha()
req.check('profileLocation', 'Invalid location').optional().isAlpha();
req.getValidationResult().then(function (result) {
if (result.isEmpty()) { return null }
var errors = result.array()
// [
// { param: "profileRealName", msg: "Bad name provided", value: ".." },
// { param: "profileLocation", msg: "Invalid location", value: ".." }
// ]
res.status(400).send(errors)
})
.then(function () {
// everything is fine! insert into the DB and respond..
})
From what it looks like, I can assume you are using MongoDB. Given that, I would recommend using an ODM, like Mongoose. It will allow you to define models for your schemas and put restrictions directly on it, letting the model handles these kind of redundant validations for you.
For example, a model for your user could be
var User = new Schema({
name: { type: String, required: [true, 'Name required'] },
bio: { type: String, match: /[a-z]/ },
age: { type: Number, min: 18 }, // I don't know the kind of site you run ;)
})
Using this schema on your route would be looking like
var user = new User({
name: form.profileRealName,
bio: form.profileBio,
url: form.profileUrl,
location: form.profileLocation
})
user.save(function (err) {
// and you could grab the error here if it exists, and react accordingly
});
As you can see it provides a pretty cool api, which you should read about in their docs if you want to know more.
About CRSF, you should install csurf, which has pretty good instructions and example usages on their readme.
After that you're pretty much good to go, there is not much more I can think about apart making sure you stay up to date with your critical dependencies, in case a 0-day occurs, for example the one that happened in 2015 with JWTs, but that's still kinda rare.

sails.js Use session param in model

This is an extension of this question.
In my models, every one requires a companyId to be set on creation and every one requires models to be filtered by the same session held companyid.
With sails.js, I have read and understand that session is not available in the model unless I inject it using the controller, however this would require me to code all my controller/actions with something very, very repetitive. Unfortunate.
I like sails.js and want to make the switch, but can anyone describe to me a better way? I'm hoping I have just missed something.
So, if I understand you correctly, you want to avoid lots of code like this in your controllers:
SomeModel.create({companyId: req.session.companyId, ...})
SomeModel.find({companyId: req.session.companyId, ...})
Fair enough. Maybe you're concerned that companyId will be renamed in the future, or need to be further processed. The simplest solution if you're using custom controller actions would be to make class methods for your models that accept the request as an argument:
SomeModel.doCreate(req, ...);
SomeModel.doFind(req, ...);
On the other hand, if you're on v0.10.x and you can use blueprints for some CRUD actions, you will benefit from the ability to override the blueprints with your own code, so that all of your creates and finds automatically use the companyId from the session.
If you're coming from a non-Node background, this might all induce some head-scratching. "Why can't you just make the session available everywhere?" you might ask. "LIKE THEY DO IN PHP!"
The reason is that PHP is stateless--every request that comes in gets essentially a fresh copy of the app, with nothing in memory being shared between requests. This means that any global variables will be valid for the life of a single request only. That wonderful $_SESSION hash is yours and yours alone, and once the request is processed, it disappears.
Contrast this with Node apps, which essentially run in a single process. Any global variables you set would be shared between every request that comes in, and since requests are handled asynchronously, there's no guarantee that one request will finish before another starts. So a scenario like this could easily occur:
Request A comes in.
Sails acquires the session for Request A and stores it in the global $_SESSION object.
Request A calls SomeModel.find(), which calls out to a database asynchronously
While the database does its magic, Request A surrenders its control of the Node thread
Request B comes in.
Sails acquires the session for Request B and stores it in the global $_SESSION object.
Request B surrenders its control of the thread to do some other asynchronous call.
Request A comes back with the result of its database call, and reads something from the $_SESSION object.
You can see the issue here--Request A now has the wrong session data. This is the reason why the session object lives inside the request object, and why it needs to be passed around to any code that wants to use it. Trying too hard to circumvent this will inevitably lead to trouble.
Best option I can think of is to take advantage of JS, and make some globally accessible functions.
But its gonna have a code smell :(
I prefer to make a policy that add the companyId inside the body.param like this:
// Needs to be Logged
module.exports = function(req, res, next) {
sails.log.verbose('[Policy.insertCompanyId() called] ' + __filename);
if (req.session) {
req.body.user = req.session.companyId;
//or something like AuthService.getCompanyId(req.session);
return next();
}
var err = 'Missing companyId';
//log ...
return res.redirect(307, '/');
};

Resources