When using ctx.cookies.set(), KOA adds a 'set-cookie' header to the response. However, ctx.cookies.get() retains the cookie (if any) from the original request. This seems counter-intuitive if I'm overwriting it. Is there not a way to have the getter reflect the new value immediately, in the same request?
h = uuidv4();
console.log('set new cookie',h);
ctx.cookies.set('uuid',h,{httpOnly:false,overwrite:true});
//This outputs undefined or the value that came with the request,
//not the newly assigned value:
console.log('cookie is',ctx.cookies.get('uuid'));
I'm fairly new to node.js. What I'm looking for is something like PHP's $_SESSION[], in which values are updated and available immediately as well as being written to the response cookie. I understand I could pile the new uuid into ctx.state, but it seems cleaner to just be able to access ctx.cookies.get() anywhere further down the middleware chain and have access to what I just set it to.
Having a set() that doesn't change the result of its get() seems like code smell to me. Am I missing something?
Related
Introduction
So I am building a website with node.js, express, express-session, and sequelize.js. Once a user logs in, an instance of the Sequelize model User is created. In my route for user log-in (/auth/login), I have:
var user = (await User.findAll(
{where: {
username: username
}}))[0];
and I few lines down I assign that user to the session.
req.session.user = user;
And then I can persist any changes by simply calling the save method of req.session.user:
await req.session.user.save();
And indeed, if I add this line next:
console.log(Object.getPrototypeOf(req.session.user));
the output is [object SequelizeInstance:User]. So far so good.
Here is the problem
In another route (/users/myaccount/edit-bio) I am able to access the values of req.session.user. That is, the output of
console.log(req.session.user.username);
is seanletendre, as expected. But now when I call
await req.session.user.save();
all I get is the error message:
UnhandledPromiseRejectionWarning: TypeError: req.session.user.save is not a function
"That is weird," I thought, "isn't this the same object?" To investigate, I add the line:
console.log(Object.getPrototypeOf(req.session.user));
just as I did in the log-in route. And what is the output? It is: [object Object]. So it seems that somehow the prototype of req.session.user gets forgotten. I don't understand how this can be.
Is it possible to re-assign a prototype to a plain object?
Suspect A
Based on the comments to my question, I suspect that the prototype is lost when the session manager serializes req.session. It seems that, unlike I thought before,req.session does not point to the exact same session object for different requests. Each time a request ends, it serializes and stores req.session. Then upon receiving a new request with a cookie designating it as part of the same session, the session object is fetch from the session store.
This is how my session middleware is setup:
var session = require('express-session');
//
// yada, yada, yada
//
app.use(session({
secret: process.env.SESSION_SECRET,
resave: false,
saveUninitialized: false,
cookie: {secure: true}
}));
So what surprises me is that, even though I am using the default store MemoryStore, my sessions are still serialized.
My question now becomes: how can I prevent object serialization upon session store when using MemoryStore?
In express-session the method save() is exposed by the object session into the request object (docs), eg.:
req.session.save(callback)
Your code req.session.user.save() is wrong, the correct way is req.session.save(), diff.:
req.session.user.save();
-----------^^^^^
req.session.save()
The method save() isn't a Promise, you must pass a callback for wait for the result of the save:
req.session.user = user;
req.session.save(function(err) {
if( err ){
// session not saved
} else {
// session saved
}
})
you can transform it into a Promise (and await it), in this way:
const saveSession = (req) => {
return new Promise((resolve, reject) => {
req.session.save(function(err) {
if( err ){
reject(err);
} else {
resolve(true);
}
});
});
};
req.session.user = user;
await saveSession(req);
The method save() is automatically called at the end of the HTTP response if the session data has been altered. Because of this, typically this method does not need to be called.
UPDATE
I call req.session.user.save() because I want to save the Sequelize
Model instance into my database.
express-session use JSON.stringify() for serialize the session into a string and store into a session storage (more info). JSON.stringify() doesn't understand functions and only property are stored. For this reason, your save() function is lost.
Is it possible to re-assign a prototype to a plain object?
Technically you could re-assign the .prototype and .constructor. However, a model instance is quite a complex object. Try logging it to debugging console and you'll see it has lots of sub-objects of its own kind and also references to other more distant objects residing in a "more global" space (nothing like the plain object after deserialization (of JSON basically)). You'd have to re-instanciate them all. That is very hard to do.
My question now becomes: how can I prevent object serialization upon session store when using MemoryStore?
I am not experienced with express-session, but from what i read i think it basically works with parsing and stringifying JSON. So you can't prevent it when using express-session.
What you could do:
Just store the user.id to req.session.user_id. And User.findOne({ where: { id }) upon reloading the session. You then get a "real" sequelize model instance. And that instance you could assign to req.user, so that it's available throughout the request.
If you also hoped to avoid reloading the user's instance upon every request, then you are searching for a "caching" solution, which is different from "sessions". You'd then use a caching system like memcache, redis or others. However, there too, you probably won't get back a "real" instance of the model from the caching server.
If you have only one server where your application is running on, you could have a global.users variable, where you store all user instances via their id. For example "global.users[12345] = user". Like with 1. you'd retrieve it (or when not found initially load it) based on session.user_id
Some more thoughts: In your place, i'd also invest some time in figuring out, if sequelize allows to populate a new User() instance with data from JSON, and tricking sequelize into believing that this is not a new record.
Also, keep in mind, that depending on your application, the user's data in the database can change, while in your session or global.users you miss these changes.
I am a beginner who have been banging my head for days with this problem I got really really stuck with.
Basically I just want to make a post request using node and express. The object will be created dynamically, but this is my hard coded example. myObj contain an array because I want to do one insert to the database for each item later on server side.
let myObj = {
id: 50,
damage_type: ["missing", "broken", "light"]
}
// Parse myObj to JSON string to be sent
let myjsonObj = JSON.stringify(myObj);
console.log(myjsonObj );
// {"poi":50,"damage_type":["missing","broken","light"]}
postDamage(myjsonObj )
function postDamage(damage) {
$.post({
type: 'POST',
url: '/damage',
data: damage
}).
done(function (damage) {
// Do things
}
router.post('/damage', (req, res) =>
{
let data = req.body;
console.log(data)
// This is what I get in the node terminal which is nonsense, I cannot work with
{ '{"id":50,"damage_type":["missing","broken","light"]}': '' }
I expect it to look like {"id":50,"damage_type":["missing","broken","light"]}
So I can loop through the damage_type creating new objects with this structure
createSQLfunction({id:50, damage_type:"missing"})
});
If I dont stringify my myObj the node terminal is printing
{poi:'50', 'damage_type[]: [ 'missing','broken','light']} Where does the extra [] come from?!
What am I doing wrong not to be able to send an array inside an object to the server side?
From the jquery website:
data
Type: PlainObject or String or Array
Data to be sent to the
server. It is converted to a query string, if not already a string.
It's appended to the url for GET-requests. See processData option to
prevent this automatic processing. Object must be Key/Value pairs. If
value is an Array, jQuery serializes multiple values with same key
based on the value of the traditional setting (described below).
The traditional setting appears to be whether it url-encodes as key[]=val1&key[]=val2 or just key=val1&key=val2. You can give it a try, YMMV.
Or you could make your life a lot easier and just serialize the json yourself, instead of messing with jquery's url-encoding.
*Edit: In answer to your question about best practices: Back before JavaScript form submissions became popular, the two standard ways of submitting a form were application/x-www-form-urlencoded or multipart/form-data. The latter was mostly used if you had file(s) you were submitting with a form.
However with the advent of JavaScript XHR (ajax) form submissions, it has become much more common/popular to use JSON instead of either of these formats. So there is absolutely nothing at all wrong with doing something like data: JSON.stringify(object) when you submit your data, and then just instruct your server to read the JSON.
In fact it's probably both easier and faster. And it is a very popular method, so no worries about going against modern best practices.
I am new to Node.js, and I have been reading questions and answers related with this issue, but still not very sure if I fully understand the concept in my case.
Suggested Code
router.post('/test123', function(req, res) {
someAsyncFunction1(parameter1, function(result1) {
someAsyncFunction2(parameter2, function(result2) {
someAsyncFunction3(parameter3, function(result3) {
var theVariable1 = req.body.something1;
var theVariable2 = req.body.something2;
)}
)}
});
Question
I assume there will be multiple (can be 10+, 100+, or whatever) requests to one certain place (for example, ajax request to /test123, as shown above) at the same time with some variables (something1 and something2). According to this, it would be impossible that one user's theVariable1 and theVariable2 are mixed up with (i.e, overwritten by) the other user's req.body.something1 and req.body.something2. I am wondering if this is true when there are multiple callbacks (three like the above, or ten, just in case).
And, I also consider using res.locals to save some data from callbacks (instead of using theVariable1 and theVariable2, but is it good idea to do so given that the data will not be overwritten due to multiple simultaneous requests from clients?
Each request an Node.js/Express server gets generated a new req object.
So in the line router.post('/test123', function(req, res), the req object that's being passed in as an argument is unique to that HTTP connection.
You don't have to worry about multiple functions or callbacks. In a traditional application, if I have two objects cat and dog that I can pass to the listen function, I would get back meow and bark. Even though there's only one listen function. That's sort of how you can view an Express app. Even though you have all these get and post functions, every user's request is passed to them as a unique entity.
This is an extension of this question.
In my models, every one requires a companyId to be set on creation and every one requires models to be filtered by the same session held companyid.
With sails.js, I have read and understand that session is not available in the model unless I inject it using the controller, however this would require me to code all my controller/actions with something very, very repetitive. Unfortunate.
I like sails.js and want to make the switch, but can anyone describe to me a better way? I'm hoping I have just missed something.
So, if I understand you correctly, you want to avoid lots of code like this in your controllers:
SomeModel.create({companyId: req.session.companyId, ...})
SomeModel.find({companyId: req.session.companyId, ...})
Fair enough. Maybe you're concerned that companyId will be renamed in the future, or need to be further processed. The simplest solution if you're using custom controller actions would be to make class methods for your models that accept the request as an argument:
SomeModel.doCreate(req, ...);
SomeModel.doFind(req, ...);
On the other hand, if you're on v0.10.x and you can use blueprints for some CRUD actions, you will benefit from the ability to override the blueprints with your own code, so that all of your creates and finds automatically use the companyId from the session.
If you're coming from a non-Node background, this might all induce some head-scratching. "Why can't you just make the session available everywhere?" you might ask. "LIKE THEY DO IN PHP!"
The reason is that PHP is stateless--every request that comes in gets essentially a fresh copy of the app, with nothing in memory being shared between requests. This means that any global variables will be valid for the life of a single request only. That wonderful $_SESSION hash is yours and yours alone, and once the request is processed, it disappears.
Contrast this with Node apps, which essentially run in a single process. Any global variables you set would be shared between every request that comes in, and since requests are handled asynchronously, there's no guarantee that one request will finish before another starts. So a scenario like this could easily occur:
Request A comes in.
Sails acquires the session for Request A and stores it in the global $_SESSION object.
Request A calls SomeModel.find(), which calls out to a database asynchronously
While the database does its magic, Request A surrenders its control of the Node thread
Request B comes in.
Sails acquires the session for Request B and stores it in the global $_SESSION object.
Request B surrenders its control of the thread to do some other asynchronous call.
Request A comes back with the result of its database call, and reads something from the $_SESSION object.
You can see the issue here--Request A now has the wrong session data. This is the reason why the session object lives inside the request object, and why it needs to be passed around to any code that wants to use it. Trying too hard to circumvent this will inevitably lead to trouble.
Best option I can think of is to take advantage of JS, and make some globally accessible functions.
But its gonna have a code smell :(
I prefer to make a policy that add the companyId inside the body.param like this:
// Needs to be Logged
module.exports = function(req, res, next) {
sails.log.verbose('[Policy.insertCompanyId() called] ' + __filename);
if (req.session) {
req.body.user = req.session.companyId;
//or something like AuthService.getCompanyId(req.session);
return next();
}
var err = 'Missing companyId';
//log ...
return res.redirect(307, '/');
};
In express you call var app = module.exports = express.createServer(); which creates a new HTTPServer object. I'd like to get access to the current req object from this app(HTTPServer) object. Is there a way to do this?
The req object is only created when the underlying HTTPServer actually gets a request, and only lasts for as long as the request is processed. So it's not really meaningful to talk about it outside the context of a callback.
During a callback, you can simply copy the appropriate data from the session object somewhere else and use that copy in your websockets code. But you can't count on the request object, or even the session object, remaining after you've finished processing the request.
Showing a small code example would be helpful; it sounds like you've got an "XY problem" (you want to accomplish a goal X and you've decided that technique Y is the right way to do it, when in fact technique Z might work better).