I'm writing an app that builds an API through scraping an external domain. In order to scrape the domain, my server must be authorized ( with a session cookie ).
I'm using the request module with a cookie jar to maintain the cookies across requests.
I want to set up some Node router middleware so that, if/when the session expires, I can re-run my authentication method. Think something like this:
export function validate(req, res, next) {
const cookie = cookieJar.getCookie('**target domain**');
const COOKIE_IS_EXPIRED = // ???
if ( COOKIE_IS_EXPIRED ) {
authenticate().then(next);
} else {
next();
}
}
When I log out the contents of cookieJar.getCookies() my result is something like the following:
[ Cookie="lw_opac_1=1483019929431275955; Expires=Wed, 29 Mar 2017 13:58:49 GMT; Max-Age=7776000; Path=/opac; hostOnly=true; aAge=0ms; cAge=6345ms",
Cookie="lw_opac=ta4tscsejs6c94ngikt7hlbcn0; Path=/; hostOnly=true; aAge=0ms; cAge=6347ms" ]
how can I validate when that both cookies are close to / have expired, and then re-run my auth function?
Thanks!
You can use the cookie module from npm to parse the cookies.
So, for example when you have a string like you posted:
var c = "lw_opac_1=1483019929431275955; Expires=Wed, 29 Mar 2017 13:58:49 GMT; Max-Age=7776000; Path=/opac; hostOnly=true; aAge=0ms; cAge=6345ms";
then running:
console.log(cookie.parse(c));
will result in printing:
{ lw_opac_1: '1483019929431275955',
Expires: 'Wed, 29 Mar 2017 13:58:49 GMT',
'Max-Age': '7776000',
Path: '/opac',
hostOnly: 'true',
aAge: '0ms',
cAge: '6345ms' }
You can use the Max-Age and Expires fields and use moment to parse them and compare them to the current time.
With moment you can compare the given date with today and get the number of hours or days of the difference. For example this:
moment().isAfter(someDate);
will tell you if that date has already passed (so if it's the expiration date then it will tell you if the cookie has expired).
See:
https://www.npmjs.com/package/cookie
https://www.npmjs.com/package/moment
#rsp answer correct. But if you are going to maintain access security token you should move on csrf token which automatically maintain client side cookie and generate token for new session. For more detail http://www.senchalabs.org/connect/csrf.html
Related
I am using the azure node based api to set up an implicit flow of oauth v2. Upon typing in the url within the browser, although the page doesn't return anything - the browser url is updated to contain access_token and other parameters, following the redirect. I am looking to extract these by using a curl command instead and execute it in nodejs on the server side. I have been trying to send in the below curl request:
curl -i -g -H "Content-Type: application/json" -d "{"client_id":"[client_id]","response_type":"id_token+token","scope":"open_id api://[client_id]/access_as_user","response_mode":"fragment","state":"12345","nonce":"678910","redirect_uri":"http://localhost:3000/account/"}" 'https://login.microsoftonline.com/microsoft.onmicrosoft.com/oauth2/v2.0/authorize'
the error i am getting is :
HTTP/1.1 400 Bad Request
Cache-Control: private
Content-Type: application/json; charset=utf-8
Server: Microsoft-IIS/8.5
Strict-Transport-Security: max-age=31536000; includeSubDomains
X-Content-Type-Options: nosniff
x-ms-request-id: b64ffac8-500a-48c9-ab61-7e64d74f0600
Set-Cookie: x-ms-gateway-slice=005; path=/; secure; HttpOnly
Set-Cookie: stsservicecookie=ests; path=/; secure; HttpOnly
X-Powered-By: ASP.NET
Date: Wed, 10 Jan 2018 09:02:16 GMT
Content-Length: 381
{"error":"invalid_request","error_description":"AADSTS90004: Malformed JSON\r\nTrace ID: b64ffac8-500a-48c9-ab61-7e64d74f0600\r\nCorrelation ID: 64d444f1-1dd6-4ba6-b75f-876778515239\r\nTimestamp: 2018-01-10 09:02:19Z","error_codes":[90004],"timestamp":"2018-01-10 09:02:19Z","trace_id":"b64ffac8-500a-48c9-ab61-7e64d74f0600","correlation_id":"64d444f1-1dd6-4ba6-b75f-876778515239"}
At this point, I created a node app, and using adal-js to retrieve the access token on the end point.
var express = require('express');
var AuthenticationContext = require('adal-node').AuthenticationContext;
var app = express();
var authorityUrl = 'https://login.windows.net/' + sampleParameters.tenant + '/oauth2/authorize?response_type=code&client_id=<client_id>&response_type=<response_type>&scope=api://<client_id>/access_as_user&response_mode=<response_mode>&state=<state>&nonce=<nonce>&redirect_uri=http://localhost:3000/account';
app.get('/account', function(req, res) {
var authenticationContext = new AuthenticationContext(authorityUrl);
authenticationContext.acquireToken(authorityUrl, function(err, response) {
var message = '';
if (err) {
message = 'error: ' + err.message + '\n';
}
message += 'response: ' + JSON.stringify(response);
if (err) {
res.send(message);
return;
}
});
app.listen(3000);
The error, I see on the browser, when i log in: (note that all valid parameters have been entered above)
Error: acquireToken requires a function callback parameter.
Could anyone help with resolving the issue - trying to extract the access token from the output?
Reference link: https://learn.microsoft.com/en-us/azure/active-directory/develop/active-directory-v2-protocols-implicit
Basically, implicit flow is used for JavaScript Single Page Application (SPA), while adal-node is used for Azure AD v1.0 endpoint, not for v2.0 endpoint. See Azure AD v2.0 authentication libraries.
For your scenario (Node.js Web API), I would recommend you use OAuth 2.0 authorization code flow to protect your Web APIs. Here is a step by step instruction for securing a web API by using Node.js.
I'm trying to expire a cookie with native NodeJS, specifically for the chrome browser. However, the expiration date that I place doesn't cause the cookie to go away.
As of right now, here's my code:
var cookie = 'Expires=' + new Date();
response.setHeader('Set-Cookie', cookie);
I ended up getting cookies with the expiration date like so even after subsequent requests:
cookie: Expires=Wed Mar 22 2017 02:14:52 GMT-0400 (EDT)
You can set cookie expires and httpOnly using the below code.
res.cookie(myCookie, myValue, { expires: new Date(Date.now()+10000), httpOnly: true };
https://expressjs.com/en/api.html#res.cookie
Try This, It may work for you.
'Set-Cookie':'sesh=wakadoo; expires='+new Date(new Date().getTime()+86409000).toUTCString();
Replace sesh=wakadoo with your variable.
I am working on a Node.js server side validation of json web tokens received from cross origin ajax clients. Presumably the tokens are generated by Google OpenID Connect which states the following:
To use Google's OpenID Connect services, you should hard-code the Discovery-document URI into your application. Your application fetches the document, then retrieves endpoint URIs from it as needed.
You may be able to avoid an HTTP round-trip by caching the values from the Discovery document. Standard HTTP caching headers are used and should be respected.
source: https://developers.google.com/identity/protocols/OpenIDConnect#discovery
I wrote the following function that uses request.js to get the keys and moment.js to add some timestamp properties to a keyCache dictionary where I store the cached keys. This function is called when the server starts.
function cacheWellKnownKeys(uri) {
var openid = 'https://accounts.google.com/.well-known/openid-configuration';
// get the well known config from google
request(openid, function(err, res, body) {
var config = JSON.parse(body);
var jwks_uri = config.jwks_uri;
var timestamp = moment();
// get the public json web keys
request(jwks_uri, function(err, res, body) {
keyCache.keys = JSON.parse(body).keys;
keyCache.lastUpdate = timestamp;
keyCache.timeToLive = timestamp.add(12, 'hours');
});
});
}
Having successfully cached the keys, my concern now is regarding how to effectively maintain the cache over time.
Since Google changes its public keys only infrequently (on the order of once per day), you can cache them and, in the vast majority of cases, perform local validation.
source: https://developers.google.com/identity/protocols/OpenIDConnect#validatinganidtoken
Since Google is changing their public keys every day, my idea with the timestamp and timeToLive properties of keyCache is to do one of two things:
Set a timeout every 12 hours to update the cache
Deal with the case where Google changes their public keys in between my 12 hour update cycle. The first failed token validation on my end triggers a refresh of the key cache followed by one last attempt to validate the token.
This seems like a viable working algorithm until I consider an onslaught of invalid token requests that result in repeated round trips to the well known config and public keys while trying to update the cache.
Maybe there's a better way that will result in less network overhead. This one line from the first quote above may have something to do with developing a more efficient solution but I'm not sure what to do about it: Standard HTTP caching headers are used and should be respected.
I guess my question is really just this...
Should I be leveraging the HTTP caching headers from Google's discovery document to develop a more efficient caching solution? How would that work?
The discovery document has property jwks_uri which is the web address of another document with public keys. This other document is the one Google is referring to when they say...
Standard HTTP caching headers are used and should be respected.
An HTTP HEAD request to this address https://www.googleapis.com/oauth2/v3/certs reveals the following header:
HTTP/1.1 200 OK
Expires: Wed, 25 Jan 2017 02:39:32 GMT
Date: Tue, 24 Jan 2017 21:08:42 GMT
Vary: Origin, X-Origin
Content-Type: application/json; charset=UTF-8
X-Content-Type-Options: nosniff
x-frame-options: SAMEORIGIN
x-xss-protection: 1; mode=block
Content-Length: 1472
Server: GSE
Cache-Control: public, max-age=19850, must-revalidate, no-transform
Age: 10770
Alt-Svc: quic=":443"; ma=2592000; v="35,34"
X-Firefox-Spdy: h2
Programmatically access these header fields from the response object generated by request.js and parse the max-age value from it, something like this:
var cacheControl = res.headers['cache-control'];
var values = cacheControl.split(',');
var maxAge = parseInt(values[1].split('=')[1]);
The maxAge value is measured in seconds. The idea then is to set a timeout based on the maxAge (times 1000 for millisecond conversion) and recursively refresh the cache upon every timeout completion. This solves the problem of refreshing the cache on every invalid authorization attempt, and you can drop the timestamp stuff you're doing with moment.js
I propose the following function for handling the caching of these well known keys.
var keyCache = {};
/**
* Caches Google's well known public keys
*/
function cacheWellKnownKeys() {
var wellKnown= 'https://accounts.google.com/.well-known/openid-configuration';
// get the well known config from google
request(wellKnown, function(err, res, body) {
var config = JSON.parse(body);
var address = config.jwks_uri;
// get the public json web keys
request(address, function(err, res, body) {
keyCache.keys = JSON.parse(body).keys;
// example cache-control header:
// public, max-age=24497, must-revalidate, no-transform
var cacheControl = res.headers['cache-control'];
var values = cacheControl.split(',');
var maxAge = parseInt(values[1].split('=')[1]);
// update the key cache when the max age expires
setTimeout(cacheWellKnownKeys, maxAge * 1000);
});
});
}
When I make requests to a server with Postman(an api service), chrome automatically makes a cookie. However, when I make a request with my nodejs server, the cookie is not being made even thought the request is successful.
//Headers
var options = {
method: 'GET'
};
options.headers = {};
options.headers.Authorization = auth;
options.url = urlm;
console.log(options);
request(options, function(error,response,body) {
res.status(200).send(response.headers);
});
The response header is
{"date":"Tue, 23 Feb 2016 20:06:57 GMT","server":"Jetty(9.2.1.v20140609)","x-csrf-header":"X-CSRF-TOKEN","expires":"Thu, 01 Jan 1970 00:00:00 GMT","x-csrf-token":"xxxxxxxxxxx","cache-control":"no-store","content-type":"audio/mpeg","set-cookie":["JSESSIONID=uiqwnksadbohqjkq675d;Path=/;HttpOnly"],"connection":"close","transfer-encoding":"chunked"}
Pass { jar: true } in your request options.
From the documentation:
jar - If true, remember cookies for future use (or define your custom cookie jar; see examples section)
I'd like to send a response code of 401 if the requesting user is not authenticated, but I'd also like to redirect when the request was an HTML request. I've been finding that Express 4 doesn't allow this:
res.status(401).redirect('/login')
Does anyone know of a way to handle this? This might not be a limitation of Express, since I'm asking to essentially pass two headers, but I don't see why that should be the case. I should be able to pass a "not authenticated" response and redirect the user all in one go.
There are some subtle diferences with the methods for sending back a new location header.
With redirect:
app.get('/foobar', function (req, res) {
res.redirect(401, '/foo');
});
// Responds with
HTTP/1.1 401 Unauthorized
X-Powered-By: Express
Location: /foo
Vary: Accept
Content-Type: text/plain; charset=utf-8
Content-Length: 33
Date: Tue, 07 Apr 2015 01:25:17 GMT
Connection: keep-alive
Unauthorized. Redirecting to /foo
With status and location:
app.get('/foobar', function (req, res) {
res.status(401).location('/foo').end();
});
// Responds with
HTTP/1.1 401 Unauthorized
X-Powered-By: Express
Location: /foo
Date: Tue, 07 Apr 2015 01:30:45 GMT
Connection: keep-alive
Transfer-Encoding: chunked
With the original (incorrect) approach using redirect:
app.get('/foobar', function (req, res) {
res.status(401).redirect('/foo')();
});
// Responds with
HTTP/1.1 302 Moved Temporarily
X-Powered-By: Express
Location: /foo
Vary: Accept
Content-Type: text/plain; charset=utf-8
Content-Length: 38
Date: Tue, 07 Apr 2015 01:26:38 GMT
Connection: keep-alive
Moved Temporarily. Redirecting to /foo
So it looks like redirect will abandon any previous status codes and send the default value (unless specified inside the method call). This makes sense due to the use of middleware within Express. If you had some global middleware doing pre-checks on all requests (like checking for the correct accepts headers, etc.) they wouldn't know to redirect a request. However the authentication middleware would and thus it would know to override any previous settings to set them correctly.
UPDATE: As stated in the comments below that even though Express can send a 4XX status code with a Location header does not mean it is an acceptable response for a request client to understand according to the specs. In fact most will ignore the Location header unless the status code is a 3XX value.
You can certainly send a Location: /login header alongside with your 401 page, however, this is ill-advised, and most browsers will not follow it, as per rfc2616.
One way to do overcome this, is to serve <meta http-equiv="refresh" content="0; url=/login"> alongside with your 401 page:
res.set('Content-Type', 'text/html');
res.status(401).send('<!DOCTYPE html><html><head><meta http-equiv="refresh" content="0; url=/login"></head></html>');
I fell on the same issue and decided to use the session to handle this kind of job.
I didn't want to have an intermediate view...
With the below code, I can redirect to the homepage, which will be rendered with 401 unauthorized code.
app.get('patternForbiddenRoute', (req, res, next) => {
// previousCode
if (notForbidden === true) {
return res.render("a_view");
}
req.session.httpCode = 401;
res.redirect('patternHomeRoute');
});
app.get('patternHomeRoute', (req, res, next) => {
res.render("my_home_view", {}, (error, html) => {
// ... handle error code ...
const httpCode = req.session.httpCode;
if (httpCode !== undefined) {
delete req.session.httpCode;
res.status(httpCode);
}
res.send(html);
}));
});