Why does JWT Cookie get removed on page refresh - node.js

Why does the JWT Cookie disappear on page refresh?
I'm using Fetch to make a request
it initially sets the JWT cookie which I verify on Chrome Inspector
on page refresh, the cookie disappears. No specific errors are being logged.
Here's the frontend request:
$(form).submit(function(e) {
e.preventDefault();
var form_email = $('#email').val();
var form_password = $('#password').val();
var formData = {
email:form_email,
password:form_password,
}
const url = $(form).attr('action');
const options = {
method: 'POST',
body: JSON.stringify({
email:form_email,
password:form_password
}),
headers:{
'Content-Type':'application/json'
},
credentials:'include'
}
fetch(url,options)
.then(function(response) {
console.log(response);
})
});
On the server-side, I already have enabled CORS, set credentials to being true, and the origin.
const express = require("express");
const connectDB = require("./config/db");
const cors = require('cors');
const app = express();
connectDB();
app.use(cors({credentials: true, origin: 'https://zkarimi.com'}));
app.use(express.json({ extended: false }));
Here's how I handle the JWT signing and cookie response:
jwt.sign(
payload,
config.get("jwtSecret"),
{ expiresIn: 36000 },
(err, token) => {
if (err) throw err;
res.cookie('jwt',token, { httpOnly: true ,secure: true, sameSite:'None',maxAge: 3600000 })
res.json({user_email:email });
}
);
Lastly here's the response from the initial request:
HTTP/1.1 200 OK
Server: Cowboy
Connection: keep-alive
X-Powered-By: Express
Access-Control-Allow-Origin: https://zkarimi.com
Vary: Origin
Access-Control-Allow-Credentials: true
Set-Cookie: jwt=eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJ1c2VyIjp7ImlkIjoiNWZjODA1Zjk2MDRlNjEwMDE3NzQwZWUxIn0sImlhdCI6MTYwNzAyMjg5MSwiZXhwIjoxNjA3MDU4ODkxfQ.oXuA6DfdZbO2XW5HtLR44pVgGUWsXMyaV-l0iVeF7to; Max-Age=3600; Path=/; Expires=Thu, 03 Dec 2020 20:14:51 GMT; HttpOnly; Secure; SameSite=None
Content-Type: application/json; charset=utf-8
Content-Length: 30
Etag: W/"1e-vu/QJUeZrcotU6SR/PRLtu/c6Jw"
Date: Thu, 03 Dec 2020 19:14:51 GMT
Via: 1.1 vegur
After this request ^, refreshing the page removes the cookie from Application > Storage > Cookies and subsequent authentication requests indicate there's no JWT cookie.

Related

client got but not stored cookies in Production environment

In dev environment, whenever client get cookies, client will stored it until exprires out date. But when i deploy server on Heroku, and client on Vercel then it not work.I tried to set
app.set("trust proxy", 1);
or change sameSite from "lax" to "none" but it still not work.
At server, i create a fuction to set cookie
export const sendRefreshToken = (res: Response, userId: string) => {
res.cookie(
process.env.REFRESH_TOKEN_COOKIE_NAME!,
createToken("refreshToken", userId),
{
httpOnly: true,
secure: true,
sameSite: __prod__ ? "lax" : "none",
path: "/refresh_token",
expires: new Date(Date.now() + 86400 * 1000 * 180), //180days
}
);
};
at client, credentials of Apollo Client is true
const httpLink = new HttpLink({
uri:
process.env.NODE_ENV === "production"
? "link-web"
: "http://localhost:4000/graphql",
credentials: "include",
});
const authMiddleware = new ApolloLink((operation, forward) => {
const token = JwtManager.getToken();
operation.setContext(({ headers = {} }) => ({
headers: {
...headers,
authorization: token ? `Bearer ${token}` : "",
},
}));
return forward(operation);
});
export const client = new ApolloClient({
cache: new InMemoryCache()
link: concat(authMiddleware, httpLink),
});
I can see the cookie at client whenever server return cookie
Access-Control-Allow-Credentials
true
Access-Control-Allow-Origin
https://test-deploy-zeta-two.vercel.app
Connection
keep-alive
Server : Cowboy
Set-Cookie : myCookie=myValue; Path=/refresh_token; Expires=Tue, 25 Oct 2022 05:52:49 GMT; HttpOnly; Secure; SameSite=None
X-Powered-By : Express
...
But when i reload the page,this cookie disappear.How can i fix it?

Express session doesn't persist if the client is on a different domain

tl:dr;
A Node (express) server is hosted on Heroku, and the UI is hosted on Netlify. When the UI makes a REST API call to the server, the session doesn't persist (but it persists if I ran both locally. localhost:5000 on the server, localhost:3000 on UI. The UI is proxying requests with package.json).
Code snippets
session.ts
export const sessionConfig = {
secret: process.env.SESSION_KEY,
store: new RedisStore({ client: redisClient }),
resave: true,
saveUninitialized: true,
cookie: {
secure: process.env.NODE_ENV === 'production',
sameSite: process.env.NODE_ENV === "production" ? 'none' : 'lax',
},
};
server.ts
const app = express();
app.use(express.json());
app.use(cookieParser());
app.set('trust proxy', 1);
app.use(session(sessionConfig)); // This sessionConfig comes from the file above
app.use(cors({
credentials: true,
origin: process.env.CLIENT_URL,
}));
I googled something like express session not persist when cross domain request. Then, I saw threads like this and this. It seems that app.set('trust proxy', 1) will make sure that session data will be persisted for cross-domain requests. Apparently, in my case, something is still missing.
Does anyone see what I'm doing wrong? Any advice will be appreciated!
PS:
I'm using sessions for captcha tests, which looks like...
captch.ts
CaptchaRouter.get('/api/captcha', async (req: Request, res: Response) => {
const captcha = CaptchaService.createCaptcha();
req.session.captchaText = captcha.text;
res.send(captcha.data);
});
CaptchaRouter.post('/api/captcha', async (req: Request, res: Response) => {
if (req.session.captchaText !== req.body.captchaText) {
throw new BadRequestError('Wrong code was provided');
}
// The client sent the correct captcha
},
);
Another PS:
Here's how the response heders look like:
Access-Control-Allow-Credentials: true
Access-Control-Allow-Origin: https://example.netlify.app
Connection: keep-alive
Content-Length: 46
Content-Type: application/json; charset=utf-8
Date: Sun, 09 Jan 2022 00:00:00 GMT
Etag: W/"2e-cds5jiaerjikllkslaxmalmird"
Server: Cowboy
Set-Cookie: connect.sid=s%3ramdon-string-here; Path=/; Expires=Sun, 09 Jan 2022 00:00:00 GMT; HttpOnly; Secure; SameSite=None
Vary: Origin
Via: 1.1 vegur
X-Powered-By: Express
The cause was that the client (hosted on Netlify) wasn't proxying API requests.
The solution was:
add _redirects under public of the client
/api/* https://server.herokuapp.com/api/:splat 200
/* /index.html 200
make sure that API requests from the client will begin with the root URL
return axios({ method: 'POST', url: '/api/example', headers: defaultHeaders });
For future reference, here's my session config
const sessionConfig = {
secret: process.env.SESSION_KEY || 'This fallback string is necessary for Typescript',
store: new RedisStore({ client: redisClient }),
resave: false,
saveUninitialized: true,
cookie: {
secure: process.env.NODE_ENV === 'production', // Prod is supposed to use https
sameSite: process.env.NODE_ENV === "production" ? 'none' : 'lax', // must be 'none' to enable cross-site delivery
httpOnly: true,
maxAge: 1000 * 60
} as { secure: boolean },
};
...and here's server.ts
const app = express();
const port = process.env.PORT || 5000;
app.use(express.json());
app.set('trust proxy', 1);
app.use(session(sessionConfig));
app.use(cors({
credentials: true,
origin: process.env.CLIENT_URL,
}));
(As #Matt Davis pointed out, cookieParser was unnecessary)
PS
I haven't tried to set cookie.domain in the session config. If this was set to the client URL (provided by Netlify), did the session cookie persist?

Node.js module CSURF question -- how do ANTI-CSRF tokens get computed?

I'm using the node.js module CSURF, which is configured to use cookies via cookie-parser.
For demo purposes, I'm just echoing the ANTI-CSRF token to the screen on a /form GET request. Here's the request and response via VS Code Rest Client plugin:
GET http://localhost:9000/form HTTP/1.1
User-Agent: vscode-restclient
accept-encoding: gzip, deflate
cookie: sid=s%3AYdAxaIHCvv38D6vd3VOi085SOzqkuZpN.eloHBwtgNm4yXQia3FtgR6puNj48kNZVbxlWtBZhSk0; _csrf=xdfFevA7j1qcGRo5BvB7JDQ2
HTTP/1.1 200 OK
Access-Control-Allow-Origin: *
Content-Security-Policy: default-src 'self';base-uri 'self';block-all-mixed-content;font-src 'self' https: data:;frame-ancestors 'self';img-src 'self' data:;object-src 'none';script-src 'self';script-src-attr 'none';style-src 'self' https: 'unsafe-inline';upgrade-insecure-requests
X-DNS-Prefetch-Control: off
Expect-CT: max-age=0
X-Frame-Options: SAMEORIGIN
Strict-Transport-Security: max-age=15552000; includeSubDomains
X-Download-Options: noopen
X-Content-Type-Options: nosniff
X-Permitted-Cross-Domain-Policies: none
Referrer-Policy: no-referrer
X-XSS-Protection: 0
Content-Type: application/json; charset=utf-8
Content-Length: 52
ETag: W/"34-4PDt3TpquKFR5AlQtYw1wqZJRD4"
Date: Wed, 03 Nov 2021 02:47:01 GMT
Connection: close
{
"csrfToken": "HhEOYbdx-lhbaEmFT_Udx-CyyZFvuXG2u3lI"
}
You can see the _csrf value in the cookie -- xdfFevA7j1qcGRo5BvB7JDQ2
Interestingly, this doesn't match the token output to screen -- HhEOYbdx-lhbaEmFT_Udx-CyyZFvuXG2u3lI
So I presume it's a cryptographic match, or a salt was added to the _csrf value to generate unique ANTI-CSRF tokens every time.
...which is fine, b/c CSURF works when I issue a POST request using HhEOYbdx-lhbaEmFT_Udx-CyyZFvuXG2u3lI.
The question/confusion comes into play when I issue a new GET request to the /form endpoint. The _csrf value (xdfFevA7j1qcGRo5BvB7JDQ2) doesn't change, only the ANTI-CSRF token that was output to the screen.
So it appears the ANTI-CSRF token changes on every request, but the cookie value doesn't. Is this correct behavior? It doesn't seem like it b/c I'd be able to always use any ANTI-CSRF token to bypass the check.
Here's the full code from CSURF URL https://www.npmjs.com/package/csurf:
var cookieParser = require('cookie-parser')
var csrf = require('csurf')
var bodyParser = require('body-parser')
var express = require('express')
// setup route middlewares
var csrfProtection = csrf({ cookie: true })
var parseForm = bodyParser.urlencoded({ extended: false })
// create express app
var app = express()
// parse cookies
// we need this because "cookie" is true in csrfProtection
app.use(cookieParser())
app.get('/form', csrfProtection, function (req, res) {
// changed original code to display token to screen instead of render it within a form; this is for dev purposes only
res.json({ csrfToken: req.csrfToken() })
})
app.post('/process', parseForm, csrfProtection, function (req, res) {
res.send('data is being processed')
})
Per OWASP (see this URL: https://security.stackexchange.com/questions/209993/csrf-token-unique-per-user-session-why), ANTI-CSRF token pairs should be changed on new sessions. So once I logged out (deleted my cookie), a new ANTI-CSRF token pair was created.
I wonder if I could change this time every time.

Unable to set cookie from node server -> app

I'm trying to set an httpOnly cookie from my node.js api (localhost:3001) to work with my react client app (localhost:3000), everything I've tried so far results in no cookie being set in my browser. Some key factors about my setup:
Backend is node, running fastify, fastify-cookie & cors
// CORS
server.use(
require('cors')({
origin: ['https://localhost:3000'],
optionsSuccessStatus: 200,
credentials: true
})
)
// Cookies
server.register(require('fastify-cookie'), {
secret: process.env.JWT_SECRET
})
// Sending the cookie
reply
.setCookie('token', token, {
domain: 'localhost',
path: '/',
secure: true,
sameSite: 'lax',
httpOnly: true
})
.send({ user })
Client is running https localhost in chrome, making api calls using fetch.
const fetchUsers = async () => {
const req = await fetch(`${process.env.USERS_API_BASE}/users`, { credentials: 'include' })
const res = await req.json()
console.log(res)
}
Result
No cookie is ever set in my chrome application inspector, but it is sent to the browser from the server and looks correct.
set-cookie: token=eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJ1aWQiOjEsImVtYWlsIjoiaGVsbG9Ac2hhbi5kaWdpdGFsIiwiaWF0IjoxNjIwNDI1ODI0LCJleHAiOjE2MjA0Mjk0MjR9.S8eOQMtSBY85wlenuxjIGYNuk3Ec5cKQ87pAhmCvQ9w.nfRxGzq3IMFimC%2FSJeUH9Xl7bH%2FyXVprwK1NBYfur4k; Domain=localhost; Path=/; HttpOnly; Secure; SameSite=Lax
request.cookies on the sever always returns a blank object {}. Any suggestions?
What you are facing is a CORS error OR at least it is categorized as one..
you see the server seems to think you're making a cross-domain request..
If you log the responce Headers this is typically what you would see
HTTP/1.1 200 OK
Date: Sun, 20 May 2018 20:43:05 GMT
Server: Apache
Set-Cookie: name=value; expires=Sun, 20-May-2018 21:43:05 GMT; Max-Age=3600; path=/; domain=.localHost
Cache-Control: no-cache, private
Access-Control-Allow-Origin: http://localHost:8080
Vary: Origin
X-RateLimit-Limit: 60
X-RateLimit-Remaining: 59
Content-Length: 2
Keep-Alive: timeout=10, max=100
Connection: Keep-Alive
Content-Type: text/html; charset=UTF-8
but when you are making a request you kinda send it like this
const res = await axios({ method: 'POST', url: 'http://127.0.0.1:3000/api/v1/users/login', data: { email, password } });
Do you see the problem 127.0.0.1 != http://localhost:8000 and that is the solution to your problem
In short Check the Key=value Pair of Access-Control-Allow-Origin on your response and Request the domain names should match else the cookie won't be set on the browser...
Here is a GitHub Issue Link for this same problem

Node.js: how to disable chunked transfer-encoding?

I'm missing a content-length header on my response from a Node server that I'm piping a .zip file from another location. I've injected a content-length header via the code below, but still it seems the transfer-encoding: chunked is overwriting it somehow.
Response Headers
HTTP/1.1 200 OK
access-control-allow-origin: *
connection: close
content-type: application/zip
date: Mon, 14 Jul 2014 03:47:00 GMT
etag: "\"eb939974703e14ee9f578642972ed984\""
last-modified: Sat, 12 Jul 2014 02:15:52 GMT
server: Apache-Coyote/1.1
set-cookie: rememberMe=deleteMe; Path=/; Max-Age=0; Expires=Sun, 13-Jul-2014 03:47:00 GMT
transfer-encoding: chunked
X-Powered-By: Express
Code
var request = require('request');
var express = require('express');
var async = require('async');
var app = express();
app.get('/:bundle_id?', function(req, res) {
var bundle_id = req.params.bundle_id;
bundle_id = bundle_id.replace(/\.zip$/, '');
var url = "https://url....../bundles/" + bundle_id;
async.waterfall([
function(callback) {
request.get(url, function(req, res, data) {
callback(null, JSON.parse(data).entities[0]['file-metadata']['content-length']);
});
}
], function(err, contentLength) {
request.get({
url: url,
headers: {
"Accept": "application/zip"
}
}).pipe(res);
res.oldWriteHead = res.writeHead;
res.writeHead = function(statusCode, reasonPhrase, headers) {
res.header('Content-Length', contentLength);
res.oldWriteHead(statusCode, reasonPhrase, headers);
}
});
});
app.listen(9000);
Turns out this was actually a rather simple fix: setting the transfer-encoding header to an empty string in the response solved the problem:
...
res.oldWriteHead = res.writeHead;
res.writeHead = function(statusCode, reasonPhrase, headers) {
res.header('Content-Length', contentLength);
res.header('transfer-encoding', ''); // <-- add this line
res.oldWriteHead(statusCode, reasonPhrase, headers);
}
...
The reason this works, is because after doing some digging, it appears the transfer-encoding header replaces content-length (since both can't co-exist). It just so happens that the clients I was using to test were choosing chunked transfer encoding over content length.
If you define a Content-Length, Transfer-Encoding will no longer be sent to "chunked".

Resources