Restricting CORS origin with Node/Express not working - node.js

I'm trying to restrict the origin of CORS requests to one specific domain per route using the express.js CORS package like so:
const express = require('express');
const cors = require('cors');
const port = process.env.PORT || 3000;
let app = express();
app.get('/', cors({origin: 'http://example.com'}), (req, res, next) => {
res.sendStatus(200);
});
app.post('/', cors({origin: 'http://whatever.com'}) (req, res, next) => {
res.sendStatus(200);
});
app.listen(port, () => {
console.log(`Started on port ${port}`);
});
This doesn't seem to have any effect, however, as I'm able to GET and POST from any domain. I then tried instead to restrict all routes to one single origin using the following, but met the same results:
app.use(cors({origin: 'http://example.com'}));
I'm experiencing this both in my dev environment on localhost and my production environment on Heroku. Any idea what I'm missing?

If your server is sending an Access-Control-Allow-Origin: http://example.com response header, then you actually already have it configured correctly.
It’s expected behavior for the server to return a 200 response no matter what origin you make the request from—even for those from an origin other than the configured http://example.com
The CORS settings don’t cause the server to block requests from any clients.
Instead, if the server responds with Access-Control-Allow-Origin: http://example.com to a client request from JavaScript code in a web app that’s not running at http://example.com, then the browser blocks that JavaScript code from being able to access the response.
https://developer.mozilla.org/en-US/docs/Web/HTTP/Access_control_CORS gives more details.
Basically the way it works is that from the server side, no behavior changes other than the difference in what response headers it sends. So the server will receive the request just as it otherwise would, and will send the response just as it otherwise would.
And then the browser will receive the response just as it otherwise would. You will be able to see the response in your browser devtools and examine it there. But that does not mean the browser will expose the response to your client-side JavaScript code.
Instead, the browser checks the value of the Access-Control-Allow-Origin response header from the server and will only expose the response cross-origin to your origin if the server says it should be allowed to: Your browser checks the value of the Access-Control-Allow-Origin against your actual origin, and if it either matches exactly or the value is * to allow any origin, only then does the browser allow your client-side JavaScript code to access the response.

Related

Node.js Express Websocket clients coming from http origin?

I have a webapp that communicates to Node.js Express server using websocket.
When verifying the websocket connection, I check the ORIGIN header of the request (and a few other parameters to ensure they are legitimate)
The expected request is either "https://www.mywebsite.com" or "https://mywebsite.com"
If the ORIGIN header is not expected, we will kick the user.
Then I noticed some people can be kicked when their socket connection looks alright, but the ORIGIN is "http://mywebsite.com". We quickly checked and realise the website can be visited in http. We added a piece of redirect code like this:
const server = express()
.enable('trust proxy')
.use((req, res, next) => {
req.secure ? next() : res.redirect('https://' + req.headers.host + req.url)
})
And now theoretically, whoever visit the http version of the website should be redirected to https.
But, even this redirection is done, we still notice people being kicked because their origin is http instead of https. Why is this so? Is there any chance that some users can never use https?
This is the correct way to redirect to https on Heroku:
Under the hood, Heroku router (over)writes the X-Forwarded-Proto and the X-Forwarded-Port request headers. The app must check X-Forwarded-Proto and respond with a redirect response when it is not https but http.
Taken from: https://help.heroku.com/J2R1S4T8/can-heroku-force-an-application-to-use-ssl-tls
This is some sample code you can use:
app.use((req, res, next) => {
if (req.header('x-forwarded-proto') !== 'https') {
res.redirect(`https://${req.header('host')}${req.url}`)
} else {
next()
}
})
The reason your code doesn't work is that Heroku does SSL termination for you and serves the certificates this means the connection between the Heroku router and your Node.js server is insecure and req.secure returns false:
https://devcenter.heroku.com/articles/http-routing#routing
Correction: Cause you set trust proxy this means req.protocol will be set to https and req.secure will return true so your code will work.

How to prevent ERR_BLOCKED_BY_RESPONSE.NotSameOriginAfterDefaultedToSameOriginByCoep?

I am attempting to access my movie API that returns data including an image of a movie poster through a React application. This image is being requested from an external website. Each time I make a request to my \movies endpoint, the image is blocked and I get the following message in the console
net::ERR_BLOCKED_BY_RESPONSE.NotSameOriginAfterDefaultedToSameOriginByCoep 200
When looking at the request in the Network tab, I get the following message saying to enable a Cross-Origin Resource Policy
Because your site has the Cross-Origin Embedder Policy (COEP) enabled, each resource must specify a suitable Cross-Origin Resource Policy (CORP). This behavior prevents a document from loading cross-origin resources which don’t explicitly grant permission to be loaded.
To solve this, add the following to the resource’s response header:
Cross-Origin-Resource-Policy: same-site if the resource and your site are served from the same site.
Cross-Origin-Resource-Policy: cross-origin if the resource is served from another location than your website. ⚠️If you set this header, any website can embed this resource.
I am using the CORS npm module which had previously been used to solve my issue with an Access-Control-Allow-Origin error. I added some additional middleware to try and add the header as instructed. This is the app.js server with that code
App.js
'use strict';
import express, { json, urlencoded } from 'express';
import morgan from 'morgan';
import mongoose from 'mongoose';
import passport from 'passport';
import cors from 'cors';
import dotenv from 'dotenv';
import auth from './routes/auth.js';
import routes from './routes/routes.js';
dotenv.config();
const app = express();
mongoose
.connect(process.env.CONNECTION_URL, {
useNewUrlParser: true,
useUnifiedTopology: true,
})
.then(res => console.log('DB Connected!'))
.catch(err => console.log(err, err.message));
app.use(cors())
app.use((req, res, next) => {
res.header("Cross-Origin-Resource-Policy", "cross-origin")
next()
})
app.use(passport.initialize());
app.use(json());
app.use(urlencoded({ extended: true }));
app.use(express.static(`public`));
app.use(morgan('common'));
auth(app);
import './authentication/passport.js';
routes(app)
app.use((req, res, err, next) => {
if (err) {
console.error(err.stack);
res.status(500).send('Something broke!');
}
next();
});
const port = process.env.PORT || 3000;
app.listen(port, '0.0.0.0', () => console.log(`Listening on Port ${port}`));
After doing this, the console throws the same error and the Cross-Origin Resource Policy still is not set. Is there something wrong with my approach or the way that I have my file structured?
You have COEP enabled in the client:
Cross-Origin-Embedder-Policy: require-corp
This is a great security feature that means:
COEP: Everything (data, images etc) on this website is mine, or I fetch from it from other websites using CORS. (There can be a third way, that is data being authorized by cookies, http-auth, etc... which is not in our discussion, so don't bother here.)
So, you have two options. The first one is to disable COEP, but I assume that you don't want to do that. So, the other option is to use CORS for everything external. For example, when you fetch something, use:
fetch('https://externalwebsite.com/image.jpg',{mode:'cors'})
or, to embed an external image in the HTML, use crossorigin
<img crossorigin="anonymous" src="https://externalwebsite.com/image.jpg">
Note that crossorigin attribute in <img> means CORS. If it is missing, it means "no-cors", which is the default. Be aware though: When you use JavaScript's fetch, the default is {mode:'cors'}, i.e. the opposite!
Now, if you try to do that (use CORS, as you should), the browser will throw another error:
Access [...] has been blocked by CORS policy: No 'Access-Control-Allow-Origin' header is present on the requested resource.
That means... exactly that! That the external server has to send the header:
Access-Control-Allow-Origin: *
That setting means that every website can use the server's resources (API in your case), as long as it does not use/send/receive cookies in the request (because... security). The way to implement this in your express server is to set:
res.header('Access-Control-Allow-Origin', '*');
Every server that intends to serve things to other websites, must have this ACAO header. (You can place your other website instead of "*" if you want only that website to access your API.)
Note/Summary:
If the external server has this ACAO header, you can fetch things using CORS/crossorigin. If it does not have ACAO header, you can fetch things with no-cors / without crossorigin. But with COEP enabled in your website, you can only fetch with CORS/crossorigin, so the external server has to have an ACAO.
Now,
As for the Cross-Origin-Resource-Policy that your server has, have in mind that (https://developer.mozilla.org/en-US/docs/Web/HTTP/Cross-Origin_Resource_Policy_(CORP)):
The policy is only effective for no-cors requests
During a cross-origin resource policy check, if the header is set, the browser will deny no-cors requests issued from a different origin/site.
This means that, since you make only CORS requests to that server, this header doesn't do anything (in your case). So the server can set it to "same-site"/"same-origin" for security reasons that are beyond this topic.

How to fix 'Content Security Policy: The page’s settings blocked the loading of a resource at http://localhost:8080/favicon.ico (“default-src”).'

I was able to connect to my server and use a get a request to display some text. However, when I restarted my server after taking a break, I was hit with this error
Content Security Policy: The page’s settings blocked the loading of a resource at http://localhost:8080/favicon.ico (“default-src”).
If anyone can please point me in the right direction, that would be great.
If it helps, I am using the latest version of Firefox.
Edit: I changed the port. I can now see my get request, however, I'm still getting the error
const express = require ('express');
const cors = require('cors');
const app = express();
app.use(cors());
const port = 3001;
// was 8080
app.listen(port, () => {
console.log(`Server is up and listening on port ${port}`);
})
app.get('/', (req, res) => {
res.send({
express: 'Your express backend is connected to react'
})
})
There is a bug in Firefox when the JSON viewer is enabled, starting at least with version 86 and continuing on through at least version 89, wherein the Content-Type: application/json header enacts a much stricter CSP than other content types. To solve this, either:
Change your content type header (in Express, send a string instead of a JavaScript object)
or, disable devtools.jsonview.enabled in about:config
maybe it has something to do with running http on https port. Try using another port or converting it to https.

Adding cors to MEAN Singlepoint Application, for Swagger-ui

Im running a NodeJS server which is serving the UI with the builded Angular /dist folder.
In the UI I include the swagger-ui, which is loading an swagger.json, the *.json is describing a REST interface and within the swagger-ui you should be able to test REST Interfaces.
https://swagger.io/swagger-ui/
Project structure
In the server.js I added a fixed rout to the /dist path where the index.html is stored, also there are express routes to the rest Interfaces which my server is offering. to load the swagger.json Documentation files
server.js
// Get dependencies
const dotEnv = require('dotenv').config();
const express = require('express');
const path = require('path');
const http = require('http');
const bodyParser = require('body-parser');
var mongoose = require('mongoose');
const cors = require('cors');
// Get our API routes
const api = require('./server/routes/api');
const swaggerAPI = require('./server/routes/swaggerAPI');
const app = express();
var connectionUrl = typeof process.env.CONNECTION_URL !== 'undefined' ? process.env.CONNECTION_URL : 'mongodb://db:27017/docdb';
console.log("Connection URL: " + connectionUrl);
mongoose.connect(connectionUrl);
// Parsers for POST data
app.use(bodyParser.json());
app.use(bodyParser.urlencoded({ extended: false }));
// Point static path to dist
app.use(express.static(path.join(__dirname, 'dist')));
// Set our api routes
app.use('/api', api);
app.use('/swagger-api', swaggerAPI);
app.use('/client', express.static('client'));
app.use(cors());
app.use(function (req, res, next) {
res.header("Access-Control-Allow-Origin", "*");
res.header("Access-Control-Allow-Headers", "Cache-Control, Pragma, Origin, Authorization, Content-Type, X-Requested-With");
res.header("Access-Control-Allow-Methods", "GET, PUT, POST");
if ('OPTIONS' === req.method) {
res.status(204).send();
}
else {
next();
}
});
// Catch all other routes and return the index file
app.get('*', (req, res) => {
res.sendFile(path.join(__dirname, 'dist/index.html'));
});
/**
* Get port from environment and store in Express.
*/
const port = process.env.PORT || '3000';
app.set('port', port);
/**
* Create HTTP server.
*/
const server = http.createServer(app);
/**
* Listen on provided port, on all network interfaces.
*/
app.listen(port, () => console.log(`API running on localhost:${port}`));
Everthing works fine I can load the swagger.json with my REST interfaces, persist them in a MongoDB and show them in the Angular UI.
But when I want to test REST interfaces from the swagger-ui i get an error in console:
Failed to load http://localhost:8888/****/Y7CTQW5PTSEG1MMPN: No 'Access-Control-Allow-Origin' header is present on the requested resource. Origin 'http://localhost:3000' is therefore not allowed access. If an opaque response serves your needs, set the request's mode to 'no-cors' to fetch the resource with CORS disabled.
But when I debug the Request in Chrome, I can see the loaded data in the Network tab.
Why is the console showing the cors error, while loading the data and not showing it in the ui?
When Site A tries to fetch content from Site B, Site B can send an Access-Control-Allow-Origin response header to tell the browser that the content of this page is accessible to certain origins. (An origin is a domain, plus a scheme and port number.) By default, Site B's pages are not accessible to any other origin; using the Access-Control-Allow-Origin header opens a door for cross-origin access by specific requesting origins.
For each resource/page that Site B wants to make accessible to Site A, Site B should serve its pages with the response header:
Access-Control-Allow-Origin: http://siteA.com
Modern browsers will not block cross-domain requests outright. If Site A requests a page from Site B, the browser will actually fetch the requested page on the network level and check if the response headers list Site A as a permitted requester domain. If Site B has not indicated that Site A is allowed to access this page, the browser will trigger the XMLHttpRequest's error event and deny the response data to the requesting JavaScript code.
Supposing that Site A wants to send a PUT request for /somePage, with a non-simple Content-Type value of application/json, the browser would first send a preflight request:
OPTIONS /somePage HTTP/1.1
Origin: http://siteA.com
Access-Control-Request-Method: PUT
Access-Control-Request-Headers: Content-Type
Note that Access-Control-Request-Method and Access-Control-Request-Headers are added by the browser automatically; you do not need to add them. This OPTIONS preflight gets the successful response headers:
Access-Control-Allow-Origin: http://siteA.com
Access-Control-Allow-Methods: GET, POST, PUT
Access-Control-Allow-Headers: Content-Type
When sending the actual request (after preflight is done), the behavior is identical to how a simple request is handled. In other words, a non-simple request whose preflight is successful is treated the same as a simple request (i.e., the server must still send Access-Control-Allow-Origin again for the actual response).
The browsers sends the actual request:
PUT /somePage HTTP/1.1
Origin: http://siteA.com
Content-Type: application/json
{ "myRequestContent": "JSON is so great" }
And the server sends back an Access-Control-Allow-Origin, just as it would for a simple request:
Access-Control-Allow-Origin: http://siteA.com
See Understanding XMLHttpRequest over CORS for a little more information about non-simple requests.
Please check this links as well to solve and fix your problem.
Cors content tutorial
Using cors
fixing cors

Proxy HTTPS with HTTP in Node.js express

I wrote an express app as an HTTP proxy, to intercept and analyse some of the network traffic. The parts of traffic my app is interested in are all HTTP, however I still want my app to proxy HTTPS so users can use it without extra setting.
My express app is created with a HTTP server. When testing, I changed the proxy setting in Chrome with SwitchyOmega, to proxy HTTPS connections with HTTP. HTTP works well, But my express app couldn't get these proxy requests for HTTPS.
So I wrote a simple TCP proxy to check on them, and find that they're like this:
CONNECT HOSTNAME:443 HTTP/1.1
Host: HOSTNAME
Proxy-Connection: keep-alive
User-Agent: MY_AGENT
ENCRYPTED HTTPS
I believe these requests are HTTP, but why express isn't receiving them?
For sure if I change the browser proxy setting to ignore HTTPS, the app works well. But I do want to know if there is any workaround that I can use to proxy all protocols with HTTP and only one port.
THX.
UPDATE- code from my express app
app.use('*', function (req, res, next) {
// print all the request the app receive
console.log('received:', req.url)
})
app.use(bodyParser.text({type: '*/*'}))
app.use(cookieParser())
app.use(logger('dev'))
app.use(express.static(path.join(__dirname, 'public')))
// serve web pages for my app, only the request targeting my server
// is handled here(right IP and port), proxy request gets handled after this.
app.use('/', internalRoute)
// analyse the part I want
app.use('/END_POINT_I_WANT', myRoute)
// handle proxy requests
app.use('*', function (req, res, next) {
// proxy the request here
})
The problem is, my first middleware, which is used to display all the requests the app receive, can't catch the HTTPS proxy requests wrapped in HTTP described above. And of course the middleware I used as proxy can't catch them either.
UPDATE-tried node-http-prxoy, no luck
var httpProxy = require('http-proxy')
, http = require('http')
, fs = require('fs')
var options = {target: 'http://127.0.0.1:8099'}
, proxy = httpProxy.createServer(options)
http.createServer(function (req, res) {
console.log(req.url)
proxy.web(req, res)
}).listen(5050)
With the above code, and browser setting to proxy all protocols with HTTP, it works the same as my express app. HTTPS proxy requests gets ERR_EMPTY_RESPONSE, and nothing on the console.
With the below options, it seems that I have to change the proxy protocol to HTTPS, which I'd rather not use, at least for now. And I get ERR_PROXY_CERTIFICATE_INVALID for my self-signed certs...
var options = { secure: true
, target: 'http://127.0.0.1:8099'
, ssl: { key: fs.readFileSync('cert/key.pem', 'utf8')
, cert: fs.readFileSync('cert/server.crt', 'utf8')
}
}
UPDATE- pin point the problem to the 'connect' event listener
Through some searching, I found this post helpful.
It pointed out that the http server doesn't have a listener for the connect event. I tried the code in the post, works. But as the last comment of that post mentioned, my app serves as a proxy in order to get the data, it then proxy the request to another proxy in order to go over the GreatFireWall.
The process is like : BROWSER -> MY_APP -> ANOTHER_PROXY -> TARGET.
Without the ANOTHER_PROXY, which is an HTTP proxy, it works well for both HTTP and HTTPS. However I failed to chain them all up. The ANOTHER_PROXY I use supports HTTPS over HTTP.
It's hard to see what might be wrong, since you haven't posted any code.
However, if you just want to create a simple proxy that supports HTTP and HTTPS, i think that you should consider using a module like node-http-proxy.
Their readme has example code for the most common scenarios, and it sounds like it will support your needs fine.

Resources