sending COOP and COEP headers in reactnode app - node.js

I have a deployed app that is built on Express (Node) and uses a built React frontend for app rendering.
Currently, I'm facing SharedArrayBuffer issues and need to pass these headers to make my code work:
To opt in to a cross-origin isolated state, you need to send the following HTTP headers on the main document:
Cross-Origin-Embedder-Policy: require-corp
Cross-Origin-Opener-Policy: same-origin
Where and how would I go about doing this? New to full stack development haha.

I'm currently working on a project where I also encountered the SharedArrayBuffer issue when integrating ffmpeg-wasm.
This first snippet from cross origin resource policy issue when playing files from s3 on deployed app solved my problem. This would go in your server.js (note that the sequence is important, set headers before express.static):
app.use((req, res, next) => {
res.header("Cross-Origin-Embedder-Policy", "require-corp");
res.header("Cross-Origin-Opener-Policy", "same-origin");
next();
});
app.use(express.static(...));
Alternatively:
const customHeaders = (res, path, stat) => {
res.append("Cross-Origin-Embedder-Policy", "require-corp");
res.append("Cross-Origin-Opener-Policy", "same-origin");
}
app.use(express.static(...), {
setHeaders: customHeaders
});
Note that there are some issues with this approach:
SharedArrayBuffer is not supported in Safari and some other browsers, see https://caniuse.com/sharedarraybuffer
All cross-origin loads from the browser will fail, for example external scripts (e.g. Analytics) or image loads from other domains

Related

How to prevent ERR_BLOCKED_BY_RESPONSE.NotSameOriginAfterDefaultedToSameOriginByCoep?

I am attempting to access my movie API that returns data including an image of a movie poster through a React application. This image is being requested from an external website. Each time I make a request to my \movies endpoint, the image is blocked and I get the following message in the console
net::ERR_BLOCKED_BY_RESPONSE.NotSameOriginAfterDefaultedToSameOriginByCoep 200
When looking at the request in the Network tab, I get the following message saying to enable a Cross-Origin Resource Policy
Because your site has the Cross-Origin Embedder Policy (COEP) enabled, each resource must specify a suitable Cross-Origin Resource Policy (CORP). This behavior prevents a document from loading cross-origin resources which don’t explicitly grant permission to be loaded.
To solve this, add the following to the resource’s response header:
Cross-Origin-Resource-Policy: same-site if the resource and your site are served from the same site.
Cross-Origin-Resource-Policy: cross-origin if the resource is served from another location than your website. ⚠️If you set this header, any website can embed this resource.
I am using the CORS npm module which had previously been used to solve my issue with an Access-Control-Allow-Origin error. I added some additional middleware to try and add the header as instructed. This is the app.js server with that code
App.js
'use strict';
import express, { json, urlencoded } from 'express';
import morgan from 'morgan';
import mongoose from 'mongoose';
import passport from 'passport';
import cors from 'cors';
import dotenv from 'dotenv';
import auth from './routes/auth.js';
import routes from './routes/routes.js';
dotenv.config();
const app = express();
mongoose
.connect(process.env.CONNECTION_URL, {
useNewUrlParser: true,
useUnifiedTopology: true,
})
.then(res => console.log('DB Connected!'))
.catch(err => console.log(err, err.message));
app.use(cors())
app.use((req, res, next) => {
res.header("Cross-Origin-Resource-Policy", "cross-origin")
next()
})
app.use(passport.initialize());
app.use(json());
app.use(urlencoded({ extended: true }));
app.use(express.static(`public`));
app.use(morgan('common'));
auth(app);
import './authentication/passport.js';
routes(app)
app.use((req, res, err, next) => {
if (err) {
console.error(err.stack);
res.status(500).send('Something broke!');
}
next();
});
const port = process.env.PORT || 3000;
app.listen(port, '0.0.0.0', () => console.log(`Listening on Port ${port}`));
After doing this, the console throws the same error and the Cross-Origin Resource Policy still is not set. Is there something wrong with my approach or the way that I have my file structured?
You have COEP enabled in the client:
Cross-Origin-Embedder-Policy: require-corp
This is a great security feature that means:
COEP: Everything (data, images etc) on this website is mine, or I fetch from it from other websites using CORS. (There can be a third way, that is data being authorized by cookies, http-auth, etc... which is not in our discussion, so don't bother here.)
So, you have two options. The first one is to disable COEP, but I assume that you don't want to do that. So, the other option is to use CORS for everything external. For example, when you fetch something, use:
fetch('https://externalwebsite.com/image.jpg',{mode:'cors'})
or, to embed an external image in the HTML, use crossorigin
<img crossorigin="anonymous" src="https://externalwebsite.com/image.jpg">
Note that crossorigin attribute in <img> means CORS. If it is missing, it means "no-cors", which is the default. Be aware though: When you use JavaScript's fetch, the default is {mode:'cors'}, i.e. the opposite!
Now, if you try to do that (use CORS, as you should), the browser will throw another error:
Access [...] has been blocked by CORS policy: No 'Access-Control-Allow-Origin' header is present on the requested resource.
That means... exactly that! That the external server has to send the header:
Access-Control-Allow-Origin: *
That setting means that every website can use the server's resources (API in your case), as long as it does not use/send/receive cookies in the request (because... security). The way to implement this in your express server is to set:
res.header('Access-Control-Allow-Origin', '*');
Every server that intends to serve things to other websites, must have this ACAO header. (You can place your other website instead of "*" if you want only that website to access your API.)
Note/Summary:
If the external server has this ACAO header, you can fetch things using CORS/crossorigin. If it does not have ACAO header, you can fetch things with no-cors / without crossorigin. But with COEP enabled in your website, you can only fetch with CORS/crossorigin, so the external server has to have an ACAO.
Now,
As for the Cross-Origin-Resource-Policy that your server has, have in mind that (https://developer.mozilla.org/en-US/docs/Web/HTTP/Cross-Origin_Resource_Policy_(CORP)):
The policy is only effective for no-cors requests
During a cross-origin resource policy check, if the header is set, the browser will deny no-cors requests issued from a different origin/site.
This means that, since you make only CORS requests to that server, this header doesn't do anything (in your case). So the server can set it to "same-site"/"same-origin" for security reasons that are beyond this topic.

"Cross Origin Request Blocked" No solutions seem to work

Background
I'm building a MERN full stack application as a personal project. I am running the frontend client on localhost:3000 and the server on localhost:5000.
Problem
All of my API routes work as expected except for a GET request, router.get('/get-friends', ...) which queries the mongoDB to return a list of collection documents. Calling that get request on Postman returns the expected output. I decided to write a simple GET request that returns a method and it works just fine in my browser
When making the request the get-friends request in my browser, I get the following log:
Cross-Origin Request Blocked: The Same Origin Policy disallows reading the remote resource at http://localhost:5000/api/users/get-friends/. (Reason: CORS request did not succeed)
What I've Already Tried
Enabling cors in my Express server
Enabling cors preflight
Adding a proxy to the server from the client's package.json
Switching from Axios to vanilla JS's fetch() method
Turning off cors in my browser
I suspect the issue occurs when I make the request to the database from Express. I am really not sure how to solve this issue.
Here is the route in question:
router.get('/get-friends', (req, res) =>{
var species_ = req.body.species;
var gender_ = req.body.gender;
var neutered_ = req.body.neutered;
// query db
Friend.find({species: species_},{gender:gender_},{neutered:neutered_}).then((friends_) =>{
if(!friends_){
return res.status(404).send('query error, nothing returned');
}
return res.send(friends_);
}).catch((e) =>{
res.status(400).send(4);
})
});
Here is the project repo and the relevant files are:
https://github.com/edgarvi/foster-friends/server.js (Express server)
https://github.com/EdgarVi/foster-friends/blob/master/routes/api/users.js (Routes for the express server)
https://github.com/EdgarVi/foster-friends/blob/master/client/src/components/layout/SearchFriends.js (React component which calls the server)
I would gladly appreciate any help!
I have highlighted possible problems.
Reason: CORS request did not succeed
The HTTP request which makes use of CORS failed because the HTTP
connection failed at either the network or protocol level. The error
is not directly related to CORS, but is a fundamental network error of
some kind.
> In many cases, it is caused by a browser plugin (e.g. an ad blocker or
privacy protector) blocking the request.
Other possible causes include:
Trying to access an https resource that has an invalid certificate
will cause this error.
Trying to access an http resource from a page with an https origin
will also cause this error.
As of Firefox 68, https pages are not permitted to access
http://localhost, although this may be changed by Bug 1488740.
> The server did not respond to the actual request (even if it responded
to the Preflight request). One scenario might be an HTTP service being
developed that panicked without returning any data.
https://developer.mozilla.org/en-US/docs/Web/HTTP/CORS/Errors/CORSDidNotSucceed
Thank you all for the help and the suggestions. After struggling through this for multiple days, I finally encountered a solution.
In my react client, I made the API call:
axios.get('http://localhost:5000/api/users/get-friends',
{
params: {
species: this.state.species,
gender: this.state.gender,
neutured: neutered_
}}
);
and then I changed the Mongoose query to look like:
router.get('/get-friends', (req, res) =>{
var species_ = req.query.species;
var gender_ = req.query.gender;
var neutered_ = req.query.neutered;
// query db
Friend.find({species: species_},{gender:gender_},{neutered:neutered_}).then((_friends) => {
return res.send(_friends);
})
});
I'm not exactly sure why these changes made my code finally work but once again, thank you all for the help and suggestions!

Node.js / Express.js + Angular router - server overwriting client view with response object when using direct link

I am building a node.js app with express, I am hosting an Angular SPA in the public folder.
The app runs and the hosting works fine when I use the angular router for navigation around the website, but when I directly try to access the link, for example: http://192.168.1.4:3000/posts, the entire body of the website is just the JSON response object, without the app
this is the Node.js code handling the get request
postRouter.route('/')
.options(cors.corsWithOptions, (req, res) => {
res.sendStatus(200);
})
.get(cors.cors, (req, res, next) => {
posts.find({})
.then((post) => {
res.status(200);
res.setHeader('Content-Type', 'application/json')
res.send(post);
}, (err) => next(err))
.catch((err) => next(err));
})
this is my angular service sending out the get request
getPosts(): Observable<Post[]> {
return this.http.get(baseURL + 'posts')
.catch(error => { return this.processHttpService.handleError(error); });
}
Post Component .ts file
ngOnInit() {
this.postService.getPosts()
.subscribe(posts => { this.posts = posts, console.log(this.posts); },
errmess => this.errMess = <any>errmess);
}
Again, when i use my Angular 5 client app hosted in the public folder, built with ng build --prod, the JSON object is retrieved from the mongodb database and is displayed correctly on my website, along with the rest of the app, the header, the body, and the footer.
it might also be worth noting that the console.log on the ngOnInit() is not displayed on the browser when using the direct link.
Any advice/fix is greatly appreciated
You have a clash of routes between angular and your express application. Angular is served up on one route (I'm guessing the / route) and then it sort of "hijacks" the users navigation. It does this by not actually changing web pages, instead it just changes the URL in the navigation bar, but never actually makes a web request to get to that resource.
You've then got endpoints on a web server listening on those endpoints. This means the moment you visit the /posts page, you're not asking angular to do anything. In fact, angular isn't even loaded because that only gets loaded on the / route. Instead you're going straight to your API.
There are ways around this, to start with many people put their API fairly separately, either on a subdomain or mounted on /api (such as /api/posts). Then your angular app can be served up on the / route. There are other techniques you can use to then allow a user to go to /posts and still get your angular app loaded.
You can use a few approaches for this such as the hash location strategy, or you can serve up your angular application from any route on the application (* in express) and load the angular app which will then take over. This second approach is most comment, it usually results in hosting your api on a sub domain and then serving your angular app on the * route of the normal domain name. For example: api.myapp.com will serve only JSON responses, but any route on myapp.com will serve the angular app, such as myapp.com/posts.

Redirecting client with NodeJS and Restify

I'm building a REST backend for an SPA with NodeJS, Restify and PassportJS for authentication. Everything's working except the last step, which is redirecting the client from the backends /login/facebook/callback to the home page of the application.
I've searched online and found lots of answers for ExpressJS but nothing useful for Node-Restify yet. I've managed to pick up a few snippets of code and this is what I'm attempting at the moment:
app.get('/api/v1/login/facebook/cb', passport.authenticate('facebook', { scope: 'email' }), function(req, res) {
req.session.user = req.user._id;
res.header('Location', '/#/home');
res.send();
});
The response is sent but the location header is not included and the client is presented with a white screen. How do I do a proper redirect using the Node-Restify API?
Restify's Response interface now has a redirect method.
As of this writing, there's a test showing how to use it here.
The contents of that test are:
server.get('/1', function (req, res, next) {
res.redirect('https://www.foo.com', next);
});
Many folks who use Restify are more familiar with ExpressJS. It's important to understand that (again, as of this writing) one of the three main public API differences affecting porting of Express plugins is that the res.redirect method in Restify requires you to pass next (or an InternalError is thrown). I've personally ported several modules from Express to Restify and the main API differences at first are (in Restify):
server.use is only for path & HTTP-method-agnostic middleware
res.redirect requires that you pass next
Some members or the Request interface are methods rather than values, such as req.path. req.path is an alias of req.getPath in Restify
I am NOT saying that under-the-hood they are similar, but that the above three things are the main obstacles to porting over Express plugins. Under-the-hood, Restify has many advantages over Express in my experience using it in both large enterprise applications and personal projects.
You need to use redirection status code 302.
res.send(302); or res.send(302, 'your response');

Express request is called twice

To learn node.js I'm creating a small app that get some rss feeds stored in mongoDB, process them and create a single feed (ordered by date) from these ones.
It parses a list of ~50 rss feeds, with ~1000 blog items, so it's quite long to parse the whole, so I put the following req.connection.setTimeout(60*1000); to get a long enough time out to fetch and parse all the feeds.
Everything runs quite fine, but the request is called twice. (I checked with wireshark, I don't think it's about favicon here).
I really don't get it.
You can test yourself here : http://mighty-springs-9162.herokuapp.com/feed/mde/20 (it should create a rss feed with the last 20 articles about "mde").
The code is here: https://github.com/xseignard/rss-unify
And if we focus on the interesting bits :
I have a route defined like this : app.get('/feed/:name/:size?', topics.getFeed);
And the topics.getFeed is like this :
function getFeed(req, res) {
// 1 minute timeout to get enough time for the request to be processed
req.connection.setTimeout(60*1000);
var name = req.params.name;
var callback = function(err, topic) {
// if the topic has been found
if (topic) {
// aggregate the corresponding feeds
rssAggregator.aggregate(topic, function(err, rssFeed) {
if (err) {
res.status(500).send({error: 'Error while creating feed'});
}
else {
res.send(rssFeed);
}
},
req);
}
else {
res.status(404).send({error: 'Topic not found'});
}};
// look for the topic in the db
findTopicByName(name, callback);
}
So nothing fancy, but still, this getFeed function is called twice.
What's wrong there? Any idea?
This annoyed me for a long time. It's most likely the Firebug extension which is sending a duplicate of each GET request in the background. Try turning off Firebug to make sure that's not the issue.
I faced the same issue while using Google Cloud Functions Framework (which uses express to handle requests) on my local machine. Each fetch request (in browser console and within web page) made resulted in two requests to the server. The issue was related to CORS (because I was using different ports), Chrome made a OPTIONS method call before the actual call. Since OPTIONS method was not necessary in my code, I used an if-statement to return an empty response.
if(req.method == "OPTIONS"){
res.set('Access-Control-Allow-Origin', '*');
res.set('Access-Control-Allow-Headers', 'Content-Type');
res.status(204).send('');
}
Spent nearly 3hrs banging my head. Thanks to user105279's answer for hinting this.
If you have favicon on your site, remove it and try again. If your problem resolved, refactor your favicon url
I'm doing more or less the same thing now, and noticed the same thing.
I'm testing my server by entering the api address in chrome like this:
http://127.0.0.1:1337/links/1
my Node.js server is then responding with a json object depending on the id.
I set up a console log in the get method and noticed that when I change the id in the address bar of chrome it sends a request (before hitting enter to actually send the request) and the server accepts another request after I actually hit enter. This happens with and without having the chrome dev console open.
IE 11 doesn't seem to work in the same way but I don't have Firefox installed right now.
Hope that helps someone even if this was a kind of old thread :)
/J
I am to fix with listen.setTimeout and axios.defaults.timeout = 36000000
Node js
var timeout = require('connect-timeout'); //express v4
//in cors putting options response code for 200 and pre flight to false
app.use(cors({ preflightContinue: false, optionsSuccessStatus: 200 }));
//to put this middleaware in final of middleawares
app.use(timeout(36000000)); //10min
app.use((req, res, next) => {
if (!req.timedout) next();
});
var listen = app.listen(3333, () => console.log('running'));
listen.setTimeout(36000000); //10min
React
import axios from 'axios';
axios.defaults.timeout = 36000000;//10min
After of 2 days trying
you might have to increase the timeout even more. I haven't seen the express source but it just sounds on timeout, it retries.
Ensure you give res.send(); The axios call expects a value from the server and hence sends back a call request after 120 seconds.
I had the same issue doing this with Express 4. I believe it has to do with how it resolves request params. The solution is to ensure your params are resolved by for example checking them in an if block:
app.get('/:conversation', (req, res) => {
let url = req.params.conversation;
//Only handle request when params have resolved
if (url) {
res.redirect(301, 'http://'+ url + '.com')
}
})
In my case, my Axios POST requests were received twice by Express, the first one without body, the second one with the correct payload. The same request sent from Postman only received once correctly. It turned out that Express was run on a different port so my requests were cross origin. This caused Chrome to sent a preflight OPTION method request to the same url (the POST url) and my app.all routing in Express processed that one too.
app.all('/api/:cmd', require('./api.js'));
Separating POST from OPTIONS solved the issue:
app.post('/api/:cmd', require('./api.js'));
app.options('/', (req, res) => res.send());
I met the same problem. Then I tried to add return, it didn't work. But it works when I add return res.redirect('/path');
I had the same problem. Then I opened the Chrome dev tools and found out that the favicon.ico was requested from my Express.js application. I needed to fix the way how I registered the middleware.
Screenshot of Chrome dev tools
I also had double requests. In my case it was the forwarding from http to https protocol. You can check if that's the case by looking comparing
req.headers['x-forwarded-proto']
It will either be 'http' or 'https'.
I could fix my issue simply by adjusting the order in which my middlewares trigger.

Resources