I am trying to get an app that uses socket.io v.3.1.1 to work on production.
It works well on development using webpack devServer for the client on 3000 and nodemon for the server on 4000.
But when I put it on the production server the client complains with:
Cross-Origin Request Blocked: The Same Origin Policy disallows reading the remote resource at http://localhost:5003/socket.io/?EIO=4&transport=polling&t=NUmy2Us.
Server
import express from 'express'
import { createServer } from 'http'
import { Server } from 'socket.io'
const app = express()
const prod = process.env.NODE_ENV === 'production'
const port = process.env.PORT || prod ? 5003 : 4000
const httpServer = createServer(app)
const io = new Server(httpServer, {
cors: {
origin: '*',
methods: ['GET', 'POST']
}
})
const connections = []
io.on('connection', (socket) => {
connections.push(socket)
console.log(`Socket id ${socket.id} connected`)
socket.on('disconnect', () => {
connections.splice(connections.indexOf(socket), 1)
})
})
httpServer.listen(port, () => console.log(`App listening on port ${port}.`))
....
Client
...
import { io } from 'socket.io-client'
const port = process.env.NODE_ENV === 'development' ? '4000' : '5003'
const socket = io(`http://localhost:${port}`)
This set up does work on development but when I put it on production on port 5003, it throws the CORS.
On the nginx server blocks I got
location /thisapp/ {
auth_basic $authentication;
auth_basic_user_file /var/www/.htpasswd;
try_files $uri $uri/ =404;
}
# And other proxies for express routing
location /api/process {
proxy_pass http://localhost:5003/api/process;
}
location /api/process/download {
proxy_pass http://localhost:5003/api/process/download;
}
I know the app is listening on 5003 on the server.
Pm2 log App
App listening on port 5003.
When I look at the network on the web sockets tab
On Dev I get this:
And on Production this:
The production server runs on https with let's encrypt but this has never been an issue for other apps I have run, I wonder if socket.io needs me to do something about it thou.
I tried multiple combinations of different approaches but I always get this:
I just ran into this last week - though not with Socket.io - so hopefully I can help.
Before the answer, a link to point you towards some reading on what's going on: https://developer.mozilla.org/en-US/docs/Web/Security/Same-origin_policy#how_to_allow_cross-origin_access
If your NGINX has access to use more_set_headers then try adding this inside your location block:
more_set_headers 'Access-Control-Allow-Origin: *';
If that works, you can next try paring that back further to:
more_set_headers 'Access-Control-Allow-Origin: http://localhost:5003';
If you don't have access to more_set_headers you can use add_headers instead, but it's confusingly named. It doesn't only add headers; it will also remove any headers applied by blocks further up the hierarchy, like in your server block. more_set_headers will not remove those headers and is truly additive.
The syntax differs a bit. If you're forced to use add_headers try:
add_header Access-Control-Allow-Origin *;
Or more restrictive:
add_header Access-Control-Allow-Origin http://localhost:5003;
Finally, if you need to support multiple origins you can do it like this to have NGINX automatically return a header that is compatible with the origin making the request.
Outside your server block:
map $http_origin $allow_origin {
~^(http://localhost:5003|http://example.com)$ $http_origin;
}
Inside your location block:
add_header Access-Control-Allow-Origin $allow_origin;
I'm not 100% sure about the syntax for using map together with more_set_headers but this should get you 95% of the way there.
In the end after a lot of back and forth it turned out not to have anything to do with the headers.
I think my problem was twofold.
On Dev I am using webpack devServer for the front end on port 3000 and nodemon for the backend on 4000, so I was not using Nginx or
Pm2, and it worked just fine.
Therefore on production I did not have any block for socket.io and Pm2 was running in cluster mode with two instances, the moment I changed it to a single instance on Pm2 and added the Nginx location block for socket.io, it started to work with this:
Server
import express from 'express'
import { createServer } from 'http'
import { Server } from 'socket.io'
const app = express()
const prod = process.env.NODE_ENV === 'production'
const port = process.env.PORT || prod ? 5003 : 4000
const httpServer = createServer(app)
const io = new Server(httpServer)
// Or to make it also work on Dev
const io = new Server(httpSever, { cors: true })
const connections = []
io.on('connection', (socket) => {
connections.push(socket)
console.log(`Socket id ${socket.id} connected`)
socket.on('disconnect', () => {
connections.splice(connections.indexOf(socket), 1)
})
})
httpServer.listen(port, () => console.log(`App listening on port ${port}.`))
Nginx
location /socket.io/ {
proxy_pass http://localhost:5003/socket.io/;
proxy_http_version 1.1;
proxy_set_header Upgrade $http_upgrade;
proxy_set_header Connection "upgrade";
proxy_set_header Host $host;
}
Client
import { io } from 'socket.io-client'
const socket = io()
// Or to make it also work on Dev
const dev = process.env.NODE_ENV === 'development'
const socket = dev ? io('http:localhost:4000') ? io()
Related
I have a frontend react that uses axios to get data from a separate node server. The frontend uses a .app domain with SSL certificate but the backend http://localhost:3001, vanilla http, IP address, and port.
import React from 'react';
import ReactDOM from 'react-dom/client';
import './index.css';
import App from './App';
const root = ReactDOM.createRoot(document.getElementById('root'));
root.render(
<React.StrictMode>
<App />
</React.StrictMode>
);
...
axios({
method: 'GET',
url: 'http://localhost:3001/',
params: {page: pageNumber, limit: limit},
cancelToken: new axios.CancelToken(c => cancel = c)
}).then( res => {
setSermons( prevSermons => {
return [...new Set([...prevSermons, ...res.data.map( sermon => sermon )])]
})
setHasMore(res.data.length > 0)
setLoading(false)
}).catch( e => {
if (axios.isCancel(e)) return
setError(true)
})
return () => cancel()
}, [query, pageNumber, limit] )
... and here is my backend node/express server:
const express = require('express')
const cors = require('cors')
const knex = require('knex')
require('dotenv').config()
const db = knex({client: 'pg', connection: <...connection stuff...>})
const app = express()
app.use(express.urlencoded({ extended: false }))
app.use(express.json())
app.use(cors())
app.get('/', (req, res) => {
let page = req.query.page || 0
let limit = req.query.limit || 50
db.select('*')
.from('sermons')
.limit(limit, {skipBinding: true})
.offset(limit*page)
.then( (data) => {
res.json(data)
})
.catch( (err) => {
console.log(err)
})
})
const port = process.env.APP_PORT
app.listen(port, '0.0.0.0', () => console.log(`Server running on port ${port}, http://localhost:${port}`));
I can open both the frontend and backend parts of the site on my browser. The backend is accessible via http://157.xxx.xxx.xxx:3001 IP and the frontend maps to my domain. But the frontend can't retrieve data from the backend.
All of this is running behind an nginx reverse proxy. I did not find any firewalls installed on the server. Yesterday it was working but overnight the connection refused error started. I know that previously, I left the localhost out of the nginx setup entirely.
It seems like CORS is not working, even though the node server is importing/using it. What more can I look at to debug this?
try adding "proxy": "http://localhost:3001" to your package.json
it will proxy your backend and resolve CORS issues. You can read more about it in this blog post:
https://medium.com/bb-tutorials-and-thoughts/react-how-to-proxy-to-backend-server-5588a9e0347
it does sound like a Cors problem
here is a quick video explaining on how cors works
https://www.youtube.com/watch?v=4KHiSt0oLJ0&ab_channel=Fireship
try adding a proxy in the package.json or set a header like this
res.setHeader('Access-Control-Allow-Origin', '*');
Turns out I couldn't get the backend server to work with http://localhost:3000 but it did work if I mapped it to a true external API with https. This is the final configuration that worked:
nginx.conf file for JUST the backend server. Previously it was absent from nginx.conf because I was trying to make it a localhost only endpoint.
server {
listen 443 ssl; # using port 3000 wont work on some wifi, so incoming is 443 (ssl)
# server_name 157.xxx.xxx.xxx; # gets blocked because mixed content http / https
server_name <app.mydomain.com>;
ssl_certificate /etc/letsencrypt/live/<app.mydomain.com>/fullchain.pem;
ssl_certificate_key /etc/letsencrypt/live/<app.mydomain.com>/privkey.pem;
location / {
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
proxy_set_header Host $host;
proxy_pass http://127.0.0.1:3001; #<--- react-backend server listens on this port
proxy_http_version 1.1;
proxy_set_header Upgrade $http_upgrade;
proxy_set_header Connection "upgrade";
}
}
How the frontend react app uses axios to talk to backend. The backend endpoint is now HTTPS, with a separate, valid SSL certificate for that the <api.mydomain.com> subdomain:
axios({
method: 'GET',
mode: 'cors',
url: 'https://api.<mydomain.com>',
params: {page: pageNumber, limit: limit}, // { q: query, page: pageNumber },
changes to my node/express backend server endpoint
const express = require('express')
const knex = require('knex')
require('dotenv').config()
const cors = require('cors')
var corsOptions = {
origin: '*',
optionsSuccessStatus: 200
}
app.use(cors(corsOptions))
const db = knex({...config stuff})
app.get('/', (req, res) => {
db.select(... --> ...res.data)
res.setHeader('Acess-Control-Allow-Origin', '*');
})
const port = process.env.APP_PORT
app.listen(port, '0.0.0.0', () => console.log(`Server running on port ${port}, http://localhost:${port}`));
I think many of these changes were not needed, but this is a working state, over https.
Next I'll see if I can block access to this backend to everyone except the frontend domain.
I have used socket.io for live updates inside nodejs application. As When I integrated it on localhost it is working fine. I am getting all the events.
But as soon as I am deploying it to production server, my socket is not connecting and not even calling any events. I have deployed node js application on ubuntu server. And I have used nginx for reverse proxy. We have also integrated ssl into our domain.
This is my app.js
const express = require('express');
const cors = require('cors');
const app = express();
app.use(cors());
app.use('/api', rootRouter);
//socket setup
var http = require('http').createServer(app);
var io = require('socket.io')(http);
io.on('connection', (socket) => {
console.log('we have new connection');
socket.on('initiateConnection', async data => {
const { userId } = data;
socket.userId = userId;
socket.join(userId);
io.sockets.in(userId).emit('InitiatedConnection', userId);
console.log('User connected with id - ', data.userId);
});
socket.on("disconnect", async (reason) => {
if (socket.userId) {
const userId = socket.userId;
socket.leave(socket.userId);
console.log("user disconnected with id - " + userId);
}
});
socket.on('newOrderCreated', async data => {
console.log('newOrderCreated called', data)
io.sockets.in(data.userId).emit('OrderReceived', data);
})
});
const port = process.env.PORT || 3000;
http.listen(port, function () {
console.log(`listening on *:${port}`);
});
module.exports = app;
And For testing purpose I was trying to call socket from one of router file. That is I have initialised one client from my orders controller. Code for same is as follows
/controllers/ordersController.js
const io = require('socket.io-client');
//const SOCKET_ENDPOINT = 'http://localhost:3000/' // endpoint for localhost
const SOCKET_ENDPOINT = 'https://example.com/' // endpoint for live server
const socket = io(`${SOCKET_ENDPOINT}`);
socket.on('OrderReceived', (data) => {
console.log('New Request order received', data);
})
module.exports = {
findNearbyOutlet: async(req, res) => {
console.log('socket', socket)
let userId = req?.loggedInUserId ? req?.loggedInUserId : 0;
//emit this event to initiate connection with server
socket.emit('initiateConnection', {
userId
});
//emit this event to broadcast message about new order
socket.emit('newOrderCreated', {
orderId: 1,
userId: userId,
orderDate: '2022-05-02',
userName: 'Some user',
address: 'Atos society, C/1102, Jaipur, India-411110'
totalAmount: 100
})
}
}
For your information We have integrated SSL with our domain. Following is the nginx configuration for our site
server {
listen 443 ssl;
ssl_certificate /var/www/ssl/example.com/bundle.crt;
ssl_certificate_key /var/www/ssl/example.com/HSSL-6256604eeb4e6.key;
server_name example.com www.example.com;
location / {
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
proxy_set_header Host $host;
proxy_pass http://localhost:3000;
proxy_http_version 1.1;
proxy_set_header Upgrade $http_upgrade;
proxy_set_header Connection 'upgrade';
proxy_cache_bypass $http_upgrade;
client_max_body_size 100M;
}
}
server {
listen 80;
server_name example.com www.example.com;
return 301 https://$server_name$request_uri;
}
So I have did this much for deployment purpose. Rest of everything is working fine. Only thing is socket.io events are not triggering. Event socket connection is not working. Everything working fine on localhost but not on live server.
I have tried lot of stuff but no clue why socket events are not working. Please help me in this socket setup on live server. Let me know what am I missing.
Thanks in advance.
I'm setting an Nginx reverse proxy to a NodeJS app that includes Socket.IO on a server that hosts additional NodeJs apps.
The NodeJS is running via PM2 on port 3001. Here is the Nginx configuration:
server {
listen 80;
server_name iptv-staging.northpoint.org;
location / {
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
proxy_set_header Host $http_host;
proxy_pass http://localhost:3001;
proxy_http_version 1.1;
proxy_set_header Upgrade $http_upgrade;
proxy_set_header Connection "upgrade";
}
}
When running the app via the IP address of the server directly http://xxx.xxx.xxx.xxx:3001/ everything runs without issue. Ping/Pong requests from Socket.IO is around 50ms (default pingTimeout is 5000ms). When accessing the app via its DNS name http://iptv-staging.northpoint.org the client reports a ping timeout and disconnects. It will reconnect on its first try, then disconnect again on the first ping/pong request.
From what I can tell, the problem has to be related to the Ngnix reverse proxy and how websockets are being routed through. It seems to be that the server's reply to a ping request is not making it to the client. But I can't seem to determine why. Any help is much appreciated.
I had exactly the same issue. A guy at work provided the answer and in honesty I didn't fully understand, so no credit for me, but I hope I can tease the answers out of my code.
Firstly, you need to tell nginx to send the incoming URI as seen by the client by adding the X-Real-URI header :
proxy_set_header X-Real-URI $request_uri;
The difference between this path and the path eventually seen in your server tells you the base path of the URI.
Then, I created an API endpoint on the server that returns URI info. This is the key to telling the client what address to use to connect.
apiRoutes.get('/options', (req, res) => {
Log.info('Received request for app options.');
// This isn't applicable to you, just showing where options declared.
let options = JSON.parse(JSON.stringify(config.get('healthcheck.options')));
// Decide what port to use. It might be a port for a dev instance or from Env
if ((process.env.PORT !== options.port) && (process.env.PORT > 0)) {
options.port = process.env.PORT;
}
// This is the important bit...
var internalUri = req.originalUrl;
var externalUri = req.get('X-Real-URI') || internalUri;
options.basePath = externalUri.substr(0, externalUri.length - internalUri.length + 1);
res.status(200).send(JSON.stringify(options));
});
My client is a React app, you'll have think how you need to implement, but here's how I did it.
Here's my 'helper' function for calling the Options service...
export async function getOptions() {
const res = await axios.get('/api/options');
return res.data;
}
In my React page, I then call this when the page loads...
componentDidMount() {
getOptions()
.then((res) => {
this.setState({ port: res.port });
// Call a client-side function to build URL. Incl later.
const socketUrl = getSocketUrl(res.port);
console.log(`Attempting to connect to sockets at ${socketUrl}.`);
// This is where it all comes together...
const socket = io(socketUrl, { path: `${res.basePath}socket.io` });
socket.on('connect', function () { console.log(`Connected to ${socketUrl}`); });
socket.on('message', (result) => {
console.log(result);
});
socket.on('data', (result) => {
console.log(`Receiving next batch of results.`);
const filteredResults = JSON.parse(result));
// Do something with the results here...
});
});
.....
Lastly, the getSocketUrl function...
function getSocketUrl(port) {
console.log(`${window.location.hostname}`);
if (window.location.hostname.toString().includes('local')) {
return `localhost:${port}`;
}
return `${window.location.hostname}`;
}
I am having a problem today that has something to do with routing. I have two main codes: one is the frontend and the other one is the backend.
The frontend is written using Vue.js so it's a SPA. This webapp is kind of complex and involves a lot of routing and backend AJAX API calls.
// All imports
import ...
loadMap(Highcharts);
loadDrilldown(Highcharts);
boost(Highcharts);
Vue.config.productionTip = false
Vue.use(VueCookie);
Vue.use(ElementUI, {locale});
Vue.use(VueRouter);
Vue.use(VueHighcharts, {Highcharts });
Vue.use(HighMaps);
// This is a global component declaration
Vue.component('app-oven', Devices);
Vue.component('app-sidebar', SideBar);
Vue.component('app-header', Header);
Vue.component('app-footer', Footer);
Vue.component('app-query', Query);
Vue.component('app-deviceproperties', DeviceProperties);
Vue.component('app-device', Device)
Vue.component('app-queryselection', QuerySelection)
Vue.component('app-index', Index)
Vue.component('app-index', Error)
Vue.component('app-realtime', RealTime);
Vue.component('app-login', Login)
Vue.component('app-preferences', Preferences)
const routes = [
{ path: '/index', component: Index},
{ path: '/', component: Login},
{ path: '/device/:deviceId', component: Device},
{ path: '/preferences', component: Preferences},
{ path: '*', component: Error}
];
const router = new VueRouter({
routes: routes,
mode: "history" // Gets rid of the # before the path
})
new Vue({
el: '#app',
router: router,
components: { App },
template: '<App/>'
})
The backend is written using Express on Node.js and it answers to specific AJAX calls from the Frontend.
// All imports
import ...
function prepareApp() {
let app = new Express();
app.use(cors({
origin: "*",
allowedHeaders: "Content-type",
methods: "GET,POST,PUT,DELETE,OPTIONS" }));
app.use(function(req, res, next) {
res.header("Access-Control-Allow-Origin", "*");
res.header("Access-Control-Allow-Headers", "Origin, X-Requested-With, Content-Type, Accept");
next();
});
app.use(helmet());
app.use(bodyParser.json());
app.use(bodyParser.urlencoded({extended: false}));
// Get all parameters
app.get('/params', params.parameters);
// Get all devices ever seen on the databases
app.get('/devices', params.devices);
app.get('/organizeData', organizer.updateAll);
// WebApp used services to access various things
app.post('/customQuery', stats.query);
app.post('/statistics', stats.statistics)
app.post('/getUserInfo', stats.getUserInfo)
app.post('/setUserInfo', stats.setUserInfo)
app.post('/genericQuery', stats.genericQuery)
app.post('/NOSQLQuery', stats.NOSQLQuery)
// Users check and insertion
app.get('/insertUser', stats.insertUser)
app.post('/verifyUser', stats.verifyUser)
app.get('/', errors.hello); // Returns a normal "hello" page
app.get('*', errors.error404); // Catch 404 and forward to error handler
app.use(errors.error); // Other errors handler
return app;
}
let app = prepareApp();
//App listener on localhost:8080
app.listen(8080, () => {
console.log("App listening on http://localhost:8080");
});
I only used this setup during development so I had both running at the same time on localhost with a different port for both. Now I would like to start the production cycle but I have no idea where to start.
Most importantly I am deploying both applications onto a Virtual Machine that is running on an external server. It already has a DNS association and a static IP address so that is already covered.
The problem arises when I try to run both programs at the same time on this production machine since its open ports are only the port 80 and the port 443. I think this is pretty normal in a production environment but I don't know how to adapt my applications so that they can still talk to each other and retrieve useful information from the Database while still using a single port.
I hope I explained the problem kinda well. Looking forward to a nice (and maybe long) answer.
I'd recommend running the backend on port 3000 internally and have nginx listening on 80 and 443 and proxying urls starting with '/api' to 3000 and deliver the frontend directly since it's just a bunch of static files.
This would be your nginx configuration. Just make sure backend server has some api prefix like '/api'. Build your vuejs app with 'npm run build' and copy the folder to /opt/frontend.
upstream backend {
server 127.0.0.1:3000;
}
server {
listen 80 default_server;
listen [::]:80 default_server;
location /api/ {
proxy_pass http://backend;
proxy_redirect off;
proxy_set_header Host $host;
proxy_set_header X-Real-IP $remote_addr;
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
proxy_set_header X-Forwarded-Host $server_name;
}
location / {
root /opt/frontend/dist;
try_files $uri $uri/ /index.html;
}
}
Alternatively, you could use the backend to host the frontend. However, a webserver like nginx is more efficient at serving static files than your backend api server.
If you don't have a way to open more ports, you can build your frontend into production mode and then take its index.html and dist folder to the same folder where your nodejs app are.
Then you create a express app listening to port 80 and send the HTML file.
var express = require('express');
var app = express();
var path = require('path');
var dir = '//vm//path//here';
app.get('/', function(req, res) {
res.sendFile(path.join(dir + '/index.html'));
});
app.listen(80);
In my case, my backend server doesn't run in cluster mode( e.g. with 3001, 3002... together with 80 port)
My case: rails server running with passenger ( mydomain.com , 80 port )
and I need to run my Vuejs project with the same domain ,the same port.
so the only solution is to run vue in specified URL.
this is my solution:
1.change your nginx config.
http {
# our backend app is passenger( rails server, running on 80 port)
passenger_root /usr/local/rvm/gems/ruby-2.2.10/gems/passenger-6.0.0;
passenger_ruby /usr/local/rvm/gems/ruby-2.2.10/wrappers/ruby;
include mime.types;
default_type application/octet-stream;
server {
listen 80;
passenger_enabled on;
# we are using this folder as the root of our backend app.
root /mnt/web/php/public;
charset utf-8;
location ~ ^/(images|javascripts|stylesheets|upload|assets|video)/ {
root /mnt/www/php/public;
expires 30d;
add_header Cache-Control public;
add_header ETag "";
}
# vuejs related content
location /vue.html {
root /mnt/web/vuejs/h5/dist;
}
location /static {
root /mnt/web/vuejs/h5/dist;
}
}
}
2.in your vue project's dist folder:
$ mv index.html vue.html
3.all the requested url in your vuejs project should be changed according to the nginx config.
My Node.js application is running on localhost:8080
These are the server configuration files:
var express = require('express');
var http = require('http');
var app = express();
var WS = require('ws');
var config = require('./config/main');
var Client = require('./client');
// HTTP
var server = http.createServer(app);
app.use(express.static(__dirname + '/../client/build/'));
server.listen(config.httpPort, function() {
console.info('HTTP listening on *:' + config.httpPort);
});
// WebSocket
var ws = new WS.Server({server: server});
console.info('Websocket server created');
ws.on('connection', function(ws) {
var client = new Client(ws);
console.log('New conection:', client.id);
});
and
module.exports = {
/* HTTP PORT */
httpPort: process.env.PORT || 8080,
/* WebSocket PORT */
wsPort: process.env.PORT || 8081
};
So I am trying to run it using the Nginx's reverse proxy:
location /myapp {
proxy_pass http://localhost:8080/;
proxy_set_header Upgrade $http_upgrade;
proxy_set_header Connection "upgrade";
}
Now, when I link to localhost/myapp the page appears normally, with all the static files loaded, but it seems like there is no WebSocket connection. PS: I am using Nginx V 1.11.7
Is there something wrong in the configuration or did I miss something? Thank you
Changing the URL that the client uses for the WS connection solved this ..
So in my client side I changed this line:
new WebSocket(location.origin.replace(/^http/, "ws"));
to this line:
new WebSocket("ws://localhost/myapp");
Now it's working fine!