Nodejs + Socket.io + Nginx Reverse Proxy Not working - node.js

I'm setting an Nginx reverse proxy to a NodeJS app that includes Socket.IO on a server that hosts additional NodeJs apps.
The NodeJS is running via PM2 on port 3001. Here is the Nginx configuration:
server {
listen 80;
server_name iptv-staging.northpoint.org;
location / {
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
proxy_set_header Host $http_host;
proxy_pass http://localhost:3001;
proxy_http_version 1.1;
proxy_set_header Upgrade $http_upgrade;
proxy_set_header Connection "upgrade";
}
}
When running the app via the IP address of the server directly http://xxx.xxx.xxx.xxx:3001/ everything runs without issue. Ping/Pong requests from Socket.IO is around 50ms (default pingTimeout is 5000ms). When accessing the app via its DNS name http://iptv-staging.northpoint.org the client reports a ping timeout and disconnects. It will reconnect on its first try, then disconnect again on the first ping/pong request.
From what I can tell, the problem has to be related to the Ngnix reverse proxy and how websockets are being routed through. It seems to be that the server's reply to a ping request is not making it to the client. But I can't seem to determine why. Any help is much appreciated.

I had exactly the same issue. A guy at work provided the answer and in honesty I didn't fully understand, so no credit for me, but I hope I can tease the answers out of my code.
Firstly, you need to tell nginx to send the incoming URI as seen by the client by adding the X-Real-URI header :
proxy_set_header X-Real-URI $request_uri;
The difference between this path and the path eventually seen in your server tells you the base path of the URI.
Then, I created an API endpoint on the server that returns URI info. This is the key to telling the client what address to use to connect.
apiRoutes.get('/options', (req, res) => {
Log.info('Received request for app options.');
// This isn't applicable to you, just showing where options declared.
let options = JSON.parse(JSON.stringify(config.get('healthcheck.options')));
// Decide what port to use. It might be a port for a dev instance or from Env
if ((process.env.PORT !== options.port) && (process.env.PORT > 0)) {
options.port = process.env.PORT;
}
// This is the important bit...
var internalUri = req.originalUrl;
var externalUri = req.get('X-Real-URI') || internalUri;
options.basePath = externalUri.substr(0, externalUri.length - internalUri.length + 1);
res.status(200).send(JSON.stringify(options));
});
My client is a React app, you'll have think how you need to implement, but here's how I did it.
Here's my 'helper' function for calling the Options service...
export async function getOptions() {
const res = await axios.get('/api/options');
return res.data;
}
In my React page, I then call this when the page loads...
componentDidMount() {
getOptions()
.then((res) => {
this.setState({ port: res.port });
// Call a client-side function to build URL. Incl later.
const socketUrl = getSocketUrl(res.port);
console.log(`Attempting to connect to sockets at ${socketUrl}.`);
// This is where it all comes together...
const socket = io(socketUrl, { path: `${res.basePath}socket.io` });
socket.on('connect', function () { console.log(`Connected to ${socketUrl}`); });
socket.on('message', (result) => {
console.log(result);
});
socket.on('data', (result) => {
console.log(`Receiving next batch of results.`);
const filteredResults = JSON.parse(result));
// Do something with the results here...
});
});
.....
Lastly, the getSocketUrl function...
function getSocketUrl(port) {
console.log(`${window.location.hostname}`);
if (window.location.hostname.toString().includes('local')) {
return `localhost:${port}`;
}
return `${window.location.hostname}`;
}

Related

Socket.io events are not triggering on live ubuntu server

I have used socket.io for live updates inside nodejs application. As When I integrated it on localhost it is working fine. I am getting all the events.
But as soon as I am deploying it to production server, my socket is not connecting and not even calling any events. I have deployed node js application on ubuntu server. And I have used nginx for reverse proxy. We have also integrated ssl into our domain.
This is my app.js
const express = require('express');
const cors = require('cors');
const app = express();
app.use(cors());
app.use('/api', rootRouter);
//socket setup
var http = require('http').createServer(app);
var io = require('socket.io')(http);
io.on('connection', (socket) => {
console.log('we have new connection');
socket.on('initiateConnection', async data => {
const { userId } = data;
socket.userId = userId;
socket.join(userId);
io.sockets.in(userId).emit('InitiatedConnection', userId);
console.log('User connected with id - ', data.userId);
});
socket.on("disconnect", async (reason) => {
if (socket.userId) {
const userId = socket.userId;
socket.leave(socket.userId);
console.log("user disconnected with id - " + userId);
}
});
socket.on('newOrderCreated', async data => {
console.log('newOrderCreated called', data)
io.sockets.in(data.userId).emit('OrderReceived', data);
})
});
const port = process.env.PORT || 3000;
http.listen(port, function () {
console.log(`listening on *:${port}`);
});
module.exports = app;
And For testing purpose I was trying to call socket from one of router file. That is I have initialised one client from my orders controller. Code for same is as follows
/controllers/ordersController.js
const io = require('socket.io-client');
//const SOCKET_ENDPOINT = 'http://localhost:3000/' // endpoint for localhost
const SOCKET_ENDPOINT = 'https://example.com/' // endpoint for live server
const socket = io(`${SOCKET_ENDPOINT}`);
socket.on('OrderReceived', (data) => {
console.log('New Request order received', data);
})
module.exports = {
findNearbyOutlet: async(req, res) => {
console.log('socket', socket)
let userId = req?.loggedInUserId ? req?.loggedInUserId : 0;
//emit this event to initiate connection with server
socket.emit('initiateConnection', {
userId
});
//emit this event to broadcast message about new order
socket.emit('newOrderCreated', {
orderId: 1,
userId: userId,
orderDate: '2022-05-02',
userName: 'Some user',
address: 'Atos society, C/1102, Jaipur, India-411110'
totalAmount: 100
})
}
}
For your information We have integrated SSL with our domain. Following is the nginx configuration for our site
server {
listen 443 ssl;
ssl_certificate /var/www/ssl/example.com/bundle.crt;
ssl_certificate_key /var/www/ssl/example.com/HSSL-6256604eeb4e6.key;
server_name example.com www.example.com;
location / {
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
proxy_set_header Host $host;
proxy_pass http://localhost:3000;
proxy_http_version 1.1;
proxy_set_header Upgrade $http_upgrade;
proxy_set_header Connection 'upgrade';
proxy_cache_bypass $http_upgrade;
client_max_body_size 100M;
}
}
server {
listen 80;
server_name example.com www.example.com;
return 301 https://$server_name$request_uri;
}
So I have did this much for deployment purpose. Rest of everything is working fine. Only thing is socket.io events are not triggering. Event socket connection is not working. Everything working fine on localhost but not on live server.
I have tried lot of stuff but no clue why socket events are not working. Please help me in this socket setup on live server. Let me know what am I missing.
Thanks in advance.

How to get websocket over SSL in Express app?

After making my websocket work localy i deployed it and got security errors due to it not being over SSL. I am hosting a NodeJS Express app with Nginx and Certbot.
server.js
const express = require('express');
const app = express();
// unrelated imports
app.use(bodyParser.json());
app.use(cors());
app.use(express.static('public'));
app.use(bodyParser.json({ limit: '50mb' }))
require('./services/socket')(app)
// unrelated endpoints
app.listen(port);
socket.js
let WebSocket = require('ws');
module.exports = function(app) {
let ws_server = new WebSocket.Server();
ws_server.on('connection', (connection) => {
connection.on('message', (message) => {
ws_server.clients.forEach((client) => {
if (client.readyState === WebSocket.OPEN && client != connection) {
client.send(message);
}
});
});
});
}
Now the app is HTTPS but not the websocket, how can i get the socket to also use SSL/certs?
In such setup nginx needs to handle the SSL and be properly set up to work with web sockets. Here's an article from nginx blog NGINX as a WebSocket Proxy . The essential bits are:
location /wsapp/ {
proxy_pass http://wsbackend;
proxy_http_version 1.1;
proxy_set_header Upgrade $http_upgrade;
proxy_set_header Connection "Upgrade";
proxy_set_header Host $host;
}
You might also want to set up timeouts for the connection.
And Express needs to trust the reverse proxy. Here's an article from them Express behind proxies
. If you are deploying in a secure environment you can just set it to app.set('trust proxy', ()=>true)
For reference here are the nginx timeouts I had to set up for my application:
proxy_connect_timeout 14d;
proxy_read_timeout 14d;
proxy_send_timeout 14d;

Socket.io and nginx CORS on production

I am trying to get an app that uses socket.io v.3.1.1 to work on production.
It works well on development using webpack devServer for the client on 3000 and nodemon for the server on 4000.
But when I put it on the production server the client complains with:
Cross-Origin Request Blocked: The Same Origin Policy disallows reading the remote resource at http://localhost:5003/socket.io/?EIO=4&transport=polling&t=NUmy2Us.
Server
import express from 'express'
import { createServer } from 'http'
import { Server } from 'socket.io'
const app = express()
const prod = process.env.NODE_ENV === 'production'
const port = process.env.PORT || prod ? 5003 : 4000
const httpServer = createServer(app)
const io = new Server(httpServer, {
cors: {
origin: '*',
methods: ['GET', 'POST']
}
})
const connections = []
io.on('connection', (socket) => {
connections.push(socket)
console.log(`Socket id ${socket.id} connected`)
socket.on('disconnect', () => {
connections.splice(connections.indexOf(socket), 1)
})
})
httpServer.listen(port, () => console.log(`App listening on port ${port}.`))
....
Client
...
import { io } from 'socket.io-client'
const port = process.env.NODE_ENV === 'development' ? '4000' : '5003'
const socket = io(`http://localhost:${port}`)
This set up does work on development but when I put it on production on port 5003, it throws the CORS.
On the nginx server blocks I got
location /thisapp/ {
auth_basic $authentication;
auth_basic_user_file /var/www/.htpasswd;
try_files $uri $uri/ =404;
}
# And other proxies for express routing
location /api/process {
proxy_pass http://localhost:5003/api/process;
}
location /api/process/download {
proxy_pass http://localhost:5003/api/process/download;
}
I know the app is listening on 5003 on the server.
Pm2 log App
App listening on port 5003.
When I look at the network on the web sockets tab
On Dev I get this:
And on Production this:
The production server runs on https with let's encrypt but this has never been an issue for other apps I have run, I wonder if socket.io needs me to do something about it thou.
I tried multiple combinations of different approaches but I always get this:
I just ran into this last week - though not with Socket.io - so hopefully I can help.
Before the answer, a link to point you towards some reading on what's going on: https://developer.mozilla.org/en-US/docs/Web/Security/Same-origin_policy#how_to_allow_cross-origin_access
If your NGINX has access to use more_set_headers then try adding this inside your location block:
more_set_headers 'Access-Control-Allow-Origin: *';
If that works, you can next try paring that back further to:
more_set_headers 'Access-Control-Allow-Origin: http://localhost:5003';
If you don't have access to more_set_headers you can use add_headers instead, but it's confusingly named. It doesn't only add headers; it will also remove any headers applied by blocks further up the hierarchy, like in your server block. more_set_headers will not remove those headers and is truly additive.
The syntax differs a bit. If you're forced to use add_headers try:
add_header Access-Control-Allow-Origin *;
Or more restrictive:
add_header Access-Control-Allow-Origin http://localhost:5003;
Finally, if you need to support multiple origins you can do it like this to have NGINX automatically return a header that is compatible with the origin making the request.
Outside your server block:
map $http_origin $allow_origin {
~^(http://localhost:5003|http://example.com)$ $http_origin;
}
Inside your location block:
add_header Access-Control-Allow-Origin $allow_origin;
I'm not 100% sure about the syntax for using map together with more_set_headers but this should get you 95% of the way there.
In the end after a lot of back and forth it turned out not to have anything to do with the headers.
I think my problem was twofold.
On Dev I am using webpack devServer for the front end on port 3000 and nodemon for the backend on 4000, so I was not using Nginx or
Pm2, and it worked just fine.
Therefore on production I did not have any block for socket.io and Pm2 was running in cluster mode with two instances, the moment I changed it to a single instance on Pm2 and added the Nginx location block for socket.io, it started to work with this:
Server
import express from 'express'
import { createServer } from 'http'
import { Server } from 'socket.io'
const app = express()
const prod = process.env.NODE_ENV === 'production'
const port = process.env.PORT || prod ? 5003 : 4000
const httpServer = createServer(app)
const io = new Server(httpServer)
// Or to make it also work on Dev
const io = new Server(httpSever, { cors: true })
const connections = []
io.on('connection', (socket) => {
connections.push(socket)
console.log(`Socket id ${socket.id} connected`)
socket.on('disconnect', () => {
connections.splice(connections.indexOf(socket), 1)
})
})
httpServer.listen(port, () => console.log(`App listening on port ${port}.`))
Nginx
location /socket.io/ {
proxy_pass http://localhost:5003/socket.io/;
proxy_http_version 1.1;
proxy_set_header Upgrade $http_upgrade;
proxy_set_header Connection "upgrade";
proxy_set_header Host $host;
}
Client
import { io } from 'socket.io-client'
const socket = io()
// Or to make it also work on Dev
const dev = process.env.NODE_ENV === 'development'
const socket = dev ? io('http:localhost:4000') ? io()

How to convert a HTTP stream to HTTPS?

My website is running via HTTPS at a public hoster and connects to a node server that is running on a Raspberry PI.
There is also a piece of hardware (lets call it decoder) in the same network with the PI which sends a data stream via TCP. The purpose of the PI is to read that stream and send it via WebSocket to the browser. So the goal is to output that stream on my website.
Now I'm running in a mixes content problem and have no idea how to solve it.
What I have done so far is to install an nginx webserver on the PI and have installed a Letsencrypt certificate. Both are running fine (tested via normal https:// call in a webbrowser).
The Websocket connection without SSL works fine as well and I get the data, but with SSL wont work. I guess that the problem is, that the decoder is not able to handle SSL.
So how can I "send", "convert", "tunnel" or "proxy" the non-SSL data stream to a HTTPS server?
Update
#Jake Holzinger: you are absolutely right. The given information was not enough. Sorry! I try to clarify:
nginx is without any further modification. So it is the configuration which comes from the installation
the website (Angular) does a let connection = new WebSocket('wss://domain:port');
The node server looks like follows:
const net = require('net');
const fs = require('fs');
const https = require('https');
const date = require('date-and-time');
const config = require('./server-config.json');
const httpProxy = require('http-proxy');
// SSL SERVER
try {
const privateKey = fs.readFileSync('/etc/letsencrypt/live/' + config.DNSROUTER + '/privkey.pem', 'utf8');
const certificate = fs.readFileSync('/etc/letsencrypt/live/' + config.DNSROUTER + '/cert.pem', 'utf8');
const ca = fs.readFileSync('/etc/letsencrypt/live/' + config.DNSROUTER + '/chain.pem', 'utf8');
const options = {
key: privateKey,
cert: certificate,
ca: ca
};
let proxy = httpProxy.createServer({
target: 'ws://localhost:9030',
ssl: {
key: privateKey,
cert: certificate
}
}).listen(9031);
}
catch (e) {
console.log("LETSENCRYPT certificates not found! No SSL!");
console.log(e)
}
/**
* server
*/
let connections = {};
let WebSocketServer = require('ws').Server;
// start WS via HTTP
const wss1 = new WebSocketServer({port: 9030});
wss1.on('connection', function(ws) {
CLIENTS.push(ws);
console.log('connection via HTTP');
ws.on('close', function () {
console.log('close HTTP!');
CLIENTS = CLIENTS.filter(item => item != ws);
})
})
/**
* client
*/
let connect = function() {
console.log(now(), 'Starting server...');
const socket = net.createConnection({ port: config.PORT, host: config.HOST }, () => {
console.log('Connected to server!');
})
socket.on('connect', () => {
console.log(now(), 'Connected to server!');
socket.on('data', data => {
sendAll(data);
});
});
socket.on('error', data => {
console.log(now(), "Connnection refused:", data.errno,data.code,"(IP:", data.address + ":" + data.port + ")");
setTimeout(() => {
socket.removeAllListeners();
console.log(now(),"Reconnecting...");
connect();
}, 1000);
});
socket.on('end', () => {
console.log(now(), 'Disconnected from server');
console.log(now(), "Reconnecting...");
socket.removeAllListeners();
connect();
});
}
connect();
I hope that this will get a better impression. Thx for your help!
Now I solved this issue in a different way.
Instead of creating a proxy server node implementation, I've have created the reverse-proxy on nginx webserver level to proxy all HTTPS -> HTTP calls to the PI. The code below is working fine for me now.
sudo nano /etc/nginx/sites-available/default
and change the content like this:
server {
listen 9031 ssl;
ssl_certificate /etc/letsencrypt/live/DOMAIN_DNS/fullchain.pem;
ssl_certificate_key /etc/letsencrypt/live/DOMAIN_DNS/privkey.pem;
location / {
proxy_pass http://127.0.0.1:9030;
proxy_http_version 1.1;
proxy_set_header Upgrade $http_upgrade;
proxy_set_header Connection "upgrade";
proxy_read_timeout 86400;
}
}

400 Bad Request while creating websocket between client and nodejs with Nginx as reverse proxy.

I am trying to create a websocket using Nginx as a reverse proxy and nodejs at the back. I'm using ws library in nodejs. When I test it using wscat tool everything works fine but as I make request from browser, I'm continuously receiving 400 : Bad Request response from Nginx. I cannot find anything over internet with the error, nginx and nodejs together(might be my bad). I did exactly as mentioned in the tutorial.
Following is my node configuration-
console.log("Server started");
var Msg = '';
var WebSocketServer = require('ws').Server
, wss = new WebSocketServer({port: 8010});
wss.on('connection', function(ws) {
ws.on('message', function(message) {
console.log('Received from client: %s', message);
ws.send('Server received from client: ' + message);
});
});
Nginx conf-
http {
map $http_upgrade $connection_upgrade {
default upgrade;
'' close;
}
upstream websocket {
server 127.0.0.1:8010;
}
server {
listen 8020;
location / {
proxy_pass http://websocket;
proxy_http_version 1.1;
proxy_set_header Upgrade $http_upgrade;
proxy_set_header Connection $connection_upgrade;
}
}
}
Then I tried to work with socket.io from this tutorial. But then I am confused how to deliver index.html file containing
<script src="/socket.io/socket.io.js"></script>
<script>
var socket = io(); // your initialization code here.
</script>
to browser as I want only specific request make web socket. I don't even understand how is it going to find socket.io.js as I don't find in where mentioned.
Please help.

Resources