I'm trying to create a Socket.IO server that has the following goals:
Accessible on the local network of virtual machines using HTTP (http://<server-local-ip>)
That can be accessed via browser by users through the HTTPs protocol, and that can also make the socket.io.js bundle available via HTTPs (https://socket-server.example.com)
That uses all available CPUs in the virtual machine (the server will run in just one virtual machine) - (Possible with PM2)
Have the ability to be automatically restarted in case of failure (Possible with PM2)
For that I created a script based on the Socket.IO help article teaching how to use PM2 and this question that teaches to use HTTP and HTTPs.
/**
* pm2 start basic.js -i 0
*/
const http = require("http");
const https = require("https");
const { Server } = require("socket.io");
const { createAdapter } = require("#socket.io/cluster-adapter");
const { setupWorker } = require("#socket.io/sticky");
const { readFileSync } = require("fs");
const httpServer = http.createServer();
const httpsServer = https.createServer({
key: readFileSync("./localhost-key.pem"),
cert: readFileSync("./localhost.pem")
});
const io = new Server(httpServer, {
cors: {
origin: "*",
methods: ["GET", "POST"]
}
});
io.adapter(createAdapter());
setupWorker(io);
io.on("connection", (socket) => {
console.log(`connect ${socket.id}`);
});
httpsServer.on("request", (req, res) => {
io.engine.handleRequest(req, res);
});
httpsServer.on("upgrade", (req, socket, head) => {
io.engine.handleUpgrade(req, socket, head);
});
httpServer.listen(8080);
httpsServer.listen(4430);
Using HTTP and HTTPs always throws an error.
Via HTTPs I can't load the socket.io.js bundle. But as this service will be available via browser, it will be necessary to make it available via HTTPs to users.
Direct access via HTTPs displays:
{
code: 0,
message: "Transport unknown"
}
This is just using the first part of the script, without trying to run with PM2 yet.
When placing the PM2 part next to the script, other errors appear:
I have to remove the code httpServer.listen(3000); for HTTP to work
When I connect to HTTPs the code never finds the session, so it keeps trying to reconnect endlessly.
socket.io.js via HTTPs remains unreachable
Even using HTTP socket.io.js and connecting with <script src="http://localhost:8080/socket.io/socket.io.js"></script> <script> const socket = io('https://localhost:3001');</script> nothing works
However, if I run all this over HTTP only, without requiring HTTPs, it works perfectly.
What am I doing wrong for HTTP/HTTPs not to work together?
Will I have to make the server available only in HTTP and create a proxy via NGINX to support HTTPs and call the HTTP server?
Related
http-proxy module doesn't work with create-react-app, but works with serve -s build
This is my proxy-server code.
So what it does - it joins 2 api servers on different ports and frontend to a single 80 port. When you open localhost:80/* it should open react frontend (3000 port). When you open /api it gives you data from 4000 port and /secondapi from 1000 port.
My 2 backend api servers are opening completely fine with it.
Also when I start frontend server using serve module it also works fine and returns my frontend part.
But if I start frontend at the same 3000 port using "npm start" my proxy server returns connect ECONNREFUSED ::1:3000
const httpProxy = require('http-proxy');
const http = require('http');
const { maintenanceHtml } = require('./maintenanceHtml');
const proxy = httpProxy.createServer();
const guiUrl = 'http://localhost:3000'; // react frontend app
const apiUrl = 'http://localhost:4000'; // 1st api server
const apiPrefix = '/api';
const fnApiUrl = 'http://localhost:1000'; // 2nd api server
const fnApiPrefix = '/secondapi';
const port = 80;
http.createServer((req, res) => {
let target = guiUrl;
if (req.url.startsWith(apiPrefix)) {
req.url = req.url.replace(apiPrefix, '/');
target = apiUrl;
}
if (req.url.startsWith(fnApiPrefix)) {
req.url = req.url.replace(fnApiPrefix, '/');
target = fnApiUrl;
}
proxy.web(req, res, { target })
proxy.on('error', (error) => {
console.log(error.message)
res.end(maintenanceHtml);
})
}).listen(port, () => {
console.log(`Proxy server has started on port 80`)
});
I think that there is react server settings that I'm not able to find.
There is a little example that you're able to start at your own PC.
https://github.com/b2b-Alexander/react-js-problem
Found solution on github: https://github.com/vitejs/vite/discussions/7620
I got installed new v18.12.1 version of NodeJS.
My main machine has v16.14.0
So I rolled back the version of NodeJS for my project.
Hello everybody and thanks in advance for your answer.
I have a website which is serve by nodejs and I'm listening on port 300 for http and 443 for https:
const fs = require('fs');
const https = require('https');
const http = require('http');
const app = require('../app');
const env = require(`../environment/${process.env.NODE_ENV}`);
const httpServer = http.createServer((req, res) => {
res.writeHead(301, { Location: `https://${req.headers.host.split(':')[0] + ':' + env.portHttps}${req.url}` });
res.end();
}).listen(env.portHttp);
const options = {
key: fs.readFileSync(env.key),
cert: fs.readFileSync(env.cert),
};
const httpsServer = https.createServer(options, app).listen(env.portHttps);
This code is from a tutorial and I guess I don't understand it well because I was expecting to get my site calling localhost:300 or localhost:443 and each time, the request on google chrome redirect to https://localhost/ and I don't get why.
So it works fine but I'd like to know why the redirection because ... Why calling a .listen(port) then ?
PS: I have an angular app launch with a proxy :
{
"/": {
"target": "https://localhost",
"changeOrigin": true,
"secure": false
}
}
I know the purpose of this proxy, I only wonder why the redirection happen and the tutorial I've followed does't explain that.
You are creating an http server on port 80 and sending back a 301 HTTP Redirect to all requests.
These redirects work by returning status code 301 and an HTTP header Location with where you'd like the browser to go to. So when you visit http://localhost/abc you receive 301, Location: https://localhost/abc and your browser takes you there.
In the dev tools, if you go to the network tab and enable "preserve log" you should see both the redirect and the actual application page load.
const httpServer = http.createServer((req, res) => {
// 301 is the http status code
// Location: ... is the header which contains where to go to
res.writeHead(301, { Location: `https://${req.headers.host.split(':')[0] + ':' + env.portHttps}${req.url}` });
res.end();
}).listen(env.portHttp);
At the bottom of the file you set up the https server which listens on port 443 and serves your application.
All such secure (TLS) transfers are done using port 443, the standard port for HTTPS traffic. As such, when your browser recognizes the 443 port, it shows you the https secure lock (green) and 301 is the http redirect to all request which is the response status code.
I need to enable compression for websocket communication (using socketio). I did this by setting the perMessageDeflate to true which worked fine in my development environment and on a staging server (verified with Wireshark). However, on our production server the websocket connection fails and socketio (v4.4.1) falls back to polling. Both use nginx as a reverse proxy with the same configuration. Chromium console shows
WebSocket connection to
'wss://***/ws-test/socket.io/?EIO=4&transport=websocket&sid=9P4EelJhF0CcxvwNAAAE'
failed: Error during WebSocket handshake: Unexpected response code:
400
I created a minimal sample that shows the same behaviour. Nodejs app on the server:
const express = require('express');
const app = express();
const http = require('http');
const server = http.createServer(app);
const { Server } = require("socket.io");
const io = new Server(server, {
perMessageDeflate: true
});
app.get('/', (req, res) => {
res.sendFile(__dirname + '/index.html');
});
server.listen(3001, () => {
console.log('listening on *:3001');
});
Client:
...
<script src="socket.io/socket.io.js"></script>
<script>
var socketio_path = window.location.pathname + 'socket.io';
var socket = io( { path: socketio_path } );
</script>
</body>
The error 400 seems to come from the nodejs server, not from nginx. I enabled all kinds of logging but couldn't find any related messages. Software versions are also reasonably close and up-to-date: nginx 1.18.0 on both (staging and production), nodejs 14.19.0/14.18.1 (staging/prod). Do you have any ideas that could help making this work on the production server?
It turned out, the issue is the Azure Application Gateway used on the production server. When it's bypassed, the websocket connection (with perMessageDeflate enabled) works fine.
Switching to a newer version of the Azure Application Gateway (v2 instead of v1) solved the issue in the end.
So I have created an application that uses a websocket in node.
In my server.js I use:
import http from "http";
import WebSocket from "websocket";
[...]
var httpServer = http.createServer(this.handleRequest);
httpServer.listen(port, function () {
console.log("Listening on port " + port);
});
var server = new WebSocket.server({httpServer: http_server});
[...]
This works, and creates a socket server on the same url.
Now, I'm trying to get this on a Windows server with IIS.
I start the application with "node server.js" and it is running on port 5003.
In IIS I use a rewrite rule to forward all incomming requests to the node server. Works perfect.
Now the problem. When I install an certificate with LetsEncrypt (Win-AMCE) the website is secure, but it won't connect to the websocket as the websocket is not secure.
According to some finds on the internet I need to use npm https
import http from "https";
import WebSocket from "websocket";
import fs from "fs";
[...]
const options = {
key: fs.readFileSync("my-site-key.pem"),
cert: fs.readFileSync("chain.pem")
};
var httpsServer = https.createServer(options,this.handleRequest);
httpsServer.listen(port, function () {
console.log("Listening on port " + port);
});
var server = new WebSocket.server({httpsServer: http_server});
[...]
The problem is, how do I get some valid certificate files. I cannot sign them self because another error will popup I guess. I cannot find the files Letsencrypt created.
So.... how do I create a secure websocket??
I've setup a simple HTTPS server to handle the following to situations:
Requests to https://localhost:5000/ that have a matching file in my directory are served via connect.static(__dirname). This works great for everything like my index.html and my CSS files and is working exactly as I need.
Requests to https://localhost:5000/api should redirect to https://subdomain.mydomain.com:443/api.
The proxy is properly transferring everything over HTTPS and the SSL handshake part seems to be working exactly as I would expect. The problem is that my API uses the subdomain to determine what database to connect to and what data to return. So, my API sees the request
https://localhost:5000/api/something
instead of
https://subdomain.mydomain.com/api/something
and is throwing an error telling me I have to supply the subdomain.
How can I tell the node proxy to forward (or use) the domain/subdomain when doing the proxy?
Here is my code:
var fs = require('fs');
var connect = require('connect'),
https = require('https'),
httpProxy = require('http-proxy'),
options = {
key: fs.readFileSync('key.pem'),
cert: fs.readFileSync('cert.pem')
},
endpoint = {
host: 'subdomain.mydomain.com',
port: 443,
prefix: '/api',
target: { https: true }
};
var proxy = new httpProxy.RoutingProxy();
var app = connect()
.use(connect.logger('dev'))
.use(function(req, res, next) {
if (req.url.indexOf(endpoint.prefix) === 0) {
proxy.proxyRequest(req, res, endpoint);
} else {
next();
}
})
.use(connect.static(__dirname));
https.createServer(options, app).listen(5000);
console.log('Listening on port 5000');
Just in case someone bumps into this old question, you should use http-proxy's changeOrigin option.