nginx+nodejs+socket.io ERR_CONNECTION_TIMED_OUT - node.js

I almost tried any solution I can find in forums and blogs but I have no luck that's why I'm asking for any help right now.
Here's the situation, I am currently using Ubuntu and I'm running 2 sockets in it before which running perfectly but when I tried to add another 1 more socket the problem arise (The ERR_CONNECTION_TIMED_OUT).
Here is my set up on NGINX for my third socket
upstream stream {
server localhost:3210;
}
server {
location /socket.io {
proxy_pass http://stream;
proxy_http_version 1.1;
proxy_set_header Upgrade $http_upgrade;
proxy_set_header Connection "upgrade";
}
}
This is the same exact nginx setup that I have had with my first 2 app, that's why I'm having a hard time to debug it, also with the nodejs server.
http.listen(3210, function(){
console.log('Listening on Port 3210');
});
and on front-end
var socket = io.connect('http://testapp.com:3210');

This seems incorrect:
var socket = io.connect('http://testapp.com:3210');
Port 3210 is what Express is listening on, and given that you're proxying using Nginx I'd expect that the client should connect to Nginx, not Express:
var socket = io.connect('http://testapp.com');
(provided that Nginx is running on port 80)

Related

502 Bad Gateway when connecting to Nodejs app running express through Nginx

I'm having issues connecting to my node app that is running on port 8081.
My setup is as follows (everything runs on a Raspberry Pi):
NGINX
events {
worker_connections 1024;
}
http {
server {
root /data/web;
location / {
}
location /pub {
proxy_pass http://localhost:8081;
proxy_http_version 1.1;
proxy_set_header Upgrade $http_upgrade;
proxy_set_header Connection 'upgrade';
proxy_set_header Host $host;
proxy_cache_bypass $http_upgrade;
}
}
}
I'm serving static files with the first location (which seems to be working fine), and I would like the second location to reroute to my node app. which is running on port 8081.
My node app looks like this:
app.get('/', function(req, res){
res.send("Hello World!");
});
var server = app.listen(8081, '192.168.0.178');
And I'm testing my connection using a simple wget from another pc in the LAN:
wget http://192.168.0.178/pub
The full error I get is this:
http://192.168.0.178/pub
Connecting to 192.168.0.178:80... connected.
HTTP request sent, awaiting response... 502 Bad Gateway
2018-01-14 15:42:27 ERROR 502: Bad Gateway.
SOLUTION
The accepted answer was indeed the problem I was having.
Another thing I added was a rewrite in my /pub location because '/pub' needs to be cut off from the url going to the Node app. So the final nginx conf looks like this:
http {
access_log /data/access_log.log;
error_log /data/error_log.log debug;
upstream backend {
server localhost:8081;
}
server {
root /data/web;
location / {
}
location /pub {
proxy_pass http://localhost:8081;
rewrite /pub(.*) /$1; break;
}
}
}
The problem seems related to the network interface you are exposing the nodejs app. You have setup the app to listen to port 8081 on the interface with ip 192.168.0.178, but the nginx is proxying trough the loopback interface, given the instruction
proxy_pass http://localhost:8081;
You can solve this issue exposing the nodejs app on the loopback interface:
var server = app.listen(8081, 'localhost');
The node app should be no more reachable directly on port 8081 from any other machine except the one the app is running

Nodejs express application on nginx linux server

I have created my first nodejs application with nodejs express.
The server is
var express = require("express");
global.app = express();
require("src/app-modules/common/yamaha/yamaha.controller.api");
app.listen(8080, function() {
console.log("Listening on http://127.0.0.1:8080");
});
On windows development machine Is working.
Now I try to publish this app on a Synology NAS.
I access this application on url: //192.168.1.151/YamahaCtrl
and I have a 404 error.
Update 2: I discover what is realy the problem: how to configure the nginx server fromn
Routes are defined on yamaha.controller.api like this one:
app.get("/api/yamaha/getBasicInfo", function (req, res) {
//do something
});
I found a nodejs 8 beta package to install on NAS, and now I have nodejs version 8. But the error still remain.
So the nodejs app server is open on localhost:8080, but url access is http://192.168.1.151/YamahaCtrl, on 80.
That mean the problem is how to configure Virtual Host on NAS, and which port on node server I should use.
Update: The problem is: need to configure a redirect port on nginx server installed on Synology. I found this article:
https://www.digitalocean.com/community/tutorials/how-to-set-up-a-node-js-application-for-production-on-ubuntu-14-04
and I create configuration file /etc/nginx/sites-available/default:
server {
listen 80;
server_name _;
location / {
proxy_pass http://127.0.0.1:8080;
proxy_http_version 1.1;
proxy_set_header Upgrade $http_upgrade;
proxy_set_header Connection 'upgrade';
proxy_set_header Host $host;
proxy_cache_bypass $http_upgrade;
}
}
And still not working: the html files are accessible , but api is available on port 8080 only.
I figured out what the problem was: nodejs server is serving static files too, but my config for static files was wrong. For development was working, and that why I didn't know that is wrong.
So it's no need to configure something on NAS :). To start nodejs application is enough.

express-ws doesn't handle call when deployed to VPS server

I have back-end written using node.js + express with express-ws depedency.
Locally everything works like it should. Previously it was deployed to red hat open shift, also haven't had any problem. Yesterday I bought VPS configured it and deployed there. Everything works except websockets.
I have nginx with enabled SSL that has the next lines in config related to the server
server {
listen ipaddresshere:80 default;
server_name _;
location / {
proxy_pass http://ipaddresshere:8080;
}
location /ws {
proxy_pass http://ipaddresshere:8080;
proxy_http_version 1.1;
proxy_set_header Upgrade $http_upgrade;
proxy_set_header Connection "upgrade";
}
}
I have other config places but they were generated by VestaCP and https://certbot.eff.org/
What I know that request to /ws route is coming to node.js app (I am logging it). But it doesn't go to that handler
app.ws('/ws', SocketsHandler.registerWs);
In the end it matches with my last handler and returns 404
app.get('*', ErrorHandler.notFound);
The question: What it can be that WS library doesn't work in VPS environment but I don't see any error in console... ?
P.S. Localy I run app without SSL and nginx
wsServer.on('connection', function (socket) {...})
I found that my config was overridden by some other file. So instead of Connection "upgrade"; server was receiving Connection "close"..

Using node, socket.io, ember.js and nginx

I have a Nginx Serving my ember build
as follows:
server {
listen 80;
root /home/ubuntu/my-app/dist;
index index.html;
location /api {
proxy_pass http://127.0.0.1:3000;
}
location / {
try_files $uri $uri/ /index.html;
}
}
I want to add a chat using socket.io, but I already have REST api on port 3000.
I'm wondering what the best way to architect this.
I thought I could add another location as follows:
location /socket.io {
proxy_set_header Upgrade $http_upgrade;
proxy_set_header Connection "upgrade";
proxy_http_version 1.1;
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
proxy_set_header Host $host;
proxy_pass http://socket_nodes;
}
But it's not working. If someone could point me in the right direction, that'd be really great.
thanks in advance.
If you're using node as your web server, then socket.io shares the same port and IP as the web server and your REST API.
A socket.io connection starts out as a regular http request to the /socket.io path. The socket.io library hooks into your web server to handle that specific http request. After a couple back and forth, the two ends agree to "upgrade" the protocol from http to webSocket and then the conversation continues as the webSocket protocol but still on the same IP and port as your webServer operates on.
All this can work fine with nginx as a proxy if you configure nginx as specified in the configuration link I gave you earlier so that it proxies all the right things and if socket.io is configured properly with your nodejs server to hook into it properly.
There's really no architectural changes to make as the web requests and socket.io connections both operate through the same web server without you having to do anything. The socket.io connection just makes an http request to the /socket.io path with some special HTTP headers set. The socket.io server code just hooks into your web server to handle that specific request and take it from there. The rest of your REST API calls are just handled by the same mechanism you already have. So, as long as you don't try to define an API call for /socket.io, the two will happily stay out of each other's way, just like the handlers for two different routes on your web server stay out of each other's way. You can see a lot more about how incoming socket.io calls work in this answer.
So I finally got this to work and thought I'd share my findings.
Nginx:
For my api proxy, I can actually share the same port as my node API. I just needed to add version and headers.
location /api {
proxy_pass http://127.0.0.1:3000;
proxy_http_version 1.1;
proxy_set_header Upgrade $http_upgrade;
proxy_set_header Connection "upgrade";
}
server.js
var http = require('http').Server(app);
var io = require('socket.io')(http);
app.get('/', function(req, res) {
res.sendFile(__dirname + '/index.html');
});
io.on('connection', function(socket) {
console.log("[NOTIFICATION] - New connection");
io.sockets.emit("message", { message: "New connection" });
socket.on("send", function(data) {
io.sockets.emit("message", { message: data.message });
});
http.listen(3000);
Ember:
https://github.com/keydunov/emberjs-socketio-chat
is a pretty good example, and use of socket.io

Mixed content error when proxying websocket through nginx with SSL

I am working on a node.js application using express to serve content and socket.io for websocket communication. The setup has been working fine, but now I want to be able to access the websocket via SSL, too. I thought using nginx (which we already used for other stuff) as a proxy was a good idea, and configured it like this:
upstream nodejs {
server 127.0.0.1:8080;
}
server {
listen 443 ssl;
ssl_certificate /etc/nginx/ssl/server.crt;
ssl_certificate_key /etc/nginx/ssl/server.key;
server_name _;
location / {
proxy_http_version 1.1;
proxy_set_header Upgrade $http_upgrade;
proxy_set_header Connection "upgrade";
proxy_set_header X-Real-IP $remote_addr;
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
proxy_set_header Host $http_host;
proxy_set_header X-NginX-Proxy true;
proxy_pass http://nodejs;
proxy_redirect off;
}
}
The node.js server is set up like this:
var express = require('express'),
app = express();
// some app configuration that I don't think matters
server = http.createServer(app).listen(8080);
var io = require('socket.io').listen(server);
io.configure(function() {
io.set('match original protocol', true);
io.set('log level', 0);
io.set('store', redisStore); // creation of redisStore not shown
}
Both nginx and node.js run inside a Vagrant box which forwards port 443 (which nginx listens on) to port 4443 on the host system.
With this setup, navigating to http://localhost:4443 (using Firefox 23) gives me access to the files served by Express, but when socket.io tries to connect to the socket, it throws the following error:
Blocked loading mixed active content "http://localhost:4443/socket.io/1/?t=1376058430540"
This outcome is sadly obvious, as it tries to load the JS file via HTTP from inside an HTTPS page, which Firefox does not allow. The question is why it does so in the first place.
Socket.io tries to determine which protocol is used to access the web page, and uses the same protocol in the construction of the above URL. In this case, it thinks it is being accessed over HTTP, which may be the result of being proxied. However, as I understand, setting match original protocol to true in the socket.io config is supposed to help in situations like this, but it does not in my case.
I have found numerous questions and answers here about websocket proxying, but none that deal with this particular issue. So I'm pretty much at wit's end, and would really appreciate some advice.
Change match original protocol to match origin protocol:
io.configure(function() {
//io.set('match original protocol', true);
io.set('match origin protocol', true);
...
}

Resources