express-ws doesn't handle call when deployed to VPS server - node.js

I have back-end written using node.js + express with express-ws depedency.
Locally everything works like it should. Previously it was deployed to red hat open shift, also haven't had any problem. Yesterday I bought VPS configured it and deployed there. Everything works except websockets.
I have nginx with enabled SSL that has the next lines in config related to the server
server {
listen ipaddresshere:80 default;
server_name _;
location / {
proxy_pass http://ipaddresshere:8080;
}
location /ws {
proxy_pass http://ipaddresshere:8080;
proxy_http_version 1.1;
proxy_set_header Upgrade $http_upgrade;
proxy_set_header Connection "upgrade";
}
}
I have other config places but they were generated by VestaCP and https://certbot.eff.org/
What I know that request to /ws route is coming to node.js app (I am logging it). But it doesn't go to that handler
app.ws('/ws', SocketsHandler.registerWs);
In the end it matches with my last handler and returns 404
app.get('*', ErrorHandler.notFound);
The question: What it can be that WS library doesn't work in VPS environment but I don't see any error in console... ?
P.S. Localy I run app without SSL and nginx
wsServer.on('connection', function (socket) {...})

I found that my config was overridden by some other file. So instead of Connection "upgrade"; server was receiving Connection "close"..

Related

Nginx proxy_reverse and node server. Where do you see the console log OUTPUT from upstream NODE server?

I come from windows server world. First time to use Nginx and so far, it's impressive but I have an issue and that's why I'm here ;)
I've Ubuntu 20.04, Nginx and deployed an expressjs app (node), followed all the tutorials out there to setup Nginx and a proxy_reverse server. It works and I'm able to see and run/interact with my expressjs app routes (home, signin, signup...etc)
My only issue is I can't see the results of console.log I placed in my routes (my node server) Get route example :
app.get('/', function (req, res) {
console.log("db is connected, user is loggedin and we're ready to
go....wait...another issue with Nginx I need to research and I can't
find in the docs")
res.send('hello world')
})
How do I view the result of that route if my node server is behind the Nginx server? Is there a way that makes Nginx display those logs to me (the admin)?
There are trillions of tutorials out there about how to add nginx as proxy but not one of them talk about this important issue. If you have a link that I can read or if you're experienced enough with Nginx proxy_reverse log operations, please explain in details or show me in code how do you view the console.log from the expressjs routes.
So, my question is how do I see that console.log in the example above output? Is it Nginx parameter, here is my server setup /etc/nginx/sites-available/default (I deleted irrelevant parts):
server {
listen 80 default_server;
listen [::]:80 default_server;
# Add index.php to the list if you are using PHP
index index.html index.htm index.nginx-debian.html;
server_name _;
location / {
proxy_pass http://localhost:8080;
proxy_http_version 1.1;
proxy_set_header Upgrade $http_upgrade;
proxy_set_header Connection 'upgrade';
proxy_set_header Host $host;
proxy_cache_bypass $http_upgrade;
}
}
Is there a directive in the context of location/server that I need to allow passing logs from the node/expressjs server to nginx shell or to anywhere to see what's going on with node/express routes? Is it in the nginx logs somewhere that I need to be aware of?
For anyone starting with this stack, pm2 logs when you inside your app directory will print logs to your console.
my_app_directory pm2 logs

How to setup a node server (with nginx sat on top) to work with mongodb

I have a node express application which communicates with mongodb and serves back the response in JSON format after doing some processing. The application works as expected when run on a local machine.
This is how my connect code looks
await MongoClient.connect(uri, async function (err, client) {
...
}
However, I have deployed the application to an aws ec2 instance following this tutorial where I added nginx as a layer on top of my node application. Now I get a 504 Gateway Time-out on any routes that try to connect to mongodb.
The server block in my nginx configuration
server {
listen 80 default_server;
listen [::]:80 default_server;
server_name localhost;
root /usr/share/nginx/html;
location / {
proxy_pass http://127.0.0.1:3000;
proxy_http_version 1.1;
proxy_set_header Upgrade $http_upgrade;
proxy_set_header Connection 'upgrade';
proxy_set_header Host $host;
proxy_cache_bypass $http_upgrade;
}
}
I understand that mongodb does not use HTTP, which is what nginx uses for communication. So I have tried to follow this tutorial but have had no luck.
Can anybody point me in the right direction?
Turns out I had completely forgotten to whitelist my server's IP address when I deployed the app to an ec2 instance. Hence why everything worked as expected locally (my local IP address was whitelisted).
This had nothing to do with NGINX. My mistake.

"TypeError:Networkerror when attempting to fetch resource" when my remote client tries to communicate with my remote server

I have uploaded a React client to DigitalOcean with an SSL certificate to enable HTTPS. I also have uploaded my Express server to Amazon's AWS. The reason for the different host providers is that I wasn't able to upload my client to AWS so I made the switch to DigitalOcean.
The server works great and I get normal responses from it when I use the client from my machine. However, the exact same code doesn't work in DigitalOcean's Nginx server. I get:
TypeError:Networkerror when attempting to fetch resource
But no response error code. The GraphQL/fetch requests aren't visible on the server so they either aren't being sent correctly or they cannot be accepted correctly by the server.
I played around with "proxy" in client's package.json and HOST/PORT/HTTPS attributes as seen here but I realized these have no effect in production.
I have no idea how to fix this. My only guess is that client uses HTTPS while server doesn't, but I haven't found info on if that's a problem.
This is my client's Nginx server configuration:
server {
listen 80 default_server;
server_name example.com www.example.com;
location / {
proxy_pass http://localhost:3000;
proxy_http_version 1.1;
proxy_set_header Upgrade $http_upgrade;
proxy_set_header Connection 'upgrade';
proxy_set_header Host $host;
proxy_cache_bypass $http_upgrade;
}
If your client lives on a domain different from your API server then you need to make sure you have CORS headers enabled on the API server, otherwise the browser will refuse to load the contents.
See here for more information regarding CORS headers.
You turn off the credentials when running the client on DigitalOcean. Maybe storing a cookie on your client is/ was not possible.

Nodejs express application on nginx linux server

I have created my first nodejs application with nodejs express.
The server is
var express = require("express");
global.app = express();
require("src/app-modules/common/yamaha/yamaha.controller.api");
app.listen(8080, function() {
console.log("Listening on http://127.0.0.1:8080");
});
On windows development machine Is working.
Now I try to publish this app on a Synology NAS.
I access this application on url: //192.168.1.151/YamahaCtrl
and I have a 404 error.
Update 2: I discover what is realy the problem: how to configure the nginx server fromn
Routes are defined on yamaha.controller.api like this one:
app.get("/api/yamaha/getBasicInfo", function (req, res) {
//do something
});
I found a nodejs 8 beta package to install on NAS, and now I have nodejs version 8. But the error still remain.
So the nodejs app server is open on localhost:8080, but url access is http://192.168.1.151/YamahaCtrl, on 80.
That mean the problem is how to configure Virtual Host on NAS, and which port on node server I should use.
Update: The problem is: need to configure a redirect port on nginx server installed on Synology. I found this article:
https://www.digitalocean.com/community/tutorials/how-to-set-up-a-node-js-application-for-production-on-ubuntu-14-04
and I create configuration file /etc/nginx/sites-available/default:
server {
listen 80;
server_name _;
location / {
proxy_pass http://127.0.0.1:8080;
proxy_http_version 1.1;
proxy_set_header Upgrade $http_upgrade;
proxy_set_header Connection 'upgrade';
proxy_set_header Host $host;
proxy_cache_bypass $http_upgrade;
}
}
And still not working: the html files are accessible , but api is available on port 8080 only.
I figured out what the problem was: nodejs server is serving static files too, but my config for static files was wrong. For development was working, and that why I didn't know that is wrong.
So it's no need to configure something on NAS :). To start nodejs application is enough.

Using node, socket.io, ember.js and nginx

I have a Nginx Serving my ember build
as follows:
server {
listen 80;
root /home/ubuntu/my-app/dist;
index index.html;
location /api {
proxy_pass http://127.0.0.1:3000;
}
location / {
try_files $uri $uri/ /index.html;
}
}
I want to add a chat using socket.io, but I already have REST api on port 3000.
I'm wondering what the best way to architect this.
I thought I could add another location as follows:
location /socket.io {
proxy_set_header Upgrade $http_upgrade;
proxy_set_header Connection "upgrade";
proxy_http_version 1.1;
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
proxy_set_header Host $host;
proxy_pass http://socket_nodes;
}
But it's not working. If someone could point me in the right direction, that'd be really great.
thanks in advance.
If you're using node as your web server, then socket.io shares the same port and IP as the web server and your REST API.
A socket.io connection starts out as a regular http request to the /socket.io path. The socket.io library hooks into your web server to handle that specific http request. After a couple back and forth, the two ends agree to "upgrade" the protocol from http to webSocket and then the conversation continues as the webSocket protocol but still on the same IP and port as your webServer operates on.
All this can work fine with nginx as a proxy if you configure nginx as specified in the configuration link I gave you earlier so that it proxies all the right things and if socket.io is configured properly with your nodejs server to hook into it properly.
There's really no architectural changes to make as the web requests and socket.io connections both operate through the same web server without you having to do anything. The socket.io connection just makes an http request to the /socket.io path with some special HTTP headers set. The socket.io server code just hooks into your web server to handle that specific request and take it from there. The rest of your REST API calls are just handled by the same mechanism you already have. So, as long as you don't try to define an API call for /socket.io, the two will happily stay out of each other's way, just like the handlers for two different routes on your web server stay out of each other's way. You can see a lot more about how incoming socket.io calls work in this answer.
So I finally got this to work and thought I'd share my findings.
Nginx:
For my api proxy, I can actually share the same port as my node API. I just needed to add version and headers.
location /api {
proxy_pass http://127.0.0.1:3000;
proxy_http_version 1.1;
proxy_set_header Upgrade $http_upgrade;
proxy_set_header Connection "upgrade";
}
server.js
var http = require('http').Server(app);
var io = require('socket.io')(http);
app.get('/', function(req, res) {
res.sendFile(__dirname + '/index.html');
});
io.on('connection', function(socket) {
console.log("[NOTIFICATION] - New connection");
io.sockets.emit("message", { message: "New connection" });
socket.on("send", function(data) {
io.sockets.emit("message", { message: data.message });
});
http.listen(3000);
Ember:
https://github.com/keydunov/emberjs-socketio-chat
is a pretty good example, and use of socket.io

Resources