I am working on a node.js application using express to serve content and socket.io for websocket communication. The setup has been working fine, but now I want to be able to access the websocket via SSL, too. I thought using nginx (which we already used for other stuff) as a proxy was a good idea, and configured it like this:
upstream nodejs {
server 127.0.0.1:8080;
}
server {
listen 443 ssl;
ssl_certificate /etc/nginx/ssl/server.crt;
ssl_certificate_key /etc/nginx/ssl/server.key;
server_name _;
location / {
proxy_http_version 1.1;
proxy_set_header Upgrade $http_upgrade;
proxy_set_header Connection "upgrade";
proxy_set_header X-Real-IP $remote_addr;
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
proxy_set_header Host $http_host;
proxy_set_header X-NginX-Proxy true;
proxy_pass http://nodejs;
proxy_redirect off;
}
}
The node.js server is set up like this:
var express = require('express'),
app = express();
// some app configuration that I don't think matters
server = http.createServer(app).listen(8080);
var io = require('socket.io').listen(server);
io.configure(function() {
io.set('match original protocol', true);
io.set('log level', 0);
io.set('store', redisStore); // creation of redisStore not shown
}
Both nginx and node.js run inside a Vagrant box which forwards port 443 (which nginx listens on) to port 4443 on the host system.
With this setup, navigating to http://localhost:4443 (using Firefox 23) gives me access to the files served by Express, but when socket.io tries to connect to the socket, it throws the following error:
Blocked loading mixed active content "http://localhost:4443/socket.io/1/?t=1376058430540"
This outcome is sadly obvious, as it tries to load the JS file via HTTP from inside an HTTPS page, which Firefox does not allow. The question is why it does so in the first place.
Socket.io tries to determine which protocol is used to access the web page, and uses the same protocol in the construction of the above URL. In this case, it thinks it is being accessed over HTTP, which may be the result of being proxied. However, as I understand, setting match original protocol to true in the socket.io config is supposed to help in situations like this, but it does not in my case.
I have found numerous questions and answers here about websocket proxying, but none that deal with this particular issue. So I'm pretty much at wit's end, and would really appreciate some advice.
Change match original protocol to match origin protocol:
io.configure(function() {
//io.set('match original protocol', true);
io.set('match origin protocol', true);
...
}
Related
I've been learning frontend development only and just recently went over basics of Nodejs. I know that I would connect to certain port number when developing in Nodejs alone. However, I'm confused about how I would connect Vue application (built with Vue CLI) to backend since npm run serve will automatically connect to port 8080 by default.
My ultimate goal is to connect MongoDB to my application. Current error I'm getting is Error: Can't resolve 'dns'.
TLDR: Could someone please explain in newbie term how I can connect Vue application with MongoDB?
In my opinion, you have two ways of solving this:
First, there is a field called devServer through which you can tweak the configuration of the dev server that starts up when you run npm run serve. Specifically, you want to pay attention to proxy field, using which you can ask the dev server to route certain requests to your node backend.
Second, depending on your setup, you could use a different host altogether to handle backend calls. For example, as you mentioned, the dev server runs on 8080 by default. You could set up your node backend to run on, say, 8081 and all backend requests that you make in your VueJS app will explicitly use the host of <host>:8081. When you decide to move your code into production, and get SSL certificates, you can have a reverse-proxy server like Nginx redirect all requests from say, api.example.com to port 8081.
As for connections to MongoDB, IMO, here's a question you should be asking yourself:
Is it safe to provide clients direct access to the database?
If the answer is yes, then by all means, ensure the mongoDB server starts with its HTTP interface enabled, set up some access restrictions, update the proxy and/or nginx and you're good to go.
If the answer is no, then you're going to have to write light-weight API endpoints in your NodeJS app. For example, instead of allowing users to directly talk to the database to get their list of privileges, you instead make a request to your NodeJS app via GET /api/privileges, and your NodeJS app will in turn communicate with your database to get this data and return it to the client.
Another added benefit to having the backend talk to your database rather than the client, is that your database instance's details are never exposed to malicious clients.
Here's a sample vue.config.js setup that I have on one of my websites:
const proxyPath = 'https://api.example.com'
module.exports = {
devServer: {
port: 8115, // Change the port from 8080
public: 'dev.example.com',
proxy: {
'/api/': {
target: proxyPath
},
'/auth/': {
target: proxyPath
},
'/socket.io': {
target: proxyPath,
ws: true
},
'^/websocket': {
target: proxyPath,
ws: true
}
}
}
}
Here's the nginx config for the same dev server. I quickly pulled what I could from our production config and obscured certain fields for safety. Consider this as pseudo-code (pseudo-config?).
server {
listen 443 ssl;
server_name dev.example.com;
root "/home/www/workspace/app-dev";
set $APP_PORT "8115";
location / {
# Don't allow robots to access the dev server
if ($http_user_agent ~* "baiduspider|twitterbot|facebookexternalhit|rogerbot|linkedinbot|embedly|quora link preview|showyoubot|outbrain|pinterest|slackbot|vkShare|W3C_Validator|Googlebot") {
return 404;
}
# Redirect all requests to the vue dev server #localhost:$APP_PORT
proxy_pass $scheme://127.0.0.1:$APP_PORT$request_uri;
proxy_http_version 1.1;
proxy_set_header Upgrade $http_upgrade;
proxy_set_header Connection $http_connection;
proxy_set_header Host $host;
proxy_set_header X-Real-IP $remote_addr;
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
}
}
server {
listen 443 ssl;
server_name api.example.com;
set $APP_PORT "8240";
location / {
# Don't allow robots to access the dev server
if ($http_user_agent ~* "baiduspider|twitterbot|facebookexternalhit|rogerbot|linkedinbot|embedly|quora link preview|showyoubot|outbrain|pinterest|slackbot|vkShare|W3C_Validator|Googlebot") {
return 404;
}
# Redirect all requests to NodeJS backend #localhost:$APP_PORT
proxy_pass $scheme://127.0.0.1:$APP_PORT$request_uri;
proxy_http_version 1.1;
proxy_set_header Upgrade $http_upgrade;
proxy_set_header Connection $http_connection;
proxy_set_header Host $host;
proxy_set_header X-Real-IP $remote_addr;
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
}
}
I've been able to find guides pertaining to various combinations of nginx, node, ssl, and websockets, but never all together and never on a single server.
Ultimately, this is what I'm after and I'm not sure if it's even possible:
single server (Ubuntu 14.04)
forced HTTPS (browsing to http://site forwards to https://)
node app is hosted on localhost:3000
node app uses web sockets
it's a single-page React app with no routing at all, so I don't need routes. I repeat, I'm only hosting one page with no navigation whatsoever.
With the below config, I have everything working except websockets - the client throws an error which doesn't happen if I browse straight to the node server and don't use nginx (browse to http://my.domain:3000):
bundle.js:26 WebSocket connection to 'wss://<my domain>/socket.io/?EIO=3&transport=websocket&sid=x1uQtRzF3gYYEvfIAAAi' failed: Error during WebSocket handshake: Unexpected response code: 400
server {
listen 80;
return 301 https://my/domain$request_uri;
}
server {
listen 443 ssl;
listen [::]:443;
ssl_certificate /path/cert.crt;
ssl_certificate_key /path/key.key;
ssl_session_cache shared:SSL:10m;
server_name blaze.chat;
location / {
proxy_set_header X-Real-IP $remote_addr;
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
proxy_set_header X-Forwarded-Proto $scheme;
proxy_pass http://localhost:3000;
proxy_http_version 1.1;
proxy_set_header Upgrade $http_upgrade;
proxy_set_header Host $host;
proxy_redirect http://localhost:3000 https://my.domain;
}
}
Right, got it working... Found a lot of articles all showing similar things but missing these key lines:
proxy_set_header Connection "upgrade";
proxy_read_timeout 86400;
In my case, websockets won't work without those lines, contrary to many other posts where similar questions were asked. Not sure what the difference was. Ironically it is listed as a requirement in the Nginx Websocket Proxying doco... Should have started there, my bad.
http://nginx.org/en/docs/http/websocket.html
Side note, I am just using this on the root path of / which works fine.
I have a chat application using socket.io. I put code on 2 server (ip1, ip2) and using nginx to make load balancing.
this is nginx config
upstream socket_nodes {
ip_hash;
server ip1:1654;
server ip2:1653;
}
server {
listen 1653;
server_name livechatsoftware.com.vn;
location / {
proxy_set_header X-Real-IP $remote_addr;
proxy_set_header Upgrade $http_upgrade;
proxy_set_header Connection "upgrade";
proxy_set_header X-NginX-Proxy true;
proxy_http_version 1.1;
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
proxy_set_header Host $host;
proxy_pass http://socket_nodes;
proxy_redirect off;
}
}
Everything is working well if clients connect to same server. but if client 1 connect to server ip1, client 2 connect to server ip2. client 1 and client 2 can not interact (eg cannot emit, send message,....)
Thanks for advance
I am using nginx to load balance my node processes (express.js + mongodb + socket.io + passport.js) . Although they are all in the same server, still socket.io events don't get shared between them. That means I've also faced the exact same issue. So I've used mong.socket.io (https://github.com/tmfkmoney/mong.socket.io) to store socket.io events in the mongo store. In that way each socket.io process reads the events from a centralized store.
You also need a central store which socket.io events can be accessed from any of the servers. I've never used that kind of setup but check out the below package;
https://www.npmjs.com/package/socket.io-redis
I'd install redis to one of my servers and point my socket.io processes to that server like below;
io.adapter(redis({ host: 'localhost', port: 6379 })); // you should use ip address and the port for the remote server
You can also use other adapter options as mentioned in the past link I've shared.
I have a Nginx Serving my ember build
as follows:
server {
listen 80;
root /home/ubuntu/my-app/dist;
index index.html;
location /api {
proxy_pass http://127.0.0.1:3000;
}
location / {
try_files $uri $uri/ /index.html;
}
}
I want to add a chat using socket.io, but I already have REST api on port 3000.
I'm wondering what the best way to architect this.
I thought I could add another location as follows:
location /socket.io {
proxy_set_header Upgrade $http_upgrade;
proxy_set_header Connection "upgrade";
proxy_http_version 1.1;
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
proxy_set_header Host $host;
proxy_pass http://socket_nodes;
}
But it's not working. If someone could point me in the right direction, that'd be really great.
thanks in advance.
If you're using node as your web server, then socket.io shares the same port and IP as the web server and your REST API.
A socket.io connection starts out as a regular http request to the /socket.io path. The socket.io library hooks into your web server to handle that specific http request. After a couple back and forth, the two ends agree to "upgrade" the protocol from http to webSocket and then the conversation continues as the webSocket protocol but still on the same IP and port as your webServer operates on.
All this can work fine with nginx as a proxy if you configure nginx as specified in the configuration link I gave you earlier so that it proxies all the right things and if socket.io is configured properly with your nodejs server to hook into it properly.
There's really no architectural changes to make as the web requests and socket.io connections both operate through the same web server without you having to do anything. The socket.io connection just makes an http request to the /socket.io path with some special HTTP headers set. The socket.io server code just hooks into your web server to handle that specific request and take it from there. The rest of your REST API calls are just handled by the same mechanism you already have. So, as long as you don't try to define an API call for /socket.io, the two will happily stay out of each other's way, just like the handlers for two different routes on your web server stay out of each other's way. You can see a lot more about how incoming socket.io calls work in this answer.
So I finally got this to work and thought I'd share my findings.
Nginx:
For my api proxy, I can actually share the same port as my node API. I just needed to add version and headers.
location /api {
proxy_pass http://127.0.0.1:3000;
proxy_http_version 1.1;
proxy_set_header Upgrade $http_upgrade;
proxy_set_header Connection "upgrade";
}
server.js
var http = require('http').Server(app);
var io = require('socket.io')(http);
app.get('/', function(req, res) {
res.sendFile(__dirname + '/index.html');
});
io.on('connection', function(socket) {
console.log("[NOTIFICATION] - New connection");
io.sockets.emit("message", { message: "New connection" });
socket.on("send", function(data) {
io.sockets.emit("message", { message: data.message });
});
http.listen(3000);
Ember:
https://github.com/keydunov/emberjs-socketio-chat
is a pretty good example, and use of socket.io
So I have a nodejs app running on port 8081:
http://mysite.com:8081/
I want to access it simply by going to http://mysite.com/ so I setup a virtual host with expressjs:
app.use(express.vhost('yugentext.com', app));
That seems too easy, and it doesn't work. Am I confused about how expressjs vhosts work?
if you want to do these via express well, the problem comes from your dns setup, not from the express code.
Add an A entry to your domain like these:
127.0.0.1 localhost *.mysite.com *.www.mysite.com
You should wait to the DNS propagation. (from seconds to hours).
If apache or other web server is running any vhost on port 80 there will be conflicts.
And the other way:
nodejs and express are far away from the performance offered by apache and nginx (vhost/proxy stuff).
Nginx>Apache (fits better with nodejs)
Creates a proxy from mysite.com to mysite.com:8080
On these way nodejs and express handles the ui, methods, httpserver etc, and Nginx or Apache the proxy , vhost, and managing your static assets sooo fast.
check these config here: Trouble with Nginx and Multiple Meteor/Nodejs Apps
I think you're doing app.listen(8081). You should be doing app.listen(80). I have no experience with express vhosts, but you don't need them for this simple use case.
upstream node-apps {
server host_ip_1:3000;
server host_ip_2:3000;
}
server {
listen 80;
server_name localhost;
location / {
proxy_set_header X-Real-IP $remote_addr;
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
proxy_set_header Host $http_host;
proxy_set_header X-NginX-Proxy true;
proxy_pass http://node-apps/;
proxy_redirect off;
}
}
this is my nginx config, proxy pass multiple servers, good luck :p