Need for http.createServer(app) in node.js / express - node.js

Using node and express, the below works just fine.
var app = express();
app.listen(app.get('port'), function() {
});
I assume that a server is created implicitly in the above construct.
When adding socket.io, I've seen the following being done.
var app = express();
var server = http.createServer(app);
var io = require('socket.io').listen(server);
app.listen(app.get('port'), function() {
});
What is the need for explicitly adding http.createServer(app) ? Won't the creation of an additional server mess up things ? Or put it other way, is it ok to create many more http.createServer(app) ?

In either case, only one server is created. When using socket.io, you share the same http server between socket.io and express. Both libraries attach event listeners to the same server and have a chance to respond to the same events. They cooperate nicely because socket.io only handles relevant requests and express handles all the non-websocket requests. And just FYI you could not create more than one server on the same port. Only one process can listen on a TCP port at a time in the OS, so the second one would fail with an error when attempting to bind an in-use port.

Related

NodeJs - express and socket.io same port integration

I am creating a server using NodeJs and Express, but I found out that if I want to make a service live, I need to use Socket.io. In this server, there are some service that don't need to be live, and these are implemented using express routes. This are tested and correctly working. Now I have to let some services to be live. So, I think I should implement also socket.io in my server configuration. This is my code without socket.io, working perfectly:
const express = require('express');
const morgan = require('morgan');
const cors = require('cors');
const mongoose = require('mongoose');
require('dotenv').config();
const app = express();
const port = process.env.port || 5050;
app.use(morgan('dev'));
app.use(cors());
app.use(express.json());
app.use(express.urlencoded({ extended:true }));
const uri = process.env.ATLAS_URI;
mongoose.connect(uri, {useNewUrlParser:true});
const connection = mongoose.connection;
connection.once('open', () => console.log('Connected'));
const routes = require(#every routes);
app.use(routes);//all created routes
app.listen(port, () => {console.log(`Server listening on port ${port}`)});
NOw, I should import correctly socket.io. When it is done, I think I can figure out correctly hot to implement my services. So, I tryied to add the line const io = require('socket.io').listend(app) as I saw in another stackoverflow quesion, but the terminal shows up this error:
const io = require('socket.io').listend(app)
^
TypeError: require(...).listend is not a function
So, I don't know how to integrate this two. I don't know if it is worth to use the same port, or if I should use another port for the socket, but I think the same port would be good. If someone knows how to implement socket.io in my current code, or a way to keep both functionalities, please help me. Thank you so much
Change this:
app.listen(port, () => {console.log(`Server listening on port ${port}`)});
to this:
const httpServer = app.listen(port, () => {console.log(`Server listening on port ${port}`)});
Then, add this:
const { Server } = require("socket.io");
const io = new Server(httpServer );
This will let both your Express server and your socket.io server share the same http server. Since all incoming socket.io connections are identifiable with some custom headers, the socket.io code (actually the underlying webSocket transport does this) can grab those and handle them independently from your regular http requests.
See plenty of examples in the socket.io doc which has improved immensely from its early days.
Sometimes, it's easy to get confused looking at different examples because there are a dozen different ways to create your http server that you use with Express. The general idea here is that whichever way you use, just make sure you assign the http server instance that you created to a variable that you can then use with socket.io server initialization (as shown above). In your Express code, app is not the server. That's the Express app object which is also an http request handler. It's not the server. The server is something you get from http.createServer() or something that app.listen() returns to you (after calling http.createServer() internally).

why 'connection' never be triggered?

On another terminal,
$curl localhost:3001
However, on nodejs server side,
I never saw
"sdfsdf" for
console.log("sdfsdf");
Questions
1 Can some expert explain why?
2 How to fix it to make 'connect' callback triggered?
Thank you.
var express = require('express'),
http = require('http')
var app = express();
var server = http.createServer(app);
//var server = http.Server(app);
//server.listen(app.get('port'), function () {
server.listen(3001, function () {
//logger.info('openHAB-cloud: express server listening on port ' + app.get('port'));
console.log("3001");
});
app.get('/', function(req, res){
//res.sendfile('index.html');
res.send("xxx");
});
io = require('socket.io').listen(server);
io.on('connection', function(socket) {
console.log("sdfsdf");
});
To connect to a socket.io server, you must use a socket.io client - you cannot just use a regular curl or http request.
A socket.io client must be specifically designed to connect to a socket.io server. That means it uses the socket.io message format on top of webSocket and it follows the proper convention that socket.io and webSocket use for connecting.
Here are some client-side examples: https://socket.io/docs/client-api/
The connection can be made either from browser Javascript with the appropriate socket.io library included or using any socket.io client-side library from some other Javascript environment.
To see a bit how webSocket connections (which socket.io uses), you may want to read this: How does WebSockets server architecture work?. And then, socket.io adds its own message layer on top of webSockets.
const io = require('socket.io-client');
const socket = io('http://localhost:3001');

Socket.io-based app running through node proxy server disconnecting all sockets whenever one disconnects

I made a basic chat app using node.js, express and socket.io. It's not too different from the tutorial chat app for socket.io, it simply emits events between connected clients. When I ran it on port 3001 on my server, it worked fine.
Then I made a proxy server app using node-http-proxy which listens on port 80 and redirects traffic based on the requested url to various independent node apps I have running on different ports. Pretty straightforward. But something is breaking. Whenever anyone disconnects, every single socket dis- and re-connects. This is bad for my chat app, which has connection-based events. The client consoles all show:
WebSocket connection to 'ws://[some socket info]' failed: Connection closed before receiving a handshake response
Here's what I think are the important parts of my code.
proxy-server.js
var http = require('http');
var httpProxy = require('http-proxy');
//create proxy template object with websockets enabled
var proxy = httpProxy.createProxyServer({ws: true});
//check the header on request and return the appropriate port to proxy to
function sites (req) {
//webapps get their own dedicated port
if (req == 'mychatwebsite.com') {return 'http://localhost:3001';}
else if (req == 'someothersite.com') {return 'http://localhost:3002';}
//static sites are handled by a vhost server on port 3000
else {return 'http://localhost:3000';}
}
//create node server on port 80 and proxy to ports accordingly
http.createServer(function (req, res) {
proxy.web(req, res, { target: sites(req.headers.host) });
}).listen(80);
chat-app.js
/*
...other modules
*/
var express = require("express");
var app = exports.app = express(); //I probably don't need "exports.app" anymore
var http = require("http").Server(app);
var io = require("socket.io")(http);
io.on("connection", function (socket) {
/*
...fun socket.on and io.emit stuff
*/
socket.on("disconnect", function () {
//say bye
});
});
http.listen(3001, function () {
console.log("listening on port 3001");
});
Now from what I've read on socket.io's site, I might need to use something to carry the socket traffic through my proxy server. I thought that node-http-proxy did that for me with the {ws: true} option as it states in their docs, but apparently it doesn't work like I thought it would. socket.io mentions three different things:
sticky session based on node's built in cluster module
socket.io-redis, which allows separate socket.io instances to talk to each other
socket.io-emitter, which allows socket.io to talk to non-socket.io processes
I have exactly no idea what any of this means or does. I am accidentally coding way above my skill level here, and I have no idea which of these tools will solve my problem (if any) or even what the cause of my problem really is.
Obligatory apology: I'm new to node.js, so please forgive me.
Also obligatory: I know other apps like nginx can solve a lot of my issues, but my goal is to learn and understand how to use this set of tools before I go picking up new ones. And, the less apps I use, the better.
I think your intuition about needing to "carry the socket traffic through" the proxy server is right on. To establish a websocket, the client makes an HTTP request with a special Upgrade header, signalling the server to switch protocols (RFC 6455). In node, http.Server instances emit an upgrade event when this happens and if the event is not handled, the connection is immediately closed.
You need to listen for the upgrade event on your http server and handle it:
var proxy = httpProxy.createProxyServer({ws: true})
var http = http.createServer(/* snip */).listen(80)
// handle upgrade events by proxying websockets
// something like this
http.on('upgrade', function (req, socket, head) {
proxy.ws(req, socket, head, {target:sites(req.headers.host)})
})
See the node docs on the upgrade event and the node-http-proxy docs for more.

Socket.io and multiple Dyno's on Heroku Node.js app. WebSocket is closed before the connection is established

I'm building an App deployed to Heroku which uses Websockets.
The websockets connection is working properly when I use only 1 dyno, but when I scale to >1, I get the following errors
POST
http://****.herokuapp.com/socket.io/?EIO=2&transport=polling&t=1412600135378-1&sid=zQzJJ8oPo5p3yiwIAAAC
400 (Bad Request) socket.io-1.0.4.js:2
WebSocket connection to
'ws://****.herokuapp.com/socket.io/?EIO=2&transport=websocket&sid=zQzJJ8oPo5p3yiwIAAAC'
failed: WebSocket is closed before the connection is established.
socket.io-1.0.4.js:2
I am using the Redis adaptor to enable multiple web processes
var io = socket.listen(server);
var redisAdapter = require('socket.io-redis');
var redis = require('redis');
var pub = redis.createClient(18049, '[URI]', {auth_pass:"[PASS]"});
var sub = redis.createClient(18049, '[URI]', {detect_buffers: true, auth_pass:"[PASS]"} );
io.adapter( redisAdapter({pubClient: pub, subClient: sub}) );
This is working on localhost (which I am using foreman to run, as Heroku does, and I am launching 2 web processes, same as on Heroku).
Before I implemented the redis adaptor I got a web-sockets handshake error, so the adaptor has had some effect. Also it is working occasionally now, I assume when the sockets match the same web dyno.
I have also tried to enable sticky sessions, but then it never works.
var sticky = require('sticky-session');
sticky(1, server).listen(port, function (err) {
if (err) {
console.error(err);
return process.exit(1);
}
console.log('Worker listening on %s', port);
});
I'm the Node.js Platform Owner at Heroku.
WebSockets works on Heroku out-of-the-box across multiple dynos; socket.io (and other realtime libs) use fallbacks to stateless processes like xhr polling that break without session affinity.
To scale up socket.io apps, first follow all the instructions from socket.io:
http://socket.io/docs/using-multiple-nodes/
Then, enable session affinity on your app (this is a free feature):
https://devcenter.heroku.com/articles/session-affinity
I spent a while trying to make socket.io work in multi-server architecture, first on Heroku and then on Openshift as many suggest.
The only way to make it work on both PAAS is disabling xhr-polling and setting transports: ['websocket'] on both client and server.
On Openshift, you must explicitly set the port of the server to 8000 (for ws – 8443 for wss on socket.io client initialization, using the *.rhcloud.com server, as explained in this post: http://tamas.io/deploying-a-node-jssocket-io-app-to-openshift/.
Polling strategy doesn't work on Heroku because it does not support sticky sessions (https://github.com/Automattic/engine.io/issues/261), and on Openshift it fails because of this issue: https://github.com/Automattic/engine.io/issues/279, that will hopefully be fixed soon.
So, the only solution I found so far, is disabling polling and use websocket transport only.
In order to do that, with socket.io > 1.0
server-side:
var app = express();
var server = require('http').createServer(app);
var socketio = require('socket.io')(server, {
path: '/socket.io-client'
});
socketio.set('transports', ['websocket']);
client-side:
var ioSocket = io('<your-openshift-app>.rhcloud.com:8000' || '<your-heroku-app>.herokuapp.com', {
path: '/socket.io-client'
transports: ['websocket']
})
Hope this will help.
It could be you need to be running RedisStore:
var session = require('express-session');
var RedisStore = require('connect-redis')(session);
app.use(session({
store: new RedisStore(options),
secret: 'keyboard cat'
}));
per earlier q here: Multiple dynos on Heroku + socket.io broadcasts
I know this isn't a normal answer, but I've tried to get WebSockets working on Heroku for more than a week. After many long conversations with customer support I finally tried out OpenShift. Heroku WebSockets are in beta, but OpenShift WebSockets are stable. I got my code working on OpenShift in under an hour.
http://www.openshift.com
I am not affiliated with OpenShift in any way. I'm just a satisfied (non-paying) customer.
I was having huge problems with this. There were a number of issues failing simultaneously making it a huge nightmare. Make sure you do the following to scale socket.io on heroku:
if you're using clusters make sure you implement socketio-sticky-session or something similar
client's connect url should not be https://example.com/socket.io/?EIO=3&transport=polling but rather https://example.com/ notably I'm using https because heroku supports it
enable cors in socket.io
specify only websocket connections
For you and others it could be any one of these.
if you're having trouble setting up sticky-session clusters here's my working code
var http = require('http');
var cluster = require('cluster');
var numCPUs = require('os').cpus().length;
var sticky = require('socketio-sticky-session');
var redis = require('socket.io-redis');
var io;
if(cluster.isMaster){
console.log('Inside Master');
// create the worker processes
for (var i = 0; i < numCPUs ; i++){
cluster.fork();
}
}
else {
// The worker code to be run is written inside
// the sticky().
}
sticky(function(){
// This code runs inside the workers.
// The sticky-session balances connection between workers based on their ip.
// So all the requests from the same client would go to the same worker.
// If multiple browser windows are opened in the same client, all would be
// redirected to the same worker.
io = require('socket.io')({transports:'websocket', 'origins' : '*:*'});
var server = http.createServer(function(req,res){
res.end('socket.io');
})
io.listen(server);
// The Redis server can also be used to store the socket state
//io.adapter(redis({host:'localhost', port:6379}));
console.log('Worker: '+cluster.worker.id);
// when multiple workers are spawned, the client
// cannot connect to the cloudlet.
StartConnect(); //this function connects my mongodb, then calls a function with io.on('connection', ..... socket.on('message'...... in relation to the io variable above
return server;
}).listen(process.env.PORT || 4567, function(){
console.log('Socket.io server is up ');
});
more information:
personally it would work flawlessly from a session not using websockets (I'm using socket.io for a unity game. It worked flawlessly from the editor only!). When connecting through the browser whether chrome or firefox it would show these handshaking errors, along with error 503 and 400.

http server and web sockets from separate servers

It's pretty easy to configure a http server (using express) and a socket server (socket.io) assigned to it:
var app = require('express')();
var http = require('http').Server(app);
var io = require('socket.io')(http);
How can I run http server and socket server in two different node.js instances?
My idea is to leverage the performance this way, releasing the http node instance from the responsibility of also sending notifications back to the clients.
In a regular Socket.IO + Express app, Socket.IO intercepts requests starting with /socket.io/.
You may set Nginx (or any other webserver that supports proxying) listening 80 port, and make it proxy to Socket.IO process if request starts with /socket.io/, and to Express process otherwise.
Edit: To set up Socket.IO in separate process you may use the following code:
var io = require('socket.io')();
io.on('connection', function(socket){
//here you can emit and listen messages
});
io.listen(3000);

Resources