why 'connection' never be triggered? - node.js

On another terminal,
$curl localhost:3001
However, on nodejs server side,
I never saw
"sdfsdf" for
console.log("sdfsdf");
Questions
1 Can some expert explain why?
2 How to fix it to make 'connect' callback triggered?
Thank you.
var express = require('express'),
http = require('http')
var app = express();
var server = http.createServer(app);
//var server = http.Server(app);
//server.listen(app.get('port'), function () {
server.listen(3001, function () {
//logger.info('openHAB-cloud: express server listening on port ' + app.get('port'));
console.log("3001");
});
app.get('/', function(req, res){
//res.sendfile('index.html');
res.send("xxx");
});
io = require('socket.io').listen(server);
io.on('connection', function(socket) {
console.log("sdfsdf");
});

To connect to a socket.io server, you must use a socket.io client - you cannot just use a regular curl or http request.
A socket.io client must be specifically designed to connect to a socket.io server. That means it uses the socket.io message format on top of webSocket and it follows the proper convention that socket.io and webSocket use for connecting.
Here are some client-side examples: https://socket.io/docs/client-api/
The connection can be made either from browser Javascript with the appropriate socket.io library included or using any socket.io client-side library from some other Javascript environment.
To see a bit how webSocket connections (which socket.io uses), you may want to read this: How does WebSockets server architecture work?. And then, socket.io adds its own message layer on top of webSockets.

const io = require('socket.io-client');
const socket = io('http://localhost:3001');

Related

NodeJs - express and socket.io same port integration

I am creating a server using NodeJs and Express, but I found out that if I want to make a service live, I need to use Socket.io. In this server, there are some service that don't need to be live, and these are implemented using express routes. This are tested and correctly working. Now I have to let some services to be live. So, I think I should implement also socket.io in my server configuration. This is my code without socket.io, working perfectly:
const express = require('express');
const morgan = require('morgan');
const cors = require('cors');
const mongoose = require('mongoose');
require('dotenv').config();
const app = express();
const port = process.env.port || 5050;
app.use(morgan('dev'));
app.use(cors());
app.use(express.json());
app.use(express.urlencoded({ extended:true }));
const uri = process.env.ATLAS_URI;
mongoose.connect(uri, {useNewUrlParser:true});
const connection = mongoose.connection;
connection.once('open', () => console.log('Connected'));
const routes = require(#every routes);
app.use(routes);//all created routes
app.listen(port, () => {console.log(`Server listening on port ${port}`)});
NOw, I should import correctly socket.io. When it is done, I think I can figure out correctly hot to implement my services. So, I tryied to add the line const io = require('socket.io').listend(app) as I saw in another stackoverflow quesion, but the terminal shows up this error:
const io = require('socket.io').listend(app)
^
TypeError: require(...).listend is not a function
So, I don't know how to integrate this two. I don't know if it is worth to use the same port, or if I should use another port for the socket, but I think the same port would be good. If someone knows how to implement socket.io in my current code, or a way to keep both functionalities, please help me. Thank you so much
Change this:
app.listen(port, () => {console.log(`Server listening on port ${port}`)});
to this:
const httpServer = app.listen(port, () => {console.log(`Server listening on port ${port}`)});
Then, add this:
const { Server } = require("socket.io");
const io = new Server(httpServer );
This will let both your Express server and your socket.io server share the same http server. Since all incoming socket.io connections are identifiable with some custom headers, the socket.io code (actually the underlying webSocket transport does this) can grab those and handle them independently from your regular http requests.
See plenty of examples in the socket.io doc which has improved immensely from its early days.
Sometimes, it's easy to get confused looking at different examples because there are a dozen different ways to create your http server that you use with Express. The general idea here is that whichever way you use, just make sure you assign the http server instance that you created to a variable that you can then use with socket.io server initialization (as shown above). In your Express code, app is not the server. That's the Express app object which is also an http request handler. It's not the server. The server is something you get from http.createServer() or something that app.listen() returns to you (after calling http.createServer() internally).

Socket.io-based app running through node proxy server disconnecting all sockets whenever one disconnects

I made a basic chat app using node.js, express and socket.io. It's not too different from the tutorial chat app for socket.io, it simply emits events between connected clients. When I ran it on port 3001 on my server, it worked fine.
Then I made a proxy server app using node-http-proxy which listens on port 80 and redirects traffic based on the requested url to various independent node apps I have running on different ports. Pretty straightforward. But something is breaking. Whenever anyone disconnects, every single socket dis- and re-connects. This is bad for my chat app, which has connection-based events. The client consoles all show:
WebSocket connection to 'ws://[some socket info]' failed: Connection closed before receiving a handshake response
Here's what I think are the important parts of my code.
proxy-server.js
var http = require('http');
var httpProxy = require('http-proxy');
//create proxy template object with websockets enabled
var proxy = httpProxy.createProxyServer({ws: true});
//check the header on request and return the appropriate port to proxy to
function sites (req) {
//webapps get their own dedicated port
if (req == 'mychatwebsite.com') {return 'http://localhost:3001';}
else if (req == 'someothersite.com') {return 'http://localhost:3002';}
//static sites are handled by a vhost server on port 3000
else {return 'http://localhost:3000';}
}
//create node server on port 80 and proxy to ports accordingly
http.createServer(function (req, res) {
proxy.web(req, res, { target: sites(req.headers.host) });
}).listen(80);
chat-app.js
/*
...other modules
*/
var express = require("express");
var app = exports.app = express(); //I probably don't need "exports.app" anymore
var http = require("http").Server(app);
var io = require("socket.io")(http);
io.on("connection", function (socket) {
/*
...fun socket.on and io.emit stuff
*/
socket.on("disconnect", function () {
//say bye
});
});
http.listen(3001, function () {
console.log("listening on port 3001");
});
Now from what I've read on socket.io's site, I might need to use something to carry the socket traffic through my proxy server. I thought that node-http-proxy did that for me with the {ws: true} option as it states in their docs, but apparently it doesn't work like I thought it would. socket.io mentions three different things:
sticky session based on node's built in cluster module
socket.io-redis, which allows separate socket.io instances to talk to each other
socket.io-emitter, which allows socket.io to talk to non-socket.io processes
I have exactly no idea what any of this means or does. I am accidentally coding way above my skill level here, and I have no idea which of these tools will solve my problem (if any) or even what the cause of my problem really is.
Obligatory apology: I'm new to node.js, so please forgive me.
Also obligatory: I know other apps like nginx can solve a lot of my issues, but my goal is to learn and understand how to use this set of tools before I go picking up new ones. And, the less apps I use, the better.
I think your intuition about needing to "carry the socket traffic through" the proxy server is right on. To establish a websocket, the client makes an HTTP request with a special Upgrade header, signalling the server to switch protocols (RFC 6455). In node, http.Server instances emit an upgrade event when this happens and if the event is not handled, the connection is immediately closed.
You need to listen for the upgrade event on your http server and handle it:
var proxy = httpProxy.createProxyServer({ws: true})
var http = http.createServer(/* snip */).listen(80)
// handle upgrade events by proxying websockets
// something like this
http.on('upgrade', function (req, socket, head) {
proxy.ws(req, socket, head, {target:sites(req.headers.host)})
})
See the node docs on the upgrade event and the node-http-proxy docs for more.

support multiple protocols on single server

I have a working Express HTTP server as well as a working websocket server. I want to add the websockets application to my regular website which is run by the HTTP server, but I'm not sure I'm understanding the documentation. Can I have a server that accepts multiple protocols and how would I handle the routing in a situation like that? The npmjs documentation for socketio says:
In conjunction with Express
Starting with 3.0, express applications have become request handler functions that you pass to http or http Server instances. You need to pass the Server to socket.io, and not the express application function.
var app = require('express')();
var server = require('http').createServer(app);
var io = require('socket.io')(server);
io.on('connection', function(){ /* … */ });
server.listen(3000);
can I handle HTTP requests through app.HTTPverbHere() and websocket requests through io.on?
The socket.io documentation shows you the exact steps needed to make socket.io work with nodejs express on the same server.
So, YES, you can do this.
In fact, every webSocket connection starts with an HTTP request (which is then upgraded to the webSocket protocol) so you must have a web server running on the server that handles webSockets anyway.
socket.io simply hooks into one route on the express web server that is used to initiate all socket.io webSocket connections and handles things from there.
Here's one example taken directly from the socket.io doc:
var app = require('express').createServer();
var io = require('socket.io')(app);
app.listen(80);
app.get('/', function (req, res) {
res.sendfile(__dirname + '/index.html');
});
io.on('connection', function (socket) {
socket.emit('news', { hello: 'world' });
socket.on('my other event', function (data) {
console.log(data);
});
});

Need for http.createServer(app) in node.js / express

Using node and express, the below works just fine.
var app = express();
app.listen(app.get('port'), function() {
});
I assume that a server is created implicitly in the above construct.
When adding socket.io, I've seen the following being done.
var app = express();
var server = http.createServer(app);
var io = require('socket.io').listen(server);
app.listen(app.get('port'), function() {
});
What is the need for explicitly adding http.createServer(app) ? Won't the creation of an additional server mess up things ? Or put it other way, is it ok to create many more http.createServer(app) ?
In either case, only one server is created. When using socket.io, you share the same http server between socket.io and express. Both libraries attach event listeners to the same server and have a chance to respond to the same events. They cooperate nicely because socket.io only handles relevant requests and express handles all the non-websocket requests. And just FYI you could not create more than one server on the same port. Only one process can listen on a TCP port at a time in the OS, so the second one would fail with an error when attempting to bind an in-use port.

http server and web sockets from separate servers

It's pretty easy to configure a http server (using express) and a socket server (socket.io) assigned to it:
var app = require('express')();
var http = require('http').Server(app);
var io = require('socket.io')(http);
How can I run http server and socket server in two different node.js instances?
My idea is to leverage the performance this way, releasing the http node instance from the responsibility of also sending notifications back to the clients.
In a regular Socket.IO + Express app, Socket.IO intercepts requests starting with /socket.io/.
You may set Nginx (or any other webserver that supports proxying) listening 80 port, and make it proxy to Socket.IO process if request starts with /socket.io/, and to Express process otherwise.
Edit: To set up Socket.IO in separate process you may use the following code:
var io = require('socket.io')();
io.on('connection', function(socket){
//here you can emit and listen messages
});
io.listen(3000);

Resources