I got a fairly massive requirejs based app that runs unbundled locally. I have a few hundred js files that get loaded in async. This is pretty quick locally and generally not a big deal. After maybe 10->20 page refreshes connectjs starts hanging for some reason. I got a half decent message once when I opened a different page and chrome indicated "waiting for available socket."
I'm guessing that at some point something ends up hanging and the connection never ends. At some point enough of these connections results in Node + connect to not accept any more requests. Has anyone experienced this and what is the solution? Is there a way to time out or reject requests from the server side?
Here is my connectjs server script:
var connect = require('connect');
var http = require('http');
var app = connect()
.use(connect['static'](__dirname))
.use(function (req, res) {
'use strict';
res.setHeader('Access-Control-Allow-Origin', '*');
// used to stub out ajax requests
if (req.url.indexOf('ajax/') !== -1) {
res.writeHead(200, { 'Content-Type': 'application/json' });
res.end(JSON.stringify({}));
}
});
var server = http.createServer(app);
server.listen(3000, function () {
'use strict';
console.log('server is listening on port 3000');
});
Related
Another update:
The problem occurs when running on localhost as well. Since I figured out the problem comes from the proxy server, here's its code :
var serverBouncer = bouncy(function(req, res, bounce) {
var path = req.url;
var url = req.headers.host;
if (typeof url !== "string")
return;
var urlArray = url.split('.');
var bouncePort = port;
if (!isNaN(urlArray[0]))
bouncePort = parseInt(urlArray[0]);
else if (String(urlArray[0]).toLowerCase() === "www" && !isNaN(urlArray[1]))
bouncePort = parseInt(urlArray[1]);
bounce(bouncePort);
});
serverBouncer.listen(80);
Update:
I found where the problem came from!!! But I still need to find the solution... There seems to be issues with using newer versions of Socket.io (>= 1.0) with a proxy server (bouncy, in my case).
I recently updated Socket.IO from v0.9.16 to v1.4.5, as well as adding Express to the mix. However, now I cannot open multiple (number seems to vary) tabs in Chrome and Firefox without experiencing strange issues (Edge is the only one to work well). It either hangs, or partially loads html and other resources before it hangs.
After waiting, I often get the error :
Failed to load resource: the server responded with a status of 400 (Bad Request)
When I close one of the tab that's been hanging, it unblocks the other tabs that were also hanging.
The issues were not present before going through with the changes listed above.
I've been doing research for 2 full days and just now decided to post this, as I know it's very vague and I'm probably not providing enough information. As much as I'd like to, it would take a very long time to remember and list everything I tried during that time.
Using Windows 10 with Chrome v51.0.2704.103, Firefox v43.0.1. The server (CentOS) is using node v6.2.2 with mainly the following modules :
express#4.14.0
npm#3.9.5
socket.io#1.4.5
Here's some relevant server code :
var port = 8502;
var socketio = require('socket.io');
var express = require("express");
var http = require('http');
var app = express();
var server = http.createServer(app);
var io = socketio.listen(server);
server.listen(port);
app.get('/', function(req, res, next) {
//Returning index.html
});
io.on("connection", function(socket) {
//Some events...
});
Here's a bit of the client code :
var client = io.connect();
client.on('connect', function() {
//Some events
})
your binding before the server is listening, try something like this
var app = express();
server = app.listen(PORT, function () {
console.log('Example app listening on port ' + PORT + '!');
});
io.listen(server);
I managed to replace the bouncy module with nginx. See my other question for the solution.
nginx : redirect to port according to domain prefix (dynamically)
Please excuse any noobiness, I'm learning. :)
I have Socket.IO set up so that I can use io.sockets.emit inside of my routes, and I have that working. There are a few problems.
(SOLVED? SEE EDIT 3) To use, I cannot start with the word socket. I have to start with ioor I get "ReferenceError: socket is not defined." I'd like to be able to use socket.broadcast.emit to emit the event to all clients except for the current user. Right now I'm having to do a check on the client side to not execute the event if it's the current user and it's becoming a real headache as I'm having to emit more events as my project progresses.
(SOLVED, SEE EDIT 1 & 2) I have to run the application with node app.js and restart the server manually every time I make a server-side change. When I run nodemon, I get "Port 3000 is already in use." I feel that this must be related to the following...
(SOLVED, SEE EDIT 2) When pushing to Heroku, I have the port from the code below changed from 3000 to 80 in bin/www and app.js, but it does not work (I can see a 404 error for sockets in the console). If this and #2 are caused by dealing with http/ports in both places, how do I properly set this up and why does node app.js work?
I only need to run Socket.IO on the route shown below (battlefield). Am I already doing this with am I already doing this with require('./routes/battlefield')(io)?
bin/www
var app = require('../app');
var port = normalizePort(process.env.PORT || '3000');
app.set('port', port);
var server = http.createServer(app);
server.listen(port);
app.js
var app = express();
var http = require('http').Server(app);
http.listen(3000);
var io = require('socket.io')(http);
app.set('socketio', io);
var battlefield = require('./routes/battlefield')(io);
battlefield.js
var express = require('express');
var router = express.Router();
var returnRouter = function(io) {
router.get('/', function(req, res, next) {
// other stuff
io.sockets.emit('message', 'This works');
socket.broadcast.emit('message', 'Socket is undefined');
})
return router;
};
module.exports = returnRouter;
I tried wrapping my routes in io.on('connection', function (socket) { to be able to use socket, and instead of 'Socket is undefined,' the event does not occur.
var returnRouter = function(io) {
io.on('connection', function (socket) {
router.get('/', function(req, res, next) {
// other stuff
socket.emit('message', 'This is never emitted');
})
})
return router;
};
I apologize for such a lengthy question. THANK YOU for any help! 💜
EDIT1: Writing out this question helped me understand the problem better. I commented out server.listen(port); in my bin/www and nodemon now works. However, the app crashes on Heroku. My Procfile is web: node ./bin/www... does that need to be changed?
EDIT2: After figuring out Edit1 and a bit of Googling, I found that I can't have server.listen(); (bin/www) and http.listen(3000); (app.js).
In bin/www, I removed server.listen();.
In app.js, for clarity's sake I changed var http = ... to var server = ... and had it listen for process.env.PORT || '3000';, taken from bin/www. I also removed app.set('socketio', io); because it looks like that was doing nothing... I wonder why it was in there.
app.js
var app = require('express')();
var server = require('http').Server(app);
var io = require('socket.io')(server);
var port = process.env.PORT || '3000';
server.listen(port);
This also makes Heroku work because of process.env.PORT, hurray! I'm guessing node app.js worked because I was initializing the app with app.js, I guess bin/www is not executed when you do that?
I still need help with #1 (using socket.broadcast.emit) 😇.
EDIT 3: Well, it took me literally the entire day but I believe I have it figured out with one quirk. Of course I couldn't use socket, it is a parameter given on connect. I also need to access socket across different routes and found this SO question. This is what I ended up doing in battlefield.js:
var returnRouter = function(io) {
var socket;
router.get('/', authenticatedUser, function(req, res, next) {
io.on('connection', function(client){
socket = client;
});
// other stuff
res.render('battlefield', {/* data */});
setTimeout(function(){socket.emit('test', 'It works!')}, 500);
});
router.post('/', function(req, res, next) {
// socket can be accessed
});
return router;
};
module.exports = returnRouter;
(Note: I took out a lot of my own code so I don't know if this is copy and pasteable ready, and you should probably check that socket is not null)
Without setTimeout, socket is undefined on GET '/'. To my understanding, the page must render first... Strange that 200 sometimes doesn't work for me and 500 does. I can leave it at 500, but this is for a game so time is pretty important.
My questions now are:
Can this be improved / is there a way I can do this without setTimeout? Am I 'connecting' clients properly with this code and am I (question #4 up there^) using Socket.IO efficiently?
P.S. If no one answers ^ these questions, I'll edit this, answer the question, and accept my answer as best answer.
When you use sockets when doing routing in Node its not that useful.
When ever you navigate to a different name space (eg www.example.com --> www.example.com/some-name-space) your front end variables are deleted and you need to resend them. This works great if you pass an object along with the GET request for that name space. But it doesn't need sockets.
Its done like this on your router file
var canAlsoBePassed = {some: "things"};
router.get('/', function(req, res, next) {
res.render('index', { items: "Can be passed directly", variables: canAlsoBePassed });
});
For sockets the best kind of applications are for single page apps or to replace AJAX requests. Another great thing sockets allows is for the server to be able to push information without the client asking for it.
To answer your question about SetTimeout, no you dont need this.
Make sure the socket script running on your client side is waiting for the document to be loaded.
$(document).ready(function() {
When an io.on('connection' event fires on your server side you know you have a new client to serve.
emit an event from the server side something like a welcome event that makes the client join a specific room. Once you have them in that room you can be listening for any events emitted to that room.
See socket.io official info
Custom namespaces
To set up a custom namespace, you can call the of function on the server-side:
var nsp = io.of('/my-namespace');
nsp.on('connection', function(socket){
console.log('someone connected'):
});
nsp.emit('hi', 'everyone!');
On the client side, you tell Socket.IO client to connect to that namespace:
var socket = io('/my-namespace');
Might not be the most accurate answer to your questions but I hope it pushes you in the right direction.
I've got a Vagrant box set up to port-forwards a socket.io application from internal port 5000 to external port 8081; when I try to connect from the client it starts long-polling the connection but I don't see any kind of response from the server and the server app never registers a connection attempt. The connection doesn't fail or return any error response code though, it just returns a 200 code with a blank response.
// Import utilities
var http = require('http'),
socketIO = require('socket.io'),
querystring = require('querystring');
// Init servers/external connections
var server = http.createServer(function baseHandler(req, res) {
// console.log(req.headers);
res.writeHead(200);
res.end(JSON.stringify({
message: 'This server only supports WebSocket connections'
}));
}),
io = socketIO(server);
server.listen(process.env.socket_port || 5000, function() {
var sockets = [];
console.log('App connected');
});
io.on('connection', function (socket) {
console.log('Socket connected');
console.log('Socket in rooms '+ socket.rooms.join(', '));
});
The same app works just fine when I'm trying to connect from the app running directly on my PC, so my code doesn't seem to be the problem here, especially given how it's basically duplicating the basic example in the docs; not really sure how to solve this from here.
This is one of those really stupid bugs which crop up when you're working on two different problems with the same codebase at the same time. Here's the client-side code line which was breaking:
var socket = io('127.0.0.1:8081/?access_token=1d845e53c4b4bd2e235a66fe9c042d75ae8e3c6ae', {path: '/auth/socket.io'});
Note the path key is set to point to a subdirectory, /auth, which is a leftover from my work to get an nginx folder proxying to an internal port which the server was working on.
I'm using iisnode to host a node app. I'm having trouble actually deploying it under my domain name. Here's the main file with two different starting points. The un-commented code is just a simple server that works correctly when accessed via my domain (so iisnode is mapping and handling the node app correctly). The commented code is the entry point for the express app I am working on, and this works when I view from a local host, but when attempting to access via my domain I receive a 'cannot GET application.js' error.
var http = require('http');
http.createServer(function (req, res) {
res.writeHead(200, { 'Content-Type': 'text/plain' });
res.end('Hello, world!');
}).listen(process.env.PORT);
//require('./app/init');
//var server = require('./app/server');
//module.exports = server.start(process.env.NODE_ENV);
Here is my server.js file. I think its a routing issue, I've substitued a console.log function for the indexRoute function, and it never fires. But I still don't understand why this works correctly accessing via localhost but not under my domain.
var express = require('express');
var routes = require('./routes');
var app = express();
function createApplication(environment) {
app.get('/', routes.indexRoute);
app.listen(process.env.PORT);
return app;
}
module.exports.start = createApplication;
I can message a git link for full app if anyone is interested.
Try specifying that you want to listen from all IP addresses, not just localhost by adding '0.0.0.0' as a parameter to listen. Also add a callback to see what happened.
app.listen(process.env.PORT, '0.0.0.0', function(err) {
console.log("Started listening on %s", app.url);
});
On the server side, I have this nodejs code. I simplified the code here to make the question clear:
var express = require('express');
var app = express();
var spawn = require('child_process').spawn;
app.get('/print_N', function(req, res) {
var child = spawn('python', ['some_tick_data_process.py']);
req.on('close', function() {
console.log('req to close with pid=' + child.pid);
child.kill('SIGTERM');
});
child.stdout.pipe(res);
child.stdout.on('error', function(err) {
console.log(child.pid + " error !!!")
console.log(err);
});
});
app.listen(3000);
console.log('Listening on port 3000...');
The underlying some_tick_data_process.py is quite I/O intensive. On the client side, I have a small python application to read the stream. The problem is that some processes will run into error "req to close". With a small number of processes, it is OK. I tried:
var http = require('http');
http.globalAgent.maxSockets = 100;
But it doesn't help. Please share your thoughts, thanks!
After leveraging on P.T.'s advice, the fix is:
app.get('/:version/query', function(req, res) {
res.setTimeout(4 * 60 * 60 * 1000); // nodejs socket timeout 4-hour
}
The express request 'close' events mean that the connection closed (I believe the express request inherits this from the underlying http.IncomingMessage.
This could be either because the client gave up, or because the nodejs socket timeout hit 2 minutes.
You need to figure out which side of the connection is giving up. I would've thought it was the client side, but was surprised to discover the nodejs timeout. See http://nodejs.org/api/http.html#http_request_setsocketkeepalive_enable_initialdelay for setting the socket timeout on the request's socket.
See also: node.js http.IncomingMessage does not fire 'close' event