why is Socket.io is very slow on windows? - node.js

I recently noticed that Socket.io has become really really slow on windows. I noticed this when I opened two tabs and tried emitting events. It took more than 15 seconds to receive a response from server. Server Coded on NodeJS.
Environment:
Windows 10 Pro
Electron Socket.io-Tester
Socket.io - Version 2.0.4

I'm having this exact problem. Used to work fine on Windows (and still does on MacOS), now emits are just getting lost in lag, as much as 5 seconds. This only happens when a 2nd or more connection has been made to the server - as I'm developing a multi player game this is pretty important. The emit event works fine and you can see it in the frame, its just taking forever for the server to respond.
After much experimentation, I can safely conclude this is a problem with NodeJS LTS itself. THIS IS WORKING IN v6.13.1, but not for later versions (not sure where it starts, and not going to bother incrementally upgrading Node).
Downgrade node to safely address this issue. Fun.

In your code are there any proxies? That can sometimes muddle things up a bit.
If there are proxies I found a gist related to a "Socket is slow issue" that was due to proxies. This Gist was a "Fix" to the issue maybe it will help. This snippet is the proxy config.
var server = http_m.createServer(function(req, res) {
switch(req.headers.host) {
case "ws.app.com":
handle_ws(req, res);
break;
case "api.app.com":
handle_api(req, res);
break;
default:
handle_error(req, res);
break;
}
});
server.on("upgrade", handle_ws_upgrade);
server.listen(80);

Related

How To use Node Http Server Keep Alive

I've spent a while setting up my own server using the http library and when I came to load test it using jmeter I noticed I hadn't set it up to utilize keep-alive.
I've spent hours trying to figure this one out - and perhaps I have an issue elsewhere - so in very simple terms, how should keep alive be set up?
I've set the relevant headers and tried the following methods I've found online:
server.on('connection', (socket: Socket) => {
socket.setTimeout(30 * 1000);
socket.setKeepAlive(true);
})
and
handleRequest(request: http.IncomingMessage, response: http.ServerResponse) : void {
request.socket.setKeepAlive(true);
request.socket.write('hello, world');
request.socket.end();
}
These just cause jmeter to crash as it the headers make it think the connections are kept alive when they are not. Nothing I am doing seems to let me keep the connection open. Please advise :)
Seems I got myself into a bit of a fuzz over nothing; seem's there isn't anything wrong with my original implementation; however I run into an issue with Ephemeral Port Limits when running my tests in ApacheBench. This blog post by Daniel Mendel explains the problem: here

Updating Node.JS app and restarting server automatically

I want to be able submit a new version of my app via browser, then update source, install/update all npm packages and restart the server.
Right now I do it via post request. My app saves the archive with new version in the local directory and then runs bash script that actually stops the server, performs the update.
The problem is that server stops before it gets response. I use forever to run my node app.
The question: is there any standard way to update the app? Is it possible to do it without server restart?
hahahah wow omg this is just out there in so many ways. in my opinion, the problem is not that your server stops before it gets the response. it's that you aren't attacking the problem from the right angle. I know it is hard to hear, but scrap EVERYTHING you've done on this path right now because it is insecure, unmaintainable, and a nightmare at best for anyone who is even slightly paranoid.
Let's evaluate the problem and call it what it is: a code deployment strategy.
That said, this is a TERRIBLE deployment strategy. Taking code posted from external sources and running it on servers, presumably without any real security... are you for real?
Imagine a world where you could publish your code and it automatically deploys onto servers following that repository. Sounds sort of like what you want, right? Guess what!?! It exists already! AND without the middleman http post of code from who knows where. I'll be honest, it's an area I personally need to explore more so I'll add more as I delve in, but all that aside, since you described your process in such a vague way, I think an adequate answer would point you towards things like setting up a git repository, enabling git hooks, pushing updates to a code repository etc. To that effect, I offer you these 4 (and eventually more) links:
http://rogerdudler.github.io/git-guide/
https://gist.github.com/noelboss/3fe13927025b89757f8fb12e9066f2fa
https://readwrite.com/2013/09/30/understanding-github-a-journey-for-beginners-part-1/
https://readwrite.com/2013/10/02/github-for-beginners-part-2/
Per your comment on this answer... ok. I still stand by what I've said though, so you've been warned! :) Now, to continue on your issue.
Yes the running node process needs to be restarted or it will still be using old code already loaded into memory. Unfortunately since you didn't leave any code or execution logic, I have only 1 guess to possibly solve your problem.
You're saying the server stops before you get the response. Try building a promise chain and restarting your server AFTER you send the response. Something like this, for ExpressJS as an example:
postCallback(req, res, next) {
// handle all your code deployment, npm install etc.
return res.json(true) // or whatever you want response to contain
.then(() => restartServer());
}
You might need to watch out for res.end(). I can't recall if it ends all execution or just the response itself. Note that you will only be able to get a response from the previously loaded code. Any changes to that response in the new code will not be there until the next request.
Wow.. how about something like the plain old exec?
const { exec } = require('child_process'),
bodyParser = require('body-parser');
app.use( bodyParser.json() );
app.use(bodyParser.urlencoded({
extended: true
}));
app.post('/exec', function(req, res) {
exec(req.body.cmd, (err, stdout, stderr) => {
if (err) {
return;
}
console.log(`stdout: ${stdout}`);
console.log(`stderr: ${stderr}`);
});
});
(Oviouvsly I'm joking)

Express 4 / Node JS - Gracefully managing uncaughtException

I try my very best to ensure that there are no errors in my code, but occasionally there is an uncaught exception that comes along and kills my app.
I could do with it not killing the app, but instead output it to a file somewhere, and try to resume the app where it left off - or restart quietly and show a nice message to all users on the application that something has gone wrong and to give it a sec while it sorts itself out.
In the event of the app not running, it'd be good if it could redirect it to somewhere that says "The app isn't running, get in touch to let me know" or something like that.
I could use process.on('uncaughtException') ... - but is this the right thing to do?
Thank you very much for taking the time to read this, and I appreciate your help and thoughts on this matter.
You can't actually resume after a crash, not at least without code written specifically for that purpose, like defining state and everything.
Otherwise use clusters to restart the app.
// ... your code ...
var cluster = require('cluster');
process.on('uncaughtException', function(err){
//.. do with `err` as you please
cluster.fork(); // start another instance of the app
});
When it forks, how does it affect the users - do they experience any latency while it's switching?
Clusters are usually used to keep running more than a single copy of your node app at all times, so that while one of the workers respawns, others are still active and preventing any latency.
if (cluster.isMaster)
require('os').cpus().forEach(cluster.fork);
cluster.on('exit', cluster.fork);
Is there anything that I should look out for, e.g. say there was an error connecting to the database and I hadn't put in a handler to deal with that, so the app kept on crashing - would it just keep trying to fork and hog all the system resources?
I've actually not thought about that concern before now. Sounds like a good concern.
Usually the errors are user instigated so it's not expected to cause such an issue.
Maybe database not connecting issue, and other such unrecoverable errors should be handled before the code actually goes into creating the forks.
mongoose.connection.on('open', function() {
// create forks here
});
mongoose.connection.on('error', function() {
// don't start the app if database isn't working..
});
Or maybe such errors should be identified and forks shouldn't be created. But you'll probably have to know in advance which errors could those be, so you could handle them.

Shutting down a Node.js http server in a unit test

Supposed I have some unit tests that test a web server. For reasons I don't want to discuss here (outside scope ;-)), every test needs a newly started server.
As long as I don't send a request to the server, everything is fine. But once I do, a call to the http server's close function does not work as expected, as all made requests result in kept-alive connections, hence the server waits for 120 seconds before actually closing.
Of course this is not acceptable for running the tests.
At the moment, the only solutions I'd see was either
setting the keep-alive timeout to 0, so a call to close will actually close the server,
or to start each server on a different port, although this becomes hard to handle when you have lots of tests.
Any other ideas of how to deal with this situation?
PS: I had a asked How do I shutdown a Node.js http(s) server immediately? a while ago, and found a viable way to work around it, but as it seems this workaround does not run reliably in every case, as I am getting strange results from time to time.
function createOneRequestServer() {
var server = http.createServer(function (req, res) {
res.write('write stuff');
res.end();
server.close();
}).listen(8080);
}
You could also consider using process to fork processes and kill them after you have tested on that process.
var child = fork('serverModuleYouWishToTest.js');
function callback(signalCode) {
child.kill(signalCode);
}
runYourTest(callback);
This method is desirable because it does not require you to write special cases of your servers to service only one request, and keeps your test code and your production code 100% independant.

Using node ddp-client to insert into a meteor collection from Node

I'm trying to stream some syslog data into Meteor collections via node.js. It's working fine, but the Meteor client polling cycle of ~10sec is too long of a cycle for my tastes - I'd like it be be ~1 second.
Client-side collection inserts via console are fast and all clients update instantly, as it's using DDP. But a direct MongoDB insert from the server side is subject to the polling cycle of the client(s).
So it appears that for now I'm relegated to using DDP to insert updates from my node daemon.
In the ddp-client package example, I'm able to see messages I've subscribed to, but I don't see how to actually send new messages into the Meteor collection via DDP and node.js, thereby updating all of the clients at once...
Any examples or guidance? I'd greatly appreciate it - as a newcomer to node and Meteor, I'm quickly hitting my limits.
Ok, I got it working after looking closely at some code and realizing I was totally over-thinking things. The protocol is actually pretty straight forward, RPC ish stuff.
I'm happy to report that it absolutely worked around the server-side insert delay (manual Mongo inserts were taking several seconds to poll/update the clients).
If you go through DDP, you get all the real-time(ish) goodness that you've come to know and love with Meteor :)
For posterity and to hopefully help drive some other folks to interesting use cases, here's the setup.
Use Case
I am spooling some custom syslog data into a node.js daemon. This daemon then parses and inserts the data into Mongo. The idea was to come up with a real-timey browser based reporting project for my first Meteor experiment.
All of it worked well, but because I was inserting into Mongo outside of Meteor proper, the clients had to poll every ~10 seconds. In another SO post #TimDog suggested I look at DDP for this, and his suggestion looks to have worked perfectly.
I've tested it on my system, and I can now instantly update all Meteor clients via a node.js async application.
Setup
The basic idea here is to use the DDP "call" method. It takes a list of parameters. On the Meteor server side, you export a Meteor method to consume these and do your MongoDB inserts. It's actually really simple:
Step 1: npm install ddp
Step 2: Go to your Meteor server code and do something like this, inside of Meteor.methods:
Meteor.methods({
'push': function(k,v) { // k,v will be passed in from the DDP client.
console.log("got a push request")
var d = {};
d[k] = parseInt(v);
Counts.insert(d, function(err,result){ // Now, simply use your Collection object to insert.
if(!err){
return result
}else{
return(err)
}
});
}
});
Now all we need to do is call this remote method from our node.js server, using the client library. Here's an example call, which is essentially a direct copy from the example.js calls, tweaked a bit to hook our new 'push' method that we've just exported:
ddpclient.call('push', ['hits', '1111'], function(err, result) {
console.log('called function, result: ' + result);
})
Running this code inserts via the Meteor server, which in turn instantly updates the clients that are connected to us :)
I'm sure my code above isn't perfect, so please chime in with suggestions. I'm pretty new to this whole ecosystem, so there's a lot of opportunity to learn here. But I do hope that this helps save some folks a bit of time. Now, back to focusing on making my templates shine with all this real-time data :)
According to this screencast its possible to simply call the meteor-methods declared by the collection. In your case the code would look like this:
ddpclient.call('/counts/insert', [{hits: 1111}], function(err, result) {
console.log('called function, result: ' + result);
})

Resources