I have a big problem with socket.io which makes my real-time game unplayable.
I want my node.js server to send data every 20ms to every connected client (each socket)
But I notice that the client does not receive every 20ms, it receives several at a time every 100 - 200ms and more. which makes the game totally unplayable (very big lag)
Locally it works very well.
Can you help me please ?
Here is a piece of code of my function that sends data (server side):
function sendData()
{
playersList.forEach( (player) =>
{
let data = [];
// code which add data to "data" variable
socketsList[player.id].emit('refreshGame', JSON.stringify(data));
});
setTimeout(sendData, 20);
}
sendData();
This happens as there are some network bottlenecks. This happens because of:
Packets sent are very big.
Slow network.
To mitigate, try to optimize sending packets, IE only send data that is needed. For example, if I have a game with moving and non-moving components, only send the non-moving components once even though you might send the moving components multiple times.
The other option would be to diagnose the network, which is outside the scope of this stackoverflow question.
However, although these are the "facts", there are some un-conventional or small adjustments that seem to fix this issue based on my own experience.
Get rid of the JSON.stringify on the emit function. Socket.emit automatically does this for you.
Place the setTimeout at the beginning of the function, rather than the end.
Also, make sure that socketsList[player.id] is the player's actual socket. If you use socketsList.splice at some point, it would mess up the indexes totally making it send the packets to the wrong client.
Related
I am building my first web-based node.js application - an online game - as a hobby/project to try and teach myself how it all works.
I'm using socket.io to send real-time updates (who's in the lobby, points scored etc) to users, but I'm not sure whether the way I'm managing the sockets, and the information being sent through them, in the best way.
Whenever the game is updated, I'm sending an object to each user which updates everything at once, and a lot of the time, the information being updated is actually staying the same. For example, if a user scores a point, an update is sent to everyone's browser to update the leaderboard, but that same socket.on function is re-sending information such as usernames, which stay the same throughout the game:
exampleObject = {
"usernames" : [username1, username2], // only gets updated in the browser once, but is sent every time
"points": {
"username1": 1, // Different value with every update
"username2": 3
}
}
(The real object is quite a bit bigger than this)
Would it be more sensible to have a different socket.on function for every individual piece of information which needs updating, so I can then call them individually as and when required, or is there any sense in updating everything through one function? Any thoughts/advice would be greatly appreciated.
If you are regularly sending a piece of information over and over, then it makes sense to design a specific message that only contains that specific information so you aren't regularly sending information that does not need to be sent. You can have as many different messages as you want and you should use that to design efficient messages, particularly for the most common messages.
Would it be more sensible to have a different socket.on function for every individual piece of information which needs updating, so I can then call them individually as and when required
Yes. Design efficient messages specifically for things you regularly send.
or is there any sense in updating everything through one function?
Only if you need to change lots of stuff at once. It's wasteful to include data in a frequent message that never changes and doesn't need to be sent.
It's perfectly fine to have different messages you send for different purposes and then the client has different listeners for those specific messages. At the same time, if you regularly send three pieces of data together, you probably wouldn't make a separate message for each piece of data - you'd put those three together such that your message structure aligns with your usage.
And, you can also have different messages for different purposes even if some data is in both messages.
One more note here. The title of your question "How should I manage the number of sockets in a node.js application?" seems to ask about managing the number of sockets. But, the rest of your question isn't about that at all. The rest of your question is about having different messages on the same socket. You don't need a new socket in order to define and use a different message. You can have thousands of different messages that you use all on the same socket connection. That's the whole architecture of socket.io. You send a message name and some data that goes with it. You can use a limitless number of separate message names all on the same connection.
For example, let's say I have a random game in which I have 500 independent objects and 10 players.
Independent object is an object that moves in a specific direction per update regardless of what players do (there is no need for players to come into contact with these objects).
Now if a player is shooting (lets say) a bullet, it is easier because it belongs to a specific player therefore it's easier to avoid in game lag. Lets look at something simpler, though, for example a player try to update their position. The typical thing I would do on client & server side would be this :
client side : update the coords of the player + send a message to the server as socket X
server side : receives the message from socket X, updates the coords of the player on the server side +
sends a message with the coords of that same player to all other sockets
When you do the communication like this, everyone will receive the new coords of the player and there will be little to no lag. (It is also sufficient for objects like bullets, because they are created upon firing a player event)
How do you handle 500+ independent objects that move in random directions with random speed all across the map and update them for all players efficiently? (Be aware that their velocity and speed can be changed upon contact with a player). What I've tried so far:
1) Put all of the movement + collission logic on the server side &
notifying all clients with a setTimeout loop & io.emit -
Result : causes massive lag even when you have only 500+ objects and 4 connected players. All of the players receive the server's response way too slow
2) Put all of the movement + collission logic on the client side & notifying the server about every object' position-
Result : To be honest, couldn't encounter much lag, but I am not sure if this is the correct idea as every time an object moves, I am literally sending a message to the server from each client to update that same object (server is getting notified N[number of connected clients] amount of times about that same object). Handling this entirely on the client side is also a bad idea because when a player randomly switches tabs [goes inactive], no more javascript will be executed in that players' browser and this whole logic will break
I've also noticed that games like agar.io, slither.io, diep.io, etc, all of them do not really have hundreds of objects that move in various directions. In agar.io and slither you mainly have static objects (food) and players, in diep.io there are dynamical objects, but none of them move at very high speeds. How do people achieve this? Is there any smart way to achieve this with minimal lag?
Thanks in advance
Convert your user interactions to enumerated actions and forward those. Player A presses the left arrow which is interpreted by the client as "MOVE_LEFT" with possible additional attributes (how much, angle, whatever) as well as a timestamp indicating when this action took place from Player A's perspective.
The server receives this and validates it as a possible action and forwards it to all the clients.
Each client then interprets the action themselves and updates their own simulation with respect to Player A's action.
Don't send the entire game state to every client every tick, that's too bloated. The other side is to be able to handle late or missing actions. One way of doing that is rollback where you keep multiple sets of state and then keep the game simulation going until a missinterpretation (late/missing packet) is found. Revert to the "right" state and replay all the messages since in order to get state to correct. This is the idea behind GGPO.
I suggest also reading every article related to networking that Gaffer on Games goes into, especially What Every Programmer Needs To Know About Game Networking. They're very good articles.
I am using net module in nodejs.
net.createServer(function(sock) {
sock.on('data', function(data) {
console.log(data);
});
});
then I tried to use 2000 tcp clients to send data to the server to test how many clients it can support. For the first 20 minutes, it was running ok. But after a period of time, the data stuck together. For example, the data from the client is in json format and look like this:
'{"value":1,"name":"tom"}'
Each client sent the data with different name and value would be incremented each time. From the server side, the data I got look like this:
{"value":1,"name":"tom"}{"value":2,"name":"tom"}{"value":3,"name":"tom"}.
They stick together, I have to split them up and save them into mongodb.
The situation was getting worse when keep the server running longer. the server can't receive any data while the clients were still sending the data.
I'd like to ask how to make the server read one item each time and the server will work well when keep running. Thanks a lot.
The socket server doesn't know or care about your data format. No matter what, you need to split it up. It can be arbitrarily buffered or split before reaching your application. The fact that you're ending up with nice segments under lower load is just due to your payload fitting into a single packet, and the app keeping up reading buffers as they come.
The easiest thing to do is use a delimiter. Since you're using JSON, you can use a newline delimeter. (Just make sure your JSON objects are formatted on a single line per object.)
Then, you can use a transform stream that buffers the stream, waits for a delimeter, and emits parsed objects.
this repository can help you.The solution of sticky package problem of TCP for Node.Js
stick
node.js stick package
You try this lib,The solution of sticky package problem of TCP for Node.Js
I have gone through many painful months with is issue and I am now ready to let this go to the bin of "what-a-great-lie-for-websites-nodejs-is"!!!
All NodeJS tutorials discuss how to create a website. When done, it works. For one person at a time though. All requests sent to the port will be blocked by the first come-first-serve situation. Why? Because most requests sent to the nodejs server will have to get parsed, data requested from the database, data calculated and parsed, response prepared and sent back to the ajax call. (this is a mere simple website example).
Same applies for authentication - a request is made, data is parsed, authentication takes place, session is created and sent back to the requester.
No matter how you sugar coat this - All requests are done this way. Yes you can employ async functionality which will shorten the time spent on some portions, yes you can try promises, yes you can employ clustering, yes you can employ forking/spawning, etc... The result is always the same at all times: port gets blocked.
I tried responding with a reference so that we can use sockets to pass the data back and matching it with the reference - that also blocked it.
The point of the matter is this: when you ask for help, everyone wants all sort of code examples, but never go to the task of helping with an actual answer that works. The whole wide world!!!!! Which leads me to the answer that nodeJS is not suitable for websites.
I read many requests regarding this and all have been met with: "you must code properly"..! Really? Is there no NodeJS skilled and experienced person who can lend an answer on this one????!!!!!
Most of the NodeJS coders come from the PHP side - All websites using PHP never have to utilise any workaround whatsoever in order to display a web page and it never blocks 2 people at the same time. Thanks to the web server.
So how come NodeJS community cannot come to some sort of an asnwer on this one other than: "code carefully"???!!!
They want examples - each example is met with: "well that is a blocking code, do it another way", or "code carefully"!!! Come one people!!!
Think of the following scenario: One User, One page with 4 lists of records. Theoretically all lists should be filled up with records independently. What is happening because of how data is requested, prepared and responded, each list in reality is waiting for the next one to finish. That is on one session alone.
What about 2 people, 2 sessions at the same time?
So my question is this: is NodeJS suitable for a website and if it is, can anyone show and prove this with a short code??? If you can't prove it, then the answer is: "NodeJS is not suitable for websites!"
Here is an example based on the simplest tutorial and it is still blocking:
var express = require('express'),
fs = require("fs");
var app = express();
app.get('/heavyload', function (req, res) {
var file = '/media/sudoworx/www/sudo-sails/test.zip';
res.send('Heavy Load');
res.end();
fs.readFile(file, 'utf8', function (err,fileContent) {
});
});
app.get('/lightload', function (req, res) {
var file = '/media/sudoworx/www/sudo-sails/test.zip';
res.send('Light Load');
res.end();
});
app.listen(1337, function () {
console.log('Listening!')
});
Now, if you go to "/heavyload" it will immediately respond because that is the first thing sent to the browser, and then nodejs proceeds reading a heavy file (a large file). If you now go to the second call "/lightload" at the same time, you will see that it is waiting for the loading of the file to finish from the first call before it proceeds with the browser output. This is the simplest example of how nodejs simply fails in handling what otherwise would be simple in php and similar script languages.
Like mentioned before, I tried as many as 20 different ways to do this in my career of nodejs programmer. I totally love nodejs, but I cannot get past this obstacle... This is not a complaint - it is a call for help because I am at my end road with nodejs and I don't know what to do.
I thank you kindly.
So here is what I found out. I will answer it with an example of a blocking code:
for (var k = 0; k < 15000; k++){
console.log('Something Output');
}
res.status(200).json({test:'Heavy Load'});
This will block because it has to do the for loop for a long time and then after it finished it will send the output.
Now if you do the same code like this it won't block:
function doStuff(){
for (var k = 0; k < 15000; k++){
console.log('Something Output');
}
}
doStuff();
res.status(200).json({test:'Heavy Load'});
Why? Because the functions are run asynchronously...! So how will I then send the resulting response to the requesting client? Currently I am doing it as follows:
Run the doStuff function
Send a unique call reference which is then received by the ajax call on the client side.
Put the callback function of the client side into a waiting object.
Listen on a socket
When the doStuff function is completed, it should issue a socket message with the resulting response together with the unique reference
When the socket on the client side gets the message with the unique reference and the resulting response, it will then match it with the waiting callback function and run it.
Done! A bit of a workaround (as mentioned before), but it's working! It does require a socket to be listening. And that is my solution to this port-blocking situation in NodeJS.
Is there some other way? I am hoping someone answers with another way, because I am still feeling like this is some workaround. Eish! ;)
is NodeJS suitable for a website
Definitely yes.
can anyone show and prove this with a short code
No.
All websites using PHP never have to utilise any workaround whatsoever
in order to display a web page and it never blocks 2 people at the
same time.
Node.js doesn't require any workarounds as well, it simply works without blocking.
To more broadly respond to your question/complaint:
A single node.js machine can easily serve as a web-server for a website and handle multiple sessions (millions actually) without any need for workarounds.
If you're coming from a PHP web-server, maybe instead of trying to migrate an existing code to a new Node website, first play with online simple website example of Node.js + express, if that works well, you can start adding code that require long-running processes like reading from DBs or reading/writing to files and verify that visitors aren't being blocked (they shouldn't be blocked).
See this tutorial on how to get started.
UPDATE FOR EXAMPLE CODE
To fix the supplied example code, you should convert your fs.readFile call to fs.createReadStream. readFile is less recommended for large files, I don't think readFile literally blocked anything, but the need to allocate and move large amounts of bytes may choke the server, createReadStream uses chunks instead which is much easier on the CPU and RAM:
rstream = fs.createReadStream(file);
var length = 0;
rstream.on('data', function (chunk) {
length += chunk.length;
// Do something with the chunk
});
rstream.on('end', function () { // done
console.log('file read! length = ' + length);
});
After switching your code to createReadStream I'm able to serve continues calls to heavyload / lightload in ~3ms each
THE BETTER ANSWER I SHOULD HAVE WRITTEN
Node.js is a single process architecture, but it has multi-process capabilities, using the cluster module, it allows you to write master/workers code that distributes the load across multiple workers on multiple processes.
You can also use pm2 to do the clustering for you, it has a built-in load balancer to distribute to work, and also allows for scaling up/down without downtime, see this.
In your case, while one process is reading/writing a large file, other processes can accept incoming requests and handle them.
Imagine Agar.io. Unlike a chat app, the list of users (or players) and other environment objects will be constantly changing, for each player, as players move around the map. That is because each client can't receive updates about every object, because the map is too large and the lag would be too much. So which of the following methods of updating clients, with Socket.IO, would be more efficient:
Send an environment array containing data, which replaces the local arrays on each client.
Send individual messages when objects appear/disappear in a players field of view, and tinker with the local arrays object by object.
If there is a better way than the above two, please outline it.
This is a multi-vector tradeoff decision so without some measuring and probably experimentation, we can't really tell you what situation is optimal. But, we can direct your thinking which you can hopefully use to finish the analysis.
First off, to scale and reduce lag, you want to:
Send fewer messages to each client.
Send smaller payloads with each message as long as it doesn't make item #1 worse (e.g. as long as it doesn't cause you to send more messages).
Have fewer times on the server where you are doing calculations and then sending messages.
To send fewer messages to each client you want to:
Reduce the scope of the map that the client gets sent updates about to only things that are closely in view (it sounds like you're already doing some of that).
Combine as much information as you can in each message that you are going to send to a client - make sure that you're never sending more than one message to a given client for a particular update.
To send smaller messages to each client you want to:
Reduce the size of the data you send to each client. This means that if some data has not changed since that last time you communicated with this client, then don't resend that data. This would suggest that your second option (client updates its own local array) is a better way to do it because you only have to send deltas to the client and it remembers previous state.
Carefully analyze the format of the data you're sending to the client and reduce its size wherever possible. Straight JSON is often not the most efficient way to send data if you're trying to optimize transmission size.