I'm developing an multiplayer turn based game (e.g chess), should support a lot of players (that's the idea). My question is about a service i'm developing, it's the pairing system, the responsible of pairing 2 players to start a room and start playing.
So, this is the pairing service:
matchPlayers() {
if (this.players.length >= 2) {
let player1 = this.players.shift();
let player2 = this.players.shift();
if (player1 !== undefined && player2 !== undefined) {
player1.getSocket().emit('opponent_found');
player2.getSocket().emit('opponent_found');
return this.createMatchInDataBaseApiRequest(player1, player2)
.then(function (data) {
let room = new RoomClass(data.room_id, player1, player2);
player1.setRoom(room);
player2.setRoom(room);
return room;
});
}
}
return false;
}
At the entrypoint of the server, each new socket connection I push it to an array "PlayersPool" this array is for players waiting to get matched up.
Right now my approach is to pair users when there are available, (FIFO - first in first out).
The problems (and question) I see with this pairing system is:
This depends on new users, this gets executed each time a new user is connected, The flow is: A user connects, get's added to the pool, and check if there are users waiting for being paired, if yes a room is created and they can play, if not he gets added to the waiting pool; Until a new user connects and the code get's executed and so on...
What would happen if in some weird case (not sure if this could happen) 2 players gets added to the waiting pool at the same exact time, this service would find the pool empty and would not create a room: To solve this maybe having another service running always and checking the pool? what would be the best approach? Could this even happen? in which scenario?
Thanks for the help.
I'm guessing this particular code snippet is on the server? If so, assuming there is only one server, then there is no "race condition": node.js is single-threaded, as IceMetalPunk mentioned, so if you're running this function every time you add a player to this.players, you should be fine.
There are other reasons to be examining the player pool periodically, though: players you've added to the pool may have gotten disconnected (due to timeout or closing the browser), so you should remove them; you also might want to handle situations where players have been waiting a long time - after X seconds, should you be updating the player on progress, calculating an estimated wait time for them, perhaps spawning an AI player for them to interact with while they wait, etc.
You can run into a "race condition", it's explained here in this package which provides you a Locking mechanism.
https://www.npmjs.com/package/async-lock
That package will be useful, only if you run node.js in a single process, meaning you are not having multiple servers, or having node cluster running multiple processes.
In that case, you will have to implement a distributed locking mechanism which is one of the most complex things in distributed computing, but today you can use the npm package for Redlock algorithm, set 3 redis servers and go.
Too much overhead for a game without players.
Node.js is not single threaded, here is the explanation of one of the creators.
Morning Keynote- Everything You Need to Know About Node.js Event Loop - Bert Belder, IBM
https://www.youtube.com/watch?v=PNa9OMajw9w
Conclusion, keep it simple, run it in a single node process and use the "async-lock" package.
If your server grows to become a MMO, you will need to read about distributed computing:
How to do distributed locking:
https://martin.kleppmann.com/2016/02/08/how-to-do-distributed-locking.html
Book on data intensive apps
http://dataintensive.net/
Related
For context, i am making a multi-player game using nodejs and my db is postgres.
players play one by one, and i save everything in the db.
when first user plays, they can't play again, until the other player played too.
what i am doing now is having a boolean on each player in the db that says "ableToPlay" which is true, then turns to false if it's not the user's turn.
issue is when user spams the "play button" and my db is in a remote server, it takes time to update from true to false, making the user play multiple times then causes the app to crash.
I am using aws Microservices architecture so the server must be stateless.
is there any way i can save the game progress in a way where the progress is accessible to all my micro-services?
How do you check the turn? Is it something:
select turn from db
if turn == X then
//allow the turn
do all the logic
update the turn to Y
endif
So the "do all the logic" may be called several times as several requests will get turn=X.
This is a very common problem in programming, there are several approaches you could do.
Two key observations to address:
the same player should not do a turn twice in a row
while one player is making the turn, the other player must wait
Easiest way it to use a transaction in the DB while the turn is happening. For example, when player X making the turn:
start transaction
update turn=X where turn=Y (Y is the other player)
if update done (one record is updates)
do all the logic
commit the transaction
In that approach, update will wait for the previous one to finish, and the WHERE clause will make sure the same player won't do two or more turns in a row. And the transaction isolation will avoid running turn logic at the same time.
If you don't want to use the transaction, you could build a state machine, with states:
waitingForTurnX
makingTurnX
waitingForTurnY
makingTurnY
this would be a nice model to code and these transitions could be handled without transactions:
update state=makingTurnX where state=waitingForTurnX
This approach will also eliminate race condition, because in vast majority of databases, updates are atomic when it comes to a single record.
For example, let's say I have a random game in which I have 500 independent objects and 10 players.
Independent object is an object that moves in a specific direction per update regardless of what players do (there is no need for players to come into contact with these objects).
Now if a player is shooting (lets say) a bullet, it is easier because it belongs to a specific player therefore it's easier to avoid in game lag. Lets look at something simpler, though, for example a player try to update their position. The typical thing I would do on client & server side would be this :
client side : update the coords of the player + send a message to the server as socket X
server side : receives the message from socket X, updates the coords of the player on the server side +
sends a message with the coords of that same player to all other sockets
When you do the communication like this, everyone will receive the new coords of the player and there will be little to no lag. (It is also sufficient for objects like bullets, because they are created upon firing a player event)
How do you handle 500+ independent objects that move in random directions with random speed all across the map and update them for all players efficiently? (Be aware that their velocity and speed can be changed upon contact with a player). What I've tried so far:
1) Put all of the movement + collission logic on the server side &
notifying all clients with a setTimeout loop & io.emit -
Result : causes massive lag even when you have only 500+ objects and 4 connected players. All of the players receive the server's response way too slow
2) Put all of the movement + collission logic on the client side & notifying the server about every object' position-
Result : To be honest, couldn't encounter much lag, but I am not sure if this is the correct idea as every time an object moves, I am literally sending a message to the server from each client to update that same object (server is getting notified N[number of connected clients] amount of times about that same object). Handling this entirely on the client side is also a bad idea because when a player randomly switches tabs [goes inactive], no more javascript will be executed in that players' browser and this whole logic will break
I've also noticed that games like agar.io, slither.io, diep.io, etc, all of them do not really have hundreds of objects that move in various directions. In agar.io and slither you mainly have static objects (food) and players, in diep.io there are dynamical objects, but none of them move at very high speeds. How do people achieve this? Is there any smart way to achieve this with minimal lag?
Thanks in advance
Convert your user interactions to enumerated actions and forward those. Player A presses the left arrow which is interpreted by the client as "MOVE_LEFT" with possible additional attributes (how much, angle, whatever) as well as a timestamp indicating when this action took place from Player A's perspective.
The server receives this and validates it as a possible action and forwards it to all the clients.
Each client then interprets the action themselves and updates their own simulation with respect to Player A's action.
Don't send the entire game state to every client every tick, that's too bloated. The other side is to be able to handle late or missing actions. One way of doing that is rollback where you keep multiple sets of state and then keep the game simulation going until a missinterpretation (late/missing packet) is found. Revert to the "right" state and replay all the messages since in order to get state to correct. This is the idea behind GGPO.
I suggest also reading every article related to networking that Gaffer on Games goes into, especially What Every Programmer Needs To Know About Game Networking. They're very good articles.
I'm working on a simple multiplayer game with Socket.io and Node.js. I ran into a few performance problems and tried debugging to find out where there could be a memory leak.
What is happening is that the CPU usage keeps increasing by about 10% for every player added to the game, and then stays steady until I eventually add another (it increases another 10%) or disconnect a player (decreases by 10%).
After I looked into it for a while I found out that the reason for such an increase is the 'emit' on the server side. The server looks something like this:
var clients = [];
io.sockets.on('connection', function(socket){
//logic for adding this player to the game arena, that runs
//only one time (when the client connects)
var id = addToGame();
//add the id of this player in the 'clients' list
clients.push({
id: id,
socket: socket
});
socket.on('disconnect', function(){
//remove the player from the game.
//delete the client'id from the list of 'clients'
});
});
setInterval(update, 30);
function update() {
for (var client of clients) {
var player = findPlayerById(client.id);
var message = getMessageForPlayer(player);
client.socket.emit('update client', message);
}
}
I thought the problem was with the fact that I was using anonymous functions a couple times, so I changed that but it didn't seem to have done anything.
So then, after messing a bit with the code, I realized that when I comment out this line
client.socket.emit('update client', message);
the CPU usage doesn't seem to be increasing at all when new players come along. Which kinda makes sense because the game runs so that there is always a certain given number of players in the game arena. Those players are initially CPU controlled and do exactly the same things a human player would be able to do. When a human player joins the game, they simply take the place of an existing CPU player, so that the computations that take place in updating the game are roughly the same regardless of whether all players are CPUs or actual human players.
The only difference is in the computation that runs right before emitting the 'update clients' message. That only happens for human players. And so I thought it would make sense that because that for loop queries the game for every single human player, it could be the reason for the increase in CPU usage.
But to my big surprise, even if you leave all those computations as they are, and simply take out the 'emit' part, the problem disappears (the CPU usage stays steady no matter how many players you add).
I figured it's something I should expect since I'm using a testing dedicated server from Digital Ocean, with 1Gb of RAM and 1 CPU (also the emitted message is an array of about 150 elements in average, consider each element to be an array containing 10 strings as elements).
My question is: how do I go about getting a server with a CPU capacity that will allow me to host in the 500 players or more? With the one I currently have, the game logic itself takes about 40% of the CPU, and each player added increases the CPU by 10%, hard to reach a mere 10 players without the CPU usage reaching 100% and rebooting.
I was about to get a 32Gb, 4-core, server from OVH, but I want to get the point of view of people that know better than I do on this. What should I look for in a CPU to be able to emit those messages without the CPU failing?
I'm developing a game where my backend is written in NodeJS and I use socket.io for realtime communication.
One of the features of my game is matchmaking. Basically, if more than two players are on matchmaking period, my application will create a game room (special socket.io room) for them and start the game.
There are two ways to do this:
Create a setInterval on NodeJS. In the interval, check playersInMatchmaking array. If there are more than 2 players in the array, create the game room. I will loop as long as the server is online.
Instead of relying on setInterval, check playersInMatchmaking array each time I receive a call to socket.on("matchmaking start") or socket.on("matchmaking stop")
Is there any benefits of using certain approach? setInterval sounds easier as I can decouple matchmaking algorithm from socket logic, however it will be less performant as matchmaking algorithm will run in a loop as opposed to socket events.
What do you think? Do you have any other ideas that would work better?
You will maximize the utilisation of socket.io and real-time communication by using events. If there is no other reason other than decoupling the algorithm, you should use events. There are no reason to run the loop over and over again if there are currently no users/players participating or searching for the match.
If the interval is not small enough (for example 5 seconds), I would also consider what happens when Player1 is put into queue, Player2 also comes in the queue, and in between 2 intervals Player1 cancels his search for opponent since he's tired of waiting. So main benefit of event triggered systems other than the one you mentioned (performance) is that there is no real delay (except the network, code looping, jitter, ..) and things happen immediately
I have a multiplayer game lobby up where users can create private chatrooms and start private games. Each user has a health bar in the game that is suppose to slowly regenerate x points per second.
I suppose I would need to start server side game loop at the beginning of each game, which is something like that:
setInterval(update('gameID'),1000);
Where update('gameID') increment the health variables for all players in a particular game where 1000 ms = 1 second.
Question: Am I right to assume this is asynchronous? I might have 50 separate games going on, and 50 of these running. The main process is not going to be blocked right?
It's asynchronous, but you don't need 50 timers in the case you describe.
You can use a single timer to regenerate players in active games. If you're also pushing health data this is going to be pretty inefficient.
You can do something like player.attackedTime = (new Date).getTime() and calculate regeneration on each attack like player.health += x_points * ((new Date).getTime() - player.attackedTime) / 1000, but you will have to do predictive regeneration on the client.
It is asynchronous. But doing this that way may kill your server.
I advice making these intervals passive, i.e. hold the start of the game in memory and make client ping for data. When client pings server checks current date and compares it to the stored one (and updates the stored one at the end of request). It can evaluate current health from that.
This solution should scale better.
setInterval is certainly asynchronous. Most functions that take a callback are asynchronous. If you're ever in doubt, you can check the documentation or the source code.