I have almost completed a turn-based multiplayer game with node.js and socket.io.
I have express.js as web server and another class acting as game server, using socket.io.
My problem is that these two are running in the same application. I have a web-landing page where users can log in and see their player details and chat in the lobby. Now until here there is nothing related to game logic. So i'm asking myself why in the hell is this game server running in the same app with webserver. Also note that, in the future, there may be different servers like (europe server, americas server, asia server, etc.) but the same landing page and login mechanism stays.
So what is the perfect way to achieve this?
Should i make two separate node.js apps, one webserver and one game server. However in that case will i be able to share session data between sockets and requests?
Another drawback is that when navigating the site, any unexpected behaviour may cause the webserver to throw exception and close the application thus shutting down the game server also.
You may have completely separate node.js apps and a common session store, such as Redis, to check if a user authenticated with the webserver before allowing access to the game server. The common session store would also help with simultaneous game servers, not just one web + one game server.
You may also prevent closing the application by catching all errors, but that is not generally recommended.
Later edit:
Here is a small list of advantages and disadvantages regarding session stores. It's not for node.js, but it applies here:
http://techwhizbang.com/2009/12/memcache-session-store/
Related
I'm new to working with sockets and have a small system design question:
I have 2 separate node processes for a web app, 1 is a simulator that is constantly running and the 2nd is an api server. Both share the same MongoDB database and we have a React app running for the client, served by the api server.
I'm looking to implement socket.io for real-time notifications and so I've set up a simple connection between the api and client.
My problem is that while the simulator runs, there are some events that I also want to trigger push notifications for so my question is how to hook that into everything?
The file hierarchy is like:
app/
simulator/
api/
client/
I saw this article for communication between node processes and I currently have 3 solutions in mind:
Leave hierarchy as it is and install socket.io package inside simulator as well. I'm not sure if sockets work this way but can both simulator and api connect to the same socket?
Move simulator file into api file to fork as a child process so that the 2 processes can communicate via child/parent messaging. simulator will message api which will then emit updates through the socket to client
Leave hierarchy as is and communicate via node-ipc. Same situation as above with simulator messaging api first before api emits that to client
If 1 is possible, that seems like the best solution in my impression. It seems like extra work to add an additional layer of messaging for 2 and 3.
Leave hierarchy as it is and install socket.io package inside simulator as well. I'm not sure if sockets work this way but can both simulator and api connect to the same socket?
The client would have to create a separate socket.io connection to the simulator process. Then, the client can receive data from the API server over one connection and from the simulator over another connection. You would need two separate, independent socket.io connections from the client, one to the API server and one to the simulator. Simulator and API server cannot share the same socket unless they are in the same process.
Move simulator file into api file to fork as a child process so that the 2 processes can communicate via child/parent messaging. simulator will message api which will then emit updates through the socket to client
This is really part of a broader option that the simulator communicates with the API server and sends it data that the API server can then send to the client over the single socket.io connection that the client made to the API server.
There are lots of different ways for the simulator process to communicate with the API server.
Since it's already an API server, you can just make an API for this (probably non-public). The simulator calls an API to send data to the client. The API server receives that data and sends it to the client.
As you suggest, if the simulator is run from the API server as a child process, then you can use parent/child communication messaging built into node.js. Note, you don't have to move the simulator files into the API file at all. You can just use child_process to launch the simulator as another nodejs app from another project. You just have to know the path to that other project.
You can use any another communication mechanism you want between the simulator process and the API server process. There could be a socket.io connection between them. You could use several forms of IPC, etc...
If 1 is possible, that seems like the best solution in my impression.
Your #1 option is not possible as separate processes can't use the same socket.io connection.
It seems like extra work to add an additional layer of messaging for 2 and 3.
My options #1 and #2 are not much code in each server. You're doing interprocess communication. You should expect to use some code to enable that. But, it's not hard at all.
If the lifetime of the simulator server and the API server are always together (they have no independent uses), then I'd probably do the child process thing where the API server launches the simulator and then use parent/child messaging to communicate between them. You do NOT have to combine sources to do this.
The child_process module can run the simulator process by just knowing what directory it is located in.
Otherwise, I'd probably make a small web server on a non-public port in the API server and have the simulator just send data to that other web server. I often refer to this as a control port. It's a way of "controlling or diagnosing" the API server internals and can only be accessed from within the private network and/or with credentials. The reason I'd use a separate web server (in the same nodejs app as the API server) is to make it easy to secure so it can't be accessed from the outside world like the regular public APIs can. You just put the internal web server on a port that is not exposed to the outside world.
You should check Socket.IO docs about adapters and Emitters. This allows to connect to sockets from different node processes and scalability.
I'm coding an online multiplayer game using nodejs and HTML5 and I'm at the point where I would like to have multiple maps for people to play on, but I'm having a scaling issue. The server I'm running this on isn't able to support the game loops for more than a few maps on its own, and even though it has 4 cores I can only utilize one with a single node process.
I'd like to be able to scale this to not even necessarily be limited to a single server. I'd like to be able to start up a node process for each map in the game, then have a master process that looks up what map a player is in and passes their connection to the correct sub process for handling, updating with game information, etc.
I've found a few ways to use a proxy like nginx or the built in node clusters to load balance but from what I can tell the examples I've seen just give a connection to whatever the next available process is, and I need to hand them out specifically. Is there some way for me to route a connection to a node process based on a condition like that? I'm using Express to serve my static content and socket.io for client to server communication currently. The information for what map the player is in will be in MongoDB along with the rest of the player data, if that makes a difference.
There are many ways to adress your problem, here are two suggestions based on your description.
1 - Use a router server which will dispatch players queries to "Area servers" : in this topology all clients queries will arrive to your route server, the server tag each query with a unique id and dispatch it to the right area server, the area server handle the query and sendit back to the route server which will recognize it from the unique tag and send back the response to the client.
this solution will dispatch the CPU/memory load but not the bandwidth !
2 - Use an authentication server which redirect client to the servers with less load : in this case, you'll have multiple identical servers and one authentication server, when a client authenticate, send the url and an auth token of available server to the client and an authentication ticket to the server.
the client then connect to the server which will recognize using the auth toekn/auth ticket.
this solution will dispatch all CPU/Memory/Bandwidth, but might not be suited to all games since you can be sent to different server each connection and you'll not see the players in the same area if you are not on the same server.
those are only two simple suggestions, you can mix the two approaches or add other stuff (for example inter-communication area servers etc) which will solve the mensioned issues but will add complexity.
I'm developing a node web application. And, while testing around, one of the client chrome browser went into hung state. The browser entered into an infinite loop where it was continuously downloading all the JavaScript files referenced by the html page. I rebooted the webserver (node.js), but once the webserver came back online, it continued receiving tons of request per second from the same browser in question.
Obviously, I went ahead and terminated the client browser so that the issue went away.
But, I'm concerned, once my web application go live/public, how to handle such problem-client-connections from the server side. Since I will have no access to the clients.
Is there anything (an npm module/code?), that can make best guess to handle/detect such bad client connections from within my webserver code. And once detected, ignore any future requests from that particular client instance. I understand handling within the Node server might not be the best approach. But, at least I can save my cpu/network by not rendering to the bad requests.
P.S.
Btw, I'm planning to deploy my node web application onto Heroku with a small budget. So, if you know of any firewall/configuration that could handle the above scenario please do recommend.
I think it's important to know that this is a pretty rare case. If your application has a very large user base or there is some other reason you are concerned with DOS/DDOS related attacks, it looks like Heroku provides some DDOS security for you. If you have your own server, I would suggest looking into Nginx or HAProxy as load balancers for your app combined with fail2ban. See this tutorial.
I am looking to build an web application using node.js and possibly socket.io but I am having a lots of confusion regarding whether to use socket.io or go with plain http. In the app the node.js server will be basically an api server which serves json to the javascript client or may be mobile clients too. The web app will also has chat messeneger for its users, this is where socket.io comes in.
I am not sure whether to use socket.io for the whole app or only for the chat part. Although my app itself could benefit from socket.io but its nothing that I think can't be done using plain http and client making more requests to the server.
I have read at several places that sometimes socket.io can be difficult to scale for more users.
Socket.io often crashes and specially creates probs when there are firewall in clients system.
More importantly.....I checked out socket.io user list and did not find many users, so was curious to know what kind of platform is more know chat network like facebook messenger, google talk etc are built upon, Are any built using http-ajax and continues querying to the server.
Please help me out in solving this question. Some might argue that this is a opinion based question. But what actually I am trying to figure out the implementation of socket.io and its limitation.
I would suggest serving your API over HTTP and leave the real-time business to Socket.io. If you are adverse to using Websockets, like #GeoPheonix stated, you can choose from a variety of transport methods using both socket.io and sockjs (https://github.com/sockjs/sockjs-node).
As far as scaling is concerned, I deployed a socket.io based real-time analytics/tracking service for a very large application with ana average of 400+ concurrent connections with no visible performance impact, but this may depend on the implementation and hardware.
Socket.io is faster than plain http. I recommend you to use it for all since you have to have a chat in first place.
In my case, real-time Texas Hold'em-like game can receive up to 2500 concurrent with one node process. However, if you change transport from websocket to xhr-polling, It can receive like more 10x than pure websocket. Your application is just chat so, I guess a little slow down wouldn't be a problem. If you sure you will exceed this number, yes, scale socket.io is a pain.
This problem will happen only if you open socket.io for port other than 80 and 443. If you have frontend web server with other language already, you can still use socket.io on another subdomain to be able to run on port 80 without conflict with your main frontend web server. Socket.io support cross-domain without a problem.
Have you used trello.com? If not, try it :). It's best for task management or even some Agile thing. They used socket.io. https://c9.io/ is another one. It's online IDE with google doc-like collaborative. One thing to note is xhr-polling trasport in socket.io is the same with http-ajax with long-polling (Better than general ajax). You can read more info at:
http://book.mixu.net/node/ch13.html
Background: I am building a web app using NodeJS + Express. Most of the communication between client and server is REST (GET and POST) calls. I would typically use AJAX XMLHttpRequest like mentioned in https://developers.google.com/appengine/articles/rpc. And I don't seem to understand how to make my RESTful service being used for Socket.io as well.
My questions are
What scenarios should I use Socket.io over AJAX RPC?
Is there a straight forward way to make them work together. At least for Expressjs style REST.
Do I have real benefits of using socket.io(if websockets are used -- TCP layer) on non real time web applications. Like a tinyurl site (where users post queries and server responds and forgets).
Also I was thinking a tricky but nonsense idea. What if I use RESTful for requests from clients and close connection from server side and do socket.emit().
Thanks in advance.
Your primary problem is that WebSockets are not request/response oriented like HTTP is. You mention REST and HTTP interchangeably, keep in mind that REST is a methodology behind designing and modeling your HTTP routes.
Your questions,
1. Socket.io would be a good scenario when you don't require a request/response format. For instance if you were building a multiplayer game in which whoever could click on more buttons won, you would send the server each click from each user, not needing a response back from the server that it registered each click. As long as the WebSocket connection is open, you can assume the message is making it to the server. Another use case is when you need a server to contact a client sporadically. An analytics page would be a good use case for WebSockets as there is no uniform pattern as to when data needs to be at the client, it could happen at anytime.
The WebSocket connection is an HTTP GET request with a special header requesting the server to upgrade it to a WebSocket connection. Distinguishing different events and message on the WebSocket connection is up to your application logic and likely won't match REST style URIs and methods (otherwise you are replication HTTP request/reply in a sense).
No.
Not sure what you mean on the last bit.
I'll just explain more about when you want to use Socket.IO and leave the in-depth explanation to Tj there.
Generally you will choose Socket.IO when performance and/or latency is a major concern and you have a site that involves users polling for data often. AJAX or long-polling is by far easier to implement, however, it can have serious performance problems in high load situations. By high-load, I mean like Facebook. Imagine millions of people loading their feed, and every minute each user is asking the server for new data. That could require some serious hardware and software to make that work well. With Socket.IO, each user could instead connect and just indefinitely wait for new data from the server as it arrives, resulting in far less overall server traffic.
Also, if you have a real-time application, Socket.IO would allow for a much better user experience while maintaining a reasonable server load. A common example is a chat room. You really don't want to have to constantly poll the server for new messages. It would be much better for the server to broadcast new messages as they are received. Although you can do it with long-polling, it can be pretty expensive in terms of server resources.