I am creating a game by Unity and I want to upload the players' score to MongoDB. Therefore, I have built a node.js server listening to port 3000, and the scores will be sent to the server and store into the database.
My question is that if I want to create a website for viewing/analyzing players' scores, which approach should I use?
create two node.js servers, one for the web, one for the game
one node.js server but listen to port 80 and 3000 (im not sure whether it is possible or not)
any other better suggestions?
Thank you.
I would create one Node server, one to serve both api and web requests.
It sounds like the data served by the API and the web will be the same or subsets of each other. So you'll probably want to share code, lookup the same stuff from the database, etc etc.
From here, you could either create separate routes that the api uses and the web uses (/api/v1/my_scores vs /my_scores) OR realize that you're just asking for different representations of the same data and do something RESTful like checking the accept header and either sending server rendered HTML or sending JSON back to the client.
Alternatively, you could just create a api in Node, then use a purely front end tool like Angular or React to create a web front end for your site.
Using port 3000 is not a good idea because many users access internet through firewalls which block non-standard ports.
I would recommend using 443 port and https to secure the communication for both use cases.
If the site for analyzing scores does not share logic with the api server, then it can be created as a separate site - but in starting it is easier to manage a single application.
If i understand your question easily and according to my limited knowledge i think that you don't require more than one server with a database. The reason is that one web you only want to display the high score nor the end user can insert it anything on website. So the complexity is minimal already so don't bother to create another server. Just make data getting API separate for using in website.
Related
I want to practice creating my own RESTful API service to go along with a client-side application that I've created. My plan is to use Node and Express to create a server. On my local machine, I know how to set up a local server, but I would like to be able to host my application (client and server) online as part of my portfolio.
The data that my client application would send to the server would not be significant in size, so there wouldn't be a need for a database. It would be sufficient to just have my server save received data dynamically in an array, and I wouldn't care about having that data persist if the user exits the webpage.
Is it possible to use a service like Netlify in order to host both a client and server for my purposes? I'm picturing something similar to how I can start up a local dev server on my computer so that the front-end can interface with it. Except now I want everything hosted online for others to view. I plan to create the Express server in the same repo as the front-end code.
No, Netlify doesn't allow you to run a server or backend. However, they do allow you to run serverless functions in the cloud. These can run for up to 10 sec. at a time. Furthermore Netlify also have a BETA solution called "background functions" That can run for up to 15 minutes. But honestly for a RESTful API there sure would be better solutions out there?
If you are still looking for the Netlify for Backend you can consider Qovery. They explained here why it is a good fit for their users.
This is more like a design question but I have no idea where to start.
Suppose I have a realtime Node.js app that runs on multiple servers. When a user logs in she doesn't know which server she will be assigned to. She will just login, do something and logout and that's it. A user won't be interacting with other users on a different server, nor will her details be stored on another server.
In the backend I assume the Node.js server will put the user's login details to some queue and then when there is space it will assign this user to an available server (A server that has the lowest ping value or is not full). Because there is a limit number of users on one physical server when the users try to login to a "full" server it will direct her to another available server.
I am using ws module of node.js. Is there any service available for this purpose or do I have to build my own? How difficult would that be?
I am not sure how websocket fits into this question. Ignoring it. I guess your actual question is about load balancing... Let me try paraphasing it.
Q: Does NodeJS has any load balancing feature that I can leverage?
Yes and it is called cluster in NodeJS. Instead of the traditional one node process listening on a single port, this module allows you to spawn a group of node processes and have them all binded to the same port.
This means is that all the user know is only the service's endpoint. He sends a request to it and 1 of the available server in the group will serve him whenever possible.
Alternatively using Nginx, the web server, as your load balancer is also a very popular approach to this problem.
References:
Cluster API: https://nodejs.org/api/cluster.html
Nginx as load balancer: http://nginx.org/en/docs/http/load_balancing.html
P.S
I guess the key word for googling solutions to your problem is load balancer.
Out of the 2 solutions I would recommend going the Nginx way as it is a much scalable approach
Example:
Your Node process could possibly be spread across multiple hosts (horizontal scaling). The former solution is more for vertical scaling, taking advantages of multi-cores machine.
I'm coding an online multiplayer game using nodejs and HTML5 and I'm at the point where I would like to have multiple maps for people to play on, but I'm having a scaling issue. The server I'm running this on isn't able to support the game loops for more than a few maps on its own, and even though it has 4 cores I can only utilize one with a single node process.
I'd like to be able to scale this to not even necessarily be limited to a single server. I'd like to be able to start up a node process for each map in the game, then have a master process that looks up what map a player is in and passes their connection to the correct sub process for handling, updating with game information, etc.
I've found a few ways to use a proxy like nginx or the built in node clusters to load balance but from what I can tell the examples I've seen just give a connection to whatever the next available process is, and I need to hand them out specifically. Is there some way for me to route a connection to a node process based on a condition like that? I'm using Express to serve my static content and socket.io for client to server communication currently. The information for what map the player is in will be in MongoDB along with the rest of the player data, if that makes a difference.
There are many ways to adress your problem, here are two suggestions based on your description.
1 - Use a router server which will dispatch players queries to "Area servers" : in this topology all clients queries will arrive to your route server, the server tag each query with a unique id and dispatch it to the right area server, the area server handle the query and sendit back to the route server which will recognize it from the unique tag and send back the response to the client.
this solution will dispatch the CPU/memory load but not the bandwidth !
2 - Use an authentication server which redirect client to the servers with less load : in this case, you'll have multiple identical servers and one authentication server, when a client authenticate, send the url and an auth token of available server to the client and an authentication ticket to the server.
the client then connect to the server which will recognize using the auth toekn/auth ticket.
this solution will dispatch all CPU/Memory/Bandwidth, but might not be suited to all games since you can be sent to different server each connection and you'll not see the players in the same area if you are not on the same server.
those are only two simple suggestions, you can mix the two approaches or add other stuff (for example inter-communication area servers etc) which will solve the mensioned issues but will add complexity.
I've been doing a lot of research recently on creating a backend for all the websites that I run and a few days ago I leased a VPS running Debian.
Long-term, I'd like to use it as the back-end for some web applications. However, these client-side javascript apps are running on completely different domains than the VPS domain. I was thinking about running the various back-end applications on the VPS as daemons. For example, daemon 1 is a python app, daemons 2 and 3 are node js, etc. I have no idea how many of these I might eventually create.
Currently, I only have a single NodeJS app running on the VPS. I want to implement two methods on it listening over some arbitrary port, port 4000 for example:
/GetSomeData (GET request) - takes some params and serves back some JSON
/AddSomeData (POST request) - takes some params and adds to a back-end MySQL db
These methods should only be useable from one specific domain (called DomainA) which is different than the VPS domain.
Now one issue that I feel I'm going to hit my head against is CORS policy. It sounds like I need to include a response header for Access-Control-Allow-Origin: DomainA. The problem is that in the future, I may want to add another acceptable requester domain, for example DomainB. What would I do then? Would I need to validate the incoming request.connection.remoteAddress, and if it matched DomainA/DomainB, write the corresponding Access-Control-Allow-Origin?
As of about 5 minutes ago before posting this question, I came across this from the W3C site:
Resources that wish to enable themselves to be shared with multiple Origins but do not respond uniformly with "*" must in practice generate the Access-Control-Allow-Origin header dynamically in response to every request they wish to allow. As a consequence, authors of such resources should send a Vary: Origin HTTP header or provide other appropriate control directives to prevent caching of such responses, which may be inaccurate if re-used across-origins.
Even if I do this, I'm a little worried about security. By design anyone on my DomainA website can use the web app, you don't have to be a registered user. I'm concerned about attackers spoofing their IP address to be equal to DomainA. It seems like it wouldn't matter for the GetSomeData request since my NodeJS would then send the data back to DaemonA rather than the attacker. However, what would happen if the attackers ran a script to POST to AddSomeData a thousand times? I don't want my sql table being filled up by malicious requests.
On another note, I've been reading about nginx and virtual hosts and how you can use them to establish different routes depending on the incoming domain but I don't BELIEVE that I need these things; however perhaps I'm mistaken.
Once again, I don't want to use the VPS as a web-site server, the Node JS listener is going to be returning some collection of JSON hence why I'm not making use of port 80. In fact the primary use of the VPS is to do some heavy manipulation of data (perhaps involving the local MySQL db) and then return a collection of JSON that any number of front-end client browser apps can use.
I've also read some recommendations about making use of NodeJS Restify or ExpressJS. Do I need these for what I'm trying to do?
I am looking to build an web application using node.js and possibly socket.io but I am having a lots of confusion regarding whether to use socket.io or go with plain http. In the app the node.js server will be basically an api server which serves json to the javascript client or may be mobile clients too. The web app will also has chat messeneger for its users, this is where socket.io comes in.
I am not sure whether to use socket.io for the whole app or only for the chat part. Although my app itself could benefit from socket.io but its nothing that I think can't be done using plain http and client making more requests to the server.
I have read at several places that sometimes socket.io can be difficult to scale for more users.
Socket.io often crashes and specially creates probs when there are firewall in clients system.
More importantly.....I checked out socket.io user list and did not find many users, so was curious to know what kind of platform is more know chat network like facebook messenger, google talk etc are built upon, Are any built using http-ajax and continues querying to the server.
Please help me out in solving this question. Some might argue that this is a opinion based question. But what actually I am trying to figure out the implementation of socket.io and its limitation.
I would suggest serving your API over HTTP and leave the real-time business to Socket.io. If you are adverse to using Websockets, like #GeoPheonix stated, you can choose from a variety of transport methods using both socket.io and sockjs (https://github.com/sockjs/sockjs-node).
As far as scaling is concerned, I deployed a socket.io based real-time analytics/tracking service for a very large application with ana average of 400+ concurrent connections with no visible performance impact, but this may depend on the implementation and hardware.
Socket.io is faster than plain http. I recommend you to use it for all since you have to have a chat in first place.
In my case, real-time Texas Hold'em-like game can receive up to 2500 concurrent with one node process. However, if you change transport from websocket to xhr-polling, It can receive like more 10x than pure websocket. Your application is just chat so, I guess a little slow down wouldn't be a problem. If you sure you will exceed this number, yes, scale socket.io is a pain.
This problem will happen only if you open socket.io for port other than 80 and 443. If you have frontend web server with other language already, you can still use socket.io on another subdomain to be able to run on port 80 without conflict with your main frontend web server. Socket.io support cross-domain without a problem.
Have you used trello.com? If not, try it :). It's best for task management or even some Agile thing. They used socket.io. https://c9.io/ is another one. It's online IDE with google doc-like collaborative. One thing to note is xhr-polling trasport in socket.io is the same with http-ajax with long-polling (Better than general ajax). You can read more info at:
http://book.mixu.net/node/ch13.html