I have MEAN stack apps, I want a suggestion about to run both them on the same server or on the same port(suggestion which one is better).I found article about to run server on the same port using proxies,also I want to know is this a good idea.There is some troubles about running on same server,because I'm using Angular 6 as Front-end, I have to build those files into my node's (which is one in this case) public folder,so I think that's impossible (I think), because there must be index.html and it must be one and not only index.html, other files too. And if this two cases is not good deal.please just let me know about that.Thanks for help and thanks for attention.
Related
I'm new to Node.js but willing to give it a serious try. Coming from PHP things seem a bit confusing as there is no index.php, but a start-script needs to be executed to fire up the server npm start.
How is this done in production? Are there pre-run scripts? What if the server closes for some reason, how do I get it back up automatically without having connection problems for the clients? Will it automatically work for the domain, or does it also mean someone alwasy has to go to domain.com:3000?
Ar am I thinking about this the wrong way?
What you are asking is very broad in term of question . Let me give the idea how it work.
Coming from PHP things seem a bit confusing as there is no
index.php, but a start-script needs to be executed to fire up the
server npm start.
So in node.js we have a file by which we start our node server and that we decide what we want . mostly people use app.js , server.js , index.js
when you run npm start , that means you would have package.json in the folder that file have written start: node app.js . And when you run npm start , it get fire .
How is this done in production? Are there pre-run scripts?
NODE_ENV=production npm start , you can access this in node code like this
process.env.NODE_ENV . in this way you can add dev,qa tag for each environment .
I will recommend you to have a look in
http://pm2.keymetrics.io/
What if the server closes for some reason, how do I get it back up automatically without having connection problems for the clients?
For that reason you can look at https://nodejs.org/api/cluster.html
You can manage the crash thread and then open another thread as node is single thread.
Also you can manage the node.js all type of error By this . This make node.js catch all exception and error
https://nodejs.org/api/process.html
process.on
does it also mean someone alwasy has to go to domain.com:3000?
No. You can take any port you want . 80,8080 whatever . I will recommend to use nginx in front of node.js application . But for less compatibility go for simple node application .
for example:-
var http = require('http');
var port = 3000 ; // take any 80, 8080
http.createServer(function (request, response) {
response.writeHead(200, {
'Content-Type': 'text/plain',
'Access-Control-Allow-Origin' : '*'
});
response.end('Hello World\n');
}).listen(port);
Hope this help .
Answering you 1st question there are other options how you can run node application. I suggest that you start using some packages likeNodemon
which are actually built for this purpose.
Answering your second question, you can use same for production deployments using some container system if you like. here are some optionsdocker, kubarnetees,and many more.
Your automatic restart thing can be solve either by you container manager or package you have used for deployment.
And for redirecting all request coming on 80 or 443 port that you want to redirect to your application you can try nginx.
There are some modules which you can use to restart the server automatically if it closes for some reason. Some of them are pm2, forever.
Without going into much detail, you have to be VERY clear of the following:
Your node web process will die. Yes, that's right, when there is an uncaught exception it can die. Therefore, you need more than one process for failover, and for that there are many techniques and libraries. Some of them are:
Several node processes load balanced behind a web server (nginx most commonly used)
Managed cluster of node processes (https://github.com/Unitech/pm2)
Or (not that good for production in my opinion) some process monitor which will restart your node web process if it dies:
https://github.com/foreverjs/forever
I am creating a game by Unity and I want to upload the players' score to MongoDB. Therefore, I have built a node.js server listening to port 3000, and the scores will be sent to the server and store into the database.
My question is that if I want to create a website for viewing/analyzing players' scores, which approach should I use?
create two node.js servers, one for the web, one for the game
one node.js server but listen to port 80 and 3000 (im not sure whether it is possible or not)
any other better suggestions?
Thank you.
I would create one Node server, one to serve both api and web requests.
It sounds like the data served by the API and the web will be the same or subsets of each other. So you'll probably want to share code, lookup the same stuff from the database, etc etc.
From here, you could either create separate routes that the api uses and the web uses (/api/v1/my_scores vs /my_scores) OR realize that you're just asking for different representations of the same data and do something RESTful like checking the accept header and either sending server rendered HTML or sending JSON back to the client.
Alternatively, you could just create a api in Node, then use a purely front end tool like Angular or React to create a web front end for your site.
Using port 3000 is not a good idea because many users access internet through firewalls which block non-standard ports.
I would recommend using 443 port and https to secure the communication for both use cases.
If the site for analyzing scores does not share logic with the api server, then it can be created as a separate site - but in starting it is easier to manage a single application.
If i understand your question easily and according to my limited knowledge i think that you don't require more than one server with a database. The reason is that one web you only want to display the high score nor the end user can insert it anything on website. So the complexity is minimal already so don't bother to create another server. Just make data getting API separate for using in website.
User case:
My nodejs server start with a configuration wizard that allow user to change the port and scheme. Even more, update the express routes
Question:
Is it possible to apply the such kind of configuration changes on the fly? restart the server can definitely bring all the changes online but i'm not sure how to trigger it from code.
Changing core configuration on the fly is rarely practiced. Node.js and most http frameworks do not support it neither at this point.
Modifying configuration and then restarting the server is completley valid solution and I suggest you to use it.
To restart server programatically you have to execute logics outside of the node.js, so that this process can continue once node.js process is killed. Granted you are running node.js server on Linux, the Bash script sounds like the best tool available for you.
Implementation will look something like this:
Client presses a switch somewhere on your site powered by node.js
Node.js then executes some JavaScript code which instructs your OS to execute some bash script, lets say it is script.sh
script.sh restarts node.js
Done
If any of the steps is difficult, ask about it. Though step 1 is something you are likely handling yourself already.
I know this question was asked a long time ago but since I ran into this problem I will share what I ended up doing.
For my problem I needed to restart the server since the user is allowed to change the port on their website. What I ended up doing is wrapping the whole server creation (https.createServer/server.listen) into a function called startServer(port). I would call this function at the end of the file with a default port. The user would change port by accessing endpoint /changePort?port=3000. That endpoint would call another function called restartServer(server,res,port) which would then call the startServer(port) with the new port then redirect user to that new site with the new port.
Much better than restarting the whole nodejs process.
CONTEXT
I am trying to build a new web app in NodeJS. This webapp uses two main components.
1) Code that I am writing within NodeJS - AKA my own logic flow.
2) A 3rd party open source NodeJS app - EtherCalc - that uses both HTTP & Socket.io
EtherCalc is a completely standalone web application on its own. It is a spreadsheet-in-a-browser+server meant to be used as a standalone app. Of importance, it has its own namespace (as in it has various pathnames that route to different functions).
My app and EtherCalc each run on their own ports independently of each other.
For simplicity's sake, let the domain name be localhost.
CHALLENGE I AIMED TO SOLVE
My application will be using both the spreadsheet capabilities of EtherCalc, as well as the non-spreadsheet-related logic flow of my own code. Users will have access to both interfaces. However, I want this all to appear to come from one URL/port - I don't want non-programmers to be specifying different ports to access different functionality.
The easiest way to tackle this for me was to create a namespace for EtherCalc within my app. Anything path that starts with /ethercalc/ automatically gets forwarded to the port that EtherCalc is running on (with the ethercalc/ removed from the request URL before it is forwarded). Anything that doesn't start with that stays within the standard server logic flow.
I'm using node-http-proxy to make this happen, using the following code:
proxy.proxyRequest(req,res,{
host:'localhost',
port:8000
});
PROBLEM I HAVE COME ACROSS
This seems to function fine initially - the spreadsheet interface loads when I go to any /ethercalc/ url. However, EtherCalc uses either WebSockets or JSON Polling in order to keep the server & the browser on the same page. This doesn't seem to work properly. For some reason, it takes a good 10 seconds to actually get that connection going. This is especially problematic if I've worked on a spreadsheet, then load it again later - there's a 10 second window before I actually see data I've put in beforehand.
After having this issue was to remove the url-changing functionality. Now my app simply forwards the requests to the port EtherCalc is running on. However, the problem remains. Connecting directly to the port that EtherCalc is on removes the problem.
Rewriting the code to use node-http-proxy's httpProxy.createServer() code didn't make a difference either - same exact outcome...
What's bizarre is that when connecting to my app port (the one that has a proxy/forwarding system in place), AFTER the lengthy wait period, it functions just fine - everything is completely synced up in real time.
Does anybody have any idea what it is that's going on?
TL;DR
I have a web app in NodeJS facing HTTP port 80.
I have a 3rd party web app in NodeJS using Socket.IO (EtherCalc) running on the same server on a non-public port.
I want to serve webpages from my own webapp AND webpages from EtherCalc to the same domain name on port 80.
I have a proxy (node-http-proxy) running on my webapp to make this possible.
For some reason when I do this, the handshake that takes place between the browser and the server for EtherCalc (now running through a proxy) takes FOREVER. (or rather 10 seconds, AKA unacceptable for a consumer facing webpage)
I have no idea why this happens.
Any clue/hints/suggestions/leads? Socket.IO is completely new to me - I have used Node for a long time (2 years), but it's been entirely in HTTP world. Proxies as well are new to me (this is my first time using one - using it for the namespace/port issues). As such, I really have no idea where to begin looking.
I haven't found out anything about my problem so I'd like to ask you if following problem could be solved. I have a nodejs server which displays a website with a button. Is it possible to start another node server (which should do some spookyJS tests and print the results to the website) when i click this button?
I found out that with nowJS you have a shared space which the server and "client" (some html page) share. Is this module helpful?
Thanks for your help,
Alex
In short - Yes!
But perhaps you can have both web servers running at all times. In fact, it'll be less of a load on your hardware.
1st Server - Application Server - runs at yoursite.com
2nd Server - SpookyJs/Test Server - runs at tests.yoursite.com
After the servers are up and running the next thing I'd do is wrap the SpookyJs application with a simple restful interface/api. To start tests and to respond with the result of a test.
An important thing to note here is that when you start the SpookyJS application, let stay open. So that every request to the SpookyJS application (through your interface) calls the "open" or the "then" method.
Again, this is to remedy the issue of spawning too many headless browsers.
After the request goes through, go ahead and respond to the request with the result that spooky gives you.
Maybe that helps?
We are doing similar things with Zombie js... so maybe it will help you (: