Is it possible to develop a web application using a mix of technologies like servlet and nodejs? Using servlets somewhere and nodejs at some other place when required.
Yes. Read about microservices. Docker can come in handy;)
Yes, consider these two use cases.
Use case 1 (Sub Domains)
You want www.example.com to use PHP & MySQL for Wordpress, api.example.com to run on Nodejs & MongoDB and coders.example.com to use Servlets.
Here you'd code and start all servers (PHP, Nodejs & Java server) on their specific & different ports and configure a frontend server (like Nginx) to receive and redirect requests according to the subdomains specified.
Use case 2 (In-App)
You want a request to come inside your Nodejs application get processed and then trasfer data to another process (a running Java app) and wait for a response back so it can process to user.
This can be accomplished by Message ques or HTTP/web services.
Both of these services will help you share your data back and forth.
Read more on Message ques
Read more on HTTP/Web services
References
Is it possible for node.js to import a java library
https://www.quora.com/How-does-Node-js-communicate-with-Java-applications
https://www.quora.com/topic/Message-Queuing
Related
I want to practice creating my own RESTful API service to go along with a client-side application that I've created. My plan is to use Node and Express to create a server. On my local machine, I know how to set up a local server, but I would like to be able to host my application (client and server) online as part of my portfolio.
The data that my client application would send to the server would not be significant in size, so there wouldn't be a need for a database. It would be sufficient to just have my server save received data dynamically in an array, and I wouldn't care about having that data persist if the user exits the webpage.
Is it possible to use a service like Netlify in order to host both a client and server for my purposes? I'm picturing something similar to how I can start up a local dev server on my computer so that the front-end can interface with it. Except now I want everything hosted online for others to view. I plan to create the Express server in the same repo as the front-end code.
No, Netlify doesn't allow you to run a server or backend. However, they do allow you to run serverless functions in the cloud. These can run for up to 10 sec. at a time. Furthermore Netlify also have a BETA solution called "background functions" That can run for up to 15 minutes. But honestly for a RESTful API there sure would be better solutions out there?
If you are still looking for the Netlify for Backend you can consider Qovery. They explained here why it is a good fit for their users.
This is more like a design question but I have no idea where to start.
Suppose I have a realtime Node.js app that runs on multiple servers. When a user logs in she doesn't know which server she will be assigned to. She will just login, do something and logout and that's it. A user won't be interacting with other users on a different server, nor will her details be stored on another server.
In the backend I assume the Node.js server will put the user's login details to some queue and then when there is space it will assign this user to an available server (A server that has the lowest ping value or is not full). Because there is a limit number of users on one physical server when the users try to login to a "full" server it will direct her to another available server.
I am using ws module of node.js. Is there any service available for this purpose or do I have to build my own? How difficult would that be?
I am not sure how websocket fits into this question. Ignoring it. I guess your actual question is about load balancing... Let me try paraphasing it.
Q: Does NodeJS has any load balancing feature that I can leverage?
Yes and it is called cluster in NodeJS. Instead of the traditional one node process listening on a single port, this module allows you to spawn a group of node processes and have them all binded to the same port.
This means is that all the user know is only the service's endpoint. He sends a request to it and 1 of the available server in the group will serve him whenever possible.
Alternatively using Nginx, the web server, as your load balancer is also a very popular approach to this problem.
References:
Cluster API: https://nodejs.org/api/cluster.html
Nginx as load balancer: http://nginx.org/en/docs/http/load_balancing.html
P.S
I guess the key word for googling solutions to your problem is load balancer.
Out of the 2 solutions I would recommend going the Nginx way as it is a much scalable approach
Example:
Your Node process could possibly be spread across multiple hosts (horizontal scaling). The former solution is more for vertical scaling, taking advantages of multi-cores machine.
I think I know what framework is and some famous framework like ruby on rails, spring, and I think I can distinguish between the meaning of web server and web application server.
but I don't know what is different between WAS and framework, for me I think framework is a kind of WAS because framework is doing many dynamic works related with database handling request from web server(Apache or nginx)
I'm confused with the relationship between these two part in Web programming.
Could you explain it?
Basically the framework is only responsible to provide a response to an http request (that includes handling the database as you said). But Rails isn't responsible to open a new thread (or in some implementations, a process) whenever a new http request arrives - this is done by the application server (such as Puma, Webrick, Unicorn etc). This is called concurrency (the ability to serve the app to multiple requests at the same time, in a nutshell) and is purely the job of the app server. Another thing is understanding (and parsing) the http request - Rails doesn't implement http, it receives a ready request from the app server who does implement http.
In ruby land the job of each part is defined by the rack protocol https://rack.github.io/. Rails, as a rack application, simply waits for "something" (the web application server) to 'call' it (with an http request), and it returns to it the response.
So to sum up: the application server needs to handle threading or multi processing to serve http requests to Rails (the app server is basically always listening on some socket for new requests, and provides concurrency either by forking processes, opening new threads or both. It depends on the app server). The app server therefore also needs to understand http (be able to parse an http request) so it can server that to Rails.
Rails, the web framework, only needs to handle an http request and return the response.
for those who want to understand the difference between web server and app server.
refer to What is the difference between application server and web server?
I am creating a game by Unity and I want to upload the players' score to MongoDB. Therefore, I have built a node.js server listening to port 3000, and the scores will be sent to the server and store into the database.
My question is that if I want to create a website for viewing/analyzing players' scores, which approach should I use?
create two node.js servers, one for the web, one for the game
one node.js server but listen to port 80 and 3000 (im not sure whether it is possible or not)
any other better suggestions?
Thank you.
I would create one Node server, one to serve both api and web requests.
It sounds like the data served by the API and the web will be the same or subsets of each other. So you'll probably want to share code, lookup the same stuff from the database, etc etc.
From here, you could either create separate routes that the api uses and the web uses (/api/v1/my_scores vs /my_scores) OR realize that you're just asking for different representations of the same data and do something RESTful like checking the accept header and either sending server rendered HTML or sending JSON back to the client.
Alternatively, you could just create a api in Node, then use a purely front end tool like Angular or React to create a web front end for your site.
Using port 3000 is not a good idea because many users access internet through firewalls which block non-standard ports.
I would recommend using 443 port and https to secure the communication for both use cases.
If the site for analyzing scores does not share logic with the api server, then it can be created as a separate site - but in starting it is easier to manage a single application.
If i understand your question easily and according to my limited knowledge i think that you don't require more than one server with a database. The reason is that one web you only want to display the high score nor the end user can insert it anything on website. So the complexity is minimal already so don't bother to create another server. Just make data getting API separate for using in website.
I have working PHP application. It allows user create private projects and invite others in it. Now, using node.js and socket.io, I want to make real-time comments, posts etc.
What is the best architecture?
I see two solutions now.
The first is:
User sends AJAX query to PHP backend:
http://example.com/comment_add.php?text=...
comment_add.php adds
comment to database and via AMPQ (or something better?) notifies
node.js server which broadcasts comment to channel's subscribers.
The second is:
User sends AJAX query to node.js server: http://example.com:3000/comment_add
Node.js sends request to PHP backend (But how? And what about authorization?), receives response, and then broadcasts to channel's subscribers.
What is the best way? Is there another methods? How to implement this properly?
When you decided to use node.js + socket.io to make a real-time web-app, you don't need to think about PHP anymore and forget Ajax also... Socket.io will be the communication between client and server.
But yes, you can use Ajax and PHP for building websites fast, and some other functions that don't need real-time
The second way is the best method. You can use http to communicate with PHP from node.js. Authorization can be done in node.js but passing auth credentials everytime to PHP
Finally my working solution is #1.
When user establishing connection to node.js/socket.io he just send 'subscribe' message to node.js with his PHP session id. Node.js checks authorization using POST request to PHP backend and if all is OK allows user to establish connection.
Frontend sends all requests to PHP as it was before node.js.
PHP modifies some object, checks who can access modified object and sends message (via AMQP or redis pub/sub etc.) to node.js:
{
id_object: 125342,
users: [5, 23, 9882]
}
node.js then check who from listed users have active sockets and for each user sends GET request to PHP:
{
userId: 5,
id_object: 125342
}
Special PHP controller receiving this request runs query to get object with rights of given user id and then sends message to node.js with resulting answer. Node.js then via socket sends answer to user's frontend.
I faced this same question a year ago when starting my final year project at University. I realized that my project was much better suited to using Node as a standalone. Node is very good at dealing with I/O, this can be anything from a HTTP requests to a database query. Adding in a PHP based Web Server behind Node is going to add un-needed complexity. If your application needs to perform CPU intensive tasks you can quite easilly spawn 'child' node processed which perform the needed operation, and return the result to your parent node.
However out of the two you methods you have mentioned I would choose #2. Node.js can communicate with your PHP server in a number of ways, you could look at creating a unix socket connection between your PHP server and Node. If that is unavailable you could simply communicate between Node and your PHP back end using HTTP. :)
Take a look here, here is a solution to a question very similar to your own:
http://forum.kohanaframework.org/discussion/comment/57607