Is socket.io implementation possible inside REST framework? - node.js

I am building an app in which I provide functionality X, Y and chat.
Lets say that X and Y are non-interactive eg. reading articles - which will work fine with REST (on a node.js server) while chat is obviously interactive so it will work best with socket.io!
Questions: 1. Is it possible for me to 'switch on' a socket between the server and the user when the user navigates to the chat part of the application? 2. Can I open up a socket inside a GET request for the url: example.com/chats/usr_id on the node.js server?
3. How can this be accomplished inside a Backbone routing framework?

Yes. Just initialize the connection when the view is rendered (via a controller or script). See socket.io client documentation. You can just connect when the view is rendered and disconnect when the view is terminated. http://socket.io/docs/client-api/
You cannot open sockets with a GET request. Socket.io has it's own build in mechanisms for connecting to a socket server. It will start with Web Socket protocol and fall back to Long Polling. You can however use custom url's for unique things. One again, consult the socket.io documentation: http://socket.io/docs/client-api/
http://www.sitepoint.com/chat-application-using-socket-io/
p.s. I'd suggest reading up on how Web Sockets work, as you don't seem to have a very strong understanding.

Related

IHP - How to send and receive data between clients via custom Web Socket controller?

I'm building a chat application with a custom web socket controller and I want to establish a two way communication between different clients with the server in the middle in such a way that whenever a client sends a request, it gets updated on the server and the server emits a response to all the clients.
Note: I've tried using IHP's Auto Refresh but using that is turning out to be quite expensive for my use case so that's why I'm trying to set up a custom web socket controller.
Check out the new IHP DataSync API: https://ihp.digitallyinduced.com/Guide/realtime-spas.html
It's higher level than websockets but likely can help you implement the chat app.

Best way to connect 2 separate node processes with socket.io communicating to a client

I'm new to working with sockets and have a small system design question:
I have 2 separate node processes for a web app, 1 is a simulator that is constantly running and the 2nd is an api server. Both share the same MongoDB database and we have a React app running for the client, served by the api server.
I'm looking to implement socket.io for real-time notifications and so I've set up a simple connection between the api and client.
My problem is that while the simulator runs, there are some events that I also want to trigger push notifications for so my question is how to hook that into everything?
The file hierarchy is like:
app/
simulator/
api/
client/
I saw this article for communication between node processes and I currently have 3 solutions in mind:
Leave hierarchy as it is and install socket.io package inside simulator as well. I'm not sure if sockets work this way but can both simulator and api connect to the same socket?
Move simulator file into api file to fork as a child process so that the 2 processes can communicate via child/parent messaging. simulator will message api which will then emit updates through the socket to client
Leave hierarchy as is and communicate via node-ipc. Same situation as above with simulator messaging api first before api emits that to client
If 1 is possible, that seems like the best solution in my impression. It seems like extra work to add an additional layer of messaging for 2 and 3.
Leave hierarchy as it is and install socket.io package inside simulator as well. I'm not sure if sockets work this way but can both simulator and api connect to the same socket?
The client would have to create a separate socket.io connection to the simulator process. Then, the client can receive data from the API server over one connection and from the simulator over another connection. You would need two separate, independent socket.io connections from the client, one to the API server and one to the simulator. Simulator and API server cannot share the same socket unless they are in the same process.
Move simulator file into api file to fork as a child process so that the 2 processes can communicate via child/parent messaging. simulator will message api which will then emit updates through the socket to client
This is really part of a broader option that the simulator communicates with the API server and sends it data that the API server can then send to the client over the single socket.io connection that the client made to the API server.
There are lots of different ways for the simulator process to communicate with the API server.
Since it's already an API server, you can just make an API for this (probably non-public). The simulator calls an API to send data to the client. The API server receives that data and sends it to the client.
As you suggest, if the simulator is run from the API server as a child process, then you can use parent/child communication messaging built into node.js. Note, you don't have to move the simulator files into the API file at all. You can just use child_process to launch the simulator as another nodejs app from another project. You just have to know the path to that other project.
You can use any another communication mechanism you want between the simulator process and the API server process. There could be a socket.io connection between them. You could use several forms of IPC, etc...
If 1 is possible, that seems like the best solution in my impression.
Your #1 option is not possible as separate processes can't use the same socket.io connection.
It seems like extra work to add an additional layer of messaging for 2 and 3.
My options #1 and #2 are not much code in each server. You're doing interprocess communication. You should expect to use some code to enable that. But, it's not hard at all.
If the lifetime of the simulator server and the API server are always together (they have no independent uses), then I'd probably do the child process thing where the API server launches the simulator and then use parent/child messaging to communicate between them. You do NOT have to combine sources to do this.
The child_process module can run the simulator process by just knowing what directory it is located in.
Otherwise, I'd probably make a small web server on a non-public port in the API server and have the simulator just send data to that other web server. I often refer to this as a control port. It's a way of "controlling or diagnosing" the API server internals and can only be accessed from within the private network and/or with credentials. The reason I'd use a separate web server (in the same nodejs app as the API server) is to make it easy to secure so it can't be accessed from the outside world like the regular public APIs can. You just put the internal web server on a port that is not exposed to the outside world.
You should check Socket.IO docs about adapters and Emitters. This allows to connect to sockets from different node processes and scalability.

Use socket.io and node.js between webpages

Is there any way using socket.io to share data between two different webpages?
Let's say I have page1.html and page2.html and they are acessed from different devices in the same network, and from page1 I send a string to page2.
From what I have seen I could make it using node.js and socket.io, based on the chat example from socket.io. However I'm not sure how.
Any help? Thanks.
There must be a socket server that handles all the web socket connections and events and so on...
The web pages cannot communicate directly but they must send data to the server. One important thing is how you plan to identify the clients.

Setting websocket disconnect code or reason in Tornadoweb

I'm working on a robotics web application that connects to the server using Websocket. Server is built on RosBridge and implemented with Tornado Web.
For the web UI, use can have the websocket disconnected for the following 3 reasons.
- Server is down.
- Another user has kicked you out of the session.
- Session token has expired.
The best way for the server to convey this information to the UI is to set the close reason and/or code (only need one of the two) to the UI. My understanding is that websocket protocol allows for that, but the option seems to be missing from the Tornado API.
The "close reason" field is not currently supported in Tornado. See https://github.com/facebook/tornado/issues/890

Using node.js and socket.io with PHP application

I have working PHP application. It allows user create private projects and invite others in it. Now, using node.js and socket.io, I want to make real-time comments, posts etc.
What is the best architecture?
I see two solutions now.
The first is:
User sends AJAX query to PHP backend:
http://example.com/comment_add.php?text=...
comment_add.php adds
comment to database and via AMPQ (or something better?) notifies
node.js server which broadcasts comment to channel's subscribers.
The second is:
User sends AJAX query to node.js server: http://example.com:3000/comment_add
Node.js sends request to PHP backend (But how? And what about authorization?), receives response, and then broadcasts to channel's subscribers.
What is the best way? Is there another methods? How to implement this properly?
When you decided to use node.js + socket.io to make a real-time web-app, you don't need to think about PHP anymore and forget Ajax also... Socket.io will be the communication between client and server.
But yes, you can use Ajax and PHP for building websites fast, and some other functions that don't need real-time
The second way is the best method. You can use http to communicate with PHP from node.js. Authorization can be done in node.js but passing auth credentials everytime to PHP
Finally my working solution is #1.
When user establishing connection to node.js/socket.io he just send 'subscribe' message to node.js with his PHP session id. Node.js checks authorization using POST request to PHP backend and if all is OK allows user to establish connection.
Frontend sends all requests to PHP as it was before node.js.
PHP modifies some object, checks who can access modified object and sends message (via AMQP or redis pub/sub etc.) to node.js:
{
id_object: 125342,
users: [5, 23, 9882]
}
node.js then check who from listed users have active sockets and for each user sends GET request to PHP:
{
userId: 5,
id_object: 125342
}
Special PHP controller receiving this request runs query to get object with rights of given user id and then sends message to node.js with resulting answer. Node.js then via socket sends answer to user's frontend.
I faced this same question a year ago when starting my final year project at University. I realized that my project was much better suited to using Node as a standalone. Node is very good at dealing with I/O, this can be anything from a HTTP requests to a database query. Adding in a PHP based Web Server behind Node is going to add un-needed complexity. If your application needs to perform CPU intensive tasks you can quite easilly spawn 'child' node processed which perform the needed operation, and return the result to your parent node.
However out of the two you methods you have mentioned I would choose #2. Node.js can communicate with your PHP server in a number of ways, you could look at creating a unix socket connection between your PHP server and Node. If that is unavailable you could simply communicate between Node and your PHP back end using HTTP. :)
Take a look here, here is a solution to a question very similar to your own:
http://forum.kohanaframework.org/discussion/comment/57607

Resources