send Session Description from node server to client - node.js

Do I need to use a websocket to send JSON data to my client? (it's a tiny session description)
Currently my client-side code sends a session description via XHR to my Node.js server. After receipt, my node server needs to send this down to the other client in the 'room'.
I can achieve this using socket.io, but is it possible to do anything a bit faster/ more secure, like XHR for example?

If you just want to receive the offer from the other side and nothing else, I would suggest you to try HTML5 Server Sent Events.
But this may bring problems due to different browsers support, so I would use a simple long pooling request. Since you only want to get the SDP offer, the implementation is pretty simple.

No, you don't need to use the WebSocket API to send JSON data from client to client via a server, but unless you use Google's proprietary App Engine Channel APIs, then the WebSocket API is probably your best choice.
Also, please keep in mind that you're not only sending session descriptions, but also candidate info (multiple times) as well as other arbitrary data that you might need to start/close sessions, etc.
As far as I know, the WebSocket API is the fastest solution (faster than XHR) for signalling because all the overhead involved with multiple HTTP requests is non-existent after the initial handshake.
If you want to code things yourself, I'd start reading the latest WebSocket draft and learning how to code the WebSocket server-side script yourself or else you will pretty much have to rely on a WebSocket library like Socket.IO or a proprietary solution like Google's App Engine Channel APIs.

How about using the 303 HTTP status code?
The first client send the session description to resource X, the server acknowledges the receipt and responds with a 303 status code that points to a newly created resource Y that accumulates other clients session descriptions.
The first client polls resource X until it changes.
The second client send its session description to resource A, the server acknowledges the receipt and updates resource Y. The first client notices the update with the next poll and will now have the second client's session information.

Related

How to use socket.io properly with express app

I wonder how do I use socket.io properly with my express app.
I have a REST API written in express/node.js and I want to use socket.io to add real-time feature for my app. Consider that I want to do something I can do just by sending a request to my REST API. What should I do with socket.io? Should I send request to the REST API and send socket.io client the result of the process or handle the whole process within socket.io emitter and then send the result to socket.io client?
Thanks in advance.
Question is not that clear but from what I'm getting from it, is that you want to know what you would use it for that you cant already do with your current API?
The short answer is, well nothing really.. Websockets are just the natural progression of API's and the need for a more 'real-time' interface between systems.
Old methods (and still used and relevant for the right use case) is long polling where you keep checking back to the server for updated items and if so grab them.. This works but it can be expensive in terms of establishing a connection, performing a lookup, then closing a connection.
websockets keep that connection open, allowing both the client and server to communicate real time. So for example, lets say you make an update to your backend data and want users to get that update, using long polling you would rely on each client to ping back to the server, check if there is an update and if so grab it. This can cause lags between updates, some users have updated data while other do not etc.
Now, take the same scenario with websockets, you make an update to the backend data, hit submit, this then emits to your socket server. Socket server takes the call, performs the task ( grabs updated data ) and emits it to the users, each connected user instantly gets that update.
Socket servers are typically used for things like real time chats or polling where packets are smaller but they are also used for web games etc. Depending on the size of your payloads will determine how best to send data back and forth because the larger the payload the more resources / bandwidth it will take on the socket server so its something to consider.

Nodejs: SocketIO (websockets) vs Http

Normally i use ajax http requests to get/post data. Now i have thoughts like why shouldn't i replace all the ajax get requests with socketIO?is there any disadvantage in following this approach?
I understand that session cookies via http headers will be sent between client and server during every http requests, during client<=>server interactions using sockets, will the session cookies in browser automatically sent to the server via socket headers(if that exists)?
In which usecases should i prefer SocketIO over Http?(if you consider this as a question that demands broad answer then you can link me to some relevant articles)
WebSockets are useful when the server needs to push some real time information to the client about some events that happened on the server. This avoids the client making multiple polling AJAX calls to verify if some event has occurred on the server.
Think of a simple chat application. If the client needs to know if the other participant in a chat session has written something in order to display it, he will need to make AJAX calls at regular intervals to verify this on the server. On the other hand WebSockets allow the server to notify the client when this even occurs, so it is much more efficient in terms of network traffic. Also the WebSockets protocol allows the server to push real time information to multiple subscribed clients at the same time: for example you could have a web browser and mobile application subscribed to a WebSocket and talking to each other directly through the server. Using AJAX those kind of scenarios would be harder to achieve and would require much more stateless HTTP calls.
I understand that session cookies will be sent between client and server during every http requests, is this case the same during client<=>server interactions using sockets
The WebSockets protocol is different from the HTTP protocol. So after the initial handshake occurs (which happens over HTTP), there are no more notion of HTTP specific things such as cookies.
There's one important thing that you should be aware when using WebSockets: it requires a persistent connection to be established between the client and the server. This could make it tricky when you need to load balance your servers. Of course the different implementations of the WebSockets protocol might offer solutions to this problem. For example Socket.IO has a Redis implementation allowing the servers to keep track of connected clients through a cluster of nodes.

node.js built in support for handling requests for same data

In my node.js server app I'm providing a service to my js client that performs some handling of remote api's.
It might very well be possible that two different clients request the same information. Say client 1 requests information, then before client 1's request is fully handled (remote api's didn't returns their response yet) client 2 is requesting the same data. What I'd want to is to wait for client 1 data to be ready and then write it to both client 1 and 2.
This seems to me like a very common issue and I was wondering if there was any library or built-in support in connect or express that supports this issue.
You might not want to use HTTP for providing the data to the client. Reasons:
If the remote API is taking a lot of time to process you will risk the client request to timeout, or the browser to repeat the request.
You will have to share some state between requests which is not a good practice.
Have a look at websockets (socket.io would be a place to start). With them you can push data from the server to the client. In your scenario, clients will perform the request to the server, which will return 202 and when the remote API will respond, the server will push the data to the clients using websockets.

It is interesting to create a new node app to handle socket.io?

I want to add on an existing project some sockets with nodeJs and Socket.io.
I already have 2 servers :
An API RESTful web service, to storage and manage my datas.
A Public web service to return HTML, assets (js, css, images, ...)
On the first try, I create my socket server on the Public one. But I think it will be better if I create an other one to handle only socket query.
What do you think ? It's a good idea or just an useless who will add more problem than solve (maybe duplicate intern lib, ..)
Also, i'm using token to communicate between Public and API, do I have to create another to communication between socket and API ? Or I can use the same one ?
------[EDIT]------
As nobody didn't understand me well I have create a schema with the infrastructure I was thinking about.
It is a good way to proceed ?
The Public Server and Socket server have to be the same ? Or can be separate ?
Do I must create a socket connection between API and Socket server for each client connected ?
Thank you !
Thanks for explaining better.
First of all, while this seems reasonable, this way of using Socket.io is not the most common one. The biggest advantage of using Socket.io is that it keeps a channel open for 2-way communication. The main advantage of this is that the server itself can send messages to the client without the latter having to poll periodically.
Think, for example, of a mail client. Without sockets, the browser would have to poll periodically to check for new mail. With an open socket connection, instead, as soon as a new mail comes the server notifies the client immediately.
In your case, the benefits could be limited, and I'm not sure the additional complexity of a Socket.io server (and cost!) would really be worth the modest speed improvement on REST requests. However, at the end it's up to you.
In answer to your points
See above
If the "public server" is not written in Node.js they can't be the same application. Wether they reside on the same server, it's up to you and your budget. Ideally they should be separate, for bigger workloads.
If you just want the socket server to act as a real-time proxy, then yes, you'll have to create a socket connection for each request. How that will work is:
The client requests a resource to the Socket.io server.
The Socket.io server does the normal HTTP request to the API server (e.g. using request)
The response is returned to the client over the socket connection
The workflow represented in #3 is the reason why you should expect only moderate performance improvement. Indeed, you'll get some better latency, but most of the overhead for starting a HTTP request is still there!

RESTful backend and socket.io to sync

Today, i had the idea of the following setup. Create a nodejs server along with express and socket.io. With express, i would create a RESTful API, which is connected to a mongo. BackboneJS or similar would connect the client to that REST API.
Now every time the mongodb(ie the data in it iam interested in) changes, socket.io would fire an event to the client, which would carry a courser to the data which has changed. The client then would trigger the appropriate AJAX requests to the REST to get the new data, where it needs it.
So, the socket.io connection would behave like a synchronize trigger. It would be there for the entire visit and could also manage sessions that way. All the payload would be send over http.
Pros:
REST API for use with other clients than web
Auth could be done entirely over socket.io. Only sending token along with REST requests.
Use the benefits of REST.
Would also play nicely with pub/sub service like Redis'
Cons:
Greater overhead, than using pure socket.io.
What do you think, are there any great disadvantages i did not think of?
I agree with #CharlieKey, you should send the updated data rather than re-requesting.
This is exactly what Tower is doing:
save some data: https://github.com/viatropos/tower/blob/development/src/tower/model/persistence.coffee#L77
insert into mongodb (cursor is a query/persistence abstraction): https://github.com/viatropos/tower/blob/development/src/tower/model/cursor/persistence.coffee#L29
notify sockets: https://github.com/viatropos/tower/blob/development/src/tower/model/cursor/persistence.coffee#L68
emit updated records to client: https://github.com/viatropos/tower/blob/development/src/tower/server/net/connection.coffee#L62
The disadvantage of using sockets as a trigger to re-request with Ajax is that every connected client will have to fetch the data, so if 100 people are on your site there's going to be 100 HTTP requests every time data changes - where you could just reuse the socket connections.
I think that pushing the updated data with the socket.io event would be better than re-requesting the lastest. Even better you could only push the modified pieces of data decreasing the amount of data sent over the line. Overall though a interesting idea.
I'd look into Now.js since it does pretty much exactly what you need.
It creates a namespace which is shared among the client and server. The server can call functions on the client directly and vice versa.
That is if you insist on your current infrastructure decision to use MongoDB and Node.js, otherwise there would be CouchDB which is a full web server and document database with sophisticated replication mechanisms built-in.

Resources