Result of socket.io request in IE 9 is Pending - node.js

I am running node.js/socket.io server.
I am testing with IE 9.
When I open dev tool in IE 9, many of the socket.io request are displaying their result as "Pending..". Although I see they have returned the result which I want.
I am not sure if this default behavior or a bad thing which might cause my browser slow.
Any help is appreciated.

If you're doing long polling as your socket transport (xhr, jsonp, etc...) having pending http requests is expected. The socket client should open an http GET to the server and the server should keep it open until there is data for the socket or the http interval expires. When either of those cases happen the client re-opens a GET and starts listening again. So in practice there should always be one pending http request. That said I see that there are multiple in your screenshot. I'm curious if its a tools issue. Have you tried your test above in chrome or firefox? Do you get the same result? Is socket communication working as expected apart from what's being displayed there?

Related

Way to Non-Keep-Alive Connections with Node?

I'm trying to build a demo distributed app that shows how to load balance between many different containers. The app works fine, but I'm trying to force refreshing between different containers, and Chrome/Firefox are keeping the connection open. I thought that forcing the connection to close with .close() would work:
https://nodejs.org/api/http.html#http_request_end_data_encoding_callback
But this causes the server to close. Is there a way to tell Chrome/Firefox not to hold open the connection to a server in Node without closing the server?
Excellent question. As far as I know, there is no normal direct way to disable HTTP keep-alive with the built-in HTTP server. You might be able to work around this. The request object has a connection property which will contain the underlying socket. You could save a reference to the connection and close it after you have called end() on the response. This really isn't a great idea however. The client can use pipelining, sending multiple requests before the response from a previous one has finished.
The best thing to do is just add this to your response headers:
Connection: close
This will signal the client not to keep the connection open.

socket.io disconnects clients when idle

I have a production app that uses socket.io (node.js back-end)to distribute messages to all the logged in clients. Many of my users are experiencing disconnections from the socket.io server. The normal use case for a client is to keep the web app open the entire working day. Most of the time on the app in a work day time is spent idle, but the app is still open - until the socket.io connection is lost and then the app kicks them out.
Is there any way I can make the connection more reliable so my users are not constantly losing their connection to the socket.io server?
It appears that all we can do here is give you some debugging advice so that you might learn more about what is causing the problem. So, here's a list of things to look into.
Make sure that socket.io is configured for automatic reconnect. In the latest versions of socket.io, auto-reconnect defaults to on, but you may need to verify that no piece of code is turning it off.
Make sure the client is not going to sleep such that all network connections will become inactive get disconnected.
In a working client (before it has disconnected), use the Chrome debugger, Network tab, webSockets sub-tab to verify that you can see regular ping messages going between client and server. You will have to open the debug window, get to the network tab and then refresh your web page with that debug window open to start to see the network activity. You should see a funky looking URL that has ?EIO=3&transport=websocket&sid=xxxxxxxxxxxx in it. Click on that. Then click on the "Frames" sub-tag. At that point, you can watch individual websocket packets being sent. You should see tiny packets with length 1 every once in a while (these are the ping and pong keep-alive packets). There's a sample screen shot below that shows what you're looking for. If you aren't seeing these keep-alive packets, then you need to resolve why they aren't there (likely some socket.io configuration or version issue).
Since you mentioned that you can reproduce the situation, one thing you want to know is how is the socket getting closed (client-end initiated or server-end initiated). One way to gather info on this is to install a network analyzer on your client so you can literally watch every packet that goes over the network to/from your client. There are many different analyzers and many are free. I personally have used Fiddler, but I regularly hear people talking about WireShark. What you want to see is exactly what happens on the network when the client loses its connection. Does the client decide to send a close socket packet? Does the client receive a close socket packet from someone? What happens on the network at the time the connection is lost.
webSocket network view in Chrome Debugger
The most likely cause is one end closing a WebSocket due to inactivity. This is commonly done by load balancers, but there may be other culprits. The fix for this is to simply send a message every so often (I use 30 seconds, but depending on the issue you may be able to go higher) to every client. This will prevent it from appearing to be inactive and thus getting closed.

4 or 5 Polling requests before WebSocket protocol is activated

I'm working with socket.io (the 1.0 version) and something weird happens. The server is very basic and without any message handling (which means only the connection signal is used and the disconnection one). Though it seems that the client sends multiple polling requests before trying to use websockets. For example here is a screenshot of the requests.
As you can see, it's really messy. There are some requests to my nodejs server, first some polling requests, then the websocket (switching protocol, indicated by the blue dot on the left) and then other requests for polling. Though I know it uses Websockets after that because there are no other polling requests once the Websocket is set. It makes my server to send some messages twice on the page load.
Does anyone ever experienced something like that ? Maybe it will just work fine. But I don't want to have this kind of behaviour. If you need additionnal information, just ask in the comments and I'll edit the main post.
Take a look at the last paragraph of New engine section. Socket.IO 1.0 first connects via XHR or JSONP, and then, if it's possible, switches transport to WebSocket on the fly. This explains why you have such messy network activity.

How server push approach to browser is supported and role of websockets in that?

I had a use case where i was planning to poll from browser to server to check any updates for a given customer.Then i thought
of exploring push approach where webserver(in my case tomcat) can do it automatically whenever servlet running on webserver
gets any update from thirdparty.First question came to my mind how javaclass will know to which browser client it has to send
update.Then i came across the link at http://www.gianlucaguarini.com/blog/nodejs-and-a-simple-push-notification-server/.
This is the amazing link that demonstrates how push approach can be supported.But i came up with some basic question to go
ahead with this approach. These are:-
1)Does browser internally uses the websockets only to communicate with webserver or they just used TCP for that?
As per my understanding browser uses only TCP protocol though it is supported by some brosers like chrome,mozilla
2)Does the websocket (provided by io.connect('url')in the example) supported by all browsers specially IE7,IE8
As per my understanding
3)To support the push approach on browser, websockets are the only way to go?
As per my understanding, websockets are mainly used to push the data from webserver to browser(only those that support websockets)
For this first browser needs to make the websocket connection to webserver.Now server will use the created websocket to emit any
data to browser.Right?
4)Is there a possiblity when websocket get automatically disconnected like in case request gets timeout or response is awaited for long time?
5)Do we need to disconnect the socket explicitly or it will be closed automatically when browser is closed?
It would be really helpful if reply is pointwise.
WebSocket protocol is TCP protocol. It's just that it starts as HTTP and then it can be upgraded to TCP.
Internel Explorer is supposed to support WebSockets in version 10. The other major browsers (Chrome, FireFox, Safari, Opera) do fully support it.
There are many other possibilites. Simply polling, long polling ( where you make one ajax request and server responds only when he has new data ), hidden infinite iframe, use of flash, etc.
Yes.
Once an application which is using a port ( in that case a browser ) is killed, then all connections are terminated as well.

Node takes very long time to response to the JSON request

I've implemented the chat application using node.js. The program open the connection with the client and it'll response the new message when the EventEmitter emit "recv" event.
The problem is it takes very long time to response to other request when the server hold about 3 or 4 more streams. The chrome developer tool show the status of the request as pending. it took more than 5-30 second to reach the server(localhost). I use console.log to log when the new request is received by the node.js
I have no idea why there's a long pause. Is there any limit on chrome browser, node.js or any other stuffs i should know? Does the node delay when it hold too many request at the same time and how should i measure this value? Thank you
Chrome supports six simultaneous connections per domain, so if those are already in use, it will have to wait for one to close. If you want to know what's going on, use a packet capture program to check the actual network traffic.
Browsers are limited to certain number of parallel connections which applies to the same browser context - for example when you have opened let's say more than 6 tabs, then the connections will be queued and you will see them pending.
You can avoid this limitation, for example, by using unique poll subdomain for each client connection. This is how facebook workaround this limitation, however problem is with Firefox, where this workaround doesn't work and your connections will be queued when they reach the limit even when you use unique subdomains.
Other solution might be to use HTML5 local storage where you can take advantage of StorageEvent which propagate changes also to other tabs within the same browser. This is how StackOverflow chat is done. Advantage of this approach is that you need only one polling connection with the server, but disadvantage is lack of HTML5 local storage support in older browsers or different implementation in FF version < 4.

Resources