Node takes very long time to response to the JSON request - node.js

I've implemented the chat application using node.js. The program open the connection with the client and it'll response the new message when the EventEmitter emit "recv" event.
The problem is it takes very long time to response to other request when the server hold about 3 or 4 more streams. The chrome developer tool show the status of the request as pending. it took more than 5-30 second to reach the server(localhost). I use console.log to log when the new request is received by the node.js
I have no idea why there's a long pause. Is there any limit on chrome browser, node.js or any other stuffs i should know? Does the node delay when it hold too many request at the same time and how should i measure this value? Thank you

Chrome supports six simultaneous connections per domain, so if those are already in use, it will have to wait for one to close. If you want to know what's going on, use a packet capture program to check the actual network traffic.

Browsers are limited to certain number of parallel connections which applies to the same browser context - for example when you have opened let's say more than 6 tabs, then the connections will be queued and you will see them pending.
You can avoid this limitation, for example, by using unique poll subdomain for each client connection. This is how facebook workaround this limitation, however problem is with Firefox, where this workaround doesn't work and your connections will be queued when they reach the limit even when you use unique subdomains.
Other solution might be to use HTML5 local storage where you can take advantage of StorageEvent which propagate changes also to other tabs within the same browser. This is how StackOverflow chat is done. Advantage of this approach is that you need only one polling connection with the server, but disadvantage is lack of HTML5 local storage support in older browsers or different implementation in FF version < 4.

Related

One API call vs multiple

I have a process in the back-end which will take take on average 30 to 90 seconds to complete.
Is it better to have a font-end react app make ONE API call and wait for back-end to complete and process and return the data. Or is it better to have the front-end make multiple calls, lets say every 2 seconds to check if the process and complete and get back the result?
Both are valid approaches. You could also report status changes with websocket so there's no need for polling.
If you do want to go the polling route, the general recommendation is to:
Return 202 accepted from your long-running process endpoint.
Also return a Link header with a url to where the status of the process can be read.
The client can then follow that client and ping it every x seconds.
I think it's not good to make a single API call and wait for 30-90 seconds to get a response. Instead send a response immediately mentioning that the request is successful and would be processed.
Now you can use web sockets or library like socket.io so that the server can communicate directly to the client once the requested processing is complete.
The multiple API calls to check if server is done or server has any new message is called polling and is not much efficient but it is still required in old browsers which don't support web sockets. Socket.io support s polling automatically in old browsers.
But, yes if you want you can do multiple calls to check if server is done processing, but I would prefer server to communicate back to the client , it is better.

How do they do real time website notifications?

So on a site like, say, stack overflow parts of the page update when things happen like your reputation increase. How do they do that lol? Does a script check from time to time or is it a push notification somehow?
About 2 years ago stackexchange started using web sockets as stated here:
https://meta.stackexchange.com/questions/125677/new-feature-real-time-updates-to-questions-answers-and-inbox
If you take a look at the stackoverflow site source you will see that a JavaScript function subscribes to a web socket server.
There are many different approaches to that technology now. Microsoft for example introduced SignalR (http://signalr.net/) which degrades gracefully to older browser too by switching to other technologies where sockets don't work like long polling (asking every X seconds if changes are available).
You as a Python guy would probably start looking at something like: https://pypi.python.org/pypi/websockets/1.0
Have fun with web sockets!
If I didn't want to use web sockets, I would do it like this:
Have the server maintain a queue of notifications for a session or user or whatever context you want.
Have a URL for fetching such notifications.
When a client tries to GET that URL, and there are notifications available in the queue, return them immediately.
Otherwise, have the HTTP connection block until there are notifications queued.
On the client, side, then; simply try to GET the notification URL over and over again. Normally, the connection will sit blocking for data to read, but I don't see that this should be a problem.
I would think this should be easier to implement on the server side than web sockets are, since the HTTP server doesn't have to support any special HTTP extensions. On the other hand, depending on the HTTP server you're using, each such open connection may be using a thread or other system resource that you want to use sparingly.

Node.js App with API endpoints which take 20sec+ :: Connections Left Open :: How to Optimize?

I have a Node.js RESTful API returning JSON data. One of the API calls can (and frequently does) take 10 - 20 seconds to finish. This long RTT is due to connecting to external APIs, like DiffBot, MailChimp, Facebook, Twitter, etc. I wish I could make the API call shorter, but I cannot.
Of course, I've implemented the node code in a nice async way, but the problem is that the client's inbound connection (to the node app) is alive while it waits for the server to finish, and thus might be killing my performance. In fact, I'm currently guessing that this may explain my long-running timeout issue in node.
I've already increased maxSockets to a huge number...
require('http').globalAgent.maxSockets = 9999;
For the sake of interest, I'm printing out the active sockets each time a new connection is made (here's the code).
Which gives me output like this:
SOCKETS: {} { 'graph.facebook.com:443': 5, 'api.instagram.com:443': 1 }
Nothing too enlightening there. The max connections I ever see is around 20 or so, total, across all hosts. But this doesn't really tell me anything about incoming connections, or how to optimize them so that my server does not choke when there are many of them alive at once (which I suspect it is).
You should optimize your architecture, not just the code.
First, I would change the way the client/server interact with each other. The server should end the request upon recept and notify the client once all the tasks for that request are truly complete.
There are different ways to achieve that. For example, the client can query the stats of the request using AJAX (poll) every X seconds. Another example would be to use WebSocket.
If you're going with this approach, look into Socket.IO. It supports many transports with the same API, if WebSocket is available, it would use that, otherwise, it would fall back to other transports such as Flash Socket, long-polling, etc.
Second, you shouldn't use one process to do all this work. You should use a queue (preferably a messaging system that supports queues), then, run workers (separate processes) to do the "heavy lifting".
Personally, I use AMQP due to its features and portability (it's an open-standard) but feel free to use any other queue system with a persistant backend.
That way, if one or more process(es) crash(es) and you use the right queue, you wouldn't lose any data (such as the API tasks you mentioned).
Hope it helps.

How does gmail browser client detect internet/server disconnect (speed and scalability)

We have an browser application (SaaS) where we would like to notify the user in the case of internet connection or server connection loss. Gmail does this very nicely, the moment I unplug the internet cable or disable network traffic it immediately says unable to reach the server and gives me a count down for retry.
What is the best way to implement something like this? Would I want the client browser issuing AJAX requests to the application server every second, or have a separate server that just reports back "alive". Scalability will be come an issue down the road.
Because GMail already checks for new e-mails every some seconds and for chat information even more frequently, it can tell without a separate request if the connection is down. If you're not using Ajax for some other sort of constant update, then yes, you would just have your server reply with some sort of "alive" signal. Note that you couldn't use a separate server because of Ajax cross-domain restrictions, however.
With the server reporting to the client (push via Comet), you have to maintain an open connection for each client. This can be pretty expensive if you have a large number of clients. Scalability can be an issue, as you mentioned. The other option is to poll. Instead of doing it every second, you can have it poll every 5-10 seconds or so.
Something else that you can look at is Web Sockets (developed as part of HTML 5), but I am not sure if it is widely supported (AFAIK only Chrome supports it).

Notifying a browser about events on server

I have a java based web application(struts 1.2). I have a requirement to display a status on the frontend (jsp). Now the status might change which my server gets notified by another server. But I want this status change to be notified to the browser.
I don't want to make a refresh at intervals. Rather I have to implement something like done in gmail chat, ie. the browser gets notified by changing events on the server.
Any ideas on how to go about this?
I was thinking on lines of opening a request to server for status, and at the server end I would hold the request and wouldn't respond back until there is a status change. Any pointers, examples on this?
Best possible solution will be to make use of XMPP protocol. It's standardized and a lot of open source solutions will get you started within minutes. You can use combination of Smack, StropheJS and Openfire to get your java based app work as desired.
There's a method called Long Polling (Comet). It basically sends a request to the server. The request thread created on the server simply waits for new data for the user, with a time limit of maybe 1 minute or more. When new data is available it is returned.
The main problem is to tackle the server-side issue, you don't want to have one thread for every user just waiting for new data. Of course you could use some asynchronous methods depending on your back-end.
Ref: http://en.wikipedia.org/wiki/Push_technology
Alternative way would be to use WebSockets. The problem is that it's not supported by all browsers today.

Resources