Google cloud app engine 1 hour latency without notivceable response times - node.js

I've got a GAE nodejs flex socek.io + express web and websocket server...
Everything works just fine and response times are really good but when i go to the metrics tab i can see this for latency
an hour latency??? I'm guessing it's something related to socket.io long polling? or websockets themselves?
Anybody cares to explain?

This is related with the time alive of your sockets, which seems to be 60 mins. I guess you are using sessions for your app. In this case, that you are using socket.io, it fall back on HTTP long polling, just as you mentioned. To get a better performance there is a new session affinity setting in the app.yaml. You can take a look into it.
If your app is working just fine just keep monitoring your Memory usage and CPU usage. Always try to keep a good management of your resources.

Related

GCP Cloud Run Socket.IO Timeout

I'm currently deploying my Socket.IO server with Node.js/Express on Google Cloud Platform using Cloud Build + Run, and it works pretty well.
The issue I'm having is that GCP automatically times out all Socket.IO connections after 1 hour, and it's really annoying. The application I'm running forces it to be run in the background for hours on end, with multiple people in each socket room and interacting with it a bit every 30 mins to 1 hour.
That's why I have 2 questions:
How can I gracefully handle these timeouts? I have a reconnection process setup on my client, checking if the socket is connected every 5 seconds, but for some reason it can't detect when these timeouts happen and I'm not sure why.
Is there a better platform I can deploy my Socket.IO server on? I don't like the timeouts that GCP sets - would a platform like Digital Ocean or Azure be better?
Cloud Run has a max timeout of 3600s to handle the requests, whatever the protocol (HTTP, HTTP/2, streaming or not). If you need to maintain longer the connexion, Cloud Run isn't the correct platform for this.
I could recommend you to have a look to App Engine Flex or to Autopilot. On both, you have longer timeout and the capacity to run jobs in background. And both accept containers.

Which one is best for chat app ? Web socket or Send request every 3 seconds

I am making chat app on react-native. I am using socket.io for this but socket.io sometimes not working successful. I would like to change send request to server side every 3 seconds.
I just send request for one chat id
Which one is best? If i use send request in every 3 seconds , will happen any problem from server side
Maybe long polling (not polling, it is different behaviour, with long polling an api call can stay pending until response is available) is an option but WebSocket are far preferable.
Responses are faster, it costs less resources serverside, less bandwidth, you can subscribe to multiple streams and so on.
Here you can valutate some metrics:
Ref: https://blog.feathersjs.com/http-vs-websockets-a-performance-comparison-da2533f13a77
socket.io scales better, and has better performance, than any polling HTTP request mechanism. When working well, it will also have faster response times than 3 seconds - it may not seem long, but actually it may be noticeable to users.
If your chat app is for a low number of users, then a polling mechanism is easier to implement and should work just fine.
If you intend to scale your application to a large number of users, you will need socket.io or a similar subscribe/push mechanism to connected clients.

Socket.io huge server response time when using xhr-polling

I am trying to scale a messaging app. Im using nodeJS with Socket.io and Redis-Store on the backend. The client can be iphone native browser, android browsers .. etc
I am using SSL for the node connection, using Nginx to load balance the socket connections. I am not clustering my socket.io app , instead i am load balancing over 10 node servers ( we have a massive amount of users ). Everything looks fine when the transport is Websockets , however when it falls back to xhr-polling ( in case of old android phones ) I see a HUGE response time of up to 3000 rpm in New-relic. And I have to restart my node servers every hour or so otherwise the server crashes.
I was wondering if I am doing anything wrong , and if there are any measures I can take to scale socket.io when using xhr-polling transport ? like increasing or decreasing the poll duration ?
You are not doing anything wrong, xhr-polling is also called long polling. The name comes from the fact that the connection is maintained open longer, usually until some answer can be sent down the wire. After the connection closes, a new connection is open waiting for the next piece of information.
You can read more on this here http://en.wikipedia.org/wiki/Push_technology#Long_polling
New Relic shows you the response time of the polling request. Socket.IO has a default "polling duration" of 20 seconds.
You will get a higher RPM for smaller polling duration and a smaller RPM for a higher polling duration. I would consider increasing the polling duration or just keep the 20 sec default.
Also, to keep New Relic from displaying irrelevant data for the long polling, you can add ignore rules in the newrelic.js that you require in you app. This is also detailed in the newrelic npm module documentation here https://www.npmjs.org/package/newrelic#rules-for-naming-and-ignoring-requests

Does socket.io and node.js's performance get affected on heroku's server (with no websocket)?

Since heroku server doesn't support websocket, does it mean if we run a node.js + io.socket app on it, expecting many concurrent users, some in effectiveness will happen when there are more users?
I was building a multiuser app and suddenly notice that heroku is using long poll instead of websockets. I couldn't see much delay in my prototype but I am worried, should i be building my app on a server that supports real websockets?
... should i be building my app on a server that supports real websockets?
Probably.
http://websocket.org/quantum.html, says "HTML5 Web Sockets can provide a 500:1 or—depending on the size of the HTTP headers—even a 1000:1 reduction in unnecessary HTTP header traffic and 3:1 reduction in latency."
Long polling is old and inefficient, and is slowly being replaced by sockets. They are supported by every server. Most of latest browsers have already added support too. Heroku will do that too, soon hopefully. You can continue with your prototype, maybe websocket support will be added before you finish it.
The advantages websocket are given here

Benchmarking comet applications

I'm currenctly working on my master's thesis. It's about real-time webapplications.
Now I'd like to compare Node.js with for example long polling.
I know some benchmarking tools such as ab, autobench etc., but these don't really test the application. Once they've made a request to the server, the request is handled and a new request is made. What I need is a benchmarking tool that will 'stay' on the webpage for a longer time so it'll simulate real people.
For example: I've made a demo chat in both Node.js and long polling (PHP). Now I want to test this with 100 simultaneous that stay on the chat for about 30 seconds.
Does anyone has some suggestions for me how I can reach this goal?
I thank you in advance!
Now I'd like to compare Node.js with for example long polling.
Long polling itself is a platform agnostic web push technology, so you can compare long polling application made in node.js with similar application made in PHP for example.
What I need is a benchmarking tool that will 'stay' on the webpage for
a longer time so it'll simulate real people.
You can create another server application which would simulate client connections, however this application shouldn't be hosted on the same machine as your long poll server application in order to have "near real" latency between clients and server. Even this approach may not give you exact environment as you would have with "real human" clients (since server application simulating client connection would be on the same origin and also because of famous quote "there is no test like production"), but it can give you rough environment to test your long polling server to gather some benchmark data. For example socket.io has this kind of application for simulation of a variety of browser transports.

Resources