Express/NodeJS + Mongoose App server response slow - node.js

Issue
I have an Express (Node.JS) + MongoDB app with a server response load time of 4 - 7 seconds on average (slow).
I understand that the average server response time is under 200ms as per google pagespeed tools.
This app is fetching data from a mongoDB asynchronously but the roundtrip times to the database is extremely slow with each call averaging about 500ms - 1s. These calls are simple findAll calls to retrieve data of less than < 100 records.
Context
Mongoose version: 4.13.14
DB server's MongoDB version is 3.4.16
DB server is hosted on MongoDB Atlas M10 in AWS / Oregon (us-west-1)
Web server is hosted with now.sh in SFO1 (us-west-1)
Have performed recommended indexes as advised by MongoDB Atlas's performance advisor
Data fetching perfectly fine in local environment (local server + local db) as data is queried in a matter of few ms
Mongoose logs for the affected page can be found in this gist
Mongo Server configuration
Mongo Atlas M10
2GB Ram
10 GB Storage
100 IOPS
Encrypted
Auto-expand storage
Attempted solutions:
I have checked my DB metrics, they looked fine. There are also no slow queries. These are simple findAll queries. Performance advisor on mongo atlas reports nothing unusual.
The production application and database are both hosted in the same region.
I have already tried optimising the application layer of the query (mongoose) by running .lean()
Question:
Where else should i look to improve the database latency? How can a simple query take so long? Otherwise, why is my server response time taking up to 4s when the expected is about 200ms?

Hey you can try hosting your server and database in the same region. I think the network is creating a overhead in this case. If the server and the database are in the same region, They are on the same network which will reduce the latency significantly. there is a diagram on aws for this

I add some problem like yours with an app that i developed in my master degree. I add to put a node.js api running online to present it in class room.And i realized that every time i wanted to make a call in the api the response was taking allot of time. I realized that one of the problems was the school network because of the firewalls. Also the place where i put the server heroku.com was giving some delay as well. What i did was use Redis ( https://redis.io/ ) to improve the performance, also heroku was giving me some problems because of the requests being http and not https.
Make a test running the app and data on your localhost and see the performance. if you don´t have any issue try to check if nothing is messing with your request like the place where you host your node server.
Let me know if this helps or if you still have issues so i can try to help you out better.

I had the same issue once with my nodejs code using the same development stack(mongodb,nodejs), I got into trouble of late response from api, and after spending a lot of time I found my server the real culprit I then changed from heroku to amazon aws EC2 instance and things started working fast and amazingly fast, so probably
your web server is culprit
to make sure mongodb is not culprit, write an api endpoint where you can just return some json response without making any query to database.

Related

Postgresql IPC: MessageQueueSend delaying queries from nodejs backend

I am testing postgresql with a nodejs backend server, using Pg npm module to query the database. The issue I am having is that when I run a particular query directly on the postgres database table using query tool on pgAdmin4, the data is fetched within 5 seconds. But the same query when requested from the backend through my nodejs server, the process is split between parallel workers and a client backend using IPC: messagequeuesend, this runs for almost 17minutes before return the data. I can't understand why the same query is fast using query tool, it just processes it fast but the one coming from my server has to delay. Is there a way to increase the priority for queries coming from backend to run like it was queried inside pgAdmin. I noticed when I check pg_stat_activity, there is an application value for the query when using query tool, but when the same query comes from the nodejs server the application value is null. I do not seem to understand why its like this, i have been searching every community for answers to this for the past 5 days, and there is no question or answer for this. Please any help will be appreciated. Thanks in advance
I tried running a query from the backend, but its split using IPC processes and result comes in after 17 minutes, the same query takes only 5 seconds to return a result inside pgAdmin query tool

AWS RDS seems to only process 3 requests at a time

I've got a laravel service that loads a reactjs page that fires off around 30+ axios calls after loading. When I look at the source tab, it looks like only 3 of the calls are being processed at a time.
I'm testing this by connecting to the AWS RDS instance from my local environment. I tried using a db.t3.medium and a db.t3.large with no noticeable change.
The applicate has multiple database connections. Each requests uses all three connection to gather the required data. All of the requests execute the exact same query from one database and then each of the requests executes a query on a different table in the second database.
Is there a reason why AWS isn't processing all of my requests simultaneously?
You aren’t looking at the good performance indicator. You are looking at your browser network console. Your browser limits the number of request it can do on the same host simultaneously.
You can find more information here: Max parallel http connections in a browser?

PouchDb on PAAS (Heroku, Bluemix, etc)

I've gotten some great feedback from Stackoverflow and wanted to check on one more idea.
Currently I've got a webapp that runs nodejs on a PAAS (Heroku and trying out bluemix). The server is being configured to talk to a Couchdb (hosted on cloudant). There are two types of data saved to the db, first, user data (each user will have it's own database), and second, app data itself (metrics, user account info (auth/admin stuff).
After some great feedback from here, the idea is that after the user logs in, they will sync there local (browser) pouchdb instance with Cloudant (probably proxied through my server as was recommended here).
Now the question is, for the app/admin data, maybe I run a couchdb instance on my server so i'm not making repeated network calls for things like user logins, metrics data, etc. The data would not be very big, and is already separated from the user data calls. The point is to have a faster/local instance for authentication mainly, changes/updates get synced outside of user requests.
The backend is in express web framework and it looks like my options are pouchdb.... to sync to the Cloudant instance?
If I want local db access (backed a Couchdb instance), on a node/express server running on a PAAS, is that the recommended setup?
Thanks vm for any feedback,
Paul
Not sure if you found a solution, but this is what I would try.
Because heroku clears any temp data, you wouldn't be able to run a default express-pouch database, you will need to change pouch db from using file system to using LevelDOWN adapter.(Link to Pouchdb adapters: https://pouchdb.com/adapters.html)
Some of these adapters would include:
https://github.com/watson/mongodown
https://github.com/kesla/mysqldown
https://github.com/hmalphettes/redisdown
You can easily get heroku mondo, mysql, or redis addon, and connect that to you express-pouchdb backend.
This way you will be able to keep your data.

Improving response time in AngularJS web app using mongolab, nodejs and express

I am developing an Angular Web App that receives its data from a nodejs/express API.
This API runs mongoose that connect to MongoLab (the free account).
When receiving data, I experience a response time > 500ms for small data sets (1.5kb) and > 1s for "large" data sets (hundreds of kb).
This is clearly already too much and I am affraid it will be even worse when my db will grow.
The current process is as follow:
Client goes to mysite.com/discover
Server send the Angular App
Client does an ajax request to mysite.com/api/collections
Server connects to MongoLab, receives data
Server send back data to client
This process is very fast in local development (local node, local MongoDB) (<20ms) but takes so much time when put online. I investigated what was taking so much time and I found two equal contributions:
API response time
MongoLab response time
The MongoDB query takes no time (<1ms).
The Question
What are my options to reduce this response time? Is it possible to store locally the data and use mongoLab as a "copy" (it would remove the MongoDB latency in most cases)? If so, would you suggest disk temporary storage, mongoDB replica, ...?
What I tried
I migrated my mongoLab DB to match the physical localization of my server (VM on digitalocean), it improve by a few 50ms, not much more.
Thanks a lot

very slow (~1000ms) response time. Heroku. node.js. mongolab. doing almost nothing

With the free account on Heroku and the free account on MongoLab (not with the Heroku plug-in) I get response time of ~1000ms per request (single user, it is just me still, relevant to all requests, not only first one after a long idle time).
I've checked from my own computer + the same free MongoLab account and I get ~168ms per the same type of requests.
While it is still high, I want to ask regarding Heroku. Is it reasonable to have such poor response time, even with the free account.
Will the response time go significantly better when I pay them?
Mongoose, MongoDB (node.js) Native Driver.
Do you have any idea for me what to check?
MongoLab helped me realize my database is defined in Europe, while the server (Heroku) is in the US. They also told me how to clone the existing database to a new one in us-east-1 (from their web console).

Resources