Node.js Express slowing down after 5 requests - node.js

I've come across a strange issue in my app (Node.js/jQuery/Mongo/Express setup) where connections are slowing down after 5 attempts.
I have a page that has 10 items on, all when clicked proceed to open a lightbox, with then in turn grabs data from the database. If I click a button to open the lightbox it will show information fine, until you've opened/closed the lightbox about 6 times.
I'm running v9.0.0 of Node, and I've got maxSockets to Infinity.
My route:
app.get('/active-item', function(req, res) {
if (typeof req.headers.referer !== 'undefined') {
if (req.session.user!==undefined) {
res.render('pages/active-item', { user: req.session.user, email_address: req.session.user.emailAddress });
} else {
res.render('pages/active-item');
}
} else {
res.redirect('/');
}
});
On the 6th attempt it just doesn't load the data in the lightbox and then after about 10-20 seconds, I see the requests come through my logs on Express. Very strange issue, I thought it was a maxSockets issue but it doesn't seem so.
I've seen similar issues online but all seems to be solved by setting maxSockets to Infinity. Am I leaving a connection open on my route? I can't possibly think what else I haven't covered.
Any ideas?

Related

Wait for Electron window to be a specific URL?

So I am writing an application in Node.js and Electron, and I am trying to login through Google on the same session, then get another URL. I have the session working and the login to Google working, but when I login to Google, I want it to switch and load another URL. The current idea I have is something like this:
win.loadURL('https://accounts.google.com/').then(() => {
});
setInterval(() => {
while (!win.webContents.getURL().includes("myaccount.google")) {
if (win.webContents.getURL().includes("myaccount.google")) {
break;
}
}
clearInterval();
}, 100);
win.loadURL('http://' + url);
I just don't know what else to do, I know this is fairly spaghetti but I've tried so many things and nothing seems to work correctly. I feel like I shouldn't even be doing a while loop at all because it seems to freeze my browser (understandably).
Listen for the event 'did-navigate'. I would like to but I can't test this for you at this time.
Also: electron webview navigation event

Save state of TCP socket (connection) after page refresh

I'm building a node.js & express app that connects to an IoT device over TCP. On the index page of the app I am rendering the page and running a function that starts to ping the device. Eventually the device responds, I open a TCP socket, and I use socket.io to emit an event to the front end. This process takes much longer than the time to render the page.
When I refresh the page, I do not want to re-ping the device. I need to "save" the state of the connection. Knowing that the device is already connected, I should not need to re-run my connection function.
Possible solutions. My thoughts:
Boolean variable for TCP socket status. In the node.js net documentation I do not see a variable for socket connection status. Another stackoverflow answer said ._connected is undocumented and could work but 'this is not the node.js way'.
Sessions. I could save device state in a session, and keep track of it on re-load. However, based on my reading I can't save the session information after the res.render is called. I specifically want to save the connection status after reload.
Use a local variable. However this is 'reset' on page load.
Save state in JSON file. Use a separate deviceState.js file with state information. I could export that file and use it as a required module in my index page.
My question is - how can I save the state of the device connection even when the page is reloaded? My hunch is there is some combo of session and local variable but I am not sure how these could work based on my points above.
Here's a simplified version of the index route. Let me know if it is missing anything that would help solve this problem:
router.get('/', function(req, res, next) {
function connectToDevice() {
// ping device and open TCP socket...
// eventually the following function is called as an eventlistener to
// a net socket.on('connect')...
function onConnect(socket) {
res.io.emit('machine-online');
}
}
connectToDevice();
res.render('index', {
title: 'Page title'
});
}
This is my first time posting on stackoverflow. I am still learning the relevant key words and have been unable to find a solution to this problem.
The way I solved this is #4, save state in external JSON.
deviceStatus.js: File at the root of the app structure that holds some information in JSON object.
var status = {};
var deviceStatus;
deviceStatus = function() {
status = {
"isOnline": false,
};
return status;
};
module.exports = deviceStatus();
Then in my index.js: Require the deviceStatus module.
var status = require('../deviceStatus');
And I am using this to render the page: In the (not shown) connectToDevice() function, I set status.isOnline == true. So here, if the device is offline, then I connect and render the page. If not, only render the page, do not connect.
if(status.isOnline == false) {
connectToDevice();
}
res.render('index', {
title: 'Page title',
machineOnline: status.isOnline
});
There might be a better way to do this, but this is the method that works for me. When the app re-loads the status.isOnline starts as false, which works since it is not connected yet.

How to catch when a user leaves the page in Meteor and/or Iron router?

I'm trying to catch when a user leaves from my Meteor application (version 1.2.0.2) ; something equivalent to the SocketIO disconnect() on the server side.
The user could close his browser, go to another website or simply refresh the page and it would fire anyway
Surprisingly, i'm searching on Internet and everything is mixed up, nothing works properly. I thought Meteor was literally based on this magic-live processing so it must manage this event in a way or another.
Iron router documentation specify this :
onStop: Called when the route is stopped, typically right before a new
route is run.
I also found Router.load and Router.unload but none of them work. This is my current [not working] code which is quite simple
Router.configure
layoutTemplate: 'MasterLayout'
loadingTemplate: 'Loading'
notFoundTemplate: 'NotFound'
Router.onStop (->
console.log('Try to stop')
Users.insert({
name: "This is a test"
lat: 0
lng: 0
})
)
Am I doing something wrong here ? How do you catch this event in my app ?
You need to attach to the onStop of the route, not the router. For instance:
Router.route('/', {
onStop: function() {
console.log("someone left the '/' route");
}
});
Another option is to use the onStop event of subscriptions. That is probably the option most similar to the socketio disconnect you mentioned. You can find an example of that in the typhone source code.
There were two solution working, I found the 2nd and best one by searching in the API Documentation for a while.
First solution : working with subscribe & publish
Anywhere in the controller / front-end side you must subscribe to a collection
# in coffee
#subscribe('allTargets')
# in javascript
this.subscribe('allTargets')
Afterwards you just have to publish and add a onStop listener. This example will take a Targets collection I already defined somewhere before, it just gets all the entries.
# in coffee
Meteor.publish 'allTargets', ->
#onStop ->
# Do your stuff here
return Targets.find()
# in javascript
Meteor.publish('allTargets', function() {
this.onStop(function() {
// Do your stuff here
});
return Targets.find();
});
You have to be careful not to return Targets.find() before you set the onStop listener too. I don't think it's a perfect solution since you don't listen to the connection itself but the changes of a collection.
Second solution : working with DDP connection
I realized through the Meteor API Documentation we can directly listen to the connection and see if someone disconnect from the server-side.
To stay well-organized and clean within my Meteor Iron project I added a new file in app/server/connection.coffee and wrote this code
# in coffee
Meteor.onConnection (connection) ->
connection.onClose ->
# Do your stuff
# in javascript
Meteor.onConnection(function(connection) {
connection.onClose(function() {
// Do your stuff
});
});
You can manage datas with connection.id which's the unique identifier of your browser tab. Both solutions are working well for me.
If you use Meteor.userId through their accounts system, you can't use it outside a method in the server-side so I had to find a workaround with the connection.id.
If anyone has a better solution to manage connections while getting this kind of client datas, don't hesitate to give your input.

Caching responses in express

I have some real trouble caching responses in express… I have one endpoint that gets a lot of requests (around 5k rpm). This endpoint fetches data from mongodb and to speed things up I would like to cache the full json response for 1 second so that only the first request each second hits the database while the others are served from a cache.
When abstracting out the database part of the problem my solution looks like this. I check for a cached response in redis. If one is found I serve it. If not I generate it, send it and set the cache. The timeout is too simulate the database operation.
app.get('/cachedTimeout', function(req,res,next) {
redis.get(req.originalUrl, function(err, value) {
if (err) return next(err);
if (value) {
res.set('Content-Type', 'text/plain');
res.send(value.toString());
} else {
setTimeout(function() {
res.send('OK');
redis.set(req.originalUrl, 'OK');
redis.expire(req.originalUrl, 1);
}, 100);
}
});
});
The problem is that this will not only make the first request every second hit the database. Instead all requests that comes in before we had time to set the cache (before 100ms) will hit the database. When adding real load to this it really blows up with response times around 60 seconds because a lot of requests are getting behind.
I know this could be solved with a reverse proxy like varnish but currently we are hosting on heroku which complicates such a setup.
What I would like to do is to do some sort of reverse-proxy cache inside of express. I would like it so that all the requests that comes in after the initial request (that generates the cache) would wait for the cache generation to finish before using that same response.
Is this possible?
Use a proxy layer on top your node.js application. Vanish Cache would be a good choice
to work with Nginx to serve your application.
p-throttle should do exactly what you need: https://www.npmjs.com/package/p-throttle

node.js runs concurrently, or does it?

Boys and girls,
i've been messing around with node.js today and I can't seem to reproduce this concurrent magic.
i wrote this rather small server:
var http = require("http");
var server = http.createServer(function(req, res) {
setTimeout(function() {
res.writeHead(200,{"content-type":"text/plain"});
res.end("Hello world!");
}, 10000);
});
server.listen(8000);
but what's strange, when running localhost:8000 in multiple chrome tabs at the same time. its as if the request is 'queued'. 1st tab takes 10 seconds, 2nd tab takes 20 seconds, 3rd tab takes 30 seconds etc...
But when running this very example with Links it behaves how I expect it (concurrently handling requests).
P.S. This seems to occur in Chrome and Firefox
bizarre
The requests for the same URL/hostname get queued client-side in the browser. That has nothing to do with node.js, your code is fine.
If you use different URLs in each tab, you example should work. (for a few tabs)
Also have a look at: Multiple Ajax requests for same URL

Resources