NodeJs Flow Process - node.js

i recently started working on nodeJs. But got confused between single thread concept wile using global or var keyword variables across diffrent pages which are inluded using required. Below is my code.
var express = require("express");
var app = express();
var mysql = require("mysql");
var comm_fun = requre("./common_functions");
var global_res = ''; // variable to send the reponse back to browser from any function
var global_req = '';// variables to save request data
app.listen(1234,function(req,res){
console.log("server started");
global_res = res;
global_req = req;
mysql = '';// code to have mysql connection in this variable
});
now, as i can use mysql, global_res and global_req variables in diffrent files which are included. But will these effects the values for another request, as they are global.
for example,
request 1 has value 1 for "global_req",
at same time request 2 comes
request 2 has value 2 for "global_req"
will these two request will collide at any point of time. can at any point of time can "global_req" be overwriten from 1 to 2 , as second request has arrived. Or both are diffrent request and will not collide at any point of time.
Thanks,

Yes, and here's a very simple scenario:
Request 1 comes in, sets global ID to 1
Request 1 then performs some IO-bound operation (e.g. DB query)
Meanwhile request 2 comes in and sets the global ID to 2
The IO-bound operation completes and request 1 continues but the global ID is now 2 and not 1
The argument about Node being single-threaded is only applicable if your application is purely CPU-bound, however, given the majority of Node applications are IO-bound then they are in-effect multi-threading. The problem lies in the fact we can't guarantee the order in which callbacks will return, and with that in mind it's fairly simple to create a race condition e.g.
let global_id = 0;
...
app.use(() => global_id++);
app.post('/create', async (req, res) => {
try {
const exists = db.query(`SELECT id FROM table WHERE id = ${global_id}`);
if (!exists) {
db.query(`INSERT INTO ....`);
}
} catch (e) {
return next(e);
}
});
If 2 requests hit the /create endpoint simultaenously, it would be very unlikely that both would succeed (at least correctly).

Related

Custom Computed Etag for Express.js

I'm working on a simple local image server that provides images to a web application with some JSON. The web application has pagination that will do a get request "/images?page=X&limit&200" to an express.js server that returns the JSON files in a single array. I want to take advantage of the browser's internal caching such that if a user goes to a previous page the express.js returns an ETAG. I was wondering how this could be achieved with express.js? For this application, I really just want the computation of the ETAG to take in three parameters the page, the directory, and the limit (It doesn't need to consider the whole JSON body). Also this application is for local use only, so I want the server to do the heavy lifting since I figured it be faster than the browser. I did see https://www.npmjs.com/package/etag which seems promising, but I'm not sure how to use it with express.js
Here's a boilerplate of the express.js code I have below:
var express = require('express');
var app = express();
var fs = require('fs');
app.get('/', async (req, res) =>{
let files = [];
let directory = fs.readdirSync("mypath");
let page = parseInt(req.query.page);
let limit = parseInt(req.query.limit);
for (let i = 0; i < limit; ++i) {
files.push(new Promise((resolve) => {
fs.readFile(files[i + page * limit].name, (err, data) => {
// format the data so easy to use for UI
resolve(JSON.parse(data));
});
});
}
let results = await Promise.all(files);
// compute an etag here and attach it the results.
res.send(results);
});
app.listen(3000);
When your server sends an ETag to the client, it must also be prepared to check the ETag that the client sends back to the server in the If-None-Match header in a subsequent "conditional" request.
If it matches, the server shall respond with status 304; otherwise there is no benefit in using ETags.
var serverEtag = "<compute from page, directory and limit>";
var clientEtag = req.get("If-None-Match");
if (clientEtag === serverEtag) res.status(304).end();
else {
// Your code from above
res.set("ETag", serverEtag);
res.send(results);
}
The computation of the serverEtag could be based on the time of the last modification in the directory, so that it changes whenever any of the images in that directory changes. Importantly, this could be done without carrying out the fs.readFile statements from your code.

Socket Io limiting only 6 connection in Node js

So i came across a problem.I am trying to send {id} to my rest API (node js) and in response, I get data on the socket.
Problem:
For first 5-6 time it works perfectly fine and display Id and send data back to socket.But after 6 time it does not get ID.
I tried this https://github.com/socketio/socket.io/issues/1145
and https://github.com/socketio/socket.io/issues/1145 but didn't solve the problem.
On re compiling the server it shows previous {ids} which i enter after 6 time.it like after 5-6 time it is storing id in some form of cache.
Here is my API route.
//this route only get {id} 5-6 times .After 5-6 times it does not display receing {id}.
const express = require("express");
var closeFlag = false;
const PORT = process.env.SERVER_PORT; //|| 3000;
const app = express();
var count = 1;
http = require('http');
http.globalAgent.maxSockets = 100;
http.Agent.maxSockets = 100;
const serverTCP = http.createServer(app)
// const tcpsock = require("socket.io")(serverTCP)
const tcpsock = require('socket.io')(serverTCP, {
cors: {
origin: '*',
}
, perMessageDeflate: false
});
app.post("/getchanneldata", (req, res) => {
console.log("count : "+count)
count++;// for debugging purpose
closeFlag = false;
var message = (req.body.val).toString()
console.log("message : "+message);
chanId = message;
client = dgram.createSocket({ type: 'udp4', reuseAddr: true });
client.on('listening', () => {
const address = client.address();
});
client.on('message', function (message1, remote) {
var arr = message1.toString().split(',');
}
});
client.send(message, 0, message.length, UDP_PORT, UDP_HOST, function (err, bytes) {
if (err) throw err;
console.log(message);
console.log('UDP client message sent to ' + UDP_HOST + ':' + UDP_PORT);
// message="";
});
client.on('disconnect', (msg) => {
client.Diconnected()
client.log(client.client)
})
}
);
There are multiple issues here.
In your app.post() handler, you don't send any response to the incoming http request. That means that when the browser (or any client) sends a POST to your server, the client sits there waiting for a response, but that response never comes.
Meanwhile, the browser has a limit for how many requests it will send simultaneously to the same host (I think Chrome's limit is coincidentally 6). Once you hit that limit, the browser queues the request and and waits for one of the previous connections to return its response before sending another one. Eventually (after a long time), those connections will time out, but that takes awhile.
So, the first thing to fix is to send a response in your app.post() handler. Even if you just do res.send("ok");. That will allow the 7th and 8th and so on requests to be immediately sent to your server. Every incoming http request should have a response sent back to it, even if you have nothing to send, just do a res.end(). Otherwise, the http connection is left hanging, consuming resources and waiting to eventually time out.
On a separate note, your app.post() handler contains this:
client = dgram.createSocket({ type: 'udp4', reuseAddr: true });
This has a couple issues. First, you never declare the variable client so it becomes an implicit global (which is really bad in a server). That means successive calls to the app.post() handler will overwrite that variable.
Second, it is not clear from the included code when, if ever, you close that udp4 socket. It does not appear that the server itself ever closes it.
Third, you're recreating the same UDP socket on every single POST to /getchanneldata. Is that really the right design? If your server receives 20 of these requests, it will open up 20 separate UDP connections.

Will a Lambda function using Express reload it's middleware for each request?

TLDR: Will the middleware that gets called by app.use be recalculated for each individual request in Express?
I have a Lambda function that uses the following function as its middleware to inject the user into every request:
async function(req, res, next) {
try {
const IDP_REGEX = /.*\/.*,(.*)\/(.*):CognitoSignIn:(.*)/;
const authProvider =
req.apiGateway.event.requestContext.identity
.cognitoAuthenticationProvider;
const [, , , userId] = authProvider.match(IDP_REGEX);
const cognito = new AWS.CognitoIdentityServiceProvider();
const listUsersResponse = await cognito
.listUsers({
UserPoolId: process.env.AUTH_SAYM_USERPOOLID,
Filter: `sub = "${userId}"`,
Limit: 1,
})
.promise();
const user = listUsersResponse.Users[0];
req.user = user;
next();
} catch (error) {
console.log(error);
next(error);
}
}
My question is: Will this middleware run once for each individual user? Or will this code run only every time the Lambda function "goes to sleep"? What I mean is, will there be a bug, if the Lambda function gets pinged often and never goes idle again that each request will have the user who made the first request? Or does the middleware calculate for each individual request?
This middleware will run upon every request and this has nothing to do with Lambda. Once the requests reaches your Express-based Lambda function, it's just going to act as a regular Express server.
The middleware's responsibility is to intercept every HTTP call, regardless of where it's running.
The only thing I see here is to move the declaration of const cognito = new AWS.CognitoIdentityServiceProvider(); out of your middleware's scope, so the instance of this object is cached for as long as the container lives. You will gain a couple of microseconds by doing so (which may be irrelevant, but is indeed a good practice).
You have to keep in mind though that since modules in Node.js are singletons, anything required/declared in "global" scopes (outside a given function, for example) is going to be re-used in future invocations to the same running container. Use this in your favour but also be careful with unintentionally caching things you don't want to.

How to properly use dataloaders across multiple users?

In caching per request the following example is given that shows how to use dataloaders in express.
function createLoaders(authToken) {
return {
users: new DataLoader(ids => genUsers(authToken, ids)),
}
}
var app = express()
app.get('/', function(req, res) {
var authToken = authenticateUser(req)
var loaders = createLoaders(authToken)
res.send(renderPage(req, loaders))
})
app.listen()
I'm confused about passing authToken to genUsers batch function. How should a batch function be composed to use authToken and to return each user corresponding results??
What the example is saying that genUsers should use the credentials of the current request's user (identified by their auth token) to ensure they can only fetch data that they're allowed to see. Essentially, the loader gets initialised at the start of the request, and then discarded at the end, and never recycled between requests.

Automatically Refresh an 'app.get()' in Node.js

I tried to find this topic everywhere, unsuccessfully.
I have a app.get() code, really simple, that gets one information from a variable that is constantly changing ("equip[id].status"), I want it to check if the variable changed from time to time.
Other option is keep a track on this variable, and when it changes run app.get() again.
My codes:
This is the one I want to refresh:
app1.get("/", function(req,res){
id = 1;
res.render(__dirname + '/connected.html', {equipment: 'Esteira 1', status: equip[id].status, historico: equip[id].hist});
});
And this are the ones that change "status"
app1.get("/open", function(req,res){
id = 1;
equip[id].status = 'Conectado';
equip[id].hist = equip[id].hist + equip[id].status;
var now = new Date();
time = dateFormat(now,"h:MM:ss");
console.log(time+" - Equipment 'Esteira 1' has connected on port 3001");
res.end
});
app1.get("/closed", function(req,res){
id = 1;
equip[id].status = 'Desconectado';
equip[id].hist = equip[id].hist + equip[id].status;
var now = new Date();
time = dateFormat(now,"h:MM:ss");
console.log(time+" - Equipment 'Esteira 1' has disconnected");
res.end
});
The app.get() is a server-side code, and on it's own it has no power over changing the client-side.
In the client-side, you need to employ a javascript code to either poll the server regularly (ajax) or have it actively listen to server for changes through websockets. That way you can also choose to either refresh the whole page or just load the relevant changes (like this very site does!).
You should look into these relevant technologies: javascript, ajax, long-polling, socket.io

Resources