Pushing query result from back end to front end with interval - node.js

I'm playing around with sockets for the first time and have a created a very simple back end and front end to test with our system at work.
I want the back end to query our server every 10 seconds in this example and pass the results to the front end.
I currently have the interval set as 10 seconds however when I run it I only get the result after ten seconds - i want the result straight away and then to check every ten seconds for changes.
I've tried moving code around and seeing what works, but I usually get a message telling me the variable is undefined (because it is then obviously outside the function.
My code is below - I am aware its probably a it overkill having the set interval in both the result and catch, so if anyone can help tidy it up so it works correctly, I'd appreciate it. Still a bit of a noob I'm afraid!
const express = require("express");
const app = express();
const server = require("http").createServer(app);
const io = require("socket.io")(server);
const oledb = require('oledb');
const smartconn =
`--myodbcconnection--`;
const db = oledb.odbcConnection(smartconn);
let command = `SELECT item FROM mytable.table LIMIT 10`
db.query(command)
.then(result => {
setInterval(function(){
io.emit("query", result.result);
}, 10000);
},
err => {
setInterval(function(){
io.emit("query", err);
}, 10000);
});
io.set("origins", "*:*");
io.on("connection", async (socket) => {
console.log("Client Successfully Connected");
});
server.listen(5000, () => {
console.log("Backend Server is running on http://localhost:5000");
});
Expect results to show immediately. Previously the old method didn't use sockets and polled the using set interval from the front end, which I want to move away from.

Here's one way to do it:
//Resolves a Promise after ms milliseconds
const sleep = (ms) => {
return new Promise((resolve, reject) => {
setTimeout(resolve, ms);
});
}
//Returns a db.query result with server status
const getServerStatus = async () => {
return await db.query(`SELECT item FROM mytable.table LIMIT 10`);
}
//Runs indefinitely and emits events to frontend every 10 seconds
(async () => {
while(true){
try{
const result = await getServerStatus();
io.emit("query", result.result);
} catch (error){
io.emit("query", error); //It would be better to change the event name to something else
}
await sleep(10000);
}
})();
This way the first event will be sent immediately because unlike setInterval, my implementation waits after executing the code, not before. Also, you can be sure that queries will not overlap when they take more than 10s to execute. setInterval doesn't guarantee that the next execution will wait for the previous one to finish.
The downside is that events will be sent after (10,000ms + query_delay), so depending on your database size and performance, some of them might get delayed by a few or a few hundred milliseconds. To counter that, you can measure getServerStatus's execution time and subtract it from the wait time.

Sorted it, by moving the emit functions into the io.on function.
Cheers!

Related

setTimeout gets overwritten by second setTimeout in Node.js

const express = require("express");
const REFRESH_INTERVAL = 1000;
const app = express();
const recurringSetTimeout = (id) => {
setTimeout(
handle = async () => {
console.log(`Starting setTimeout${id}`);
setTimeout(handle, REFRESH_INTERVAL);
},
REFRESH_INTERVAL,
)
}
const main = () => {
app.get("/t1", async (req, res) => {
const msg = "Starting timeout 1...";
console.log(msg)
recurringSetTimeout(1);
res.send(msg);
});
app.get("/t2", async (req, res) => {
const msg = "Starting timeout 2...";
console.log(msg)
recurringSetTimeout(2);
res.send(msg);
});
app.listen(3000, () => {
console.log("Server is running...");
});
}
main();
I have this code that should run two different setTimeouts on two route calls, t1 runs first and t2 runs second. After calling t1 I am getting correct results in logs, "Starting setTimeout1" and after each second again. But when I call t2, I am expecting to get "Starting setTimeout2" but as well as "Starting setTimeout1" from the previous route call. But it seems that setTimeout2 somehow overrides setTimeout1, since I am getting only "Starting setTimeout2" but after each second two logs, instead of one. So it seems that setTimeout1 is running but gets overwritten by setTimeout2, since I am getting only timeout2 logs (2 per second).
If I run setInterval, instead of setTimeout, it works fine, but I want to understand this behaviour of setTimeout, can someone please explain. Thanks!

Nodejs `fs.createReadStream` as promise

Im trying to get fs.createReadStream working as a promise, so after the entire file has been read, it will be resolved.
In the case bellow, im pausing the stream, executing the awaitable method and resuming.
How to make .on('end'... to be be executed in the end.
if 1. is not possible, why the `.on('wont be fired', maybe i can use it to resolve the promise.
function parseFile<T>(filePath: string, row: (x: T) => void, err: (x) => void, end: (x) => void) {
return new Promise((resolve, reject) => {
const stream = fs.createReadStream(filePath);
stream.on('data', async data => {
try {
stream.pause();
await row(data);
} finally {
stream.resume();
}
})
.on('end', (rowCount: number) => {
resolve();// NOT REALLY THE END row(data) is still being called after this
})
.on('close', () => {
resolve();// NEVER BEING CALLED
})
.on('error', (rowCount: number) => {
reject();// NEVER GETS HERE, AS EXPECTED
})
})
}
UPDATE
Here you can actually test it: https://stackblitz.com/edit/node-czktjh?file=index.js
run node index.js
The output should be 1000 and not 1
Thanks
Something to be aware of. You've removed the line processing from the current version of the question so the stream is being read in large chunks. It appears to be reading the entire file in just two chunks, thus just two data events so the expected count here is 2, not 1000.
I think the problem with this code occurs because stream.pause() does not pause the generation of the end event - it only pauses future data events. If the last data event has been fired and you then await inside the processing of that data event (which causes your data event handler to immediately return a promise, the stream will think it's done and the end event will still fire before you're done awaiting the function inside the processing of that last data event. Remember, the data event handler is NOT promise-aware. And, it appears that stream.pause() only affects data events, not the end event.
I can imagine a work-around with a flag that keeps track of whether you're still processing a data event and postpones processing the end event until you're done with that last data event. I will add code for that in a second that illustrates how to use the flag.
FYI, the missing close event is another stream weirdness. Your nodejs program actually terminates before the close event gets to fire. If you put this at the start of your program:
setTimeout(() => { console.log('done with timer');}, 5000);
Then, you will see the close event because the timer will prevent your nodejs program from exiting before the close event gets to fire. I'm not suggesting this as a solution to any problem, just to illustrate that the close event is still there and wants to fire if your program doesn't exit before it gets a chance.
Here's code that demonstrated the use of flags to work-around the pause issue. When you run this code, you will only see 2 data events, not 1000 because this code is not reading lines, it's reading much larger chunks that that. So, the expected result of this is not 1000.
// run `node index.js` in the terminal
const fs = require('fs');
const parseFile = row => {
let paused = true;
let ended = false;
let dataCntr = 0;
return new Promise((resolve, reject) => {
const stream = fs.createReadStream('./generated.data.csv');
stream
.on('data', async data => {
++dataCntr;
try {
stream.pause();
paused = true;
await row(data);
} finally {
paused = false;
stream.resume();
if (ended) {
console.log(`received ${dataCntr} data events`);
resolve();
}
}
})
.on('end', rowCount => {
ended = true;
if (!paused) {
console.log(`received ${dataCntr} data events`);
resolve();
}
})
.on('close', () => {
//resolve();
})
.on('error', rowCount => {
reject();
});
});
};
(async () => {
let count = 0;
await parseFile(async row => {
await new Promise(resolve => setTimeout(resolve, 50)); //sleep
count++;
});
console.log(`lines executed: ${count}, the expected is more than 1`);
})();
FYI, I still think your original version of the question had the problem I mentioned in my first comment - that you weren't pausing the right stream. What is documented here is yet another problem (where you can get end before your await in the last data event is done).

How to make a function wait for data to appear in the DB? NodeJS

I am facing a peculiar situation.
I have a backend system (nodejs) which is being called by FE (pretty standard :) ). This endpoint (nodejs) needs to call another system (external) and get the data it produces and return them to the FE. Until now it all might seem pretty usual but here comes the catch.
The external system has async processing and therefore responds to my request immediately but is still processing data (saves them in a DB) and I have to get those data from DB and return them to the FE.
And here goes the question: what is the best (efficient) way of doing it? It usually takes a couple of seconds only and I am very hesitant of making a loop inside the function and for the data to appear in the DB.
Another way would be to have the external system call an endpoint at the end of the processing (if possible - would need to check that with the partner) and wait in the original function until that endpoint is called (not sure exactly how to implement that - so if there is any documentation, article, tutorial, ... would appreciate it very much if you could share guys)
thx for the ideas!
I can give you an example that checks the Database and waits for a while if it can't find a record. And I made a fake database connection for example to work.
// Mocking starts
ObjectID = () => {};
const db = {
collection: {
find: () => {
return new Promise((resolve, reject) => {
// Mock like no record found
setTimeout(() => { console.log('No record found!'); resolve(false) }, 1500);
});
}
}
}
// Mocking ends
const STANDBY_TIME = 1000; // 1 sec
const RETRY = 5; // Retry 5 times
const test = async () => {
let haveFound = false;
let i = 0;
while (i < RETRY && !haveFound) {
// Check the database
haveFound = await checkDb();
// If no record found, increment the loop count
i++
}
}
const checkDb = () => {
return new Promise((resolve) => {
setTimeout(async () => {
record = await db.collection.find({ _id: ObjectID("12345") });
// Check whether you've found or not the record
if (record) return resolve(true);
resolve(false);
}, STANDBY_TIME);
});
}
test();

Handling of large array of items in nodejs

I have array of more then 100 items. Each item hits a url. After getting response from the specific url of the item I do insert option in the database.
If I use async await, then I have to wait till the 100 items get processd. But if I do asynchronoulsy, then I got response but as asynchronus behavior the data in db will not be visible untill all process get completed.
Example 1: using async await
var loc = async(locations){
return new Promise((resolve,reject) => {
try{
for(let item of locations){
var response = await html(location.url);
await insertDB(response);
}
}catch(e){
console.log('error : ',e);
}
resolve('All operation completed successfully');
})
}
loc(locations);
locations is array of items
In this I will get response only when all the requests get completed.
Example 2: asynchronously
var loc = (locations){
return new Promise((resolve,reject) => {
locations.forEach(location => {
html(location.url,(res) => {
insertDB(response,(rows) => {
});
});
})
resolve('All operation completed successfully');
})
}
loc(locations)
Here I will immediately get the response, but the process on hitting and insertion will be processing in the background.
Considering example 1, is it possible that we can divide the looping request into child process so that make it execute fast in nodejs or any other way to do it?
Is this what you want? It will work like your 2nd solution, but it will be resolved after every insert complete
var loc = (locations){
return new Promise((resolve,reject) => {
const promises = locations.map(async (location) => {
try {
var response = await html(location.url);
await insertDB(response);
} catch(e) {
console.log('error : ',e);
}
})
await Promise.all(promises);
resolve('All operation completed successfully');
})
}
Should have difference. Can you do me a favor to test the performance time.
solution1() as your first solution function
solution2() as my solution function
console.time('solution1');
await solution1(locations);
console.timeEnd('solution1');
console.time('solution2');
await solution2(locations);
console.timeEnd('solution2');
assume insertDB cost 100ms ~ 200ms, avg = 150ms
Solution 1:
run loop 1, wait insertDB -> run loop 2, wait insertDB
time = 150ms + 150ms = 300ms
Solution 2:
run loop 1, without wait insertDB -> run loop 2, without wait insertDB
till all insertDB complete, then resolve
time = 150ms
because all insertDB start without waiting another insertDB
credit: https://stackoverflow.com/a/56993392/13703967
test: https://codepen.io/hjian0329/pen/oNxdPoZ

Add intentional latency in express

Im using express with node.js, and testing certain routes. I'm doing this tute at http://coenraets.org/blog/2012/10/creating-a-rest-api-using-node-js-express-and-mongodb/
Im calling the http://localhost:3000/wines via ajax (the content doesn't matter). But I want to test latency. Can I do something like make express respond after 2 seconds? (I want to simulate the ajax loader and I'm running on localhost so my latency is pretty much nil)
Use as middleware, for all your requests
app.use(function(req,res,next){setTimeout(next,1000)});
Just call res.send inside of a setTimeout:
setTimeout((() => {
res.send(items)
}), 2000)
To apply it globaly on all requests you can use the following code:
app.use( ( req, res, next ) => {
setTimeout(next, Math.floor( ( Math.random() * 2000 ) + 100 ) );
});
Time values are:
Max = 2000 (sort of.... min value is added so in reality its 2100)
Min = 100
Try connect-pause module. It adds delay to all or only some routes in your app.
app.get('/fakeDelay', function(req,res){
let ms = req.query.t;
ms = (ms>5000 || isNaN(ms)) ? 1000 : parseInt(ms);
setTimeout((()=> res.status(200).send({delay:ms})), ms);
})
Then request the URL as: http://localhost/fakeDelay/?t=2000
(max 5000ms and default of 1000ms on this example)
Update:
Using a Promise. The function 'sleep' can be used for delaying any Express response or other async function.
const sleep = (ms) => new Promise(resolve => setTimeout(resolve, ms));
app.get('/fakeDelay', async (req, res) => {
await sleep(500);
res.send([])
})
just add a comment on top of the solution of #maggocnx : put this middleware early (before your route handler)
app.use(function(req,res,next){setTimeout(next,1000)});
You could also just write your own generic delay handler using a Promise or callback (using a q promise in this case):
pause.js:
var q = require('q');
function pause(time) {
var deferred = q.defer();
// if the supplied time value is not a number,
// set it to 0,
// else use supplied value
time = isNaN(time) ? 0 : time;
// Logging that this function has been called,
// just in case you forgot about a pause() you added somewhere,
// and you think your code is just running super-slow :)
console.log('pause()-ing for ' + time + ' milliseconds');
setTimeout(function () {
deferred.resolve();
}, time);
return deferred.promise;
}
module.exports = pause;
then use it however you'd like:
server.js:
var pause = require('./pause');
router.get('/items', function (req, res) {
var items = [];
pause(2000)
.then(function () {
res.send(items)
});
});
You can use the express-delay package.
The code below will delay all incoming requests by one second.
const app = require('express')();
const delay = require('express-delay');
app.use(delay(1000));
The package also offers the possibility to introduce a random delay within specified boundaries, e.g. by calling delay(1000, 2000) for a delay between one and two seconds.
In my case I wanted a way to have the same processing time for all of my endpoints.
The solution I found is to override one of the Response methods :
/* MINIMUM DELAY */
const minDelay = 200; /* ms */
app.use((req, res, next) => {
const tmp = res.json;
const start = new Date().getTime();
(res.json as any) = async (body: any) => {
await new Promise((re) => setTimeout(re, minDelay - new Date().getTime() + start));
tmp.apply(res, [body]);
};
next();
});
This way an attacker would not be able to differentiate failed login requests from OK requests just by looking at the response time :)

Resources