nodejs: create same value in a short period - node.js

I have a sample app, user can access some dynamic data via different URL.
The workflow is like this:
when user request get_data?id=1234567
first it checks the DB if there is data for it
if not, generate a random value
then if other users request the same url within a short time (say 10 min), it will return the value that already generated
if one of the users send a clear request, the value will be cleared from DB.
The bug is: if 2 users request the same url at the same time, since it needs time to query the DB, it would do 1 and 2 at the same time, then create different values for each user.
How to make sure that in a short period, it always generate same value for all users?

Although NodeJS is single threaded and does not have the problem of synchronization between multiple threads, its asynchronous event model still can require you to implement some kind of locking mechanism to synchronize the concurrent async operations in certain situations (like in your case).
There are a number of libraries that provide this functionality, e.g. async-mutex. Here's a very basic example of what your code could look like:
const express = require('express');
const app = express();
const Mutex = require('async-mutex').Mutex;
const locks = new Map();
app.get('/get_data', async (req, res) => {
const queryId = req.query.id;
if (!queryId) {
// handle empty queryid ...
}
if (!locks.has(queryId)) {
locks.set(queryId, new Mutex());
}
const lockRelease = await locks
.get(queryId)
.acquire();
try {
// do the rest of your logic here
} catch (error) {
// handle error
} finally {
// always release the lock
lockRelease();
}
});
app.listen(4000, function () {
console.log("Server is running at port 4000");
});

Related

How should I go about using Redis for creating notifications with express/nodejs?

Okay so I have a Nodejs/Express app that has an endpoint which allows users to receive notifications by opening up a connection to said endpoint:
var practitionerStreams = [] // this is a list of all the streams opened by pract users to the
backend
async function notificationEventsHandler(req, res){
const headers ={
'Content-Type': 'text/event-stream',
'Connection': 'keep-alive',
'Cache-Control': 'no-cache'
}
const practEmail = req.headers.practemail
console.log("PRACT EMAIL", practEmail)
const data = await ApptNotificationData.findAll({
where: {
practEmail: practEmail
}
})
//console.log("DATA", data)
res.writeHead(200, headers)
await res.write(`data:${JSON.stringify(data)}\n\n`)
// create a new stream
const newPractStream = {
practEmail: practEmail,
res
}
// add the new stream to list of streams
practitionerStreams.push(newPractStream)
req.on('close', () => {
console.log(`${practEmail} Connection closed`);
practitionerStreams = practitionerStreams.filter(pract => pract.practEmail !== pract.practEmail);
});
return res
}
async function sendApptNotification(newNotification, practEmail){
var updatedPractitionerStream = practitionerStreams.map((stream) =>
// iterate through the array and find the stream that contains the pract email we want
// then write the new notification to that stream
{
if (stream["practEmail"]==practEmail){
console.log("IF")
stream.res.write(`data:${JSON.stringify(newNotification)}\n\n`)
return stream
}
else {
// if it doesnt contain the stream we want leave it unchanged
console.log("ELSE")
return stream
}
}
)
practitionerStreams = updatedPractitionerStream
}
Basically when the user connects it takes the response object (that will stay open), will put that in an Object along with a unique email, and write to it in the future in sendApptNotification
But obviously this is slow for a full app, how exactly do I replace this with Redis? Would I still have a Response object that I write to? Or would that be replaced with a redis stream that I can subscribe to on the frontend? I also assume I would store all my streams on redis as well
edit: from what examples I've seen people are writing events from redis to the response object
Thank you in advance
If you want to use Redis Stream as notification system, you can follow this official guide:
https://redis.com/blog/how-to-create-notification-services-with-redis-websockets-and-vue-js/ .
To get this data as real time you need to create a websocket connection. I prefer to send to you an official guide instead of create it for you it's because the quality of this guide. It's perfect to anyone understand how to create it, but you need to adapt for your reality, of course.
However like I've said to you in the comments, I just believe that it's more simple to do requests in your api endpoint like /api/v1/notifications with setInterval in your frontend code and do requests each 5 seconds for example. If you prefer to use a notification system as real time I think you need to understand why do you need it, so in the future you can change your system if you think you need it. Basically it's a trade-off you must to do!
For my example imagine two tables in a relational database, one as Users and the second as Notifications.
The tables of this example:
UsersTable
id name
1 andrew
2 mark
NotificationTable
id message userId isRead
1 message1 1 true
2 message2 1 false
3 message3 2 false
The endpoint of this example will return all cached notifications that isn't read by the user. If the cache doesn't exists, it will return the data from the database, put it on the cache and return to the user. In the next call from API, you'll get the result from cache. There some points to complete in this example, for example the query on the database to get the notifications, the configuration of time expiration from cache and the another important thing is: if you want to update all the time the notifications in the cache, you need to create a middleware and trigger it in the parts of your code that needs to notify the notifications user. In this case you'll only update the database and cache. But I think you can complete these points.
const redis = require('redis');
const redisClient = redis.createClient();
app.get('/notifications', async (request, response) => {
const userId = request.user.id;
const cacheResult = await redisClient.get(`user:${userId}:notifications`)
if (cacheResult) return response.send(cacheResult);
const notifications = getUserNotificationsFromDatabase(userId);
redisClient.set(`user:${userId}:notifications`, notifications);
response.send(notifications);
})
Besides that there's another way, you can simple use only the redis or only the database to manage this notification. Your relational database with the correct index will send to your the results as faster as you expect. You'll only think about how much notifications you'll have been.

How to handle two long request simultanously in expressJS

i have an API with express one route make a few time to get all data required (search through long JSON object)
router.get(
"/:server/:maxCraftPrice/:minBenef/:from/:to",
checkJwt,
async (req, res) => {
const getAllAstuces = new Promise(async (resolve, reject) => {
const { EQUIPMENTS_DIR, RESOURCES_DIR } = paths[req.params.server];
const astuces = [];
const { from, to, maxCraftPrice, minBenef } = req.params;
const filteredEquipments = getItemByLevel(from, to);
for (const equipment in filteredEquipments) {
// parsing and push to astuces array
}
resolve(astuces);
});
const resource = await getAllAstuces;
return res.json(resource);
}
);
Now in my website when someone go to the page associated with this route, while the data is loading EVERY other request is just locked like in a queue
I tried to add Promise to handle this but no change
Is there a way to handle requests simultanously or maybe should i refactor that route to make it faster ?
If your request takes a long time to process, it will block all other requests until it is done. If you can make the request take less processing time, that's a good place to start, but you're probably going to need to take further steps to make multiple requests faster.
There are various methods for getting around this situation. This article describes a few approaches.

Collect data from multiple requests nodejs

Environment: nodejs 17.2, expressjs 4.17
Task: Data arrives at the url of the type "/user-actions" from different servers at a rate of about 2 requests per second. It is necessary to aggregate them and send them to another server once a second.
For example:
Request #1: {userId: 1, action: "hitOne"}
Request #2: {userId: 2, action: "hitFive"}
Request #3: {userId:1, action: "hitFive"}
It is necessary to get 2 objects
const data = [{userId: 1, action: "hitOne"}, {userId: 2, action: "hitFive"}]
and
const data = [{userId: 1, action: "hitFive"}]
Each of these objects is sent to another server 1 time per second, something like this
http.post('http://newserver.url/user-actions', {data});
I was thinking of making a variable in which to record everything that comes in the request and send this variable to a new server once a second on a timer.
But something tells me: or there will be problems with the variable (for example, due to concurrent request) and there will not always be the data I was waiting for, or some nonsense will come out with the timer.
How to implement such a scenario correctly?
So you're creating some sort of a proxy service. You have two potential issues:
data persistence and
retries and pending requests.
I think your best bet would be to do something like this:
in this particular service (with the API route), you just receive requests, and store them somewhere like Redis or RabbitMQ or Amazon SQS.
in another service, you deal with retries, posting etc.
Even if you don't split up into two services, you still want to put things in specialised storage service in things like this. E.g. your process crashes, and you lose whatever data you have holding in memory. It also simplifies all the management details. Things like storing, sorting what came first, what requests are pending - those are super easy to deal with with RabbitMQ-type service.
But let's simplify things and hold them in memory. Now you have to deal with all these things yourself.
So here's a naive proxy service.
const axios = require('axios');
const axiosRetry = require('axios-retry');
const REQUEST_INTERVAL = 1000; // every second
const MAX_PARALLEL_REQUESTS = 3;
axiosRetry(axios, { retries: 3});
const bucket = [];
let exportingInterval;
let currentRequestsCount = 0;
const logRequest = (payload) => bucket.push(payload);
const makeRequest = (payload) => axios.post('http://remote-service/user-actions', payload);
const sendData = () => {
// first, make sure you don't make more then X parallel requests
if (currentRequestsCount > MAX_PARALLEL_REQUESTS) {
return
}
// clear the bucket
const data = bucket.splice(0, bucket.length);
if (!data.length) {
return;
}
// send the data, make sure you handle the failure.
currentRequestsCount = currentRequestsCount + 1;
makeRequest()
.then(() => currentRequestsCount = currentRequestsCount - 1)
.catch(() => {
// what do do now? We failed three times.
// Let's put everything back in the bucket, try in the next request.
bucket.splice(bucket.length, 0, ...data);
currentRequestsCount = currentRequestsCount - 1;
});
}
const startExporting = () => exportingInterval = setInterval(() => sendData(), REQUEST_INTERVAL);
const stopExporting = () => clearInterval(exportingInterval)
module.exports = {
logRequest,
startExporting,
stopExporting,
}
Now, you would use this like this:
const proxyService = require('./proxy-service');
const app = express();
proxyService.startExporting();
// ...
app.post('/user-data', (req, res) => {
proxyService.logRequest(req.body);
res.end();
});
Now, this is just a simple example:
You do need to make sure that retry policy is ok. You have to make sure you don't DoS wherever you're sending the data.
You want to make sure you limit how many objects you send per call.
maybe that 1-second interval is not a good thing - what if sending off the data lasts longer?
What if you start piling requests? My simple counter only counts to 3, maybe it's more complicatd then that.
Also, calling that startExporting and stopExporting should go in some common place, where you boot the app, and where you cleanup in case of a graceful shutdown.
But it gives you an idea of how it can be done.
It is a trade-off: time, data
If you want ensure enough data, you can use Promise.all() function. When both 2 requests is responded, you will call api to sent it. This will ensure that enough data but won't ensure that send data to another server once a second.
let pr1 = request1();
let pr2 = request2();
await data = promise.all([pr1,pr2]);
requestToAnotherServer(data);
If you want ensure that server will send data to another server once a second. You can set a timer, when time out, you send data that server received. But this won't ensure that enough data
sendData = [];
setInterval(()=>{
let pr1 = request1().then(data=>{sendData.push(data)});
let pr2 = request2().then(data=>{sendData.push(data)});
requestToAnotherServer(sendData);
sendData = [];
},1000)

Node / Express generate calendar URL

I have a database with a bunch of dates and an online overview where you can view them, now I know I can copy a URL from my Google Agenda and import this in other calendar clients so I can view the events there.
I want to generate an Express endpoint where I fetch every event every time the endpoint is called and return it in a format that can be imported by other calendar clients. Now with packages like iCal-generator I could generate, read, and return the file whenever a user requests the URL. but it feels redudent to write a file to my storage to then read it, return it and delete it every time it's requested.
What is the most effiecent way to go about this?
Instead of generating the file/calendar data on every request, you could implement a simple caching mechanism. That is, upon start of your node app you generate the calendar data and put it in your cache with corresponding time to live value. Once the data has expired or new entries are inserted into your DB you invalidate the cache, re-generate the data and cache it again.
Here's a very simple example for an in-memory cache that uses the node-cache library:
const NodeCache = require('node-cache');
const cacheService = new NodeCache();
// ...
const calendarDataCacheKey = 'calender-data';
// at the start of your app, generate the calendar data and cache it with a ttl of 30 min
cacheCalendarData(generateCalendarData());
function cacheCalendarData (calendarData) {
cacheService.set(calendarDataCacheKey, calendarData, 1800);
}
// in your express handler first try to get the value from the cache
// if not - generate it and cache it
app.get('/calendar-data', (req, res) => {
let calendarData = cacheService.get(calendarDataCacheKey);
if (calendarData === undefined) {
calendarData = generateCalendarData();
cacheCalendarData(calendarData);
}
res.send(calendarData);
});
If your app is scaled horizontally you should consider using redis.
100% untested, but I have code similar to this that exports to a .csv from a db query, and it might get you close:
const { Readable } = require('stream');
async function getCalendar(req, res) {
const events = await db.getCalendarEvents();
const filename = 'some_file.ics';
res.set({
'Content-Type': 'text/calendar',
'Content-Disposition': `attachment; filename=${filename}`,
});
const input = new Readable({ objectMode: true });
input.pipe(res)
.on('error', (err) => {
console.error('SOME ERROR', err);
res.status(500).end();
});
events.forEach(e => input.push(e));
input.push(null);
}
if you were going to use the iCal generator package, you would do your transforms within the forEach method before pushing to the stream.

Reject two in a row requests (nodejs)

Sometimes we need restrict executing repeating requests from a user if the first request has not been finished. For example: We do want register user at some service, and only after that put him into database with external id. I would like to have a service where I can set protected routes.
I have solved this problem by checking flag at request.start() -> and removed it after the request has been completed. Anyway I'm looking for your suggestion guys.
I think you need to build this into the route handlers yourself:
var inProgress = []
var handleRegisterRoute = function create(req, res, next) {
var id = req.user.id
var found = _.find(inProgress, function(item){
return item = id
})
if(found){
res.send('Dont think twice, its aright')
return
} else {
inProgress.push(id)
completeRegister()
inProgress= _.without(inProgress, id);
req.send()
return
}
}
Again this is psuedo code - to the gist of what I would write. You may need to store the "inprogress" in a better data store referenceable by your entire server farm - some sort of DB.

Resources