Session expired functionality for MERN stack project - security

I want to implement session expire functionality. so when the user stays on that page for more than 15 min then it should show an alert that the session is expired now. also, meanwhile, if someone copies that URL and pastes it to another tab/browser/incognito (where he was already logged in) or refreshes the page, it should not retake the countdown. for example, if it's in the middle of the countdown let's say 5 min left to reach 15min then after copy-pasting URl on another tab at that time it should start from 5min left then 4 min left, and so on without retaking the countdown from 15 min.
I am not sure what is the best way to implement this in the MERN stack project (should use any library or cookie or local storage) also with security?
I tried sample implementation but it does not work for cross-browser or for incognito it retakes the session also after 15min if I refresh the page countdown timer again starts. someone has a better example or suggestion regarding how to implement the functionality is really appreciated TIA :-)
My dummy implementation example -
countdowntimer.tsx file
const history = useHistory();
const [countdownSessionExpired, setCountdownSessionExpired] = React.useState(false);
React.useEffect(() => {
const countDownTime = localStorage.getItem(COUNTER_KEY) || 10;
countDown(Number(countDownTime), () => {
console.log("countDownTime:", countDownTime);
setCountdownSessionExpired(true);
});
}, [history]);
return (
<>
{countdownSessionExpired ? (
<div>sessoin is expired</div>
) : (
<div>view the page</div>
)}
</>
);
};
==================================================================================================
countdowntimer.utils.tsx file
export const COUNTER_KEY = "myCounter";
export function countDown(i: number, callback: Function) {
const timer = setInterval(() => {
let minutes = i / 60;
let seconds = i % 60;
minutes = minutes < 10 ? 0 + minutes : minutes;
seconds = seconds < 10 ? 0 + seconds : seconds;
// document.getElementById("displayDiv")!.innerHTML = "Time (h:min:sec) left for this station is " + "0:" + minutes + ":" + seconds;
console.log("Time (h:min:sec) left for this station is:", seconds, i--);
if (i-- > 0) {
localStorage.setItem(COUNTER_KEY, String(i));
} else {
localStorage.removeItem(COUNTER_KEY);
clearInterval(timer);
callback();
}
}, 1000);
}```

Related

MacOS Catalina freezing+crashing after running Node.JS load test script

I wrote up a simple load testing script that runs N number of hits to and HTTP endpoint over M async parallel lanes. Each lane waits for the previous request to finish before starting a new request. The script, for my specific use-case, is randomly picking a numeric "width" parameter to add to the URL each time. The endpoint returns between 200k and 900k of image data on each request depending on the width parameter. But my script does not care about this data and simply relies on garbage collection to clean it up.
const fetch = require('node-fetch');
const MIN_WIDTH = 200;
const MAX_WIDTH = 1600;
const loadTestUrl = `
http://load-testing-server.com/endpoint?width={width}
`.trim();
async function fetchAll(url) {
const res = await fetch(url, {
method: 'GET'
});
if (!res.ok) {
throw new Error(res.statusText);
}
}
async function doSingleRun(runs, id) {
const runStart = Date.now();
console.log(`(id = ${id}) - Running ${runs} times...`);
for (let i = 0; i < runs; i++) {
const start = Date.now();
const width = Math.floor(Math.random() * (MAX_WIDTH - MIN_WIDTH)) + MIN_WIDTH;
try {
const result = await fetchAll(loadTestUrl.replace('{width}', `${width}`));
const duration = Date.now() - start;
console.log(`(id = ${id}) - Width ${width} Success. ${i+1}/${runs}. Duration: ${duration}`)
} catch (e) {
const duration = Date.now() - start;
console.log(`(id = ${id}) - Width ${width} Error fetching. ${i+1}/${runs}. Duration: ${duration}`, e)
}
}
console.log(`(id = ${id}) - Finished run. Duration: ` + (Date.now() - runStart));
}
(async function () {
const RUNS = 200;
const parallelRuns = 10;
const promises = [];
const parallelRunStart = Date.now();
console.log(`Running ${parallelRuns} parallel runs`)
for (let i = 0; i < parallelRuns; i++) {
promises.push(doSingleRun(RUNS, i))
}
await Promise.all(promises);
console.log(`Finished parallel runs. Duration ${Date.now() - parallelRunStart}`)
})();
When I run this in Node 14.17.3 on my MacBook Pro running MacOS 10.15.7 (Catalina) with even a modest parallel lane number of 3, after about 120 (x 3) hits of the endpoint the following happens in succession:
Console output ceases in the terminal for the script, indicating the script has halted
Other applications such as my browser are unable to make network connections.
Within 1 - 2 mins other applications on my machine begin to slow down and eventually freeze up.
My entire system crashes with a kernel panic and the machine reboots.
panic(cpu 2 caller 0xffffff7f91ba1ad5): userspace watchdog timeout: remoted connection watchdog expired, no updates from remoted monitoring thread in 60 seconds, 30 checkins from thread since monitoring enabled 640 seconds ago after loadservice: com.apple.logd, total successful checkins since load (642 seconds ago): 64, last successful checkin: 10 seconds ago
service: com.apple.WindowServer, total successful checkins since load (610 seconds ago): 60, last successful checkin: 10 seconds ago
I can very easily stop of the progression of these symptoms by doing a Ctrl+C in the terminal of my script and force quitting it. Everything quickly gets back to normal. And I can repeat the experiment multiple times before allowing it to crash my machine.
I've monitored Activity Monitor during the progression and there is very little (~1%) CPU usage, memory usage reaches up to maybe 60-70mb, though it is pretty evident that the Network activity is peaking during the script's run.
In my search for others with this problem there were only two Stack Overflow articles that came close:
node.js hangs other programs on my mac
Node script causes system freeze when uploading a lot of files
Anyone have any idea why this would happen? It seems very dangerous that a single app/script could so easily bring down a machine without being killed first by the OS.

my function runs faster than the interval set

I have this code, which is supposed to send a message and add to a variable every 10 minutes
function btcb() {
const embed = new Discord.MessageEmbed()
.setColor('#FF9900')
.setTitle("Bitcoin block #"+bx.blocks.btc+" was mined")
.setAuthor('Block mined', 'https://cdn.discordapp.com/emojis/710590499991322714.png?v=1')
client.channels.cache.get(`710907679186354358`).send(embed)
bx.blocks.btc = bx.blocks.btc+1
}
setInterval(btcb,600000)
But it actually does it every 2-3 minutes instead. What am I doing wrong?
Youre better off setting the interval to 1 second and counting 600 seconds before resseting:
let sec = 0;
function btcb() {
if(sec++<600) return;
sec = 0;
const embed = new Discord.MessageEmbed()
.setColor('#FF9900')
.setTitle("Bitcoin block #"+bx.blocks.btc+" was mined")
.setAuthor('Block mined', 'https://cdn.discordapp.com/emojis/710590499991322714.png?v=1')
client.channels.cache.get(`710907679186354358`).send(embed)
bx.blocks.btc = bx.blocks.btc+1
}
setInterval(btcb,1000)

Node.js | Trigger a function in PUG template when ioHook is activated by key hook?

I have a simple node.js app with a javascript file and a pug template file.
In the javascript file I required iohook and the code logic itself works. It detects pressed keys and I can do actions based on that.
But here is my problem: I have a pug-template for my layout with a few simple buttons and divs. If you click the button, the div starts a timer - imagine: 60 seconds timer starts onClick. This also works.
Now I want to combine these two: If I press a certain key like "1" it should start the timer for the first div. But here is my problem: I failed to combine these two.
Now my question is what are you recommendations to solve this? I don't really know how to continue. Any tips are appreciated!
I tried to require iohook inside the PUG template but that doesn't work because the scope of the template doesn't know the require function.
If i try to pass ioHook to the template, it always throws the error, that it is undefined.
index.js
const ioHook = require('iohook');
ioHook.on('keydown', event => {
if ( event['keycode'] == 55 ) {
// do something
});
ioHook.start(true);
index.pug
html
body
.button
.top-div(onclick="startTimer(60, document.querySelector('#top-span'));")
| TOP
span#top-span
script.
function startTimer(duration, display) {
let timer = duration, minutes, seconds;
let IntervalId = setInterval(function () {
minutes = parseInt(timer / 60, 10)
seconds = parseInt(timer % 60, 10);
minutes = minutes < 10 ? "0" + minutes : minutes;
seconds = seconds < 10 ? "0" + seconds : seconds;
display.textContent = minutes + ":" + seconds;
display.style.backgroundColor = "grey";
display.parentNode.style.backgroundColor = "grey";
if (timer > 0)
{
--timer;
}
else
{
display.style.backgroundColor = "limegreen";
display.parentNode.style.backgroundColor = "limegreen";
clearInterval(IntervalId);
}
}, 1000);
}

Node.js Calling functions as quickly as possible without going over some limit

I have multiple functions that call different api endpoints, and I need to call them as quickly as possible without going over some limit (20 calls per second for example). My current solution is to have a delay and call the function once every 50 milliseconds for the example I gave, but I would like to call them as quickly as possible and not just space out the calls equally with the rate limit.
function-rate-limit solved a similar problem for me. function-rate-limit spreads out calls to your function over time, without dropping calls to your function. It still allows instantaneous calls to you function until the rate limit is reached, so it can behave with no latency introduced under normal circumstances.
Example from function-rate-limit docs:
var rateLimit = require('function-rate-limit');
// limit to 2 executions per 1000ms
var start = Date.now()
var fn = rateLimit(2, 1000, function (x) {
console.log('%s ms - %s', Date.now() - start, x);
});
for (var y = 0; y < 10; y++) {
fn(y);
}
results in:
10 ms - 0
11 ms - 1
1004 ms - 2
1012 ms - 3
2008 ms - 4
2013 ms - 5
3010 ms - 6
3014 ms - 7
4017 ms - 8
4017 ms - 9
You can try using queue from async. Be careful when doing this, it essentially behaves like a while(true) in other languages:
const async = require('async');
const concurrent = 10; // At most 10 concurrent ops;
const tasks = Array(concurrent).fill().map((e, i) => i);
let pushBack; // let's create a ref to a lambda function
const myAsyncFunction = (task) => {
// TODO: Swap with the actual implementation
return Promise.resolve(task);
};
const q = async.queue((task, cb) => {
myAsyncFunction(task)
.then((result) => {
pushBack(task);
cb(null, result);
})
.catch((err) => cb(err, null));
}, tasks.length);
pushBack = (task) => q.push(task);
q.push(tasks);
What's happening here? We are saying "hey run X tasks in parallel" and after each task gets completed, we put it back in the queue which is the equivalent of saying "run X tasks in parallel forever"

nodejs - every minute, on the minute

How can I wait for a specific system time before firing ?
I want to fire an event when seconds = 0, i.e. every minute
while (1==1) {
var date = new Date();
var sec = date.getSeconds();
if(sec===0) {
Do Something()
}
}
You shouldn't do that, because with this while you will have a blocking operation. Also, there are better things to do in any JavaScript platform, like using setInterval/setTimeout functions.
The node docs for them are here.
A quick example of how to achieve what you want:
setInterval(function() {
var date = new Date();
if ( date.getSeconds() === 0 ) {
DoSomething();
}
}, 1000);
For a more fine grained control over scheduled processes in Node, maybe you should checkout node-cron.

Resources