setInterval continues even after nodejs process/server exits - node.js

I have a route in my expressjs which calls some logic on an interval:
export const get = (req, res) => {
const { ids } = req.query;
let interval = setInterval(() => {
let id = ids.pop()
console.log("On id #:", id)
// do some stuff with the id
if (!ids.length) {
clearInterval(interval)
}
}, 1000)
interval.unref();
res.json();
};
So if you hit the route that calls this function, code runs every 1 second, and prints the console statement. This is all within an express server. If the process runs to completion, great. However, I notice that even if I ctrl c out of the server's running process before the interval loop is done, the terminal continues printing the id every 1 second, until its done.
I thought the .unref() method is supposed to prevent this, but it has no effect. I also tried this:
["exit", "uncaughtException", "SIGINT", "SIGTERM"]
.forEach(signal => {
process.on(signal, () => {
clearInterval(interval);
});
});
This also seems to have no effect. How can I make sure that any running intervals are stopped and cleared when I shut down my express server?
Maybe my issue has something to do with some convoluted logic for the setup and cleanup of my server. For reference:
import express from "express";
import { setRoutes } from "./routes";
let app = express();
const server = app.listen(8080, function () {
console.log(`🎧 Server is now running on port : ${8080}`);
});
app = setRoutes(app);
function stop() {
// Run some code to clean things up before server exits or restarts
server.on("close", function () {
console.log("⬇ Shutting down server");
process.exit();
});
server.close();
}
process.on("SIGINT", stop);
process.on("SIGTERM", stop);
process.on("SIGQUIT", stop);
process.once("SIGUSR2", function () {
// Run some code to do a different kind of cleanup on nodemon restart:
process.kill(process.pid, "SIGUSR2");
});

Related

setTimeout gets overwritten by second setTimeout in Node.js

const express = require("express");
const REFRESH_INTERVAL = 1000;
const app = express();
const recurringSetTimeout = (id) => {
setTimeout(
handle = async () => {
console.log(`Starting setTimeout${id}`);
setTimeout(handle, REFRESH_INTERVAL);
},
REFRESH_INTERVAL,
)
}
const main = () => {
app.get("/t1", async (req, res) => {
const msg = "Starting timeout 1...";
console.log(msg)
recurringSetTimeout(1);
res.send(msg);
});
app.get("/t2", async (req, res) => {
const msg = "Starting timeout 2...";
console.log(msg)
recurringSetTimeout(2);
res.send(msg);
});
app.listen(3000, () => {
console.log("Server is running...");
});
}
main();
I have this code that should run two different setTimeouts on two route calls, t1 runs first and t2 runs second. After calling t1 I am getting correct results in logs, "Starting setTimeout1" and after each second again. But when I call t2, I am expecting to get "Starting setTimeout2" but as well as "Starting setTimeout1" from the previous route call. But it seems that setTimeout2 somehow overrides setTimeout1, since I am getting only "Starting setTimeout2" but after each second two logs, instead of one. So it seems that setTimeout1 is running but gets overwritten by setTimeout2, since I am getting only timeout2 logs (2 per second).
If I run setInterval, instead of setTimeout, it works fine, but I want to understand this behaviour of setTimeout, can someone please explain. Thanks!

Async testing a TCP server with Node net - Why are my tests throwing 'Cannot log after tests are done'?

Context
I have spiked a TCP Echo server and am trying to write integration tests for it. I'm familiar with testing but not asynchronously.
Desired Behaviour
I would like my tests to spy on logs to verify that the code is being executed. Any asynchronous code should be handled properly, but this is where my understanding falls through.
Problem
I am getting asynchronous errors:
Cannot log after tests are done. Did you forget to wait for something async in your test?
Attempted to log "Server Ready".
Attempted to log "Client Connected".
And finally a warning:
A worker process has failed to exit gracefully and has been force exited. This is likely caused by tests leaking due to improper teardown. Try running with --detectOpenHandles to find leaks.
Code
import * as net from 'net';
export const runServer = async () => {
console.log('Initialising...');
const port: number = 4567;
const server = net.createServer((socket: net.Socket) => {
socket.write('Ready for input:\n');
console.log('Client Connected');
socket.on('data', (data) => {
echo(data, socket);
server.close();
})
socket.on('end', () => {
console.log('Client Disconnected');
});
});
server.listen(port, () => {
console.log('Server Ready');
});
server.on('error', (err) => {
console.log(err);
});
function echo(data: Buffer, socket: net.Socket) {
console.log('Input received')
socket.write(data)
};
return server;
}
Test
More such tests will be added when these are working as intended.
import * as index from '../../src/index';
import * as process from 'child_process';
test('the server accepts a connection', async () => {
const consoleSpy = spyOn(console, 'log');
try {
const server = await index.runServer();
await consoleConnect();
await consoleEcho();
await consoleStop();
} catch (error) {
console.log(error);
}
expect(consoleSpy).toHaveBeenCalledWith('Initialising...');
expect(consoleSpy).toHaveBeenCalledWith('Client Connected');
expect(consoleSpy).toHaveBeenCalledTimes(2);
})
const consoleConnect = async () => {
process.exec("netcat localhost 4567");
}
const consoleEcho = async () => {
process.exec("Echo!");
}
const consoleStop = async () => {
process.exec("\^C");
}
My overall question is how do I manage the events in such a way that the tests are able to run without async-related errors?
You are not properly waiting for your child processes to finish. Calls to exec return a ChildProcess object as documented here. They execute asynchronously so you need to wait for them to finish using the event emitter api.
Ex from docs
ls.on('exit', (code) => {
console.log(`child process exited with code ${code}`);
});
To use async await you need to convert to using a promise. Something like
return new Promise((resolve, reject) => {
ls.on('exit', (code) => {
resolve(code);
});
// Handle errors or ignore them. Whatevs.
}
You are closing your server on the first data event. You probably don't want to do that. At least wait until the end event so you have read all the data.
socket.on('data', (data) => {
echo(data, socket);
server.close(); // Remove this line
})

How to launch/cancel a function in express by user request

I have express js server which listens for a request from a user:
// PUG template
$("#request").click(()=>{
$.ajax({url: "/launch", method: 'get'});
})
// server.js
app.get('/launch', (req, res) => {
getCatalog();
}
This should launch a huge do while function, which may literally work for hours, except if user wishes to cancel it.
Question: what should be the proper way to launch and cancel this function by user request?
// PUG template
$("#cancel").click(()=>{
...
})
I would approach this case with code logic other than express functionality.
You can create a class that handles catalog loading and also have a state for this process that you can turn on and off (I believe loading process involves multi async functions calls so the event loop allow this).
For example:
class CatalogLoader {
constructor() {
this.isProcessing = false
}
getCatalog() {
this.isProcessing = true
while(... && this.isProcessing) {
// Huge loading logic
}
this.isProcessing = false
}
}
And in express you can add below api:
app.get('/launch', (req, res) => {
catalogLoader.getCatalog();
}
app.get('/cancelLaunch', (req, res) => {
catalogLoader.isProcessing = false
...
}
Second possible solution using require('child_process');, but you need to know the PID of the process you wish to cancel. Benefit: unload the main node thread from a heavy task.
So, including node's const childProcess = require('child_process');
Then:
app.get('/launch', (req,res) => {
const getCatalog = childProcess.fork('script.js', null, {
detached: true
});
res.send();
});
app.get('/kill', (req,res,next) => {
const pid = req.query.pid;
if (pid) {
process.kill(pid);
res.send();
} else {
res.end();
}
});
$("#requestCancel").click(()=>{
$.ajax({url: "/kill?pid=variable*", method: 'get'});
})
I send data to PUG's js from node via Server Sent Events

jest doesn't wait beforeAll resolution to start tests

What I test: An express server endpoints
My goal: automate API tests in a single script
What I do: I launch the express server in a NodeJS child process and would like to wait for it to be launched before the test suite is run (frisby.js endpoints testing)
What isn't working as expected: Test suite is launched before Promise resolution
I rely on the wait-on package which server polls and resolves once the resource(s) is/are available.
const awaitServer = async () => {
await waitOn({
resources: [`http://localhost:${PORT}`],
interval: 1000,
}).then(() => {
console.log('Server is running, launching test suite now!');
});
};
This function is used in the startServer function:
const startServer = async () => {
console.log(`Launching server http://localhost:${PORT} ...`);
// npmRunScripts is a thin wrapper around child_process.exec to easily access node_modules/.bin like in package.json scripts
await npmRunScripts(
`cross-env PORT=${PORT} node -r ts-node/register -r dotenv/config src/index.ts dotenv_config_path=.env-tests`
);
await awaitServer();
}
And finally, I use this in something like
describe('Endpoints' () => {
beforeAll(startTestServer);
// describes and tests here ...
});
Anyway, when I launch jest the 'Server is running, launching test suite now!' console.log never shows up and the test suite fails (as the server isn't running already). Why does jest starts testing as awaitServer obviously hasn't resolved yet?
The npmRunScripts function works fine as the test server is up and running a short while after the tests have failed. For this question's sake, here's how npmRunScripts resolves:
// From https://humanwhocodes.com/blog/2016/03/mimicking-npm-script-in-node-js/
const { exec } = require('child_process');
const { delimiter, join } = require('path');
const env = { ...process.env };
const binPath = join(__dirname, '../..', 'node_modules', '.bin');
env.PATH = `${binPath}${delimiter}${env.PATH}`;
/**
* Executes a CLI command with `./node_modules/.bin` in the scope like you
* would use in the `scripts` sections of a `package.json`
* #param cmd The actual command
*/
const npmRunScripts = (cmd, resolveProcess = false) =>
new Promise((resolve, reject) => {
if (typeof cmd !== 'string') {
reject(
new TypeError(
`npmRunScripts Error: cmd is a "${typeof cmd}", "string" expected.`
)
);
return;
}
if (cmd === '') {
reject(
new Error(`npmRunScripts Error: No command provided (cmd is empty).`)
);
return;
}
const subProcess = exec(
cmd,
{ cwd: process.cwd(), env }
);
if (resolveProcess) {
resolve(subProcess);
} else {
const cleanUp = () => {
subProcess.stdout.removeAllListeners();
subProcess.stderr.removeAllListeners();
};
subProcess.stdout.on('data', (data) => {
resolve(data);
cleanUp();
});
subProcess.stderr.on('data', (data) => {
reject(data);
cleanUp();
});
}
});
module.exports = npmRunScripts;
I found the solution. After trying almost anything, I didn't realize jest had a timeout setup which defaults at 5 seconds. So I increased this timeout and the tests now wait for the server promise to resolve.
I simply added jest.setTimeout(3 * 60 * 1000); before the test suite.
In my case, it caused by the flaw of the beforeAll part. Make sure the beforeAll doesn't contain any uncaught exceptions, otherwise it will behaves that the testing started without waiting for beforeAll resolves.
After much digging I found a reason for why my beforeAll didn't seem to be running before my tests. This might be obvious to some, but it wasn't to me.
If you have code in your describe outside an it or other beforeX or afterY, and that code is dependent on any beforeX, you'll run into this problem.
The problem is that code in your describe is run before any beforeX. Therefore, that code won't have access to the dependencies that are resolved in any beforeX.
For example:
describe('Outer describe', () => {
let server;
beforeAll(async () => {
// Set up the server before all tests...
server = await setupServer();
});
describe('Inner describe', () => {
// The below line is run before the above beforeAll, so server doesn't exist here yet!
const queue = server.getQueue(); // Error! server.getQueue is not a function
it('Should use the queue', () => {
queue.getMessage(); // Test fails due to error above
});
});
});
To me this seems unexpected, considering that code is run in the describe callback, so my impression was that that callback would be run after all beforeX outside the current describe.
It also seems this behavior won't be changed any time soon: https://github.com/facebook/jest/issues/4097
In newer versions of jest (at least >1.3.1) you can pass a done function to your beforeAll function and call it after everything is done:
beforeAll(async (done) => {
await myAsyncFunc();
done();
})
it("Some test", async () => {
// Runs after beforeAll
})
More discussions here: https://github.com/facebook/jest/issues/1256

Can you make Supertest wait for an Express handler to finish executing?

I use Supertest to test my Express apps, but I'm running into a challenge when I want my handlers to do asynchronous processing after a request is sent. Take this code, for example:
const request = require('supertest');
const express = require('express');
const app = express();
app.get('/user', async (req, res) => {
res.status(200).json({ success: true });
await someAsyncTaskThatHappensAfterTheResponse();
});
describe('A Simple Test', () => {
it('should get a valid response', () => {
return request(app)
.get('/user')
.expect(200)
.then(response => {
// Test stuff here.
});
});
});
If the someAsyncTaskThatHappensAfterTheResponse() call throws an error, then the test here is subject to a race condition where it may or may not failed based on that error. Even aside from error handling, it's also difficult to check for side effects if they happen after the response is set. Imagine that you wanted to trigger database updates after sending a response. You wouldn't be able to tell from your test when you should expect that the updates have completely. Is there any way to use Supertest to wait until the handler function has finished executing?
This can not be done easily because supertest acts like a client and you do not have access to the actual req/res objects in express (see https://stackoverflow.com/a/26811414/387094).
As a complete hacky workaround, here is what worked for me.
Create a file which house a callback/promise. For instance, my file test-hack.js looks like so:
let callback = null
export const callbackPromise = () => new Promise((resolve) => {
callback = resolve
})
export default function callWhenComplete () {
if (callback) callback('hack complete')
}
When all processing is complete, call the callback callWhenComplete function. For instance, my middleware looks like so.
import callWhenComplete from './test-hack'
export default function middlewareIpnMyo () {
return async function route (req, res, next) {
res.status(200)
res.send()
// async logic logic
callWhenComplete()
}
}
And finally in your test, await for the callbackPromise like so:
import { callbackPromise } from 'test-hack'
describe('POST /someHack', () => {
it.only('should handle a post request', async () => {
const response = await request
.post('/someHack')
.send({soMuch: 'hackery'})
.expect(200)
const result = await callbackPromise()
// anything below this is executed after callWhenComplete() is
// executed from the route
})
})
Inspired by #travis-stevens, here is a slightly different solution that uses setInterval so you can be sure the promise is set up before you make your supertest call. This also allows tracking requests by id in case you want to use the library for many tests without collisions.
const backgroundResult = {};
export function backgroundListener(id, ms = 1000) {
backgroundResult[id] = false;
return new Promise(resolve => {
// set up interval
const interval = setInterval(isComplete, ms);
// completion logic
function isComplete() {
if (false !== backgroundResult[id]) {
resolve(backgroundResult[id]);
delete backgroundResult[id];
clearInterval(interval);
}
}
});
}
export function backgroundComplete(id, result = true) {
if (id in backgroundResult) {
backgroundResult[id] = result;
}
}
Make a call to get the listener promise BEFORE your supertest.request() call (in this case, using agent).
it('should respond with a 200 but background error for failed async', async function() {
const agent = supertest.agent(app);
const trackingId = 'jds934894d34kdkd';
const bgListener = background.backgroundListener(trackingId);
// post something but include tracking id
await agent
.post('/v1/user')
.field('testTrackingId', trackingId)
.field('name', 'Bob Smith')
.expect(200);
// execute the promise which waits for the completion function to run
const backgroundError = await bgListener;
// should have received an error
assert.equal(backgroundError instanceof Error, true);
});
Your controller should expect the tracking id and pass it to the complete function at the end of controller backgrounded processing. Passing an error as the second value is one way to check the result later, but you can just pass false or whatever you like.
// if background task(s) were successful, promise in test will return true
backgroundComplete(testTrackingId);
// if not successful, promise in test will return this error object
backgroundComplete(testTrackingId, new Error('Failed'));
If anyone has any comments or improvements, that would be appreciated :)

Resources