Integration testing a node.js socket.io server - node.js

I'm trying to set up a server to respond to socket.io clients using nodejs, express and socket.io. I want to write tests to probe the server and make sure it's handling the events correctly and sending appropriate responses to the client.
I tried writing some automated tests using jest but I couldn't figure out how to actually emit events to the server and have it respond.
Unit testing Node.js and WebSockets (Socket.io)
I checked out the above post but it didn't work for me...

Check out this boilerplate solution that's based on promises and good practice.
You can test your servers entire io events with it, no sweat.
You just need to copy a boilerplate test and add your own code as needed.
Checkout the repo on GitHub for full source code.
https://github.com/PatMan10/testing_socketIO_server
const io = require("socket.io-client");
const ev = require("../utils/events");
const logger = require("../utils/logger");
// initSocket returns a promise
// success: resolve a new socket object
// fail: reject a error
const initSocket = () => {
return new Promise((resolve, reject) => {
// create socket for communication
const socket = io("localhost:5000", {
"reconnection delay": 0,
"reopen delay": 0,
"force new connection": true
});
// define event handler for sucessfull connection
socket.on(ev.CONNECT, () => {
logger.info("connected");
resolve(socket);
});
// if connection takes longer than 5 seconds throw error
setTimeout(() => {
reject(new Error("Failed to connect wihtin 5 seconds."));
}, 5000);
});
};
// destroySocket returns a promise
// success: resolve true
// fail: reject false
const destroySocket = socket => {
return new Promise((resolve, reject) => {
// check if socket connected
if (socket.connected) {
// disconnect socket
logger.info("disconnecting...");
socket.disconnect();
resolve(true);
} else {
// not connected
logger.info("no connection to break...");
resolve(false);
}
});
};
describe("test suit: Echo & Bello", () => {
test("test: ECHO", async () => {
try {
// create socket for communication
const socketClient = await initSocket();
// create new promise for server response
const serverResponse = new Promise((resolve, reject) => {
// define a handler for the test event
socketClient.on(ev.res_ECHO, data4Client => {
//process data received from server
const { message } = data4Client;
logger.info("Server says: " + message);
// destroy socket after server responds
destroySocket(socketClient);
// return data for testing
resolve(data4Client);
});
// if response takes longer than 5 seconds throw error
setTimeout(() => {
reject(new Error("Failed to get reponse, connection timed out..."));
}, 5000);
});
// define data 4 server
const data4Server = { message: "CLIENT ECHO" };
// emit event with data to server
logger.info("Emitting ECHO event");
socketClient.emit(ev.com_ECHO, data4Server);
// wait for server to respond
const { status, message } = await serverResponse;
expect(status).toBe(200);
expect(message).toBe("SERVER ECHO");
} catch (error) {
logger.error(error);
}
});
test("test BELLO", async () => {
try {
const socketClient = await initSocket();
const serverResponse = new Promise((resolve, reject) => {
socketClient.on(ev.res_BELLO, data4Client => {
const { message } = data4Client;
logger.info("Server says: " + message);
destroySocket(socketClient);
resolve(data4Client);
});
setTimeout(() => {
reject(new Error("Failed to get reponse, connection timed out..."));
}, 5000);
});
const data4Server = { message: "CLIENT BELLO" };
logger.info("Emitting BELLO event");
socketClient.emit(ev.com_BELLO, data4Server);
const { status, message } = await serverResponse;
expect(status).toBe(200);
expect(message).toBe("SERVER BELLO");
} catch (error) {
logger.error(error);
}
});
});

Related

How to stream json data from one nodejs server to another and process it at the receiver at runtime?

what I'm basically trying to achieve is to get all items of a mongodb collection on a nodejs server, stream these items (in json format) over REST to another nodejs server and pipe the incoming readstream to stream-json to persist the parsed objects afterwards in another mongodb.
(I need to use streams because my items can be deeply nested objects which would consume a lot of memory. Additionally I'm unable to access the first mongodb from the second server directly due to a strict network segmentation.)
Well, the code I got so far is actually already working for smaller amounts of data, but one collection has about 1.2 GB. Therefore the processing at the receiving side continues to fail.
Here's the code of the sending server:
export const streamData = async (res: Response) => {
try {
res.type('json');
const amountOfItems = await MyModel.count();
if (JSON.stringify(amountOfItems) !== '0'){
const cursor = MyModel.find().cursor();
let first = true;
cursor.on('error', (err) => {
logger.error(err);
});
cursor.on('data', (doc) => {
if (first) {
// open json array
res.write('[');
first = false;
} else {
// add the delimiter before every object that isn't the first
res.write(',');
}
// add json object
res.write(`${JSON.stringify(doc)}`);
});
cursor.on('end', () => {
// close json array
res.write(']');
res.end();
logger.info('REST-API-Call to fetchAllItems: Streamed all items to the receiver.');
});
} else {
res.write('[]');
res.end();
logger.info('REST-API-Call to fetchAllItems: Streamed an empty response to the receiver.');
}
} catch (err) {
logger.error(err);
return [];
}
};
And that's the receiving side:
import { MyModel } from '../models/my-model';
import axios from 'axios';
import { logger } from '../services/logger';
import StreamArray from 'stream-json';
import { streamArray } from 'stream-json/streamers/StreamArray';
import { pipeline } from 'stream';
const persistItems = async (items:Item[], ip: string) => {
try {
await MyModel.bulkWrite(items.map(item => {
return {
updateOne: {
filter: { 'itemId': item.itemId },
update: item,
upsert: true,
},
};
}));
logger.info(`${ip}: Successfully upserted items to mongoDB`);
} catch (err) {
logger.error(`${ip}: Upserting items to mongoDB failed due to the following error: ${err}`);
}
};
const getAndPersistDataStream = async (ip: string) => {
try {
const axiosStream = await axios(`http://${ip}:${process.env.PORT}/api/items`, { responseType: 'stream' });
const jsonStream = StreamArray.parser({ jsonStreaming : true });
let items : Item[] = [];
const stream = pipeline(axiosStream.data, jsonStream, streamArray(),
(error) => {
if ( error ){
logger.error(`Error: ${error}`);
} else {
logger.info('Pipeline successful');
}
},
);
stream.on('data', (i: any) => {
items.push(<Item> i.value);
// wait until the array contains 500 objects, than bulkWrite them to database
if (items.length === 500) {
persistItems(items, ip);
items = [];
}
});
stream.on('end', () => {
// bulkwrite the last items to the mongodb
persistItems(items, ip);
});
stream.on('error', (err: any) => {
logger.error(err);
});
await new Promise(fulfill => stream.on('finish', fulfill));
} catch (err) {
if (err) {
logger.error(err);
}
}
}
As I said, the problem occurs only on a bigger collection holding about 1.2 Gb of data.
The problem seems to occur a few seconds after the sending side server is closing the stream.
This is the error message I get at the receiving server:
ERROR: Premature close
err: {
"type": "NodeError",
"message": "Premature close",
"stack":
Error [ERR_STREAM_PREMATURE_CLOSE]: Premature close
at IncomingMessage.onclose (internal/streams/end-of-stream.js:75:15)
at IncomingMessage.emit (events.js:314:20)
at Socket.socketCloseListener (_http_client.js:384:11)
at Socket.emit (events.js:326:22)
at TCP.<anonymous> (net.js:676:12)
"code": "ERR_STREAM_PREMATURE_CLOSE"
}
Can I somehow prevent the read stream from closing too early?
The only workaround I can imagine right now is to save the stream locally to a file first, then create a new readstream from that file, process/persist the data and delete the file afterwards, although I would prefer not to do that. Additionally I'm not quite sure if that's going to work out or if the closing read stream issue will remain if I try to save a large dataset to a file.
Edit: Well, as I guessed, this approach results in the same error.
Is there a better approach I'm not aware of?
Thanks in advance!
Found a solution using a combination of:
Websockets with stream api and websocket-express to trigger the streaming over websockets via routes
Backend
app.ts
import router from './router/router';
import WebSocketExpress from 'websocket-express';
const app = new WebSocketExpress();
const port = `${process.env.APPLICATION_PORT}`;
app.use(router);
app.listen(port, () => {
console.log(`App listening on port ${port}!`);
});
router.ts
import { Router } from 'websocket-express';
import streamData from './streamData';
const router = new Router();
router.ws('/my/api/path', streamData);
export default router;
streamData.ts (did some refactoring to the above version)
import { MyModel } from '../models/my-model';
import { createWebSocketStream } from 'ws';
export const streamData = async (res: Response) => {
const ws = await res.accept();
try {
const duplex = createWebSocketStream(ws, { encoding: 'utf8' });
duplex.write('[');
let prevDoc: any = null;
// ignore _id since it's going to be upserted into another database
const cursor = MyModel.find({}, { _id: 0 } ).cursor();
cursor.on('data', (doc) => {
if (prevDoc) {
duplex.write(`${JSON.stringify(prevDoc)},`);
}
prevDoc = doc;
});
cursor.on('end', () => {
if (prevDoc) {
duplex.write(`${JSON.stringify(prevDoc)}`);
}
duplex.end(']');
});
cursor.on('error', (err) => {
ws.close();
});
duplex.on('error', (err) => {
ws.close();
cursor.close();
});
} catch (err) {
ws.close();
}
};
Client (or the receiving server)
import { MyModel } from '../models/my-model';
import StreamArray from 'stream-json';
import { streamArray } from 'stream-json/streamers/StreamArray';
import { pipeline } from 'stream';
import WebSocket, { createWebSocketStream } from 'ws';
export const getAndPersistDataStream = async (ip: string) => {
try {
const ws = new WebSocket(`ws://${ip}:${process.env.PORT}/my/api/path`);
try {
const duplex = createWebSocketStream(ws, { encoding: 'utf8' });
const jsonStream = StreamArray.parser({ jsonStreaming: true });
let items: Items[] = [];
const stream = pipeline(duplex, jsonStream, streamArray(), error => {
if (error) {
ws.close();
}
});
stream.on('data', (i: any) => {
items.push(<Items>i.value);
if (items.length === 500) {
persistItems(items, ip);
items = [];
}
});
stream.on('end', () => {
persistItems(items, ip);
ws.close();
});
stream.on('error', (err: any) => {
ws.close();
});
await new Promise(fulfill => stream.on('finish', fulfill));
} catch (err) {
ws.close();
}
} catch (err) {
}
(I removed a lot of (error)-logging stuff, because of that the catch block is empty...)

node.js How to get express.post to wait for the net.client.data event

How to have express.post to wait for a response from a socket created using net client? net here means the package require("net")
Architecture
browser <-> express with built in unix domain client <-> some unix domain server
My express server serves the front end as usual. Sometimes, it gets information from other services running on the same machine. These services are outside my control. I use net to create a client to connect with them and this works fine.
Tried approaches
All the usual answers about express.post using, for example, a promise, are not applicable because they wait for a reply on the function you call.
For example. fs.readFile will return something related to the completion of fs.readFile and this can easily be promisified using the examples on the internet.
Node.js promise request return
https://www.intuz.com/blog/promises-in-node-js-with-examples
https://dzone.com/articles/how-to-interact-with-a-database-using-promises-in
https://www.turtle-techies.com/using-promises-with-express-js/
The problem
However, with net, the reply comes from somewhere else.
client.write is the method for sending. This method just returns true, not the response from the server.
We become aware of the response at the client.data event and I can't figure how to get a promise to watch the client.data event to fulfill the promise.
My best attempt
Using turtle-techies as an example (all above are similar though differing syntax):
const readFilePromise = (filename) => new Promise((resolve, reject) => {
fs.readFile(filename, 'utf-8', (err, data) => {
if (err) {
reject(err);
} else {
resolve(data);
}
});
});
const app = express();
//call the promisified long-running function. ONLY works if the function returns the data you need. client.write does not.
app.post('/get-file', (req, res) => {
readFilePromise('important_file.txt')
.then(data => res.end(data))
.catch(err => res.end(`could not serve data: ${err}`))
});
Now I try rewriting the promisified function or method to suit net.client:
var net = require('net');
var socketName = '/tmp/dp.sock';
var client = net.createConnection(socketName);
client.on("connect", function() {
client.write('client on connect');
});
client.on("data", function(data) {
console.log("client.on data: ", data.toString());
//I don't see a way to get a handle here on express.post req,res so
// I think the answer lies elswhere. Anyway, it would be messy to put it here.
});
client.on('close', function() {
console.log('client on close');
});
//========= end of client building ===========
//========= promisified client.write =========
const readFilePromise = (filename) => new Promise((resolve, reject) => {
client.write('my message'); //This just returns true which I don't care about
//now what to do here to become aware of client.ondata and handle its data?
});
I have a feeling the answer is in front of my face.
The answer is here: https://techbrij.com/node-js-tcp-server-client-promisify
In principle, wrap the client creation and client.write in a new class in which you can then do any promisification you like.
In case the link breaks, here's what they say (modified by me to be for unix domain socket; techbrij.com did tcp)
//in a module.js file
const net = require('net');
const DEF_socketName = '/tmp/echo.sock';
class Client {
constructor(socketName = '/tmp/echo.sock') {
this.socket = new net.createConnection(socketName);
this.address = socketName || DEF_socketName;
this.init();
}
init() {
var client = this;
client.socket.on("connect", function() {
client.socket.write('hello from unix client!');
});
client.socket.on("data", function(data) { //approahch 1 here we delete the function def and simply put dodata
console.log("client.on data: ", data.toString());
//client.destroy();
});
client.socket.on('close', function() {
console.log('Connection closed');
});
}
sendMessage(message) {
console.log('client.sendMessage: ' + message);
var client = this;
return new Promise((resolve, reject) => {
client.socket.write(message);
client.socket.on('data', (data) => {
resolve(data);
if (data.toString().endsWith('exit')) {
client.socket.destroy();
}
});
client.socket.on('error', (err) => {
reject(err);
});
});
}
}
module.exports = Client;
Then in a client.js file:
const Client = require('./client_mod');
const client = new Client();
client.sendMessage('A')
.then((data)=> { console.log(`Received: ${data}`); return client.sendMessage('B');} )
.then((data)=> { console.log(`Received: ${data}`); return client.sendMessage('C');} )
.then((data)=> { console.log(`Received: ${data}`); return client.sendMessage('exit');} )
.catch((err) =>{ console.error(err); })

Can not emit event with socketid

I am trying to write an integration test for socket io. I am using try-catch for server event when the error is catched I emit an event for the client to handle the error
io.of('/rooms').on('connection', socket => {
const socketId = socket.id;
console.log('server socketid', socketId);
const userRepository = getCustomRepository(UserRepository);
const conversationToUserRepository = getCustomRepository(ConversationToUserRepository);
socket.on('initGroupChat', async (users_id, fn) => {
try {
const [user, listConversationToUser] = await Promise.all([
userRepository.findOne({
where: {
users_id,
},
}),
conversationToUserRepository.find({
where: {
users_id,
},
}),
]);
if (!user) {
throw new NotFoundError('User not found with id: ' + users_id);
}
console.log(2);
user.socket_id = socket.id;
await userRepository.save(user);
for (const item of listConversationToUser) {
socket.join(item.conversation_id.toString());
}
fn('init group chat success');
} catch (error) {
io.to(socketId).emit('error', errorHandlerForSocket(error));
console.log(10);
}
});
});
but on the socket client, nothing happens. here is the code socket client:
it.only('Init Group Chat with error', done => {
socket = io.connect(`http://localhost:${env.app.port}/rooms`, {
transports: ['websocket']
});
const id = 11111;
socket.emit('initGroupChat', id, (response: any) => {
console.log('response', response);
});
socket.on('error', (error: any) => {
console.log('error3', error);
done();
});
});
on the error event, the console.log did not show on the terminate. it didn't catch the event I emit on the server.
can anyone help me fix this issue
Every time the client Refresh socketId changes.
Server :
io.emit("send_to_client", {
userId: 112233,
data: "Hello user 112233"
});
Client :
var userLogin = 112233;
socket.on("send_to_client", function(res) {
if(res.userId === userLogin)
//somethingElse
});

nodejs reuse the last TLS connect of same server

My node daemon connect to the same TLS server multiple times. I want to save the TLS connection time.
// repeat every 5 seconds
setInterval(() => {
const socket = tls.connect(443, 'test.com')
socket.on('connect', () => {
socket.write(someUniqueData)
})
}, 5000)
Is it possible to reuse the last TLS connection?
You can save the socket easily enough, here's an example:
const tls = require('tls');
const someUniqueData = 'someUniqueData';
var savedSocket = null;
function getSocket() {
return new Promise((resolve, reject) => {
if (savedSocket) {
console.log('getSocket: Reusing saved socket..');
resolve(savedSocket);
return;
}
console.log('getSocket: Creating new socket..');
const socket = tls.connect(443, 'test.com');
socket.on('connect', () => {
console.log('Connected to host..');
savedSocket = socket;
resolve(savedSocket);
});
socket.on('error', (err) => {
console.log('Error connecting to host..');
reject(err);
});
});
}
async function connectAndSend() {
try {
let socket = await getSocket();
console.log('connectAndSend: Sending data..');
socket.write(someUniqueData)
} catch (err) {
console.error('connectAndSend: Error occurred: ', err);
}
}
// repeat every 5 seconds
setInterval(() => {
connectAndSend();
}, 5000)

Unit testing Node.js and WebSockets (Socket.io)

Could anyone provide a rock-solid, dead-simple unit test for Node.js using WebSockets (Socket.io)?
I'm using socket.io for Node.js, and have looked at socket.io-client for establishing the client connection to a server in the test. However, I seem to be missing something.
In the example below, "worked..." never gets printed out.
var io = require('socket.io-client')
, assert = require('assert')
, expect = require('expect.js');
describe('Suite of unit tests', function() {
describe('First (hopefully useful) test', function() {
var socket = io.connect('http://localhost:3001');
socket.on('connect', function(done) {
console.log('worked...');
done();
});
it('Doing some things with indexOf()', function() {
expect([1, 2, 3].indexOf(5)).to.be.equal(-1);
expect([1, 2, 3].indexOf(0)).to.be.equal(-1);
});
});
});
Instead, I simply get:
Suite of unit tests
First (hopefully useful) test
✓ Doing some things with indexOf()
1 test complete (26 ms)
Any suggestions?
After further poking and prodding, I found some incredibly useful information. In the author's example, he points out the critical step of establishing socket listeners in the before hooks.
This example works:
Assuming a server is listening for socket connections at localhost:3001, of course
var io = require('socket.io-client')
, assert = require('assert')
, expect = require('expect.js');
describe('Suite of unit tests', function() {
var socket;
beforeEach(function(done) {
// Setup
socket = io.connect('http://localhost:3001', {
'reconnection delay' : 0
, 'reopen delay' : 0
, 'force new connection' : true
});
socket.on('connect', function() {
console.log('worked...');
done();
});
socket.on('disconnect', function() {
console.log('disconnected...');
})
});
afterEach(function(done) {
// Cleanup
if(socket.connected) {
console.log('disconnecting...');
socket.disconnect();
} else {
// There will not be a connection unless you have done() in beforeEach, socket.on('connect'...)
console.log('no connection to break...');
}
done();
});
describe('First (hopefully useful) test', function() {
it('Doing some things with indexOf()', function(done) {
expect([1, 2, 3].indexOf(5)).to.be.equal(-1);
expect([1, 2, 3].indexOf(0)).to.be.equal(-1);
done();
});
it('Doing something else with indexOf()', function(done) {
expect([1, 2, 3].indexOf(5)).to.be.equal(-1);
expect([1, 2, 3].indexOf(0)).to.be.equal(-1);
done();
});
});
});
I found that the placement of done() in the beforeEach, socket.on('connect'...) listener was crucial to having the connection get established. For example, if you comment out done() in the listener, then add it one scope out (just before exiting the beforeEach), you'll see the "no connection to break..." message instead of the "disconnecting..." message. Like so:
beforeEach(function(done) {
// Setup
socket = io.connect('http://localhost:3001', {
'reconnection delay' : 0
, 'reopen delay' : 0
, 'force new connection' : true
});
socket.on('connect', function() {
console.log('worked...');
//done();
});
socket.on('disconnect', function() {
console.log('disconnected...');
});
done();
});
I'm new to Mocha, so there's probably a very obvious reason to the initiated for placing done() within the socket scope itself. Hopefully that little detail will save others in my shoes from hair pulling.
For me, the above test (with correct scoping of done()) outputs:
Suite of unit tests
First (hopefully useful) test
◦ Doing some things with indexOf(): worked...
✓ Doing some things with indexOf()
disconnecting...
disconnected...
◦ Doing something else with indexOf(): worked...
✓ Doing something else with indexOf()
disconnecting...
disconnected...
2 tests complete (93 ms)
Offering an extension of the accepted answer here. Has basic client to server communication useful as boilerplate for other future tests. Using mocha, chai, and expect.
var io = require('socket.io-client')
, io_server = require('socket.io').listen(3001);
describe('basic socket.io example', function() {
var socket;
beforeEach(function(done) {
// Setup
socket = io.connect('http://localhost:3001', {
'reconnection delay' : 0
, 'reopen delay' : 0
, 'force new connection' : true
, transports: ['websocket']
});
socket.on('connect', () => {
done();
});
socket.on('disconnect', () => {
// console.log('disconnected...');
});
});
afterEach((done) => {
// Cleanup
if(socket.connected) {
socket.disconnect();
}
io_server.close();
done();
});
it('should communicate', (done) => {
// once connected, emit Hello World
io_server.emit('echo', 'Hello World');
socket.once('echo', (message) => {
// Check that the message matches
expect(message).to.equal('Hello World');
done();
});
io_server.on('connection', (socket) => {
expect(socket).to.not.be.null;
});
});
});
Dealing with callbacks and promises yourself can be difficult and non trivial examples quickly become very complex and hard to read.
There is a tool called socket.io-await-test available via NPM that allows you to suspend/wait in a test until events have been triggered using the await keyword.
describe("wait for tests", () => {
it("resolves when a number of events are received", async () => {
const tester = new SocketTester(client);
const pongs = tester.on('pong');
client.emit('ping', 1);
client.emit('ping', 2);
await pongs.waitForEvents(2) // Blocks until the server emits "pong" twice.
assert.equal(pongs.get(0), 2)
assert.equal(pongs.get(1), 3)
})
})
Check out this boilerplate solution that's based on promises and good practice.
You can test your servers entire io events with it, no sweat.
You just need to copy a boilerplate test and add your own code as needed.
Checkout the repo on GitHub for full source code.
https://github.com/PatMan10/testing_socketIO_server
const io = require("socket.io-client");
const ev = require("../utils/events");
const logger = require("../utils/logger");
// initSocket returns a promise
// success: resolve a new socket object
// fail: reject a error
const initSocket = () => {
return new Promise((resolve, reject) => {
// create socket for communication
const socket = io("localhost:5000", {
"reconnection delay": 0,
"reopen delay": 0,
"force new connection": true
});
// define event handler for sucessfull connection
socket.on(ev.CONNECT, () => {
logger.info("connected");
resolve(socket);
});
// if connection takes longer than 5 seconds throw error
setTimeout(() => {
reject(new Error("Failed to connect wihtin 5 seconds."));
}, 5000);
}
);
};
// destroySocket returns a promise
// success: resolve true
// fail: resolve false
const destroySocket = socket => {
return new Promise((resolve, reject) => {
// check if socket connected
if (socket.connected) {
// disconnect socket
logger.info("disconnecting...");
socket.disconnect();
resolve(true);
} else {
// not connected
logger.info("no connection to break...");
resolve(false);
}
});
};
describe("test suit: Echo & Bello", () => {
test("test: ECHO", async () => {
// create socket for communication
const socketClient = await initSocket();
// create new promise for server response
const serverResponse = new Promise((resolve, reject) => {
// define a handler for the test event
socketClient.on(ev.res_ECHO, data4Client => {
//process data received from server
const { message } = data4Client;
logger.info("Server says: " + message);
// destroy socket after server responds
destroySocket(socketClient);
// return data for testing
resolve(data4Client);
});
// if response takes longer than 5 seconds throw error
setTimeout(() => {
reject(new Error("Failed to get reponse, connection timed out..."));
}, 5000);
});
// define data 4 server
const data4Server = { message: "CLIENT ECHO" };
// emit event with data to server
logger.info("Emitting ECHO event");
socketClient.emit(ev.com_ECHO, data4Server);
// wait for server to respond
const { status, message } = await serverResponse;
// check the response data
expect(status).toBe(200);
expect(message).toBe("SERVER ECHO");
});
test("test BELLO", async () => {
const socketClient = await initSocket();
const serverResponse = new Promise((resolve, reject) => {
socketClient.on(ev.res_BELLO, data4Client => {
const { message } = data4Client;
logger.info("Server says: " + message);
destroySocket(socketClient);
resolve(data4Client);
});
setTimeout(() => {
reject(new Error("Failed to get reponse, connection timed out..."));
}, 5000);
});
const data4Server = { message: "CLIENT BELLO" };
logger.info("Emitting BELLO event");
socketClient.emit(ev.com_BELLO, data4Server);
const { status, message } = await serverResponse;
expect(status).toBe(200);
expect(message).toBe("SERVER BELLO");
});
});
---- Foot Note ----
Depending on how you setup your server environment, you may experience environmental conflict between socket.io and socket.io-client running from the same project simultaneously. In which case it would be better to separate the project into a "test client" and a server. Checkout below repo if you get this issue.
https://github.com/PatMan10/testing_socketIO_server_v2
In OP's code,
socket.on('connect', function(done) {
console.log('worked...');
done();
});
the done was applied to the wrong callback. It should be removed from the socket.on callback and added to Mocha's it block callback:
it('First (hopefully useful) test', function (done) {
var socket = io.connect('http://localhost:3001');
socket.on('connect', function () {
console.log('worked...');
done();
});
});
A complete example
Existing answers are great but don't show the server ultimately being tested. Here's a complete version with console.logs to illustrate what's going on. Explanation follows.
src/server.js:
const express = require("express");
const createServer = (port=3000) => {
const app = express();
const http = require("http").Server(app);
const io = require("socket.io")(http);
io.on("connection", socket => {
console.log("[server] user connected");
socket.on("message", msg => {
console.log(`[server] received '${msg}'`);
socket.emit("message", msg);
});
socket.on("disconnect", () => {
console.log("[server] user disconnected");
});
});
http.listen(port, () =>
console.log(`[server] listening on port ${port}`)
);
return {
close: () => http.close(() =>
console.log("[server] closed")
)
};
};
module.exports = {createServer};
test/server.test.js:
const {expect} = require("chai");
const io = require("socket.io-client");
const {createServer} = require("../src/server");
const socketUrl = "http://localhost:3000";
describe("server", function () {
this.timeout(3000);
let server;
let sockets;
beforeEach(() => {
sockets = [];
server = createServer();
});
afterEach(() => {
sockets.forEach(e => e.disconnect())
server.close();
});
const makeSocket = (id=0) => {
const socket = io.connect(socketUrl, {
"reconnection delay": 0,
"reopen delay": 0,
"force new connection": true,
transports: ["websocket"],
});
socket.on("connect", () => {
console.log(`[client ${id}] connected`);
});
socket.on("disconnect", () => {
console.log(`[client ${id}] disconnected`);
});
sockets.push(socket);
return socket;
};
it("should echo a message to a client", done => {
const socket = makeSocket();
socket.emit("message", "hello world");
socket.on("message", msg => {
console.log(`[client] received '${msg}'`);
expect(msg).to.equal("hello world");
done();
});
});
it("should echo messages to multiple clients", () => {
const sockets = [...Array(5)].map((_, i) => makeSocket(i));
return Promise.all(sockets.map((socket, id) =>
new Promise((resolve, reject) => {
const msgs = [..."abcd"].map(e => e + id);
msgs.slice().forEach(e => socket.emit("message", e));
socket.on("message", msg => {
console.log(`[client ${id}] received '${msg}'`);
expect(msg).to.equal(msgs.shift());
if (msgs.length === 0) {
resolve();
}
});
})
));
});
});
In summary, the server exports a function that lets a server app be created from scratch, allowing each it block to be idempotent and avoid server state from carrying between tests (assuming no persistence on the server otherwise). Creating an app returns an object with a close function. socket.disconnect() must be called per socket in each test to avoid timeouts.
Given these requirements, the testing suite follows this per-test setup/teardown workflow:
let server;
let sockets;
beforeEach(() => {
sockets = [];
server = createServer();
});
afterEach(() => {
sockets.forEach(e => e.disconnect())
server.close();
});
makeSocket is an optional helper to reduce the repeated boilerplate of connecting and disconnecting a socket client. It does produce a side effect on the sockets array for cleanup later, but this is an implementation detail from the it block's perspective. Test blocks shoudn't touch server or sockets variables, although other workflows are likely depending on need. The critical takeaways are test case idempotency and closing all connections after each test case.
Options on the socket.connect object on the client let you choose transport and behavior of the socket. "force new connection": true creates a new Manager per socket instead of reusing an existing one and transports: ["websocket"] upgrades to WS protocol from long polling immediately.
Use it("should ... ", done => { /* tests */ }); and invoke done() after all work is completed in callbacks or return a promise (and omit the done parameter to the it callback). The example above shows both approaches.
Used in this post:
node: 12.19.0
chai: 4.2.0
express: 4.16.4
mocha: 5.2.0
socket.io: 2.2.0
socket.io-client: 2.2.0
I had this problem: How to do unit test with a "socket.io-client" if you don't know how long the server take to respond?.
I've solved so using mocha and chai:
var os = require('os');
var should = require("chai").should();
var socketio_client = require('socket.io-client');
var end_point = 'http://' + os.hostname() + ':8081';
var opts = {forceNew: true};
describe("async test with socket.io", function () {
this.timeout(10000);
it('Response should be an object', function (done) {
setTimeout(function () {
var socket_client = socketio_client(end_point, opts);
socket_client.emit('event', 'ABCDEF');
socket_client.on('event response', function (data) {
data.should.be.an('object');
socket_client.disconnect();
done();
});
socket_client.on('event response error', function (data) {
console.error(data);
socket_client.disconnect();
done();
});
}, 4000);
});
});

Resources