I am starting apollo-server with unittests and would like to gracefully stop the server, but in unit tests I get Error: listen EADDRINUSE: address already in use :::4000 (and Mocha's done() called multiple times)
Here the start and stop functions:
let apollo_server = null;
exports.start = function(options, callback) {
options.schema = schema
options.formatError = (error) => {
console.log('Fail to start graphql server', error);
return error;
}
apollo_server = new ApolloServer(options);
apollo_server.listen().then(({ url, subscriptionsUrl }) => {
console.log('Grapql server started.', url, subscriptionsUrl);
});
callback();
}
exports.stop = function(callback) {
apollo_server.stop().then(()=>{
console.log("server stopped")l
callback();
})
}
Testing is done with Apollo-link, apollo-links-ws and subscriptions-transport-ws. It starts with following config to access graphql server>
const serverConfig = { serverUrl: 'http://localhost:4000/', subscriptionUrl: 'ws://localhost:4000/graphql' };
The test looks like following:
describe('Start query', function() {
before('Start the graphql server', function(callback) {
const options = { playground: false };
graphql_server.start(options, callback);
});
before('Create owner', function(callback) {
datasource.new_owner({ name: "test1" })
callback()
});
it('Should return created entry of a query', (callback) => {
owner_query.variables = { id: '4711' }
execute(link, owner_query).subscribe({
next: ({ data }) => {
expect(data.owners.length).to.be.equal(1)
callback()
},
error: error => { expect(error).to.be.null },
complete: () => {},
});
})
after('Stop server', function(callback) {
graphql_server.stop((data) => {
console.log("stop1", data)
callback()
})
})
})
This is the second test. The first is similar for a mutation and is does not throw an error. The second test case ends with error on listen EADDRINUSE of port 4000.
What do I miss?
Related
What I'm trying to do is to emit a event based on the progress of my jobs. It's working in the Gateway(proxy) side, the logs appear, etc, but when I'm consuming the event on the front-end, sometimes it works, sometimes not, and it throws 'ping timeout' error in the console. If I restart the nodejs service a few times it works.
I'm open to ideas of alternative ways to implement this feature.
SocketController
export default class SocketController {
socket: any;
interval: any;
instance: any;
queue: any;
constructor(server) {
// Creating Websocket connection
this.instance = new Server(server, {
cors: { origin: process.env.FRONTEND_URL },
path: "/socket.io/"
});
this.socket = null;
this.queue = null;
this.instance.on("connection", (socket) => {
let connectedUsersCount =
Object.keys(this.instance.sockets.sockets).length + 1;
let oneUserLeft = connectedUsersCount - 1;
console.log("New client connected ", connectedUsersCount);
// Assign socket to the class
this.socket = this.socket == null ? socket : this.socket;
/*
if (this.interval) {
clearInterval(this.interval);
}
*/
// initialize Queue
this.queue = this.queue === null ? new QueueService(socket) : this.queue;
socket.on("disconnect", () => {
console.log("Client disconnected ", oneUserLeft);
// clearInterval(this.interval);
});
});
}
QueueService
export default class QueueService {
channels: any;
socket: any;
constructor(socket: any) {
this.channels = ["integrationProgress", "news"];
this.socket = socket;
integrationQueueEvents.on("progress", (job: any) => {
console.log("Job Progressing", job);
this.socket.emit("integrationProgress", { status: true, data: job.data })
});
integrationQueueEvents.on("active", ({ jobId }) => {
console.log(`Job ${jobId} is now active`);
});
integrationQueueEvents.on("completed", ({ jobId, returnvalue }) => {
console.log(`${jobId} has completed and returned ${returnvalue}`);
this.socket.emit("integrationComplete", {
status: true,
message: returnvalue
});
});
integrationQueueEvents.on("failed", ({ jobId, failedReason }) => {
console.log(`${jobId} has failed with reason ${failedReason}`);
this.socket.emit("integrationProgress", {
status: false,
message: failedReason
});
});
}
}
Front-End
const socket = io(process.env.GATEWAY_URL, {
path: "/socket.io/"
});
socket.on("connect_error", (err) => {
console.log(`connect_error due to ${err.message}`);
socket.connect();
});
socket.on("disconnect", (socket) => {
console.log(socket);
console.log("Client disconnected ");
});
socket.on("connect", (socket) => {
console.log("Client Connected ");
console.log(socket);
});
socket.on("integrationProgress", async (socket) => {
try {
console.log(`Progress: ${socket.data}`);
updateJob(socket.data);
} catch (err) {
console.log(err);
}
});
I am a relative newcomer to React and implementing a chat app with react, socket.io, and node.
When the user is chatting with someone, he would connect to a socket. It works fine, but if the user leaves to another page, and RE-ENTERS the same chat again, a second socket is connected(bad), and I get the "Can't perform a React state update on an unmounted component" error.
I did implement code to have socket leave the room (on the second useEffect), if the user leaves the page. But that doesn't seem to work.
Thanks in advance,
Error:
Partial React frontend code:
useEffect(() => {
socket.emit("enter chatroom", { conversationId: props.chatId });
setChatId(props.chatId);
getMessagesInConversation(props.chatId, (err: Error, result: any) => {
if (err) {
setCerror(err.message);
} else {
setMessages(buildMessages(result.messages).reverse());
}
});
socket.on("received", (data: any) => {
setMessages((messages: any) => {
console.log("messages");
console.log(messages);
return [...messages.slice(-4), ...buildMessages(data.newMsg)];
});
});
}, []);
useEffect(() => {
return () => {
socket.emit("leaveChatroom", {
conversationId: chatId,
});
};
}, [chatId]);
simplified nodejs code
private ioMessage = () => {
this._io.on("connection", (socket) => {
console.log("socket io connected");
const socketUser = socket.request.user;
socket.on("enter chatroom", (data) => {
const room_status = Object.keys(socket.rooms).includes(
data.conversationId
);
if (!room_status) { // should only join if socket is not already joined.
socket.join(data.conversationId);
}
});
socket.on("leaveChatroom", (data) => {
socket.leave(data.conversationId);
});
socket.on("chat message", async (msg) => {
const database = getDB();
const newMessage = {
...msg,
createdAt: Date.now(),
};
await database.collection("message").insertOne(newMessage);
const messages = await database
.collection("message")
.find({ conversationId: msg.conversationId })
.toArray();
this._io.to(msg.conversationId).emit("received", {
newMsg: [messages[messages.length - 1]],
});
});
socket.on("disconnect", (socket) => {
delete this._users[socketUser.userId];
console.log("--- disconnect : ", socketUser);
console.log("--- active users: ", this._users);
});
});
};
detailed error log:
I have an LdapJS server which implements standard operation and an extended operation to check health:
const server = ldap.createServer();
server.exop('healthcheck', (req, res, next) => {
res.end();
console.log('ended');
return next();
});
...
Then I wrote a simple client script to ping healthcheck service:
const { createClient } = require('ldapjs');
const client = createClient({
url: 'ldap://localhost:1389',
timeout: 2000,
connectTimeout: 2000
});
client.exop('healthcheck', (err, value, res) => {
if (err) {
console.log(`ERROR: ${err.message}`);
process.exit(1);
}
else {
console.log(`STATUS: ${res.status}`);
process.exit(0);
}
});
The problem is that the exop is correctly received by server (I can see the log inside its callback), but the client always logs: ERROR: request timeout (client interrupt).
Why the request is not correctly terminated?
EDIT
I wrote a mocha test for the exop and it works. Seems that the problem is related to the standalone call in healthcheck script.
describe('#healthcheck()', function () {
before(function () {
server = createServer();
server.listen(config.get('port'), config.get('host'), () => {});
});
after(function () {
server.close();
});
it('should return status 0', function (done) {
const { createClient } = require('ldapjs');
const client = createClient({
url: 'ldap://localhost:1389',
timeout: 2000,
connectTimeout: 2000
});
client.exop('healthcheck', (err, value, res) => {
should.not.exist(err);
res.status.should.be.equal(0);
client.destroy();
return done();
});
});
});
I cannot make MongoClient.connect to await for the results before going on.
I am trying to pass db from the server.js, where I connect my mongoclient, to my routes/api.js where I do my post requests. But it does not work, I always get:
TypeError: Cannot read property 'collection' of undefined
Here is my routes/api.js:
var db = require("../server");
router.post('/video_url', async (req, res) => {
const cursor = db.collection('movie').findOne({ link: req.body.videoURL }, function (findErr, result) {
if (findErr) throw findErr;
console.log(cursor)
});
server.js:
var db = async function () {
return await MongoClient.connect(MONGODB_URI, function(err, client) {
try {
if(err) throw err;
db = client.db('sub-project');
// Start the application after the database connection is ready
app.listen(PORT, () => console.log(`Server listening on port ${PORT}`));
return db;
}
catch(ex) {
console.log(ex)
}
});
}
module.exports = db;
EDIT:
var dbObject = (async function() {
var connection = await new Promise(function(resolve, reject) {
MongoClient.connect(MONGODB_URI, { useNewUrlParser: true }, function(err, client) {
try {
if (err) throw err;
db = client.db('sub-project');
// Start the application after the database connection is ready
app.listen(PORT, () => console.log(`Server listening on port ${PORT}`));
resolve(db);
} catch (ex) {
console.log(ex)
reject(ex);
}
});
});
return connection;
})();
console.log("TYPEOF DB IN", typeof(dbObject))
console.log("TYPEOF DB.COLLECTION IN", typeof(dbObject.collection))
The both console.log() are undefined... is that normal?
Use this code for your server.js. Your code was not working because your function was not getting called when you are requiring it.
var dbObject;
(function() {
MongoClient.connect(MONGODB_URI, { useNewUrlParser: true }, function(err, client) {
try {
if (err) throw err;
db = client.db('sub-project');
// Start the application after the database connection is ready
app.listen(PORT, () => console.log(`Server listening on port ${PORT}`));
dbObject = db;
} catch (ex) {
console.log(ex);
}
});
})();
setTimeout(function() {
console.log("TYPEOF DB IN", typeof(dbObject))
console.log("TYPEOF DB.COLLECTION IN", typeof(dbObject.collection))
}, 2000);
module.exports = dbObject;
I have the following nodejs function using Sequelize:
var processDatabase = function (dbConnection, schema, recordsets) {
var myLogTable = dbConnection.define(schema.tableName, schema.myLogSchema, schema.myLogSchemaIndex);
myLogTable.sync({
force: false,
freezeTableName: true,
logging: console.log
}).then(function () {
console.log('Table synced...');
for (k = 0; k < recordsets.length; k++) {
var query = "Some query";
dbConnection.query(
query, {
type: dbConnection.QueryTypes.SELECT
}
)
.then(function (results) {
console.log('MYSQL Selection Done');
})
.catch(function (err) {
console.log('MYSQL Error: ' + err.message);
});
}
}).catch(function (err) {
console.log('MYSQL Sync Error: ' + err.message);
});
};
I am new to mocking and do not especially know how to test the catch part.
This is my unit test which I can come up with, but I do not know how a call to sync can go to the catch part:
describe('when call processDatabase', function () {
it('should process successfully when sync fails', function (done) {
seqConnection.define = function (tableName, schema, schemaIndex) {
return mockMyLogModel;
};
processProfilesNotMapped(seqConnection, {
tableName: 'SomeTable',
myLogSchema: myLogSchema,
myLogSchemaIndex: myLogSchemaIndex
}, []);
done();
})
});
How would I write my mocking so that I can test both catch and also then so that they can be covered?
You need to defer an exception in your mock, since "sync" is using promises. You can use q library or any other. This way when you execute the sync function it will go to the catch section
Example using q:
describe('when call processDatabase', function () {
it('should process successfully when sync fails', function (done) {
seqConnection.define = function (tableName, schema, schemaIndex) {
const mock = {
sync: function(){
const deferred = q.defer();
deferred.reject(new Error('Some error'));
return deferred.promise;
}
}
return mock;
};
expect(
function(){
cmdManager.execute("getProfileDummy","hosar#gmail.com")
}
).to.throw(Error);
processProfilesNotMapped(seqConnection, {
tableName: 'SomeTable',
myLogSchema: myLogSchema,
myLogSchemaIndex: myLogSchemaIndex
}, []);
done();
})
});