Correct usage of events in NodeJs - Concerning "this" context - node.js

I am designing a communication server in Node that handles incoming messages (sent by client1) and transfers them to someone else (client2), who answers the message and sends the answer back, via the server, to client1.
The communication happens via WebSockets, which implies an open connection from each client to the server.
Thus I implemented a ConnectionManager to which I can register any new connections when a new client comes online. Every connection gets assigned a messageQueue in which all incoming messages are cached before processing.
At the end of processing, I have a ServerTaskManager, who generates Output-Tasks for the server, telling him a message to send and a receiver to receive it.
This ServerTaskManager emits a Node-Event (inherits from EventEmitter) upon registering a new serverTask to which the server listens.
Now I would like my ConnectionManager to also listen to the event of the serverTaskManager, in order to make him push the next message in the messageQueue into processing.
Now the problem is, that I can catch the ServerTaskManager event within the ConnectionManager just fine, but, of course, the "this" within the listener is the ServerTaskManager, not the ConnectionManager. Thus calling any "this.someFunction()" functions that belong to the ConnectionManager won't work.
Here is some code:
/**
* ServerTaskManager - Constructor
* Implements Singleton pattern.
*/
function ServerTaskManager()
{
var __instance;
ServerTaskManager = function ServerTaskManager()
{
return __instance;
}
ServerTaskManager.prototype = this;
__instance = new ServerTaskManager();
__instance.constructor = ServerTaskManager;
return __instance;
}
util.inherits(ServerTaskManager, EventEmitter);
/**
* ConnectionManager - Constructor
* Also implements Singleton pattern.
*/
function ConnectionManager()
{
var __instance;
ConnectionManager = function ConnectionManager()
{
return __instance;
}
ConnectionManager.prototype = this;
__instance = new ConnectionManager();
__instance.constructor = ConnectionManager;
__instance.currentConnections = [];
// Listen for new serverInstructions on the serverTaskManager
serverTaskManager.on('newInstruction', function(messageObject, currentReceiver)
{
this.processNextMessage(currentReceiver);
});
return __instance;
}
util.inherits(ConnectionManager, EventEmitter);
Now when I run this and the "newInstructions" event is triggered by the serverTaskManager, node throws:
TypeError: Object #<ServerTaskManager> has no method 'processNextMessage'
Which is of course true. The function I want to call belongs to the ConnectionManager:
/**
* Starts processing the next message
*
* #param connectionId (int) - The ID of the connection, of which to process the next message.
*/
ConnectionManager.prototype.processNextMessage = function (connectionId)
{
// Some Code...
}
So obviously, when listening to the ServerTaskManager event, "this" within the listener is the ServerTaskManager. Now how do I call my ConnectionManager's function from within the listener?
I hope I am not completely misled by how events and listeners and/or prototypical extensions work (in Node). This project is by far the most advanced that I have worked on in JavaScript. Normally I am only coding PHP with a little bit of client side JS.
Thx in advance for any hints!
Worp

Like this.
serverTaskManager.on('newInstruction', function(messageObject, currentReceiver)
{
ConnectionManager.processNextMessage(currentReceiver);
});
Or like this.
serverTaskManager.on('newInstruction', function(messageObject, currentReceiver)
{
ConnectionManager().processNextMessage(currentReceiver);
});
PS: your question is unnecessarily long. When posting code, don't necessarily post your example. It is much easier to boil your code down to the simplest form that exhibits the behavior you are seeing. You'll get more quality responses this way.

Related

socket.on event gets triggered multiple times

var express = require('express');
var app = express();
var server = app.listen(3000);
var replyFromBot;
app.use(express.static('public'));
var socket = require('socket.io');
var io = socket(server);
io.sockets.on('connection' , newConnection);
function newConnection(socket) {
console.log(socket.id);
listen = true;
socket.on('Quest' ,reply);
function reply(data) {
replyFromBot = bot.reply("local-user", data);
console.log(socket.id+ " "+replyFromBot);
socket.emit('Ans' , replyFromBot);
}
}
i've created a server based chat-bot application using node.js socket.io and express but the thing is for first time when i call socket.on it gets executed once and for 2nd time it gets executed twice for 3rd thrice and so on i've tackled this issue by setting a flag on my client so that it would display only once. i just wants to know is my code logically correct i mean is this a good code? because if the client ask a question for 10th time than listeners array will have 10+9+8....+1 listeners it would go on increasing depending upon number of questions clients asked. which is not good
i tried using removeListener it just removes listener once and it dosent call back for 2nd time. what do you guys recommend? do i go with this or is there any other way to add the listener when socket.on called and remove it when it gets executed and again add listener for the next time it gets called
thank-you.
client code:
function reply() {
socket.emit('Quest' , Quest);
flag = true;
audio.play();
socket.on('Ans', function(replyFromBot) {
if(flag) {
console.log("hi");
var para = document.createElement("p2");
x = document.getElementById("MiddleBox");
para.appendChild(document.createTextNode(replyFromBot));
x.appendChild(para);
x.scrollTop = x.scrollHeight;
flag = false;
}
});
}
The problem is caused by your client code. Each time you call the reply() function in the client you set up an additional socket.on('Ans', ...) event handler which means they accumulate. You can change that to socket.once() and it will remove itself each time after it get the Ans message. You can then also remove your flag variable.
function reply() {
socket.emit('Quest' , Quest);
audio.play();
// change this to .once()
socket.once('Ans', function(replyFromBot) {
console.log("hi");
var para = document.createElement("p2");
x = document.getElementById("MiddleBox");
para.appendChild(document.createTextNode(replyFromBot));
x.appendChild(para);
x.scrollTop = x.scrollHeight;
});
}
Socket.io is not really built as a request/response system which is what you are trying to use it as. An even better way to implement this would be to use the ack capability that socket.io has so you can get a direct response back to your Quest message you send.
You also need to fix your shared variables replyFromBot and listen on your server because those are concurrency problems waiting to happen as soon as you have multiple users using your server.
Better Solution
A better solution would be to use the ack capability that socket.io has to get a direct response to a message you sent. To do that, you'd change your server to this:
function newConnection(socket) {
console.log(socket.id);
socket.on('Quest', function(data, fn) {
let replyFromBot = bot.reply("local-user", data);
console.log(socket.id+ " "+replyFromBot);
// send ack response
fn(replyFromBot);
});
}
And, change your client code to this:
function reply() {
audio.play();
socket.emit('Quest', Quest, function(replyFromBot) {
console.log("hi");
var para = document.createElement("p2");
x = document.getElementById("MiddleBox");
para.appendChild(document.createTextNode(replyFromBot));
x.appendChild(para);
x.scrollTop = x.scrollHeight;
});
}
Doing it this way, you're hooking into a direct reply from the message so it works as request/response much better than the way you were doing it.
Instead of socket.on('Quest' ,reply); try socket.once('Quest' ,reply);
The bug in your code is that each time newConnection() is called node registers a event listener 'Quest'. So first time newConnection() is called the number of event listener with event 'Quest' is one, the second time function is called, number of event listener increases to two and so on
socket.once() ensures that number of event listener bound to socket with event 'Quest' registered is exactly one

Redis Connections May Not be Closing with c#

I'm connecting to Azure Redis and they show me the number of open connections to my redis server. I've got the following c# code that encloses all my Redis sets and gets. Should this be leaking connections?
using (var connectionMultiplexer = ConnectionMultiplexer.Connect(connectionString))
{
lock (Locker)
{
redis = connectionMultiplexer.GetDatabase();
}
var o = CacheSerializer.Deserialize<T>(redis.StringGet(cacheKeyName));
if (o != null)
{
return o;
}
lock (Locker)
{
// get lock but release if it takes more than 60 seconds to complete to avoid deadlock if this app crashes before release
//using (redis.AcquireLock(cacheKeyName + "-lock", TimeSpan.FromSeconds(60)))
var lockKey = cacheKeyName + "-lock";
if (redis.LockTake(lockKey, Environment.MachineName, TimeSpan.FromSeconds(10)))
{
try
{
o = CacheSerializer.Deserialize<T>(redis.StringGet(cacheKeyName));
if (o == null)
{
o = func();
redis.StringSet(cacheKeyName, CacheSerializer.Serialize(o),
TimeSpan.FromSeconds(cacheTimeOutSeconds));
}
redis.LockRelease(lockKey, Environment.MachineName);
return o;
}
finally
{
redis.LockRelease(lockKey, Environment.MachineName);
}
}
return o;
}
}
}
You can keep connectionMultiplexer in a static variable and not create it for every get/set. That will keep one connection to Redis always opening and proceed your operations faster.
Update:
Please, have a look at StackExchange.Redis basic usage:
https://github.com/StackExchange/StackExchange.Redis/blob/master/Docs/Basics.md
"Note that ConnectionMultiplexer implements IDisposable and can be disposed when no longer required, but I am deliberately not showing using statement usage, because it is exceptionally rare that you would want to use a ConnectionMultiplexer briefly, as the idea is to re-use this object."
It works nice for me, keeping single connection to Azure Redis (sometimes, create 2 connections, but this by design). Hope it will help you.
I was suggesting try using Close (or CloseAsync) method explicitly. In a test setting you may be using different connections for different test cases and not want to share a single multiplexer. A search for public code using Redis client shows a pattern of Close followed by Dispose calls.
Noting in the XML method documentation of Redis client that close method is described as doing more:
//
// Summary:
// Close all connections and release all resources associated with this object
//
// Parameters:
// allowCommandsToComplete:
// Whether to allow all in-queue commands to complete first.
public void Close(bool allowCommandsToComplete = true);
//
// Summary:
// Close all connections and release all resources associated with this object
//
// Parameters:
// allowCommandsToComplete:
// Whether to allow all in-queue commands to complete first.
[AsyncStateMachine(typeof(<CloseAsync>d__183))]
public Task CloseAsync(bool allowCommandsToComplete = true);
...
//
// Summary:
// Release all resources associated with this object
public void Dispose();
And then I looked up the code for the client, found it here:
https://github.com/StackExchange/StackExchange.Redis/blob/master/src/StackExchange.Redis/ConnectionMultiplexer.cs
And we can see Dispose method calling Close (not the usual override-able protected Dispose(bool)), further more with the wait for connections to close set to true. It appears to be an atypical dispose pattern implementation in that by trying all the closure and waiting on them it is chancing to run into exception while Dispose method contract is supposed to never throw one.

How to remove a listener in chrome.webRequest API in dart?

Related to `chrome.webRequest.onBeforeRequest.removeListener`? -- How to stop a chrome web listener, I am trying to deregister a listener using dart:js
After invoking onBeforeRequest.callMethod('removeListener', [callback]); I notice that the listener is still being called. Furthermore, directly after adding a listener the hasListenerreturns false (even thought the listener is being registered).
var callback = (map) { /* some code */ };
var filter = new JsObject.jsify({"key": "value"});
var opt_extraInfoSpec = new JsObject.jsify(["extra opt"]);
// chrome.webRequest.onBeforeRequest.addListener
JsObject onBeforeRequest = context['chrome']['webRequest']['onBeforeRequest'];
onBeforeRequest.callMethod('addListener', [callback, filter, opt_extraInfoSpec]);
Logger.root.fine('main(): does callback exist: ${onBeforeRequest.callMethod('hasListener', [callback])}');
It seems to be necessary to follow 100% the dart:js recommendations how to use a dart Function in the javascript environment. I guess my problem was that the original dart dynamic function is wrapped automatically in a proxy. Hence the callMethod for addListener used a different proxy object then the callMethod for hasListener, even thought both of them were based on the same original dart object (i.e. callback).
The solution is to use the JsFunction and define the callback as following:
var callback = new JsFunction.withThis((that, map) { /* some code */ });

Socket.IO server throttling a fast client

I have a server that uses socket.io and I need a way of throttling a client that is sending the server data too quickly. The server exposes both a TCP interface and a socket.io interface - with the TCP server (from the net module) I can use socket.pause() and socket.resume(), and this effectively throttles the client. But with socket.io's socket class there are no pause() and resume() methods.
What would be the easiest way of getting feedback to a client that it is overwhelming the server and needs to slow down? I liked socket.pause() and socket.resume() because it didn't require any additional code on the client-side - backup the TCP socket and things naturally slow down. Any equivalent for socket.io?
Update: I provide an API to interact with the server (there is currently a python version which runs over TCP and a JavaScript version which uses socket.io). So I don't have any real control over what the client does. Which is why using socket.pause() and socket.resume() is so great - backing up the TCP stream slows the python client down no matter what it tries to do. I'm looking for an equivalent for a JavaScript client.
With enough digging I found this:
this.manager.transports[this.id].socket.pause();
and
this.manager.transports[this.id].socket.resume();
Granted this probably won't work if the socket.io connection isn't a web sockets connection, and may break in a future update, but for now I'm going to go with it. When I get some time in the future I'll probably change it to the QUOTA_EXCEEDED solution that Pascal proposed.
Here is a dirty way to achieve throttling. Although this is a old post; some people may benefit from it:
First register a middleware:
io.on("connection", function (socket) {
socket.use(function (packet, next) {
if (throttler.canBeServed(socket, packet)) {
next();
}
});
//You other code ..
});
canBeServed is a simple throttler as seen below:
function canBeServed(socket, packet) {
if (socket.markedForDisconnect) {
return false;
}
var previous = socket.lastAccess;
var now = Date.now();
if (previous) {
var diff = now - previous;
//Check diff and disconnect if needed.
if (diff < 50) {
socket.markedForDisconnect = true;
setTimeout(function () {
socket.disconnect(true);
}, 1000);
return false;
}
}
socket.lastAccess = now;
return true;
}
You can use process.hrtime() instead of Date.time().
If you have a callback on your server somewhere which normally sends back the response to your client, you could try and change it like this:
before:
var respond = function (res, callback) {
res.send(data);
};
after
var respond = function (res, callback) {
setTimeout(function(){
res.send(data);
}, 500); // or whatever delay you want.
};
Looks like you should slow down your clients. If one client can send too fast for your server to keep up, this is not going to go very well with 100s of clients.
One way to do this would be have the client wait for the reply for each emit before emitting anything else. This way the server can control how fast the client can send by only answering when ready for example, or only answer after a set time.
If this is not enough, when a client exceeded x requests per second, start replying with something like QUOTA_EXCEEDED error, and ignore the data they send in. This will force external developers to make their app behave as you want them to do.
As another suggestion, I would propose a solution like this:
It is common for MySQL to get a large amount of requests which would take longer time to apply than the rate the requests coming in.
The server can record the requests in a table in db assuming this action is fast enough for the rate the requests are coming in and then process the queue at a normal rate for the server to sustain. This buffer system will allow the server to run slow but still process all the requests.
But if you want something sequential, then the request callback should be verified before the client can send another request. In this case, there should be a server ready flag. If the client is sending request while the flag is still red, then there can be a message telling the client to slow down.
simply wrap your client emitter into a function like below
let emit_live_users = throttle(function () {
socket.emit("event", "some_data");
}, 2000);
using use a throttle function like below
function throttle(fn, threshold) {
threshold = threshold || 250;
var last, deferTimer;
return function() {
var now = +new Date, args = arguments;
if(last && now < last + threshold) {
clearTimeout(deferTimer);
deferTimer = setTimeout(function() {
last = now;
fn.apply(this, args);
}, threshold);
} else {
last = now;
fn.apply(this, args);
}
}
}

Best way using events in node.js

Which is the best approach for listening/launching events in node.js?
I've been testing event launching and listening in node.js by extending the model with EventEmitter and I'm wondering if it has sense this approach since the events are only listened when there is a instance of the model.
How can be achieved that events will be listened while the node app is alive??
Example for extending the model using eventEmitter.
// myModel.js
var util = require('util');
var events2 = require('events').EventEmitter;
var MyModel = function() {
events2.call(this);
// Create a event listener
this.on('myEvent', function(value) {
console.log('hi!');
});
};
MyModel.prototype.dummyFunction = function(params) {
// just a dummy function.
}
util.inherits(MyModel, events2);
module.exports = MyModel;
EDIT: A more clear question about this would be: how to keep a permanent process that listens the events during the app execution and it has a global scope (something like a running event manager that listens events produced in the app).
Would be a solution to require the file myModel.js in app.js? How this kind of things are solved in node.js?
I'm not entirely sure what you mean about events only being active when there is an instance of a model since without something to listen and react to them, events cannot occur.
Having said that, it is certainly reasonable to:
util.inherits(global,EventEmitter)
global.on('myEvent',function(){
/* do something useful */
});
which would allow you to:
global.emit('myEvent')
or even:
var ee=new EventEmitter();
ee.on('myEvent',...)
As for how to properly use EventEmitter: it's defined as
function EventEmitter() {}
which does not provide for initialization, so it should be sufficient to:
var Thing=function(){};
util.inherits(Thing,EventEmitter);
which will extend instances of Thing with:
setMaxListeners(num)
emit(type,...)
addListener(type,listener) -- aliased as on()
once(type,listener)
removeListener(type,listener)
removeAllListeners()
listeners(type)
The only possible "gotcha" is that EventEmitter adds its own _events object property to any extended object (this) which suggests you should not name any of your own object properties with the same name without unexpected behavior.

Resources