NodeJS express framework reads the same event multiple times - node.js

I'm working with a service (WSO2CEP) that sends events to a node js program that I developed, let's call it receiver.js, and then it stores these events in a mongo db. The comunication between WSO2CEP and receiver.js is done through a HTTP connection. The problem I'm facing on is that when the WSO2 sends an event, the receiver.js caputres it and stores it in the db, and after a few seconds/minutes, it detects that a new events has arrived, which is not true, and stores it again in the db. This second event is identical to the first one.
When I saw that I thought that the problem was that the WSO2 was sending the same event multiple times, but I've debug it an I'm 100% sure that only one events is being sent, so the problem seems to be the HTTP connection.
The HTTP connection is being handled by the receiver.js acting as a server and WSO2 as a client, which sends the events through HTTP post request. The http server implementation in receiver.js is done with the "express" framework. As it can be seen in below code chunk.
'use strict';
const express = require('express');
const bodyParser = require('body-parser');
const EventEmitter = require('events');
const port = Whatever;
module.exports = class WSO2Server extends EventEmitter {
constructor () {
super();
const app = express();
app.use(bodyParser.json()); // to support JSON-encoded bodies
app.route('/Whatever').post( (req, res) => {
let event = req.body;
this.emit('event', event);
});
this.server = app.listen(port);
}
destroy () {
this.server.close();
}
}
I suspect that the events are being stored in a queu (or similar) and are being retransmitted every so often. Any idea about that? Thank you

Looking at your code, I can't see you using the response object at all. After you've called this.emit('event', event); you should call something like res.status(201).end(); which will dispatch a HTTP Status 201 back to the calling client.
Because you're not setting any information on the response object, your application is hanging and not making a response to the HTTP call. Thus something like nginx or apache is re-issuing the request to your application after a specific timeout.
If you explicitly create the response with something res.status(201).end(); then your request will end correctly and a duplicate call will not be made.

Related

Socket Io limiting only 6 connection in Node js

So i came across a problem.I am trying to send {id} to my rest API (node js) and in response, I get data on the socket.
Problem:
For first 5-6 time it works perfectly fine and display Id and send data back to socket.But after 6 time it does not get ID.
I tried this https://github.com/socketio/socket.io/issues/1145
and https://github.com/socketio/socket.io/issues/1145 but didn't solve the problem.
On re compiling the server it shows previous {ids} which i enter after 6 time.it like after 5-6 time it is storing id in some form of cache.
Here is my API route.
//this route only get {id} 5-6 times .After 5-6 times it does not display receing {id}.
const express = require("express");
var closeFlag = false;
const PORT = process.env.SERVER_PORT; //|| 3000;
const app = express();
var count = 1;
http = require('http');
http.globalAgent.maxSockets = 100;
http.Agent.maxSockets = 100;
const serverTCP = http.createServer(app)
// const tcpsock = require("socket.io")(serverTCP)
const tcpsock = require('socket.io')(serverTCP, {
cors: {
origin: '*',
}
, perMessageDeflate: false
});
app.post("/getchanneldata", (req, res) => {
console.log("count : "+count)
count++;// for debugging purpose
closeFlag = false;
var message = (req.body.val).toString()
console.log("message : "+message);
chanId = message;
client = dgram.createSocket({ type: 'udp4', reuseAddr: true });
client.on('listening', () => {
const address = client.address();
});
client.on('message', function (message1, remote) {
var arr = message1.toString().split(',');
}
});
client.send(message, 0, message.length, UDP_PORT, UDP_HOST, function (err, bytes) {
if (err) throw err;
console.log(message);
console.log('UDP client message sent to ' + UDP_HOST + ':' + UDP_PORT);
// message="";
});
client.on('disconnect', (msg) => {
client.Diconnected()
client.log(client.client)
})
}
);
There are multiple issues here.
In your app.post() handler, you don't send any response to the incoming http request. That means that when the browser (or any client) sends a POST to your server, the client sits there waiting for a response, but that response never comes.
Meanwhile, the browser has a limit for how many requests it will send simultaneously to the same host (I think Chrome's limit is coincidentally 6). Once you hit that limit, the browser queues the request and and waits for one of the previous connections to return its response before sending another one. Eventually (after a long time), those connections will time out, but that takes awhile.
So, the first thing to fix is to send a response in your app.post() handler. Even if you just do res.send("ok");. That will allow the 7th and 8th and so on requests to be immediately sent to your server. Every incoming http request should have a response sent back to it, even if you have nothing to send, just do a res.end(). Otherwise, the http connection is left hanging, consuming resources and waiting to eventually time out.
On a separate note, your app.post() handler contains this:
client = dgram.createSocket({ type: 'udp4', reuseAddr: true });
This has a couple issues. First, you never declare the variable client so it becomes an implicit global (which is really bad in a server). That means successive calls to the app.post() handler will overwrite that variable.
Second, it is not clear from the included code when, if ever, you close that udp4 socket. It does not appear that the server itself ever closes it.
Third, you're recreating the same UDP socket on every single POST to /getchanneldata. Is that really the right design? If your server receives 20 of these requests, it will open up 20 separate UDP connections.

how do i send a message to a specific user in ws library?

I'm exploring different websocket library for self-learning and I found that this library is really amazing ws-node. I'm building a basic 1 on 1 chat in ws-node library
My question is what is the equivalent of socket.io function which is socket.to().emit() in ws? because i want to send a message to a specific user.
Frontend - socket.io
socket.emit("message", { message: "my name is dragon", userID: "123"});
Serverside - socket.io
// listening on Message sent by users
socket.on("message", (data) => {
// Send to a specific user for 1 on 1 chat
socket.to(data.userID).emit(data.message);
});
WS - backend
const express = require('express');
const http = require('http');
const WebSocket = require('ws');
const express = require('express');
const http = require('http');
const WebSocket = require('ws');
const app = express();
const server = http.createServer(app);
const wss = new WebSocket.Server({ server });
wss.on('connection', (ws) => {
ws.on('message', (data) => {
\\ I can't give it a extra parameter so that I can listen on the client side, and how do I send to a specific user?
ws.send(`Hello, you sent -> ${data.message}`);
});
});
Honestly, the best approach is to abstract away the WebSocket using a pub/sub service.
The issue with client<=(server)=>client communication using WebSockets is that client connections are specific to the process (and machine) that "owns" the connection.
The moment your application expands beyond a single process (i.e., due to horizontal scaling requirements), the WebSocket "collection" becomes irrelevant at best. The array / dictionary in which you stored all your WebSocket connections now only stores some of the connections.
To correct approach would be to use a pub/sub approach, perhaps using something similar to Redis.
This allows every User to "subscribe" to a private "channel" (or "subject"). Users can subscribe to more than one "channel" (for example, a global notification channel).
To send a private message, another user "publishes" to that private "channel" - and that's it.
The pub/sub service routes the messages from the "channels" to the correct subscribers - even if they don't share the same process or the same machine.
This allows a client connected to your server in Germany to send a private message to a client connected to your server in Oregon (USA) without anyone being worried about the identity of the server / process that "owns" the connection.
There isn't an equivalent method. socket.io comes with a lot of helpers and functionalities, that will make your life easier, such as rooms, events...
socket.io is a realtime application framework, while ws is just a WebSocket client.
You will need to make your custom wrapper:
const sockets = {};
function to(user, data) {
if(sockets[user] && sockets[user].readyState === WebSocket.OPEN)
sockets[user].send(data);
}
wss.on('connection', (ws) => {
const userId = getUserIdSomehow(ws);
sockets[userId] = ws;
ws.on('message', function incoming(message) {
// Or get user in here
});
ws.on('close', function incoming(message) {
delete sockets[userId];
});
});
And then use it like this:
to('userId', 'some data');
In my opinion, if you seek that functionality, you should use socket.io. Which it's easy to integrate, has a lot of support, and have client libraries for multiple languages.
If your front-end uses socket.io you must use it on the server too.

Express server doesn't close

I'm creating an Express server with reloadable endpoints.
To make this possible I created an endpoint for such a purpose.
But when I call server.close() on the Express's HTTP server it still continues listening, while the server.listening says otherwise, it still is.
Here is a simplified version of my script (Not working fully, but you get the gist):
class simpleServer {
constructor() {
let express = require('express');
this.app = express();
this.app.get('/reload', this.reload);
this.server = this.app.listen(3000);
}
reload(req, res) {
console.log('Closing server');
this.server.close(function() {
console.log('Closed server');
});
// Re-init & stuff
res.json({
message: 'Reloaded'
});
}
}
let server = new simpleServer();
When I call the endpoint, the server will output 'Closing server', but the 'Closed server' takes a long time to be called (5 minutes). And when I reload the page, it still works, while the server.listening is equal to false.
I'm using Node.js version 6.0.0 with Express version 4.14.0.
Some updates:
I fixed the issue by calling req.destroy() after sending the response, does this have any side-effects tho?
A cleaner fix would be keeping a record of current connections and closing those in the reload function instead of closing them instantly. This will probably be less heavy if you have a higher load.
When you call .close(), it only stops accepting new connections, it does not terminate existing connections.
The reason it may take some time to actually close is if there are existing connections that have set Connection: keep-alive in case of more requests.
You can use process.exit().
Or you can try
this.server.close(function() {
console.log('Closed server');
})();

Socket.IO & private messages

This must have been asked already a thousand times, but I do not find any of the answers satisfying, so I'll try having another go, being as clear as possible.
I am starting out with a clean Express; the one that is usually done via the following terminal commands:
user$ express
user$ npm install
then I proceed installing socket.io, this way:
user$ npm install socket.io --save
on my main.js file I then have the following:
//app.js
var express = require('express'),
http = require('http'),
path = require('path'),
io = require('socket.io'),
routes = require('./routes');
var app = express();
I start my socket.io server by attaching it to my express one:
//app.js
var server = http.createServer(app).listen(app.get('port'), function(){
console.log('express server started!');
});
var sIo = io.listen(server);
What I do now is to set the usual routes for Express to work with:
//app.js
app.get('/', routes.index);
app.get('/send/:recipient/:text', routes.sendMessage);
Now, Since I like to keep things organized, I want to put my socket.io code in another file, so instead of using the usual code:
//app.js
sIo.sockets.on('connection', function(socket){
console.log('got a connection');
});
I use the following to be able to access both the socket and the sIo object (as that object contains all the connections infos (important)):
//app.js
sIo.sockets.on('connection', function(socket){
routes.connection(sIo, socket);
});
// index.js (./routes)
exports.connection = function(sIo, socket){
console.log('got a connection.');
};
This way I can do all my socket.io jobs in here. I know that I can access all my clients information now from the sIo object, but of course, they do not contain any information about their session data.
My questions now are the following:
Suppose a user makes an HTTP request to send a message and the handler in my routes is like this:
exports.sendMessage = function(req, res){
//do stuff here
};
How can I get this to "fire" something in my socket.io to send a message? I do not want to know all the underlying work that needs to be done, like keeping track of messages, users, etc. I only want to understand how to "fire" socket.io to do something.
How can I make sure that socket.io sends the message only to a person in particular and be 100% sure that nobody else gets it? From what I can see, there is no way to get the session infos from the sIo object.
Thanks in advance.
question one: The cleanest way to separate the two would probably be to use an EventEmitter. You create an EventEmitter that emits when an http message comes in. You can pass session information along with the event to tie it back to the user who sent the message if necessary.
// index.js (./routes)
var EventEmitter = require('events').EventEmitter;
module.exports.messageEmitter = messageEmitter = new EventEmitter();
module.exports.sendMessage = function(req, res) {
messageEmitter.emit('private_message', req.params.recipient, req.params.text);
};
question 2: You can access the socket when the initial connection is made. An example mostly borrowed from this answer:
var connect = require('connect'),
userMap = {};
routes.messageEmitter.on('private_message', function(recipient, text) {
userMap[recipient].emit('private_message', text);
});
io.on('connection', function(socket_client) {
var cookie_string = socket_client.request.headers.cookie;
var parsed_cookies = connect.utils.parseCookie(cookie_string);
var connect_sid = parsed_cookies['connect.sid'];
if (connect_sid) {
session_store.get(connect_sid, function (error, session) {
userMap[session.username] = socket_client;
});
}
socket_client.on('private_message', function(username, message) {
userMap[username].emit(private_message, message)
});
});
So we're just creating a map between a session's username and a socket connection. Now whenever you need to send a message you can easily lookup what socket is associated with that user and send a message to them using their socket. Just make sure to handle disconnects, and reconnects and connecting in multiple tabs, etc.
I have built something like what you are saying. If a user can make a socket request, it pushes the message via the socket, and then the server does a broadcast or emit of it. But, if a user can't connect to the socket, it then does the http post, like what you are saying by calling the sendMessage. What I have done, rather than having sendMessage shoot off a socket is that I also have my clients doing an ajax request every 5 seconds or so. That will bring back new messages, and if any of the messages were not received via socket.io, I then add them to my clientside array. This acts as sort of a safety net, so I don't have to always fully trust socket.io.
see below in pseudo code
client
if canSendSocketMessage()
sendSocketMessage(message)
else
sendAjaxMessage(message)
setInterval( ->
// ajax call
getNewMessages()
), 5000
server
socket stuff
socket.on 'message' ->
saveMessage()
socket.emit(message)
ajax endpoints
app.post 'sendMessage'
saveMessage()
app.get 'getNewMessages'
res.send getNewMessages()

Get SESSIONID in nodeJS

Now, after some hours of playing around with nodejs and socket.io, I'm getting a couple more problems - one being, that I need to get the sessionID inside node.js, whitout using app.get('/' ... - since that event doesnt seem to fire when socket.io connects, it only fires .on('connection', function( ...
var express = require('express')()
express.set('port', process.env.PORT || 8080)
var server = require('http').createServer(express)
var socket = require('socket.io').listen(server)
server.listen(express.get('port'))
// this event is fired, get('/', ... isnt!
server.on('connection', function(stream) {
// ??
} )
The Session is initially created by the PHP application, as the user logs in. Session data is stored in the database, and the key I need to access that data is the SESSION ID. What's the easiest way to get to it? Like mentioned, I found a couple examples that used app.get('/' ... but I couldnt get that event to fire at all.
Thanks.
If the session data is being stored as a cookie (most likely), then you should be able to re-parse that data during the socket handshake. I posted code for that on this answer, but copied the code here as well:
io.configure(function () {
io.set('authorization', function (handshakeData, callback) {
var cookie = handshakeData.headers.cookie;
// parse the cookie to get user data...
// second argument to the callback decides whether to authorize the client
callback(null, true);
});
});
If the session data is being propagated in the URL, then you may be able to gather this information from handshakeData.url or handshakeData.query. You'll probably have to do your own experimentation.

Resources