Combining Nodejs Net socket and Socket IO - node.js

I have a windows application (Built on C# as windows service) that sends data to NodeJs Net Socket, So since Socket.IO helps making a Web Application a live one , without the need of reload. How can i allow Socket.IO stream the received data from NodeJs Net Socket to the Web Application , in the exact moment the Net Socket receives data from C#?
So in the code that receives the socket data from C#:
var net = require('net');
net.createServer(function (socket) {
socket.on('data', function (data) {
broadcast(socket.name + "> \n" + data + " \n", socket);
socket.end("<EOF>");
//send data to web interface , does it work that way?
//SomeFooToSendDataToWebApp(Data)
});
});
Further more for the Socket.IO i have those lines , which i cant really figure out how to deal with them:
//Should it listen to net socket or web socket?
server.listen(8080);
// Loading socket.io
var io = require('socket.io').listen(server);
// It works but only for one request
io.sockets.on('connection', function (socket2) {
socket2.emit('message' , 'Message Text');
});
P.S: I am new to nodejs & socket.io , so if its possible as well to explain their behavior.
Edit 1 : My Front End Javascript to check it if it has any problems:
//for now it listens to http port , which Socket.IO listens to
var socket = io.connect('http://localhost:8080');
var myElement = document.getElementById("news");
socket.on('message', function(message) {
document.getElementById("news").innerHTML = message;
})
Edit 2 : Did follow jfriend00's answer as it seems my previous code tries were trying to send messages to an unknown socket, i only added this since i needed it to be sent to all the connected clients , so only one line fixed it !
socket.on('data', function (data) {
broadcast(socket.name + "> \n" + data + " \n", socket);
socket.end("<EOF>");
//send data to web interface , does it work that way?
//The Added code here:
io.emit('message',data + " more string");
});

It's a bit hard to tell exactly what you're asking.
If you have some data you want to send to all connected socket.io clients (no matter where the data came from), then you can do that with:
io.emit("someMessage", dataToSend);
If you want to send to only one specific connected client, then you have to somehow get the socket object for that specific client and then do:
socket.emit("someMessage", dataToSend);
How you get the specific socket object for the desired connected client depends entirely upon how your app works and how you know which client it is. Every socket connection on the server has a socket.id associated with it. In some cases, server code uses that id to keep track of a given client (such as putting the id in the session or saving it in some other server-side data). If you have the id for a socket, you can get to the socket with the .to() method such as:
io.to(someId).emit("someMessage", dataToSend);
Your question asked about how you send data received from some C# service over a normal TCP socket. As far as sending it to a socket client, it does not matter at all where the data came from or how you received it. Once you have the data in some Javascript variable, it's all the same from there whether it came from a file, from an http request, from an incoming TCP connection in your C# service, etc... It's just data you want to send.

You can try the following, simple server:
const io = require('socket.io')(8080);
io.on('connection', socket => {
console.log('client connected');
socket.on('data', data => {
io.emit('message', data);
});
});
console.log('server started at port 8080');
It should work if I understand the problem correctly.
And maybe document.getElementById("news").innerHTML += message; in the html client code to see what really happens there?

socket2 means your client which just connected. So you can store these connections to send data to them (helpful for broadcast).
If you get data from windows service via some polling mechanism, on this step you can send this message to your connected clients. So keep your connections in a array to send specific messages each client afterwards

Related

Node js with express return connection closed before receiving a handshake response

I have a socket running in nodejs and using this socket in html page this is working fine and some times I'm receiving the error on developer console as like
failed: Connection closed before receiving a handshake response. In this time my update not getting reflect on the user screen. Actually whenever the changes updated in admin screen I written the login in laravel to store this values into the redis and I have used the laravel event broadcast and in node js socket.io read the redis value change and push the values into the user screens.
I have code in laravel as like,
Laravel Controller,
public function updatecommoditygroup(Request $request)
{
$request_data = array();
parse_str($request, $request_data);
app('redis')->set("ssahaitrdcommoditydata", json_encode($request_data['commodity']));
event(new SSAHAITRDCommodityUpdates($request_data['commodity']));
}
In this above controller when the api call receives just store the values into this redis key and broadcast the event.
In my event class,
public $updatedata;
public function __construct($updatedata)
{
$this->updatedata = $updatedata;
}
public function broadcastOn()
{
return ['ssahaitrdupdatecommodity'];
}
Finally I have written my socket.io file as like below,
var app = require('express')();
var http = require('http').Server(app);
var io = require('socket.io')(http);
var Redis = require('ioredis');
var redis = new Redis({ port: 6379 } );
redis.subscribe('ssahaitrdupdatecommodity', function(err, count) {
});
io.on('connection', function(socket) {
console.log('A client connected');
});
redis.on('pmessage', function(subscribed, channel, data) {
data = JSON.parse(data);
io.emit(channel + ':' + data.event, data.data);
});
redis.on('message', function(channel, message) {
message = JSON.parse(message);
io.emit(channel + ':' + message.event, message.data);
});
http.listen(3001, function(){
console.log('Listening on Port 3001');
});
When I have update the data from admin I'm passing to laravel controller, and controller will store the received data into redis database and pass to event broadcast.And event broadcast pass the values to socket server and socket server push the data whenever the redis key get change to client page.
In client page I have written the code as like below,
<script src="../assets/js/socket.io.js"></script>
var socket = io('http://ip:3001/');
socket.on("novnathupdatecommodity:App\\Events\\NOVNATHCommodityUpdates", function(data){
//received data processing in client
});
Everything working fine in most of the time and some times issue facing like
**VM35846 socket.io.js:7 WebSocket connection to 'ws://host:3001/socket.io/?EIO=3&transport=websocket&sid=p8EsriJGGCemaon3ASuh' failed: Connection closed before receiving a handshake response**
By this issue user page not getting update with new data. Could you please anyone help me to solve this issue and give the best solution for this issue.
I think this is because your socket connection timeout.
new io({
path:,
serveClient:,
orgins:,
pingTimeout:,
pingInterval:
});
The above is the socket configuration. If you are not configuring socket sometime it behaves strangely. I do not know the core reason, but i too have faced similar issues that implementing the socket configuration solved it.
Socket.io Server
Similar configuration should be done on the client side. There is an option of timeout in client side
Socket.io Client
For example.
Say this is your front-end code
You connect to the socket server using the following command:
io('http://ip:3001', { path: '/demo/socket' });
In your server side when creating the connection:
const io = require("socket.io");
const socket = new io({
path: "/demo/socket",
serveClient: false /*whether to serve the client files (true/false)*/,
orgins: "*" /*Supports cross orgine i.e) it helps to work in different browser*/,
pingTimeout: 6000 /*how many ms the connection needs to be opened before we receive a ping from client i.e) If the client/ front end doesnt send a ping to the server for x amount of ms the connection will be closed in the server end for that specific client*/,
pingInterval: 6000 /* how many ms before sending a new ping packet */
});
socket.listen(http);
Note:
To avoid complication start you http server first and then start you sockets.
There are other options available, but the above are the most common ones.
I am just describing what i see in the socket.io document available in github.socket_config. Hope this helps

Nodejs Error: This socket has been ended by the other party

I have setup a simple server and client, however whenever I close the client, it seems not possible to reconnect. Here's my client:
const net = require('net');
var client = net.connect({port: 8080, host: '127.0.0.1'});
var response = '';
// events
client.on('data', function(chunk) { response += chunk });
client.on('end', function() {
console.log(response);
client.end()
});
// main execution
client.write('test');
And here's my server:
const net = require('net');
var server = net.createServer();
server.listen(8080, '127.0.0.1');
server.on('connection', function(sock) {
sock.on('data', function(chunk) {
sock.write('test received');
sock.end();
});
});
This is just outline code representing my issue. When I execute my client the first time, everything works correctly. However, when I execute it again, the server outputs the error mentioned in the title and crashes. The same happens if I remove 'client.end()' and instead Ctrl+C out of the client program to cause it to end.
My understanding of sockets is that they represent endpoints in a stream between the client and the server. When that stream is no longer necessary (i.e, when the client does what it needs to do), I want that stream to be completely removed. I would think that calling end() on both the client and server endpoints of that single stream would achieve this, like sending two FIN messages, but as explained it does not. The reasons I want to do this are so: (a) the client file will actually finish execution and (b) the server will no longer have its socket endpoint of that stream in its system/waste resources listening to it.
Any insight into the source of my problem would be appreciated.
You should use the 'connect' event on the client-side to be sure that you perform requests only when your socket is ready. So, in the callback on the event, you can invoke write() function.
You can check if socket is not destroyed before writing.
if (!socket.destroyed) socket.write("something");
Your server only closes the socket when data is received, and your client never sends any.

Setting up a Stateless connection with NodeJS and Socket.IO

After prototyping my project using PHP and Unity3D i've decided on building the production version using Cordova and NodeJS.
I'm currently using Socket.io with NodeJS and having some confusion with connections. The way that I had expected this to work out was the following procedure:
The client would connect to the server with a request
The server would respond to the request
The connection would be closed
However, it seems that the connection likes to stay open, and if the connection is closed, it continuously attempts to reconnect which is not what I am looking for. I'm attempting to establish a single state of data transfer, similar to what happens when you make a web-request to a PHP file.
The source code of the project is pretty much boilerplate code:
var application = require('express')();
var http = require('http').Server(application);
var server = require('socket.io')(http);
http.listen(8080, function() {
console.log('Listening on *:8080');
});
server.on('connection', function(socket) {
console.log('SERVER: A new connection has been received.');
server.on('disconnect', function() {
console.log('SERVER: A connection has been closed.');
});
});
I do not need a persistent connection, nor do I want one.
Thoughts: I could send a close handshake from the client. For example:
Send some data to the server
Recieve some data from the server
Send a close request to the server / just close the socket
Continue application logic once the socket is closed
Would this be the proper way to handle this? However then the question arises, what if the data gets lost, then there's a permanently open socket. Would implementing a basic timeout be ideal in this situation? (IE: If a response isn't received within 10 seconds, there was an error or the server was not available).
Then Socket.io is the wrong tool for your scenario. socket.io needs to keep the socket open to get events from the server back to the client (and vice-versa). As a matter of fact, even of the server does not support WebSockets, socket.io will resort back to other mechanisms, such as polling.
Not sure why you're using socket.io for this. Socket IO is used for different purpose and doesn't fir your criteria here. I have seen mainly its uses in real time application and binary streaming.You can try TCP socket in node.js
var net = require('net');
var HOST = '127.0.0.1';
var PORT = 6969;
// Create a server instance, and chain the listen function to it
// The function passed to net.createServer() becomes the event handler for the 'connection' event
// The sock object the callback function receives UNIQUE for each connection
net.createServer(function(sock) {
// We have a connection - a socket object is assigned to the connection automatically
console.log('CONNECTED: ' + sock.remoteAddress +':'+ sock.remotePort);
// Add a 'data' event handler to this instance of socket
sock.on('data', function(data) {
console.log('DATA ' + sock.remoteAddress + ': ' + data);
// Write the data back to the socket, the client will receive it as data from the server
sock.write('You said "' + data + '"');
});
// Add a 'close' event handler to this instance of socket
sock.on('close', function(data) {
console.log('CLOSED: ' + sock.remoteAddress +' '+ sock.remotePort);
});
}).listen(PORT, HOST);
console.log('Server listening on ' + HOST +':'+ PORT);
Check out here

Socket.io with multiple Node.js hosts, emit to all clients

I am new to Socket.io and trying to get my head around the best approach to solve this issue.
We have four instances of a Node.js app running behind a load balancer.
What I am trying to achieve is for another app to POST some data to the load balancer URL which will hand if off to one of the instances.
The receiving instance will store the data, then use Socket.io to emit the data to the connected clients.
The issue is that browser/client can only be connected to a single instance at one time.
I am trying to determine if there is a way to emit to all clients at once?
Or have the clients connect to multiple servers using io.connect?
Or is this a case for Redis?
Publish/Subscribe is what you need here. Redis will give you the functionality your looking for out of the box. You just need to create a redis client and subscribe to an update channel on each of your app server nodes. Then, publish the update when a POST is successful (or whatever). Finally, have the redis client subscribe to the update chanel and on message emit a socketio event:
(truncated for brevity)
var express = require('express')
, socketio = require('socket.io')
, redis = require('redis')
, rc = redis.createClient()
;
var app = express();
var server = http.createServer(app);
var io = socketio.listen(server);
server.listen(3000);
app.post('/targets', function(req, res){
rc.publish('update', res.body);
});
rc.on('connect', function(){
// subscribe to the update channel
rc.subscribe('update');
});
rc.on('message', function(channel, msg){
// util.log('Channel: ' + channel + ' msg: ' + msg);
var msg = JSON.parse(msg);
io.sockets.in('update').emit('message', {
channel: channel,
msg: msg
});
});
Then in the JS app, listen for that emitted message:
socket.on('message', function(data){
debugger;
// do something with the updated data
});
Of course, introducing this new Redis Server adds another single point of failure. A more robust implementation may use something like a message broker with AMQP or ZeroMQ or some similar networking library which provides pub/sub capabilities.

How can I send packets between the browser and server with socket.io, but only when there is more than one client?

In my normal setup, the client will emit data to my server regardless of whether or not there is another client to receive it. How can I make it so that it only sends packets when the user-count is > 1? I'm using node with socket.io.
To do this you would want to listen to the connection event on your server (as well as disconnect) and maintain a list of clients which are connected in a 'global' variable. When more than 1 client is connected send out a message to all connected clients to know they can start sending messages, like so:
var app = require('express').createServer(),
io = require('socket.io').listen(app);
app.listen(80);
//setup express
var clients = [];
io.sockets.on('connection', function (socket) {
clients.push(socket);
if (clients.length > 1) {
io.socket.emit('start talking');
}
socket.on('disconnect', function () {
var index = clients.indexOf(socket);
clients = clients.slice(0, index).concat(clients.slice(index + 1));
if (clients.length <= 1) {
io.sockets.emit('quiet time');
};
});
});
Note: I'm making an assumption here that the socket is passed to the disconnect event, I'm pretty sure it is but haven't had a chance to test.
The disconnect event wont receive the socket passed into it but because the event handler is registered within the closure scope of the initial connection you will have access to it.

Resources