How can I write a simple stream which intercepts messages?
For example, say I want to log (or eventually transform) a message being sent over the wire by a user's socket.write(...) call.
Following is a minimal program which attempts to do this:
const net = require('net');
const stream = require('stream');
const socket = new net.Socket();
const transformer = new stream.Transform({
transform(chunk,e,cb){
console.log("OUT:"+chunk.toString());
cb();
}});
//const client = socket.pipe(transformer); // <= prints "OUT:" on client, but nothing on server
const client = transformer.pipe(socket); // <= prints nothing on client, but "hello world" on server
socket.on('data', (data)=>{ console.log("IN:"+data.toString()); });
socket.connect(1234, 'localhost', ()=>{ client.write("hello world"); });
When I do socket.pipe(transformer), the client prints "OUT:" (like I want), but doesn't actually send anything to the server. When I swap the pipe locations, transformer.pipe(socket), nothing gets printed to the client but the message gets sent to the server.
Although not listed here, I also tried to use the Writable stream, which does print the message on the client, but it is never sent to the server (if I do a this.push(...) inside the Writable stream, it still doesn't seem to send to the server)
What am I missing here?
EDIT: Reformatted the code for clarity and updated the text
It looks like I needed to change the following line
socket.connect(1234, 'localhost', ()=>{ client.write("hello world"); });
to this
socket.connect(1234, 'localhost', ()=>{ transformer.write("hello world"); });
This is based on #Mr.Phoenix's comment. I expected .pipe() to return a new stream which I could use. I believe that is how Java's netty framework does it and I kept expecting node streams to work the same way.
You're not writing any data out of the stream.
You need to either this.push(chunk) or change the call to cb to cb(null, chunk).
See the docs about implementing transform streams for more info.
Related
I'm trying to create a print-server written with electron and node js.
My goal is to catch the body of a print-job from a POS to an Epson thermal printer.
As I understood correctly from the documentations of Epson, the printer communicates on tcp port 9100 and on udp 3289 by default.
So I created a websocket which is listening on the tcp port with the "Net" module.
The socket is established successfully and I also recieve some Buffer data.
My Question for now is, how can I encode this buffer, as it isn't possible to encode this via the default encoding types from Node.js.
Or would you recommend to use a virtual printer which prints a file and afterwards to try reading the data from it?
Which module or virtual printers are recommended?
I've searched already for quite a while now without finding any positive results.
Here is my current code from the net server:
var server = net.createServer(function(socket) {
socket.setEncoding('utf8')
socket.on('data', function(buffer) {
var decoded = buffer
console.log(decoded)
})
socket.on('end', socket.end)
});
server.on('connection', handleConnection);
server.listen(9100, function() {
console.log('server listening to %j', server.address());
});
function handleConnection(conn) {
var remoteAddress = conn.remoteAddress + ':' + conn.remotePort;
console.log('new client connection from %s', remoteAddress);
conn.on('data', onConnData);
conn.once('close', onConnClose);
conn.on('error', onConnError);
}
Ok I've got this running.
The Problem was, that the cashing system first made a request for the printerstatus "DLE EOT n".
So I responded to the cashing system with the according status bits / byte (0x16).
Afterwards the POS sended the printjob which I decoded from CP437 to UTF8 to capture and to be able to let my script read the incoming printrequest.
Hope this post helps anyone who is developing anything similar like kitchen monitors, printservers etc. as I found very less informations in the web about this topic.
I have a windows application (Built on C# as windows service) that sends data to NodeJs Net Socket, So since Socket.IO helps making a Web Application a live one , without the need of reload. How can i allow Socket.IO stream the received data from NodeJs Net Socket to the Web Application , in the exact moment the Net Socket receives data from C#?
So in the code that receives the socket data from C#:
var net = require('net');
net.createServer(function (socket) {
socket.on('data', function (data) {
broadcast(socket.name + "> \n" + data + " \n", socket);
socket.end("<EOF>");
//send data to web interface , does it work that way?
//SomeFooToSendDataToWebApp(Data)
});
});
Further more for the Socket.IO i have those lines , which i cant really figure out how to deal with them:
//Should it listen to net socket or web socket?
server.listen(8080);
// Loading socket.io
var io = require('socket.io').listen(server);
// It works but only for one request
io.sockets.on('connection', function (socket2) {
socket2.emit('message' , 'Message Text');
});
P.S: I am new to nodejs & socket.io , so if its possible as well to explain their behavior.
Edit 1 : My Front End Javascript to check it if it has any problems:
//for now it listens to http port , which Socket.IO listens to
var socket = io.connect('http://localhost:8080');
var myElement = document.getElementById("news");
socket.on('message', function(message) {
document.getElementById("news").innerHTML = message;
})
Edit 2 : Did follow jfriend00's answer as it seems my previous code tries were trying to send messages to an unknown socket, i only added this since i needed it to be sent to all the connected clients , so only one line fixed it !
socket.on('data', function (data) {
broadcast(socket.name + "> \n" + data + " \n", socket);
socket.end("<EOF>");
//send data to web interface , does it work that way?
//The Added code here:
io.emit('message',data + " more string");
});
It's a bit hard to tell exactly what you're asking.
If you have some data you want to send to all connected socket.io clients (no matter where the data came from), then you can do that with:
io.emit("someMessage", dataToSend);
If you want to send to only one specific connected client, then you have to somehow get the socket object for that specific client and then do:
socket.emit("someMessage", dataToSend);
How you get the specific socket object for the desired connected client depends entirely upon how your app works and how you know which client it is. Every socket connection on the server has a socket.id associated with it. In some cases, server code uses that id to keep track of a given client (such as putting the id in the session or saving it in some other server-side data). If you have the id for a socket, you can get to the socket with the .to() method such as:
io.to(someId).emit("someMessage", dataToSend);
Your question asked about how you send data received from some C# service over a normal TCP socket. As far as sending it to a socket client, it does not matter at all where the data came from or how you received it. Once you have the data in some Javascript variable, it's all the same from there whether it came from a file, from an http request, from an incoming TCP connection in your C# service, etc... It's just data you want to send.
You can try the following, simple server:
const io = require('socket.io')(8080);
io.on('connection', socket => {
console.log('client connected');
socket.on('data', data => {
io.emit('message', data);
});
});
console.log('server started at port 8080');
It should work if I understand the problem correctly.
And maybe document.getElementById("news").innerHTML += message; in the html client code to see what really happens there?
socket2 means your client which just connected. So you can store these connections to send data to them (helpful for broadcast).
If you get data from windows service via some polling mechanism, on this step you can send this message to your connected clients. So keep your connections in a array to send specific messages each client afterwards
I'm building a realtime visualization using redis as pubsub messenger between python and node. There's a python script always running which sets a redis hash with hmset. That side of the app is working fine, if I enter the following example command: "HGETALL 'sellers-80183917'" in a redis client I end up getting the proper data.
The problem is in the js side. I'm using socketio and redis nodejs libraries to listen to the redis instance and publish the results online through a d3js viz.
I run the following code with node:
var express = require('express');
var app = express();
var redis = require('redis');
app.use(express.static(__dirname + '/public'));
var http = require('http').Server(app);
var io = require('socket.io')(http);
var sredis = require('socket.io-redis');
io.adapter(sredis({ host: 'localhost', port: 6379 }));
redisSubscriber = redis.createClient(6379, 'localhost', {});
redisSubscriber.on('message', function(channel, message) {
io.emit(channel, message);
});
app.get('/sellers/:seller_id', function(req, res){
var seller_id = req.params.seller_id;
redisSubscriber.subscribe('sellers-'.concat(seller_id));
res.render( 'seller.ejs', { seller:seller_id } );
});
http.listen(3000, '127.0.0.1', function(){
console.log('listening on *:3000');
});
And this is the relevant part of the seller.ejs file that's receiving the user requests and outputting the viz:
var socket = io('http://localhost:3000');
var stats;
var seller_key = 'sellers-'.concat(<%= seller %>);
socket.on(seller_key, function(msg){
stats = [];
console.log('Im in');
var seller = $.parseJSON(msg);
var items = seller['items'];
for(item in items) {
var item_data = items[item];
stats.push({'title': item_data['title'], 'today_visits': item_data['today_visits'], 'sold_today': item_data['sold_today'], 'conversion_rate': item_data['conversion_rate']});
}
setupData(stats);
});
The problem is that the socket_on() method never receives anything and I don't see where the problem is as everything seems to be working fine besides this.
I think that you might be confused as to what Pub/Sub in Redis actually is. It's not a way to listen to changes on hashes; you can have a Pub/Sub channel called sellers-1, and you can have a hash with the key sellers-1, but those are unrelated to each other.
As documented here:
Pub/Sub has no relation to the key space.
There is a thing called keyspace notifications that can be used to listen to changes in the key space (through Pub/Sub channels); however, this feature isn't enabled by default because it'll take up more resources.
Perhaps an easier method would be to publish a message after the HMSET, so any subscribers would know that the hash got changed (they would then retrieve the hash contents themselves, or the published message would contain the relevant data).
This brings us to the next possible issue: you only have one subscriber connection, redisSubscriber.
From what I understand from the Node.js Redis driver, calling .subscribe() on such a connection would remove any previous subscriptions in favor of the new one. So if you were previously subscribed to the sellers-1 channel and subscribe to sellers-2, you wouldn't be receiving messages from the sellers-1 channel anymore.
You can listen on multiple channels by either passing an array of channels, or by passing them as a arguments:
redisSubscriber.subscribe([ 'sellers-1', 'sellers-2', ... ])
// Or:
redisSubscriber.subscribe('sellers-1', 'sellers-2', ... )
You would obviously have to track each "active" seller subscription. Either that, or create a new connection for each subscription, which also isn't ideal.
It's probably a better idea to have a single Pub/Sub channel on which all changes would get published, instead of a separate channel for each seller.
Finally: if your seller id's aren't hard to guess (for instance, if it's based on an incremental integer value), it would be trivial for someone to write a client that would make it possible to listen in on any seller channel they'd like. It might not be a problem, but it is something to be aware of.
I have software running in customer servers on premises and there are multiple software and I want on failure of any software it should send emails to me
It can be a pain enabling & configuring to work with the customers mail servers.
I thought to write simple socket program in NodeJS to read the error log file and push those messages to my server that should handle the sending email
or may be web service to call for sending email.
If any has used things like this please tell me or Is there any easy solution exist somewhere?
Updating my question
As per comments I tried to implement same solution here is my main nodejs server file and where exactly I am facing problem now in Socket event emit. I want to emit socket event whenever log.xml file get changes, This run only one time.
var app = require('http').createServer(handler),
io = require('socket.io').listen(app),
parser = new require('xml2json'),
fs = require('fs');
app.listen(8030);
console.log('server listening on localhost:8030');
// creating a new websocket to keep the content updated without REST call
io.sockets.on('connection', function (socket) {
console.log(__dirname);
// reading the log file
fs.readFile(__dirname + '/var/home/apache/log.xml', function (err, data) {
if (err)
throw err;
// parsing the new xml data and converting them into json file
var json = parser.toJson(data);
// send the new data to the client
socket.emit('error', json);
});
});
/* Email send services This code to in my client server outside of main socket server cloud This part is working fine I tested it in my different server
var socket = io.connect('http://localhost:8030');
socket.on('error', function (data) {
// convert the json string into a valid javascript object
var _data = JSON.parse(data);
mySendMailTest(_data);
*/
Please apologies me as I am new to stackoverflow community.
I think there is no problem in your socket code you need to use fs.watchFile before reading file. this is watch function similar to Angular Watch , it will detect any change happen to your file and run another function in callback to emit the socket
https://nodejs.org/docs/latest/api/fs.html#fs_fs_watchfile_filename_options_listener
// creating a new websocket to keep the content updated without REST call
io.sockets.on('connection', function (socket) {
console.log(__dirname);
// reading the log file
// watching the file
fs.watchFile(__dirname + '/var/home/apache/log.xml', function(curr, prev) {
// on file change just read it
fs.readFile(__dirname + '/var/home/apache/log.xml', function (err, data) {
if (err)
throw err;
// parsing the new xml data and converting them into json file
var json = parser.toJson(data);
// send the new data to the client
socket.emit('error', json);
});
});
});
I have setup a simple server and client, however whenever I close the client, it seems not possible to reconnect. Here's my client:
const net = require('net');
var client = net.connect({port: 8080, host: '127.0.0.1'});
var response = '';
// events
client.on('data', function(chunk) { response += chunk });
client.on('end', function() {
console.log(response);
client.end()
});
// main execution
client.write('test');
And here's my server:
const net = require('net');
var server = net.createServer();
server.listen(8080, '127.0.0.1');
server.on('connection', function(sock) {
sock.on('data', function(chunk) {
sock.write('test received');
sock.end();
});
});
This is just outline code representing my issue. When I execute my client the first time, everything works correctly. However, when I execute it again, the server outputs the error mentioned in the title and crashes. The same happens if I remove 'client.end()' and instead Ctrl+C out of the client program to cause it to end.
My understanding of sockets is that they represent endpoints in a stream between the client and the server. When that stream is no longer necessary (i.e, when the client does what it needs to do), I want that stream to be completely removed. I would think that calling end() on both the client and server endpoints of that single stream would achieve this, like sending two FIN messages, but as explained it does not. The reasons I want to do this are so: (a) the client file will actually finish execution and (b) the server will no longer have its socket endpoint of that stream in its system/waste resources listening to it.
Any insight into the source of my problem would be appreciated.
You should use the 'connect' event on the client-side to be sure that you perform requests only when your socket is ready. So, in the callback on the event, you can invoke write() function.
You can check if socket is not destroyed before writing.
if (!socket.destroyed) socket.write("something");
Your server only closes the socket when data is received, and your client never sends any.