How to access a Child Process' sockets in NodeJS? - node.js

I am currently running into a very strange problem.
I am trying to start VLC using a NodeJS child process and then accessing it's Remote Control (RC) interface using socket. The problem occurs when connecting to this socket. I get an error, connection refused. The port is open and the application is allowed from the firewall.
The tricky part is, when I open VLC manually using this interface, and only try to connect on the socket, it works. I am assuming something in the spawned process makes things different causing the error somehow.
Here is the code I am trying to run:
var spawn = require('child_process').spawn;
var file_dir = "V:\\TEST\\";
var files = ["Ika.mkv", "Nami.mkv", "Azu.mkv"];
var player = spawn("C:\\Program Files (x86)\\VideoLAN\\VLC\\vlc.exe", ['--intf="rc"', '--rc-host="localhost:3000"', '--fullscreen', file_dir + files[0]]);
var net = require('net');
var client = net.createConnection(3000, "localhost");
client.on('connect', function() {
console.log('connected to VLC on port 3000');
client.write("add " + file_dir + files[1] + "\n");
client.write("enqueue " + file_dir + files[2] + "\n");
client.write("help" + "\n");
});
client.on('data', function(data) {
console.log(data.toString());
});
client.on('end', function() {
console.log('disconnected from server');
});
I have tried this code on two machines, and I am running into the same problem.
Some questions you may ask:
What operating system? Windows 8.1
Why do I need to use a socket?
VLC doesn't have any interfaces that read and write from standard in or standard out. I have tried many different options and they simply do nothing.
What am I trying to build?
A Media Center with an web interface to it. I am using VLC as a media player.
Can't you use the built in HTTP interface?
It doesn't suit what I want to build. I want more control over managing my media.
Any and all help would be welcome. My thanks.

Turns out for some strange reason not all command line arguments get passed to the VLC instance.
I solved it by grouping together the instancing of the RC interface and the setting the RC mode to localhost:3000
This is the new line to spawn a process
var player = spawn("C:\\Program Files (x86)\\VideoLAN\\VLC\\vlc.exe", ['-I rc --rc-host=\"localhost:3000\"','--fullscreen', file_dir + files[0]]);
This works, only downfall is it also creates a RC console window, but I can live with it.
Thanks to #jfriend00 for helping solve the strange mystery.

Related

trying to connect to TCP server using Node.js

I'm trying to connect to a server/specific port using Node.js, and I don't even get past var net = require('net');
I'm using Node.js v16.15.0.
Welcome to Node.js v16.15.0.
When I use the command above, I receive UNDEFINED. As far as I know, I've installed everything I need (including socket.io), and I'm working within the Node.js environment in iTerm.
My goal is to connect to a TCP server, receive a list of files, and then download each of them over a persistent socket. But I'm a little stuck as I can't even seem to get into the TCP server in the first place.
This is what I think I'm supposed to run to get in (obviously with my correct port and IP info which is omitted below).
var HOST = 'IP';
var PORT = 'PORT'
var FILEPATH = 'myfilepathhereIwilltweakitwhenIgettothispoint';
Can anyone point me in the right direction?
From what you said, I think you are trying to code NodeJS script within the NodeJS executable start in command line. You get an UNDEFINED because you imported the library into your variable and this assignment does have any value, so it is UNDEFINED. You can read more about this in this subject : link
But what we usually do in NodeJS development is creating a file, let's call it index.js. Inside that file we are writing our code, let's say :
const net = require('net');
const client = net.createConnection({ port: 8124 }, () => {
// 'connect' listener.
console.log('connected to server!');
client.write('world!\r\n');
});
client.on('data', (data) => {
console.log(data.toString());
client.end();
});
client.on('end', () => {
console.log('disconnected from server');
});
Code sample from NodeJS Documentation.
Then we want to run our code by using the command line like this : node path/to/index.js.
Hope it helps !

open serialport and log data automatically under various events

I am using a COM3 port and iam using the serialport module to read from the serialport. What I want is an automatic logging system which logs the data from serialport under the following events.
1.close program and reopening the program
2.restart my pc
3.unplug the usb cable and plug it again
4.restart the Arduino device
I want to open the serialport and log data received from it automatically whenever the above events occur. How to handle these cases.
Currently, this is the code iam using
var fs = require('fs');
const SerialPort = require('serialport')
const Readline = require('#serialport/parser-readline')
const port = new SerialPort('COM3')
const parser = new Readline()
port.pipe(parser)
parser.on('data', function (data) {
const index = data.indexOf('*#SENSOR_DATA')
if(index != -1){
fs.appendFileSync("sensor_data.txt", new Date(), 'utf8')
fs.appendFileSync("sensor_data.txt", data, 'utf8')
}
})
port.on('error', function(err) {
fs.appendFileSync("sensor_data.txt", new Date(), 'utf8')
fs.appendFileSync("sensor_data.txt", err, 'utf8')
console.log(err);
})
port.write('ROBOT PLEASE RESPOND\n')
How can I handle the above mentioned cases?
close program and reopening the program
Logging the data from the port on starting of the program, should be straight forward, i.e. the running of your script.
Closing of the program, depends on what you scenarios you want to handle, graceful or non graceful exits of the program. To implement this you can add event listeners on the nodeJS process object. You can register event handlers for when the process is going to exit to do some action.
process.on('SIGINT', () => {
// read my data from Serial
process.exit(-1);
})
restart my pc
Depending on what your OS is, you could configure your nodejs process as a service, or you could run it via cron. For example cron (depending on the OS) has the option for scheduling jobs on reboot. Regardless of your OS, you will be able to schedule jobs or create a service to ensure your process starts when your machine restarts.
unplug the usb cable and plug it again &
restart the Arduino device
For this I would look at the different parsers that are available for serialport. For example the ReadyParser.

Connecting to socket.io 1.x manually using websockets, capacity testing

I am working with a nodejs express server which uses socket.io to communicate an iOS client, and am having a little trouble trying to test how many clients can connect and exchange data at any one time.
My goal is to be able to run a script which connects to socket.io with thousands of different sessions, as well as send and receive data to understand our system's scale. Currently we are using a single dyno on Heroku but will likely be considering other options on AWS soon.
I have found code which should do what I am trying to do for earlier versions of socket.io, such as this, but have had issues since it seems v1.x has a very different handshake protocol. I tried out using the socket.io-client package, but trying to connect multiple times only simulates use of one session, I need to simulate many in independent users.
I have been picking apart the socket.io-client code, but have only gotten so far as creating a connection - I am stuck on the sending data part. If anyone has any knowledge or could point to some written resources on how data is sent between a client and a socket.io server, it would help me out a lot.
Here's what I have so far:
var needle = require('needle'),
WebSocket = require('ws'),
BASE_URL = 'url-to-socket-host:5002';
var connectionNo = 0;
needle.get('http://' + BASE_URL + '/socket.io/?EIO=3&transport=polling&t=1416506501335-0', function (err, resp) {
// parse the sid
var resp = JSON.parse(resp.body.toString().substring(5, resp.body.toString().length));
// use the sid to connect using websockets
var url = 'ws://' + BASE_URL + '/socket.io/?EIO=3&transport=websocket&sid=' + resp.sid;
console.log(connectionNo + ' with sid: ' + resp.sid);
var socket = new WebSocket(url, void(0), {
agent: false
});
socket.on('open', function () {
console.log('Websocket connected: ' + connectionNo);
// I don't understand how to send data to the server here,
// from looking at the source code it should use some kind
// of binary encoding, any ideas?
socket.on('message', function (msg) {
console.log(msg);
});
});
});
I will continue deconstructing the socket.io-client code but if anyone has any clues or recourses that may help, let me know. Thanks.
I ended up setting for using the socket.io-client npm package which has the ability to connect to a new session on every connection. I found an example benchmark in this issue.
There is not so much need for me to manually connect to socket.io using pure websockets and HTTP, but thanks to Yannik for pointing out the parser in use. The spec of the inner workings of v1.x can be found here.
Thanks!
The problem my reside in the fact that you are not using socket.io in your client code. You have imported ('ws') which is another module whose docs are here: https://www.npmjs.org/package/ws.
You probably want to ws.send('something');. When you receive a message in ws, it also comes with an object with a property indicating whether it is binary data or not. If it is, you will need to concatenate the chunks incrementally. There is a canonical way to do this which you can find via google. But it looks a little like this:
var message;
socketConnection.on('data', function(chunk){ message += chunk});

Creating web socket server on Azure hosting using Node.js?

I have a node application I am trying to host on Azure, that create a websocket server. I am trying to create the web socket server using the code below:
var socket = require('websocket').server;
...
server.listen(1111,function(){
console.log('Http server is listening on port 1111');
});
When I do this on my local machine it works fine, but once it's up on Azure the following line of client side javascript:
var connection = new WebSocket('ws:' + document.domain + ':1111');
connection.send(msg); //throws error
Throws the error:
InvalidStateError: An attempt was made to use an object that is not,
or is no longer, usable
Can I get the desired functionality if I'm using azure? If so, any suggestions on where to start looking for a fix?
You can see the broken app in action here.
It will work on Azure, but you need to make a change. Try this:
server.listen(process.env.port, function () {
var addr = app.address();
console.log('Server listening on http://' + addr.address + ':' + addr.port);
});
Here's a tutorial on using the chat example from Socket.IO's source in Azure:
http://www.windowsazure.com/en-us/develop/nodejs/tutorials/app-using-socketio/
It assumes you're on Windows, which isn't strictly necessary. To use Node.js on Azure from a Mac or Linux system, just publish from Git:
http://www.windowsazure.com/en-us/develop/nodejs/common-tasks/publishing-with-git/

I'm receiving duplicate messages in my clustered node.js/socket.io/redis pub/sub application

I'm using Node.js, Socket.io with Redisstore, Cluster from the Socket.io guys, and Redis.
I've have a pub/sub application that works well on just one Node.js node. But, when it comes under heavy load is maxes out just one core of the server since Node.js isn't written for multi-core machines.
As you can see below, I'm now using the Cluster module from Learnboost, the same people who make Socket.io.
But, when I fire up 4 worker processes, each browser client that comes in and subscribes gets 4 copies of each message that is published in Redis. If there are are three worker processes, there are three copies.
I'm guessing I need to move the redis pub/sub functionality to the cluster.js file somehow.
Cluster.js
var cluster = require('./node_modules/cluster');
cluster('./app')
.set('workers', 4)
.use(cluster.logger('logs'))
.use(cluster.stats())
.use(cluster.pidfiles('pids'))
.use(cluster.cli())
.use(cluster.repl(8888))
.listen(8000);
App.js
redis = require('redis'),
sys = require('sys');
var rc = redis.createClient();
var path = require('path')
, connect = require('connect')
, app = connect.createServer(connect.static(path.join(__dirname, '../')));
// require the new redis store
var sio = require('socket.io')
, RedisStore = sio.RedisStore
, io = sio.listen(app);
io.set('store', new RedisStore);io.sockets.on('connection', function(socket) {
sys.log('ShowControl -- Socket connected: ' + socket.id);
socket.on('channel', function(ch) {
socket.join(ch)
sys.log('ShowControl -- ' + socket.id + ' joined channel: ' + ch);
});
socket.on('disconnect', function() {
console.log('ShowControll -- Socket disconnected: ' + socket.id);
});
});
rc.psubscribe('showcontrol_*');
rc.on('pmessage', function(pat, ch, msg) {
io.sockets.in(ch).emit('show_event', msg);
sys.log('ShowControl -- Publish sent to channel: ' + ch);
});
// cluster compatiblity
if (!module.parent) {
app.listen(process.argv[2] || 8081);
console.log('Listening on ', app.address());
} else {
module.exports = app;
}
client.html
<script src="http://localhost:8000/socket.io/socket.io.js"></script>
<script src="https://ajax.googleapis.com/ajax/libs/jquery/1.7.0/jquery.min.js"></script>
<script>
var socket = io.connect('localhost:8000');
socket.emit('channel', 'showcontrol_106');
socket.on('show_event', function (msg) {
console.log(msg);
$("body").append('<br/>' + msg);
});
</script>
I've been battling with cluster and socket.io. Every time I use cluster function (I use the built in Nodejs cluster though) I get alot of performance problems and issues with socket.io.
While trying to research this, I've been digging around the bug reports and similar on the socket.io git and anyone using clusters or external load balancers to their servers seems to have problems with socket.io.
It seems to produce the problem "client not handshaken client should reconnect" which you will see if you increase the verbose logging. This appear alot whenever socket.io runs in a cluster so I think it reverts back to this. I.E the client gets connected to randomized instance in the socket.io cluster every time it does a new connection (it does several http/socket/flash connections when authorizing and more all the time later when polling for new data).
For now I've reverted back to only using 1 socket.io process at a time, this might be a bug but could also be a shortcoming of how socket.io is built.
Added: My way of solving this in the future will be to assign a unique port to each socket.io instance inside the cluster and then cache port selection on client side.
Turns out this isn't a problem with Node.js/Socket.io, I was just going about it the completely wrong way.
Not only was I publishing into the Redis server from outside the Node/Socket stack, I was still directly subscribed to the Redis channel. On both ends of the pub/sub situation I was bypassing the "Socket.io cluster with Redis Store on the back end" goodness.
So, I created a little app (with Node.js/Socket.io/Express) that took messages from my Rails app and 'announced' them into a Socket.io room using the socket.io-announce module. Now, by using Socket.io routing magic, each node worker would only get and send messages to browsers connected to them directly. In other words, no more duplicate messages since both the pub and sub happened within the Node.js/Socket.io stack.
After I get my code cleaned up I'll put an example up on a github somewhere.

Resources