Rrdtool with nodejs - node.js

Let's say I have a program running in my local machine, and I would like it to listen to HTTP requests, which triggers the program to run and respond back with the output of the program. For example, I have rrdtool (command line utility for accessing round robin database) installed in my Linux box, and I want web clients to request the web server to run the rrdtool program and respond back with the output of rrdtool.
Quesitons:
I know programming languages are used to generate dynamic html contents to be sent back to clients, but rrdtool is an already existing program that just needs to be triggered by a web request. In fact, rrdtool provides various programming language bindings such as python, but I would like to use Node.js on the server side and javascript binding to rrdtool isn't supported. So how is the interface between my javascript code and the rrdtool program (CLI) done?
I thought about using a Java implementation of rrdtool functionalities such as rrd4j, but portability isn't really a priority for me and I would like to run the official rrdtool program (written in C) for a better performance. However, I'm not sure if the cost of the interface between javascript on the server side and rrdtool would outweigh the performance benefits of running the C program.
Any help/feedback/pointers would be appreciated.

Just use NodeJS's "child_process" module to run your rrdtool. Then capture its output with child.stdin.on('data', function (data) {..., etc. Grab its output, format it into HTML, and send it as a reply to the web request that initiated it.

If you are using RRDTool to generate graph images, then you should be able to call rrdtool via your NodeJS handler, having it write to a temporary file in a web accessible directory. Then, send back the URL of the newly created image file, which your web frontend can use to create a new IMG tag for display.

Related

Node.js I/O streams: piping output all the way back to web server

A little bit of background
Below you will find a diagram of the relationship between the different components of a Node app I'm currently working on. Here is the link on GitHub. It is an application that I use to archive videos which have strong journalistic importance, and between the moment I watch them, and the moment I get the time to use them for my reports, they are usually removed from youtube. Hence, by archiving them, this information no longer gets lost.
What I'm trying to achieve in plain English
download_one_with_pytube.py is basically a piece of code that downloads a video given an id, and so it reports its progress of the download by printing to the console the percentage of that progress.
What I'm trying to achieve in terms of output piping
Here is a pseudo shell set of piped commands
Array of IDs of videos | for each URL | python download(video Ids) | print progress | response.send(progress)
The difficulty I have is to actually spawn the python code passing it the video id dynamically, and then pipe the progress all the way back to the server's response.
Resources I've consulted & Stuff I tried
I've tried, the whole day yesterday, without success, implementing my own Classes of objects inheriting from EventEmitter, or even implementing my own deplex stream class to pipe that output all the back to my express web server so that progress can be served to the browser.
Advanced Node.js | Working with Streams : Implementing Readable and Writable Streams
Util | Node.js v9.3.0 Documentation
How to create duplex streams with Node.js - a Nodejs programming tutorial for web developers | CodeWinds
class Source extends Readable
Pipe a stream to a child process · Issue #4374 · nodejs/node-v0.x-archive
Developers - Pipe a stream to a child process
Deferred process.exit() to allow STDOUT pipe to flush by denebolar · Pull Request #1408 · jsdoc3/jsdoc
The problem
I think the problem is that i get confused with the direction the pipes should take.
What I've managed so far
All i've managed to do is 'pipe' the outpout of the python script back to downloadVideos.js
How the app is strutured
Through express (server.js in the diagram), I exposed my node app (running through a forever daemon) so that devices on the same LAN as the server can access [server IP address]:3333/startdownload and trigger the app execution.
Looking at concrete lines of code in my repo
How can I pipe the output of this console.log here all the way back to server at this line of code here ?
A simple working example using Node's included http
I've got a GIST here of http server running that illustrates what I'm trying to achieve. However due to my app architecture being more real world than this simple example, I have several files and require statements in between the output I'm trying to pipe and the res.send statement.
Conclusion
I really appreciate any help anyone can provide me on this.
We could code together live using Cloud9 shared workspaces making this process easier.
Here is the link to application., but I would have to send an invite for it to be accessible I guess.

Is there a way to run a node task in a child process?

I have a node server, which needs to:
Serve the web pages
Keep querying an external REST API and save data to database and send data to clients for certain updates from REST API.
Task 1 is just a normal node tasks. But I don't know how to implement the task 2. This task won't expose any interface to outside. It's more like a background task.
Can anybody suggest? Thanks.
To make a second node.js app that runs at the same time as your first one, you can just create another node.js app and then run it from your first one using child_process.spawn(). It can regularly query the external REST API and update the database as needed.
The part about "Send data to clients for certain updates from REST API" is not so clear what you're trying to do.
If you're using socket.io to send data to connected browsers, then the browsers have to be connected to your web server which I presume is your first node.js process. To have the second node.js process cause data to be sent through the socket.io connections in the first node.js process, you need some interprocess way to communicate. You can use stdout and stdin via child_process.spawn(), you can use some feature in your database or any of several other IPC methods.
Because querying a REST API and updating a database are both asynchronous operations, they don't take much of the CPU of a node.js process. As such, you don't really have to do these in another node.js process. You could just have a setInterval() in your main node.js process, query the API every once in a while, update the database when results are received and then you can directly access the socket.io connections to send data to clients without having to use a separate process and some sort of IPC mechanism.
Task 1:
Express is good way to accomplish this task.
You can explore:
http://expressjs.com/
Task 2:
If you are done with Expressjs. Then you can write your logic with in Express Framework.
This task then can be done with node module forever. Its a simple tool that runs your background scripts forever. You can use forever to run scripts continuously (whether it is written in node.js or not)
Have a look:
https://github.com/foreverjs/forever

Pass data between multiple NodeJS servers

I am still pretty new to NodeJS and want to know if I am looking at this in the wrong way.
Background:
I am making an app that runs once a week, generates a report, and then emails that out to a list of recipients. My initial reason for using Node was because I have an existing front end already built using angular and I wanted to be able to reuse code in order to simplify maintenance. My main idea was to have 4+ individual node apps running in parallel on our server.
The first app would use node-cron in order to run every Sunday. This would check the database for all scheduled tasks and retrieve the stored parameters for the reports it is running.
The next app is a simple queue that would store the scheduled tasks and pass them to the worker tasks.
The actual pdf generation would be somewhat CPU intensive, so this would be a cluster of n apps that would retrieve and run individual reports from the queue.
When done making the pdf, they would pass to a final email app that would send the file out.
My main concerns are communication between apps. At the moment I am setting up the 3 lower levels (ie. all but the scheduler) on separate ports with express, and opening http requests to them when needed. Is there a better way to handle this? Would the basic 'net' work better than the 'http' package? Is Express even necessary for something like this, or would I be better off running everything as a basic http/net server? So far the only real use I've made of Express is to specifically listen to a path for put requests and to parse the incoming json. I was led to asking here because in tracking logs so far I see every so often the http request is reset, which doesn't appear to affect the data received on the child process, but I still like to avoid errors in my coding.
I think that his kind of decoupling could leverage some sort of stateful priority queue with features like retry on failure, clustering, ...
I've used Kue.js in the past with great sucess, it's redis backed and has nice documentation and interface http://automattic.github.io/kue/

If Node.js is server side then is it not visible to the client?

I undesrand that Node.js is a server side (implementation?) of JavaScript. Server side means executes on the server (like say PHP or Python). Does that mean the code you write in JavaScript in Node is "invisible" (to the client)? I'm not quiet keen on server side stuff and this subject interests me. So say, you write something really super simple such as console.log("Hello World"); then that gets executed on the server and doesn't get shown to the client (like View Source, etc.)? Am I right?
I'm asking this here to seek an easier (small) explanation of the idea. Also, is this possibly something I'm looking for?
Yes, Node.js code runs completely on the server (just like python). In your link the goal is to encrypt the source code because a client has access to the servers filesystem. To communicate with the client you will need another component like the http module.

Sending and performing commands from node.js to bash

I'm developing a sort of Flash Operator Pannel for Asterisk but, with Node.js and Socket.io instead of depending of Flash.
I've polished the node server and the front end BUT I don't know how could I send events from Asterisk to node server and do things that will be sended over the socket.
Given the fact that we have a heavily tuned Asterisk to suit our company needs, connecting to the AMI nor the Asterisk socket will solve my problem because we aren't working with real extensions.
So, despite the Asterisk part, I want to know how could I send info to node through bash or curls or whatever
I thought about using curls to the server but this could cause that someone who knows the commands (pretty unlikely) could alter the application flow with unreal data.
EDIT: Rethinking about it, I would just want to be able to receive requests through the socket/server ??? and then be able to perform actions that will be emited through socket.io.
Is that even possible?
The answer really depends upon what specific data you are trying to get from Asterisk to Node. You're trying to replace the Flash Operator Panel, yet you don't have real extensions. I'm guessing that you are using Asterisk as an SBC/proxy of sorts.
If you truly want an event-driven approach, I suggest modifying your dialplan to reach out to Node whenever needed, with whatever data you want. This would most easily be achieved by calling an AGI script with some number of arguments (written in whatever language) that then connects to Node via an HTTP POST, socket, or other.
If you want a more passive approach, you could have Node stream-read the asterisk log files for data, or, as already suggested, connect to the Asterisk Manager Interface (AMI) and stream from there. Contrary to what has been stated previously, I don't consider this to be a very daunting task.
You want to open a socket from Node to Asterisk's AMI (asterisk manager interface). I never used Node, but I would imagine the code would look roughly like this:
var astman = new net.socket().connect(5038);//connect to port 5039 on localhost
astman.on('data', function(data) {
//do something with received data
});
One of the most well maintained ami libraries are FreePBX's php-astmanager. While it's written in php, it should give you a pretty good idea of what your need to do.
You could certainly set up your node.js program to listen on a socket for messages from Asterisk. But you'd have to roll your own connection management scheme, authentication scheme, message durability (possibly), etc.
Alternatively -- and especially if there is the node server and asterisk server are not on the same machine -- you could use a message queue program like RabbitMQ. That takes care of a lot of the important details involved in interprocess communications. It's pretty easy, too. On the node side, check out https://github.com/postwait/node-amqp
I've never used Asterisk but running command line programs can be done with the child_process module.
http://nodejs.org/docs/latest/api/child_processes.html

Resources