nodejs ssh2 handling data responses in persistent shell - node.js

I want to open a persistent connection with the ssh, type commands and handle their responses. Commands will likely hook on to each other, such as changing directories then running another command, so exec does not seem to be an option from what I understand. With php and phpseclib it was simple, I could simply do:
$ssh->sftp('cd /some/dir');
$response = $ssh->sftp('ls');
However with ssh2 and nodejs there appears to be only one handler for all incoming data, so no matter what I write, it will all come back to the same function, which makes it hard to determine what is what. Especially since I can not control what comes back. If I did an 'ls' I would get a list of files and folders, but if I did a grep or tail I would get a different type of list, but my handler would not know which is which to handle/parse them properly.
How can I solve this issue?
Perhaps I am looking at this the wrong way and just need someone to take the PHP glasses off. My goal is to build a small local app that will connect to my servers through ssh and do complex tasks like grabbing my access logs and parsing all the data into a more readable format for me, or maybe creating a new sites-available config file and then a2ensite'ing it, or vardumping my databases and downloading the files to back them up locally, etc.

ssh.connection.shell is used to get access to remote shell in interactive manner. For connections to do work & come back, shell is not the right option.
Read this,
https://github.com/mscdex/ssh2/issues/210
You can do the same with npm#ssh2,
conn.sftp(function(err, sftp) {
if (err) throw err;
sftp.readdir('foo', function(err, list) {
if (err) throw err;
console.dir(list);
conn.end();
});
});

Related

How to slurp a value in a file and assign this to a variable inside a puppet module

I'm pretty new to puppet and have run into an issue.
We have a proprietary home-grown API-based secrets management platform. We can either query the API directly or configure so that the secrets for that host are mounted to the root filesystem.
My problem is I can't figure out how to get that information within the context of a puppet module and into a variable so that I can use it. It seems you can't get stdout/stderr back from exec (or can you) otherwise this would be cake.
So for simplicity, let's say my secret is /etc/app/example/foo.
$roles.each |$role| {
case downcase($role) {
'foo': {
# SOMEHOW I NEED TO GET TOKEN FROM FILESYSTEM OR API CALL HERE
$token = <GET TOKEN SOMEHOW>
# here I need to do something with my value
exec { "my description":
command => '//bin/foo',
environment => ["TOKEN=${token}"]
}
This is basically what I need to do at a basic level. It doesn't matter if I call curl directly (preferred approach) or read a mounted file.
Thx for any help.
you can't get stdout/stderr back from exec (or can you) otherwise this would be cake.
You cannot capture the standard output or error of an Exec's command for reuse, but Puppet's built-in generate() function serves exactly the purpose of executing a command and capturing its output. Normally that would run the command on the server, during catalog compilation, but if you want it to run on the client instead then you can defer its execution. One of the primary purposes for deferring functions is for interaction with secret stores.
With that said, you might want to consider wrapping the whole thing up in a custom resource type. That's maybe a bit more work (especially if you don't speak Ruby), but it's a lot more flexible, and it should make for cleaner and clearer code on the Puppet DSL side, too.

Using gcloud commands in nodejs application

Some gcloud commands don't have API or client library support (for example - this one).
In these cases, is there a simple way to run gcloud commands from a nodejs application?
The gcloud endpoints service commands for IAM policy are difficult for me to check quickly but, if IIRC (and if this is similar to gcloud projects commands for IAM policy), it's not that there's no API, but that there's no single API call.
What you can always do with gcloud is append --log-http to see what happens beneath the covers. With IAM policy mutations (off-top-of-head), you get the policy, mutate it, and then apply the changes back using the etag the GET gave you. The backend checks the policy's state (the etag is like a hash of the policy) and, if it's unchanged, you can make the change.
If this is what's happening here, you should be able to repro the functionality in NodeJS using the existing (!) APIs and, if you're using API Client Libraries (rather than Cloud Client libraries), the functionality will be available.
Apart from the complexity involved in shelling out to gcloud, you'll need to also authenticate it and then you'll need to (un)marshal data to the shell and manage errors. Ergo, it's messy and generally discouraged.
In node.js ,we have child_process module. As the name suggests the child_process provides function like spawn or exec that creates new child process that executes shell command like independent process. spawn is a function that takes the main command as
first argument and other command line options as an array values in place of second parameter.
So with respect to link that you share, you might end-up writing something like this :
const { spawn } = require("child_process");
const listening = spawn('gcloud', ['endpoints', 'services', 'blah', '--option','someValue']);
listening.stdout.on("data", data => {
console.log(`stdout: ${data}`);
});
listening.stderr.on("data", data => {
console.log(`stderr: ${data}`);
});
listening.on('error', (error) => {
console.log(`error: ${error.message}`);
});
References :
https://nodejs.org/api/child_process.html#child_process_child_process_spawn_command_args_options
I'm not sure this directly answers your question but there is an npm package that can help you run unix commands from within the app.
Check out shell.js

Trying to write to a json file using Node fs.writeFile

I hope I'm saying this correctly. What I'm trying to do is write to a json file using fs.writeFile.
I can get it to work using the command line but what I want to do is call a function maybe a button click to update the json file.
I figure I would need some type of call to the node server which is local port 8080. I was researching and seen somebody mention using .post but still can't wrap my head around how to write the logic.
$(".button").on("click", function(event) {
fs.writeFile("./updateme.json", "{test: 1}", function(err) {
if(err) {
return console.log(err);
}
console.log("The file was saved!");
});
});
Using jQuery along with fs? Wow that could be great! Unfortunately that is not as simple as that!
Let me introduce you to server-side VS client-side JavaScript. Well actually there are a lot of resources on the net about that - just google it, or check the answers to this other StackOverflow question. Basically JavaScript can run either on a browser (Chrome, Mozilla...) or as a program (usually a server written in NodeJS), and while the language is (almost) the same, both platforms don't have the same features.
The script that you're showing should run in a browser, because it's using jQuery and interacting with buttons and stuff (aka the DOM). Can you imagine what a mess it would be if that script could interact with the file system? Any page you'll visit will be able to crawl around in your holiday pictures and other personal stuff you keep on your computer. Bad idea! That is why some libraries like fs are not available in the browser.
Similarly, some libraries like jQuery are not available (or simply useless) in the server, because there is no HTML and user interaction, only headless programs running.
So, what can I do to write a JSON file after a user clicks on a button?
You can set up:
A NodeJS server that will write a JSON file
Make jQuery call this server with the data to be written after the user clicks on a button
If you want further guidelines on this, tell me in the comments! I'll be ready to edit my question so as to include instructions on setting up such an environment.

Tail a file efficiently in nodejs

I am creating an application which would watch a file and fetch the contents from that file (similar to tail but with the possibility of paging in previous data as well). I read up on quite a few solutions ranging from spawning a new process to getting only the updated bytes of the file but I am still a little confused on a few parts.
What I want to do exactly is the following:
Watch a file and trigger an event/callback whenever new data comes into the file
Read this new data from the file and efficiently send it to a client. Using a websocket or something else. (suggest a good way to do this please)
At the client end, take this data and display it to user and keep updating it with new data as it comes
If the user requests older data a way to fetch that data from the file we are watching
I am looking for efficient solutions for the above sub problems and any suggestions for a better approach are also welcome.
FYI I am new to nodejs so verbosity in your solutions would be highly appreciated.
Watch for changes
Suggest you look at chokidar, it is an optimized implementation of fs.watch, fs.events, the native node.js libraries.
// Initialize watcher.
const watcher = chokidar.watch('some/directory/**/*.xml', config);
// Add event listeners.
watcher
.on('add', path => log(`File ${path} has been added`))
.on('change', path => log(`File ${path} has been changed`))
.on('unlink', path => log(`File ${path} has been removed`));
To get the changed value
Here you can look at diff module. And you will need to store the state of the previous and current files. In order to build the changes.
To notify the client
You will need to create a websocket server, recommend you to use socket.io and then in your application you will create the diff and send a websocket message to the server. The server will notify/broadcast the message to the needed clients.

express-formidable access to events

So being novice in Express and Node, I fear I must be not understanding something fundamental about middleware. I am trying to upload a file using express-formidable, and I've got it to work (as far as taking the file and uploading it to the directory of my choice). However, I would love a progress bar or to be able to do something at the start of the upload such as choose the file name. I see a lot of examples of regular formidable using the 'progress' or 'file' event. But for the life of me I can't figure out how to access this with express-formidable. Since I can't create an IncomingForm, I don't know where the events are or how to bind them.
What am I missing about how express-formidable works? Does it cut out everything about formidable and just stuff a file where you tell it? Or do you have access to everything formidable?
Here's an example of a route I have called ingest, which is where I receive an uploaded file and process it.
app.post('/ingest', function(req, res) {
/*I want to be able to show progress here, and do other things
like set the file name before it's saved, but by the time I get here
the file is already processed and saved, and I can't figure out how
to access events using '.on' if there's no form object*/
});

Resources