SSH to a created EC2 instance through NodeJS implementation - node.js

My code looks like below: -
AWS.config.update({ region: 'us-east-1' });
var ec2 = new AWS.EC2();
// Create the EC2 instance
ec2.describeInstances(function (err, data) {
if (err) {
res.status(500).json(err);
} else {
res.status(201).json(data);
}
});
The above code creates the EC2 instance perfectly. Now, my requirement is that I want to "ssh to the created instance" from my NodeJS code programatically. What step should I follow to achieve this. BTW, the whole idea is, once I could ssh to the EC2 instance programatically, the next step I will do is to install Docker and other softwares in that created Instance programatically
Thanks

As long as you have all the necessary information to be able to connect to and authenticate with your EC2 instance via SSH, you could use a module like ssh2 to connect programmatically to execute commands, transfer files, etc.

I think I'm a little late for this question but I also had this problem a few months ago and it really took me a few days to find a solution.
The solution is :
ssh -tt -o StrictHostKeyChecking=no -i "ec2-instance-key.pem" ec2-user#PUBLIC_DNS sh ./shellScript.sh
This line of code connects to the EC2 instance and executes a shell script. You can either run a single shell script which has all the commands you want to execute or execute them via the ssh command.
You'll need the authentication key for the instance , as you can see in the command.
Hope this helps someone someday !

Related

How to build docker image without having to use the sudo keyword

I'm building a node.js app which allows people to run code on my server and I'm using Docker to containerise the user's code so that it can't steal data or in general do something they shouldn't. I have a Docker image template that is copied into the user's personal app directory and I want to build the image using this function I've written:
const util = require("util");
const exec = util.promisify(require("child_process").exec);
async function buildContainer(path, dockerUser) {
return await exec(`sudo docker build -t user_app_${dockerUser} ${path}`);
}
However when I go to use it, it requires me to enter my sudo password as if I was executing it manually in a terminal window.
Is there anyway I can run this function without having to include the sudo keyword?
Thanks in advance.
you can use podman instead of docker.
There you donĀ“t need sudo.
You have the most commands like docker.
example:
podman build
podman run
and so on...
hope that helps :)
Regards

Create a file on remote server with nodejs

What is the easiest way to create a file on a remote server (can be accessed with ssh) in NodeJS?
Example: I work on my local computer, and there exists a remote server with ip 192.168.1.100. I would like to create an empty text file on this server, in path "/home/users/share".
I tried using some scp library in NodeJS, but could not copy my file to the remoted server.
What do you mean?
If file is created in path, that ssh has access, and rights are set OK, then file be acceseable.
Do you want to generate data on the server and save it?
https://nodejs.org/api/fs.html#fs_fs_writefile_file_data_options_callback
Choose a file from the disk and upload it to the server? eg: https://www.w3schools.com/nodejs/nodejs_uploadfiles.asp
If I am correct, you want to create file on different server using Nodejs and want to access the file using ssh.
If your server has linux then you can use rsync command from terminal.
E.g rsync --chmod=u+rwx,g+rwx,o+rwx /path/to/file server:/path/to/file.
And if you want it done from Nodejs then you can use child_process library to execute all terminal commands.
const { exec } = require('child_process');
exec('rsync --chmod=u+rwx,g+rwx,o+rwx /path/to/file server:/path/to/file', (err, () =>
if (err) {
// node couldn't execute the command
return;
}
}));

Conditionally detecting whether a Node server is running inside a Docker Container

I have my node.js code where I establish mongodb connections like this: mongodb://localhost:27017/mycollection
Now, I put my server in one container and db in another container and I am able to connect to my db from the server like this: mongodb://mycontainer:27017/mycollection
I have this connection string configured in my server code/config.
Now, how do I detect whether a person is running the server in a container or not and accordingly take the connection string for the db?
If he is running it in the host machine, I want to use the first connection string with localhost and connect to the db in the host machine and if he connects through a container, I want to use the container link name to connect as mentioned in the second case.
Is there any way to do this?
Personally, when I want to accomplish that, I set an ENV variable in the Dockerfile like the following:
ENV DATABASE_HOST db
You can have the full documentation on the Dockerfile reference.
Then, in your Node.js code source, you need to know whether the DATABASE_HOST is set or not (I can redirect you to this Stack Overflow Jayesh's post: Read environment variables in Node.js):
var dbHost = 'localhost';
if (process.env.DATABASE_HOST) {
dbHost = process.env.DATABASE_HOST;
}
or in one line:
var dbHost = process.env.DATABASE_HOST || 'localhost';
Then, for MongoDB connection:
var mongodbConnection = 'mongodb://' + dbHost + ':27017/mycollection'
Now, when you run the container, you must link the container in the docker run command with --link <your mongodb container>:db (since db is the value set in the ENV variable).
But, you can also use the option -e DATABASE_HOST=<somthing else> (again with the docker run command) and use a MongoDB container under another name: -e DATABASE_HOST=anotherOne --link mongo:anotherOne.
And again, you can use an external MongoDB without linking any container if you want (which is not in another container maybe): -e DATABASE_HOST=www.mymongo.com.
EDIT: This solution is maybe better than just identifying if the application is run in a Docker container because with this one your code is usable anywhere.
is-docker is a popular npm packages to accomplish this.
import isDocker from 'is-docker';
if (isDocker()) {
console.log('Running inside a Docker container');
}
The purpose of me using the dependency is perhaps for those who are trying to determine which host to use on their database.
import isDocker from "is-docker";
const host = !!isDocker() ? "host.docker.internal" : env.NODE_DB_HOST;

SSH agent forwarding on Windows (pageant)

When I try to connect through SSH from any language (tried with Golang & Nodejs) to one of my servers from Windows the agent forwarding doesn't work.
I'm saying this because some commands like git pull are throwing errors (Permission denied (publickey)), while there aren't if I login directly using Putty.
I tried to use the env. variable SSH_AUTH_SOCK but it seems there's no such variable set on Windows. I expected Pageant doing the job.
Code example in NodeJS (simple-ssh lib):
this.ssh = new SSH({
// other unimportant variables
agent: process.env.SSH_AUTH_SOCK, // which is undefined
agentForward: true
});
How does this work on Windows?
For pageant on Windows, you should use the special 'pageant' value for agent instead:
this.ssh = new SSH({
// other unimportant variables
agent: 'pageant',
agentForward: true
});

How to upload local code to EC2

I wanna run my nodejs codes on Amazon EC2.
I use this code to test(using vi to code on 64-bit Amazon Linux)
var http = require('http');
http.createServer(function (req, res) {
res.writeHead(200, {'Content-Type': 'text/plain'});
res.end('Hello World\n');
}).listen(80);
console.log('Server running at http://12.34.56.78/');
and It really works.
But now how can I upload my local nodejs code (in my computer now)to amazon EC2, I use 64-bit Amazon Linux
There's not a unique way to achieve that task. You could use various approaches, each of them with its pro's and con's.
An easy solution would be to use a bare git repository in the server you want to upload the code to, and push your code to that remote repository. You could even use git hooks to automate the deployment and npm install when pushing new code.
One thing I'd recommend you is that, as EC2 instance storage is volatile, you probably should automate the server setup & configuration using something like Opscode's Chef. Either that or implement some incremental backups for your EBS volumes.
You can also use something like fabric. http://docs.fabfile.org/en/1.8/ I've found it very quick to get things done:
from fabric.api import put, run, task
def run_your_app():
run("node js command to run your app")
#task
def put_your_file():
put("localfilename", "remoteFilename")
run_your_app();
Save this to 'fabfile.py' and then your run it from the command line:
fab -H <your hostname or ip> put_your_file
Also more on the fabric operations here:
http://docs.fabfile.org/en/1.8/api/core/operations.html

Resources