SSH agent forwarding on Windows (pageant) - node.js

When I try to connect through SSH from any language (tried with Golang & Nodejs) to one of my servers from Windows the agent forwarding doesn't work.
I'm saying this because some commands like git pull are throwing errors (Permission denied (publickey)), while there aren't if I login directly using Putty.
I tried to use the env. variable SSH_AUTH_SOCK but it seems there's no such variable set on Windows. I expected Pageant doing the job.
Code example in NodeJS (simple-ssh lib):
this.ssh = new SSH({
// other unimportant variables
agent: process.env.SSH_AUTH_SOCK, // which is undefined
agentForward: true
});
How does this work on Windows?

For pageant on Windows, you should use the special 'pageant' value for agent instead:
this.ssh = new SSH({
// other unimportant variables
agent: 'pageant',
agentForward: true
});

Related

Conditionally detecting whether a Node server is running inside a Docker Container

I have my node.js code where I establish mongodb connections like this: mongodb://localhost:27017/mycollection
Now, I put my server in one container and db in another container and I am able to connect to my db from the server like this: mongodb://mycontainer:27017/mycollection
I have this connection string configured in my server code/config.
Now, how do I detect whether a person is running the server in a container or not and accordingly take the connection string for the db?
If he is running it in the host machine, I want to use the first connection string with localhost and connect to the db in the host machine and if he connects through a container, I want to use the container link name to connect as mentioned in the second case.
Is there any way to do this?
Personally, when I want to accomplish that, I set an ENV variable in the Dockerfile like the following:
ENV DATABASE_HOST db
You can have the full documentation on the Dockerfile reference.
Then, in your Node.js code source, you need to know whether the DATABASE_HOST is set or not (I can redirect you to this Stack Overflow Jayesh's post: Read environment variables in Node.js):
var dbHost = 'localhost';
if (process.env.DATABASE_HOST) {
dbHost = process.env.DATABASE_HOST;
}
or in one line:
var dbHost = process.env.DATABASE_HOST || 'localhost';
Then, for MongoDB connection:
var mongodbConnection = 'mongodb://' + dbHost + ':27017/mycollection'
Now, when you run the container, you must link the container in the docker run command with --link <your mongodb container>:db (since db is the value set in the ENV variable).
But, you can also use the option -e DATABASE_HOST=<somthing else> (again with the docker run command) and use a MongoDB container under another name: -e DATABASE_HOST=anotherOne --link mongo:anotherOne.
And again, you can use an external MongoDB without linking any container if you want (which is not in another container maybe): -e DATABASE_HOST=www.mymongo.com.
EDIT: This solution is maybe better than just identifying if the application is run in a Docker container because with this one your code is usable anywhere.
is-docker is a popular npm packages to accomplish this.
import isDocker from 'is-docker';
if (isDocker()) {
console.log('Running inside a Docker container');
}
The purpose of me using the dependency is perhaps for those who are trying to determine which host to use on their database.
import isDocker from "is-docker";
const host = !!isDocker() ? "host.docker.internal" : env.NODE_DB_HOST;

Paramiko cannot open an ssh connection even with load_system_host_keys + WarningPolicy

I am connecting to a remote server with the following code:
ssh = paramiko.SSHClient()
ssh.load_system_host_keys()
ssh.set_missing_host_key_policy(paramiko.WarningPolicy())
ssh.connect(
hostname=settings.HOSTNAME,
port=settings.PORT,
username=settings.USERNAME,
)
When I'm on local server A, I can ssh onto the remote from the command line, suggesting it is in known_hosts. And the code works as expected.
On local server B, I can also ssh onto the remote from the command line. But when I try to use the above code I get:
/opt/mysite/virtualenv/lib/python3.5/site-packages/paramiko/client.py:763: UserWarning: Unknown ssh host key for [hostname]:22: b'12345'
key.get_fingerprint())))
...
File "/opt/mysite/virtualenv/lib/python3.5/site-packages/paramiko/client.py", line 416, in connect
look_for_keys, gss_auth, gss_kex, gss_deleg_creds, t.gss_host,
File "/opt/mysite/virtualenv/lib/python3.5/site-packages/paramiko/client.py", line 702, in _auth
raise SSHException('No authentication methods available')
paramiko.ssh_exception.SSHException: No authentication methods available
Unlike "SSH - Python with paramiko issue" I am using both load_system_host_keys and WarningPolicy, so I should not need to programatically add a password or key (and I don't need to on local server A).
Is there some system configuration step I've missed?
Try to use the fabric (this is written based on invoke + paramiko) instead of the paramiko and set the following parameters:
con = fabric.Connection('username#hostname' ,connect_kwargs={'password': 'yourpassword', 'allow_agent': False}
If it's keep falling, try to check if your password is still valid and you're not required to change your password.
I tested with the wrong user on local server B. The user running the Python process did not have ssh permissions after all. (Command line ssh failed for that user.) Once I gave it permissions, the connection worked as expected.

node-ansible fails to connect to the host when multiple connections coexist

I have an API server which may trigger multiple node-ansibles simultaneously to connect to a remote machine to do something.
Here's the node.js code:
// app.js
const Ansible = require('node-ansible')
let ansibleNum = 10
for (let i = 0; i < ansibleNum; i += 1) {
let command = new Ansible.Playbook().playbook('test')
command.inventory('hosts')
command.exec()
.then(successResult => {
console.log(successResult)
})
.catch(err => {
console.log(err)
})
}
And the ansible playbook:
# test.yml
---
- hosts: all
remote_user: ubuntu
become: true
tasks:
- name: Test Ansible
shell: echo hello
register: result # store the result into a variable called "result"
- debug: var=result.stdout_lines
As ansibleNum increases, the probability of the failure of ansible playbook also increases.
The failure message is:
fatal: [10.50.123.123]: UNREACHABLE! => {"changed": false, "msg": "Failed to connect to the host via ssh: Shared connection to 10.50.123.123 closed.\r\n", "unreachable": true}
I've read another similar question here, but the solutions provided by it doesn't work in my case.
Another way to trigger the problem is by executing
ansible-playbook -i hosts test.yml & ansible-playbook -i hosts test.yml.
This command runs ansible without node.js.
I've pushed the code to github. You can download it directly.
Anyone knows why the shared connection got closed?
I've set ControlMaster argument to auto by following the document here.
It's strange that setting the connection type to paramiko solves my problem.
Here's the config file located in ~/.ansible.cfg:
[defaults]
transport = paramiko
Based on this document, it seems that paramiko doesn't support persistent connection.
I'm still confused about why this setting solves my problem.

SSH to a created EC2 instance through NodeJS implementation

My code looks like below: -
AWS.config.update({ region: 'us-east-1' });
var ec2 = new AWS.EC2();
// Create the EC2 instance
ec2.describeInstances(function (err, data) {
if (err) {
res.status(500).json(err);
} else {
res.status(201).json(data);
}
});
The above code creates the EC2 instance perfectly. Now, my requirement is that I want to "ssh to the created instance" from my NodeJS code programatically. What step should I follow to achieve this. BTW, the whole idea is, once I could ssh to the EC2 instance programatically, the next step I will do is to install Docker and other softwares in that created Instance programatically
Thanks
As long as you have all the necessary information to be able to connect to and authenticate with your EC2 instance via SSH, you could use a module like ssh2 to connect programmatically to execute commands, transfer files, etc.
I think I'm a little late for this question but I also had this problem a few months ago and it really took me a few days to find a solution.
The solution is :
ssh -tt -o StrictHostKeyChecking=no -i "ec2-instance-key.pem" ec2-user#PUBLIC_DNS sh ./shellScript.sh
This line of code connects to the EC2 instance and executes a shell script. You can either run a single shell script which has all the commands you want to execute or execute them via the ssh command.
You'll need the authentication key for the instance , as you can see in the command.
Hope this helps someone someday !

How to run intern tests with sauceconnect on a remote server?

I've been trying to run intern tests (intern 2.1) against a SauceConnect sc instance running on another machine.
I tried various configurations e.g.
tunnel : "SauceLabsTunnel",
tunnelOptions : {
hostname : 'remotehost'
}
But that doesn't work, I'd also tried to use the proxyUrl, proxy, proxyPort options and same problem. I always receive the error failed to remove matching tunnels
I'm closing this as I'd answered it, use tunnel : 'NullTunnel'

Resources