Not able to make connection from jenkins to AWS ec2 servers - linux

I am trying to use SSH plugins in Jenkins to execute the script on the remote host server. I have added the remote user credentials on Jenkins but not able to make the connection on AWS ec2 instance.
screenshot of the error:
Am I missing any steps while doing the configurations? How can I resolve this issue?

Some things to consider
Does the Security Group on the Target 3.56.98.1 allow for SSH
access from the Jenkins IP address?
Do you have the Public SSH Key of Jenkins added to the targets authorised keys file?
Have you set the correct username that Jenkins will use in the target ubuntu, ec2-user, other?

Related

Git clone gives "ssh: connect to host github.com port 22: Connection timed out" Linux /opt directory Amazon EC2 Instance

Issue
I am trying to use git in /opt/jamf2snipe directory on an EC2 Instance. I have tried the following command:
sudo git clone git#github.com:MYUSERNAME/jamf2snipe-school.git
It says connection timed out:
Cloning into 'jamf2snipe-school'...
ssh: connect to host github.com port 22: Connection timed out
fatal: Could not read from remote repository.
Please make sure you have the correct access rights
and the repository exists.
If I try to run this in my home directory it works fine. It seems to be a permission issue with /opt. I am wary of changing permissions for /opt.
Setup
I am trying to do this on an Amazon EC2 Instance. Currently SSH is limited to certain IP addresses (not including Github). I followed this article from github to use SSH over HTTPS. I tested to make sure I had stuff setup correctly by using:
$ ssh -T git#github.com
received
Hi USERNAME! You've successfully authenticated, but GitHub does not provide shell access.
I did this in /opt/jamf2snipe and the home directory successfully.
First, make sure to, if possible, not use sudo.
In addition of executing commands as root (which is dangerous), it uses its own environment variable, and SSH settings (in /root/.ssh), which differs from your normal EC2 user.
Conversely, making a repository in /opt, which might be accessible only by root, is not the best spot to clone a repository.
Second, Using SSH over the HTTPS port is the usual solution (like this one from 2018) on EC2, where the firewall can block by default SSH egress traffic.

jenkins.plugins.publish_over.BapPublisherException: Failed to connect and initialize SSH connection Message [Auth fail]

I am learning to use Jenkins to deploy a .Net 5.0 application on an AWS EC2 server. This is the first time I am using Linux server and Jenkins for .Net (I'm am a life long Windows guy), and I am facing an error while trying to publish my artifacts over SSH to Web Server.
My setup:
Jenkins server is an AWS EC2 Linux AMI server.
Web Server is also an AWS EC2 LInux AMI server.
My Jenkins is correctly installed and working. I am able to build and run unit test cases without any issues.
For Deploy, I am using 'Publish Over SSH' plugin, and I have followed all steps to configure this plugin as mentioned here https://plugins.jenkins.io/publish-over-ssh/.
However, when try to 'Test Configuration', I get the below error,
Failed to connect or change directory
jenkins.plugins.publish_over.BapPublisherException: Failed to connect and initialize SSH connection. Message: [Failed to connect session for config [WebServer]. Message [Auth fail]]
I did a ping test from Jenkins server to Web Server, and it is a success.
I'm using the .pem key in the 'Key' section of 'Publish over SSH'. This key is the same key I use to SSH into the web server.
The below link suggests many different solutions, but none is working in my case.
Jenkins Publish over ssh authentification failed with private key
I was looking at the below link which describes the same problem,
Jenkins publish over SSH failed to change to remote directory
However in my case I have kept 'Remote Directory' as empty. I don't know if I have to specify any directory here. Anyways, I tried creating a new directory under the home directory of user ec2-user as '/home/ec2-user/publish' and then used this path as Remote Directory, but it still didn't work.
Screenshot of my settings in Jenkins:
I would appreciate if anyone can point me to the right direction or highlight any mistake I'm doing with my configuration.
In my case following steps solved the problem.
Solution is based on Ubuntu 22.04
add two line in /etc/ssh/sshd_config
PubkeyAuthentication yes
PubkeyAcceptedKeyTypes +ssh-rsa
restart sshd service
sudo service sshd restart
you might consider the following:
a. From the screenshot you’ve provided, it seems that you have checked the Use password authentication, or use different key option which will require you to add your key and password (inputs from these fields will be used in connecting to your server via SSH connection). If you use the same SSH key and passphrase/password on all of your servers, you can uncheck/untick that box and just use the config you have specified above.
b. You might also check if port 22 of your web server allows inbound traffic from the security group where your Jenkins server/EC2 instance is running. See reference here.
c. Also, make sure that the remote directory you have specified is existing otherwise the connection may fail.
Here's the sample config

How to configure users/keys to allow Ansible to run against multiple hosts?

I'm currently using a sandbox environment to help gain an understanding of Linux and Ansible.
I have a rhel 7.6 VM where Ansible is installed/ran from that i connect to via moba. I then have 2 test VMs that i'd like to run Ansible against.
I cannot SSH from the Ansible VM to either of the test VM's (Permission denied public key) but i can connect directly to the test VM's.
How do i set up the keys/hosts? does the private key need to be uploaded to the Ansible VM?
Try to deploy ~/.ssh/id_rsa.pub key from Ansible control machine to one of your VM's in a file ~/.ssh/authorized_keys. Copy the contents of ~/.ssh/id_rsa.pub from the Ansible control machine in ~/.ssh/authorized_keys on the target host. You may use the ssh-copy-id command to perform this for you so long as you have access to the target host via some method.
another method different from best practice id_rsa.pub deployment is configuring inventory vars for your hosts/groups by setting ansible_user, ansible_ssh_pass (with vault usage), ansible_become_user, ansible_become_pass (with vault usage)

How do I remove default ssh host from ssh configuration?

I used to connect to Amazon web services using ssh command and application.pem key. Now when I try to connect to other platforms such as Github my ssh client looks for same application.pem key and tries to connect to AWS. How do I connect to Github or change the default host and key configuration.I am using a Ubuntu 13.10 system and following is my ssh output.
pranav#pranav-SVF15318SNW:~/.ssh$ ssh
Warning: Identity file application.pem not accessible: No such file or directory.
You need the identity file to login to the box. Use the command:
ssh -i (identity_file) username#hostname"
This worked for me. Write just the filename (without any slashes), unlike Amazon EC2 tutorial which asks you to enter:
ssh -i /path/key_pair.pem ec2-user#public_dns_name
and also check the permission

CoreOS Vagrant Virtual box SSH password

I'm trying to SSH into CoreOS Virtual Box using Putty. I know the username appears in the output when I do Vagrant up but I don't know what the password is.
I've also tried overriding it with config.ssh.password settings in Vagrantfile but when I do vagrant up again it comes up with Authentication failure warning and retries endlessly.
How do we use Putty to log into this Box instance?
By default there is no password set for the core user, only key-based authentication. If you'd like to set a password this can be done via cloud-config.
Place the cloud-config file in a user-data file within the cloned repository. View user-data.sample for an example.
A better method would be to follow this guide so that you can use vagrant ssh as it was designed.
By default for Vagrant:
user: vagrant
password: vagrant
..vagrant up again it comes up with Authentication failure warning and
retries endlessly.
I think because it make connect with wrong ssh public key.
To change it read this: https://stackoverflow.com/a/23554973/3563993

Resources