CoreOS Vagrant Virtual box SSH password - linux

I'm trying to SSH into CoreOS Virtual Box using Putty. I know the username appears in the output when I do Vagrant up but I don't know what the password is.
I've also tried overriding it with config.ssh.password settings in Vagrantfile but when I do vagrant up again it comes up with Authentication failure warning and retries endlessly.
How do we use Putty to log into this Box instance?

By default there is no password set for the core user, only key-based authentication. If you'd like to set a password this can be done via cloud-config.
Place the cloud-config file in a user-data file within the cloned repository. View user-data.sample for an example.
A better method would be to follow this guide so that you can use vagrant ssh as it was designed.

By default for Vagrant:
user: vagrant
password: vagrant
..vagrant up again it comes up with Authentication failure warning and
retries endlessly.
I think because it make connect with wrong ssh public key.
To change it read this: https://stackoverflow.com/a/23554973/3563993

Related

Not able to make connection from jenkins to AWS ec2 servers

I am trying to use SSH plugins in Jenkins to execute the script on the remote host server. I have added the remote user credentials on Jenkins but not able to make the connection on AWS ec2 instance.
screenshot of the error:
Am I missing any steps while doing the configurations? How can I resolve this issue?
Some things to consider
Does the Security Group on the Target 3.56.98.1 allow for SSH
access from the Jenkins IP address?
Do you have the Public SSH Key of Jenkins added to the targets authorised keys file?
Have you set the correct username that Jenkins will use in the target ubuntu, ec2-user, other?

jenkins.plugins.publish_over.BapPublisherException: Failed to connect and initialize SSH connection Message [Auth fail]

I am learning to use Jenkins to deploy a .Net 5.0 application on an AWS EC2 server. This is the first time I am using Linux server and Jenkins for .Net (I'm am a life long Windows guy), and I am facing an error while trying to publish my artifacts over SSH to Web Server.
My setup:
Jenkins server is an AWS EC2 Linux AMI server.
Web Server is also an AWS EC2 LInux AMI server.
My Jenkins is correctly installed and working. I am able to build and run unit test cases without any issues.
For Deploy, I am using 'Publish Over SSH' plugin, and I have followed all steps to configure this plugin as mentioned here https://plugins.jenkins.io/publish-over-ssh/.
However, when try to 'Test Configuration', I get the below error,
Failed to connect or change directory
jenkins.plugins.publish_over.BapPublisherException: Failed to connect and initialize SSH connection. Message: [Failed to connect session for config [WebServer]. Message [Auth fail]]
I did a ping test from Jenkins server to Web Server, and it is a success.
I'm using the .pem key in the 'Key' section of 'Publish over SSH'. This key is the same key I use to SSH into the web server.
The below link suggests many different solutions, but none is working in my case.
Jenkins Publish over ssh authentification failed with private key
I was looking at the below link which describes the same problem,
Jenkins publish over SSH failed to change to remote directory
However in my case I have kept 'Remote Directory' as empty. I don't know if I have to specify any directory here. Anyways, I tried creating a new directory under the home directory of user ec2-user as '/home/ec2-user/publish' and then used this path as Remote Directory, but it still didn't work.
Screenshot of my settings in Jenkins:
I would appreciate if anyone can point me to the right direction or highlight any mistake I'm doing with my configuration.
In my case following steps solved the problem.
Solution is based on Ubuntu 22.04
add two line in /etc/ssh/sshd_config
PubkeyAuthentication yes
PubkeyAcceptedKeyTypes +ssh-rsa
restart sshd service
sudo service sshd restart
you might consider the following:
a. From the screenshot you’ve provided, it seems that you have checked the Use password authentication, or use different key option which will require you to add your key and password (inputs from these fields will be used in connecting to your server via SSH connection). If you use the same SSH key and passphrase/password on all of your servers, you can uncheck/untick that box and just use the config you have specified above.
b. You might also check if port 22 of your web server allows inbound traffic from the security group where your Jenkins server/EC2 instance is running. See reference here.
c. Also, make sure that the remote directory you have specified is existing otherwise the connection may fail.
Here's the sample config

unable to connect to Ubuntu ami without using KeyPair

I have build an AMI in aws using
Ubuntu Server 16.04 LTS (HVM), SSD Volume Type - ami-0d77397e
Now I might be mis-understanding this, but I don't want to use a keypair as we are sharing this ami around a team. It is in a security group that is locked down to our IP's, so i just want to be able to log in using user/pass
When I try to connect I get the username prompt which I enter the user name Ubuntu in on pressing enter I get this prompt:
Disconnected: No supported authentication methods available (server sent: publickey)
Instead of sharing keys you can create unix users like
1) sudo adduser username -- It will ask you enter password and other details
2) Edit /etc/ssh/sshd_config setting
PasswordAuthentication yes
3) Restart the ssh daemon with
sudo service ssh restart
Now log back in by saying ssh username#ec2_ip and enter the password you just created in 1.
You should use key pairs (multiple, no need to share them), but if you really are resistant then you can enable password logins.

Difference between connecting throug Jenkins SSH plugin and normal ssh

I have a remote server.
If I use ssh to connect with the server as the Jenkins user it works perfectly
ssh jenkins#remoteserver.com
The jenkins user is allowed to change to user jboss WITHOUT being asked for password:
sudo su jboss
This works perfectly, no need for entering a password. Everything as expected.
If I make a Jenkins build, connecting to the remote server through a SSH plugin, the connection works fine. I can also run a testscript, it works also!
But if I make the sudo su jboss through Jenkins on my remote server, it's not working.
Jenkins is not throwing any error, there is just the spinning circle
It's never stopping, only if I cancel the job.
Anyone got an idea, what's the difference between running ssh in Jenkins and conncecting through a plugin?
Is the connection lost, when changing the username? (looks like it)
The SSH plugin and the ssh command provide two completely different implementations of the SSH protocol:
Your ssh command will probably run the OpenSSH client
The SSH plugin uses the SSH protocol implementation provided by JSch
I'm not into JSch, but I'd suspect there's a either a problem in the way the plugin configures JSch terminal handling, or there's a problem related to terminal handling with JSch. Either may break the behaviour of sudo:
sudo is somewhat sensitive to terminal/tty settings; see e.g. this discussion, which also contains a few hints which may help to work around the issue.

Can't understand vagrant ssh implementation

I recently start to use Vagrant (and recently move to Ubuntu from Windows too). My goal to understand fundamentals of vagrant ssh.
So, I'm trying to understand what vagrant ssh actually does.
I've read What does vagrant ssh actually do?, but I haven't understood anything.
I'll try to explain with an example:
The first time, I connect into the vagrant machine by ssh vagrant#192.168.0.x and typing the password.
Next, I configure the keypair and connect into guest by ssh vagrant#192.168.0.x without typing password.
Next, I try to understand how vagrant implements SSH into its own guest machine:
In /etc/ssh/sshd_config, I set PasswordAuthentication no, but vagrant ssh still works
Delete insecure_private_key placed in ~/.vagrant.d on the host machine, but vagrant restores it and vagrant ssh still works.
Remove openssh-server in the vagrant machine and now vagrant ssh really doesn't work :)
Please could anybody in plain English explain me how vagrant implements vagrant ssh?
Update: Vagrant Docs: SSH explains actually what I need.
May be I didn't get the point of your question, but I'll try to explain you the main differences between vagrant ssh and ssh.
vagrant ssh is actually the same as a normal ssh, but there are several differences between them:
port which ssh-client tries to access;
private key that ssh-client uses for authentication;
hosts-key is switched off for vagrant so you will not get initial message "The host is unknown";
other minor differences.
If you know the port where vagrant runs, and know where is the private key that vagrant uses,
you can use ssh instead of vagrant ssh.

Resources