Unable to transfer local file wsl ubuntu terminal to remote server using windows subsystem - linux

I have a file called test1.zip in /mnt/c/Users/test/ folder of my local laptop [in which ubuntu windows subsystem for linux is installed]. Local ubuntu terminal WSL name is lauda
Now, I would like to transfer this zip file called test1.zip to my remote server named stuff.
PLEASE NOTE THAT ALL COMMANDS ARE TRIED FROM MY LOCAL LAPTOP WSL SCREEN [ubuntu screen]
So, I tried the below command from my WSL [local laptop ubuntu WSL terminal]
scp user1#lauda:/mnt/c/Users/test/test1.zip user1#stuff:/home/test/codes/test1
and got the error ssh: Could not resolve hostname lauda: Name or service not known
So I tried the below [replacing the lauda local laptop ubuntu terminal hostname with its IP]
scp user1#172.xx.xxx.xxx:/mnt/c/Users/test/test1.zip user1#stuff:/home/test/codes/test1
this resulted in error as ssh: connect to host 172.xx.xxx.xxx port 22: Connection refused
Now I tried the same command as above but in opposite way as shown below
scp user1#stuff:/home/test/codes/ user1#lauda:/mnt/c/Users/test/test1.zip
and got the below error
ssh: Could not resolve hostname lauda: Temporary failure in name resolution
Later, I tried with IP address
scp user1#stuff:/home/test/codes/ user1#172.xx.xxx.xxx:/mnt/c/Users/test/test1.zip
And I got the below error
ssh: connect to host 172.xx.xxx.xxx port 22: No route to host lost connection
Later, I tried the below commands as well
scp /mnt/c/Users/test/test1.zip user1#stuff:/home/test/codes/
and got an error scp: /home/test/codes/test1.zip: Permission denied
So, I again tried like below
scp user1#stuff:/home/test/codes/ /mnt/c/Users/test/test1.zip
and got an error scp: /home/test/codes: not a regular file
PLEASE NOTE THAT ALL COMMANDS ARE TRIED FROM MY LOCAL LAPTOP WSL SCREEN [ubuntu screen]
How can I transfer local files/folders from my local ubuntu WSL terminal to remote server?

scp /mnt/c/Users/test/test1.zip user1#stuff:/home/test/codes/ is the closest attempt to working. The error you get could be due to one of two reasons:
Firstly user1 does not have permissions to write to /home/test on stuff - makes sense as usually only the test user would be able to write there. (Note that the test user on your WSL instance is not the same profile the test user on the remote.)
Secondly the /home/test/codes/ folder may not even exist yet.
Instead (if you know test's password) copy as the test user :
scp /mnt/c/Users/test/test1.zip test#stuff:/home/test/codes/
Or copy to user1's home directory (after ensuring you have created /home/user1/codes/
scp /mnt/c/Users/test/test1.zip user1#stuff:/home/user1/codes/

Related

Git clone gives "ssh: connect to host github.com port 22: Connection timed out" Linux /opt directory Amazon EC2 Instance

Issue
I am trying to use git in /opt/jamf2snipe directory on an EC2 Instance. I have tried the following command:
sudo git clone git#github.com:MYUSERNAME/jamf2snipe-school.git
It says connection timed out:
Cloning into 'jamf2snipe-school'...
ssh: connect to host github.com port 22: Connection timed out
fatal: Could not read from remote repository.
Please make sure you have the correct access rights
and the repository exists.
If I try to run this in my home directory it works fine. It seems to be a permission issue with /opt. I am wary of changing permissions for /opt.
Setup
I am trying to do this on an Amazon EC2 Instance. Currently SSH is limited to certain IP addresses (not including Github). I followed this article from github to use SSH over HTTPS. I tested to make sure I had stuff setup correctly by using:
$ ssh -T git#github.com
received
Hi USERNAME! You've successfully authenticated, but GitHub does not provide shell access.
I did this in /opt/jamf2snipe and the home directory successfully.
First, make sure to, if possible, not use sudo.
In addition of executing commands as root (which is dangerous), it uses its own environment variable, and SSH settings (in /root/.ssh), which differs from your normal EC2 user.
Conversely, making a repository in /opt, which might be accessible only by root, is not the best spot to clone a repository.
Second, Using SSH over the HTTPS port is the usual solution (like this one from 2018) on EC2, where the firewall can block by default SSH egress traffic.

Scp connection timed out ubuntu VM

so i'm trying to copy a file for my directory to Azure ubuntu VM , SSH works just fine ,but scp command takes a lot of time and then i had this message
connect to host 10.x.x.x port 22: Connection timed out lost connection
this is the command i used :
scp -vvv -i .ssh/id_rsa BaltimoreCyberTrustRoot.crt.pem azureuser#10.x.x.x:/var/www/html
• AFAIK, the SCP command that you are using to try to connect to your Ubuntu Azure VM might not be correct as the correct command to connect to your Ubuntu Linux VM from your local machine to copy files between them is as follows: -
scp -r ./tmp/ azureuser#10.xxx.xxx.xxx:/home/file/user/local
In the above command, the SCP connection gets established successfully after entering the private key further which files in the local system in ‘/tmp’ directory is recursively getting copied in the Azure ubuntu VM specified in ‘/home/file/user/local’ directory. Thus, the whole directory tree as specified is copied from the local system to the Azure ubuntu VM.
• Also, if you want to use the private key in the ‘SCP’ command through SSH, then you will have to use the below command to copy files from the local system to the Azure ubuntu VM: -
sudo scp -i ~/.ssh/id_rsa /path/cert.pem azureuser#10.xxx.xxx.xxx:/home/file/user/local
Using ‘sudo’ to access a ‘root’ file, while ‘SCP’ is going to look for the identity file ‘id_rsa’ in ‘/root/.ssh/’ instead of in ‘/home/user/.ssh/’. That's why you will have to specify the identity file (private key) in the SCP command to connect to the Azure ubuntu VM and transfer files from local system to the VM.
Other than this, kindly ensure that port 22 is opened in the inbound NSG rule on the Azure ubuntu VM and the VM's default page is accessible on port 80/443 over public IP address and the Azure FQDN assigned.
For more information, kindly refer to the links below: -
Can't scp to Azure's VM
https://learn.microsoft.com/en-us/azure/virtual-machines/linux/copy-files-to-linux-vm-using-scp#scp-a-directory-from-a-linux-vm

ssh: Could not resolve hostname sama5d27-som1-ek-sd:

I am trying to set up remote connection between my Desktop machine ( Windows machine ) and remote machine ( Linux machine ).
For that, I am following the steps described here :
https://learn.microsoft.com/en-us/cpp/linux/set-up-fips-compliant-secure-remote-linux-development?view=vs-2019#to-create-and-use-an-rsa-key-file
But, in step 2, when I try, From Windows, to copy the public key to the Linux machine using this command :
scp C:\Users\wiemz/.ssh/id_rsa.pub root#sama5d27-som1-ek-sd:
This error occurs :
ssh: Could not resolve hostname sama5d27-som1-ek-sd: H\364te Unknown.
lost connection
I verfied my Linux internet connection with ping command and it's going well. Besides, when I typed this command on the Linux machine :
ssh root#sama5d27-som1-ek-sd
it says :
ssh: Could not resolve hostname sama5d27-som1-ek-sd: Temporary failure
in name resolution
How can I fix this problem please ?
The problem is your OS can't resolve the hostname you use. You should provide FQDN like web.example.com or IP address of the machine. For example
scp file root#IP:
or
scp file root#web.example.com:

ec2 instance access failed due to change in owner

When I login to the server, but 22 is already open for all upcoming connections still getting error as below,
ssh Server_Name
ssh: connect to host Server-IP port 22: Connection refused
I misleadingly change the the owner of the system and change root privileges with jenkins. So, right now I could not able to log into the system and port 22 is closed it's throwing the error.
I understood the error issue occurred because of wrong fstab file and wrong editing to sshd conf(Not sure). And, the directory of authorized_keys been messed up. I tried this solution but not working
I tried accessing via public DNS, via private IP address, detaching and re-attaching volumes driver after attaching it to other instance(but, once I attached to it, I could not able to ssh into that instance), etc. but no luck. Also, tried login with Jenkins user still not working. But, jenkins is still running fine on the server and I could access the Jenkins Dashboard and run the shell onto my instance. But, if I try any sudo command, it shows sudo: effective uid is not 0, is sudo installed setuid root?
Build step 'Execute shell' marked build as failure
Questions
Is there any way to get back my instance port 22 running fine as before ?
Is there a way I can run the sudo commands using Jenkins user by creating the job(By running the shell) inside Jenkins ?
I could trace on the IP which clearly shows port 22 is closed and I could not do anything because of it. Thanks in advance.

Issues with using Jump Host

How do I transfer a file from my local machine to a remote host to which I need to get through a jump host? These are the steps I follow to connect to the remote host
1. ssh myname#jump-host
2. enter password
3. sudo su - another-random-name
4. ssh name#remote-host
Now I want to transfer a file from my local machine to the remote-host. How would I achieve this? I have already tried scp -oProxyCommand but I don't quite know where I should include step 3 as part of this command?
Use port forwarding to get third host ssh port on your localhost, in this way:
ssh -L 2222:remote-host:22 myname#jump-host
then (on another tab/shell on first host):
scp -P 2222 file myname#localhost:
will copy directly to remote host.
On the jump host under another-random-name run
ssh -L 2222:remote-host:22 myname#jump-host
then on your local computer you can run
scp -P 2222 file name#jump-host:
SCP will try to connect to jump-host, while in fact this connection will be forwarded to jump-host. And will use name as it is connecting to remote-host.
You are probably still facing problem with certificate for another-random-user. You can either create certificate on your machine for your-local-user and put public key on remote-host in user allowed keys.

Resources