so i'm trying to copy a file for my directory to Azure ubuntu VM , SSH works just fine ,but scp command takes a lot of time and then i had this message
connect to host 10.x.x.x port 22: Connection timed out lost connection
this is the command i used :
scp -vvv -i .ssh/id_rsa BaltimoreCyberTrustRoot.crt.pem azureuser#10.x.x.x:/var/www/html
• AFAIK, the SCP command that you are using to try to connect to your Ubuntu Azure VM might not be correct as the correct command to connect to your Ubuntu Linux VM from your local machine to copy files between them is as follows: -
scp -r ./tmp/ azureuser#10.xxx.xxx.xxx:/home/file/user/local
In the above command, the SCP connection gets established successfully after entering the private key further which files in the local system in ‘/tmp’ directory is recursively getting copied in the Azure ubuntu VM specified in ‘/home/file/user/local’ directory. Thus, the whole directory tree as specified is copied from the local system to the Azure ubuntu VM.
• Also, if you want to use the private key in the ‘SCP’ command through SSH, then you will have to use the below command to copy files from the local system to the Azure ubuntu VM: -
sudo scp -i ~/.ssh/id_rsa /path/cert.pem azureuser#10.xxx.xxx.xxx:/home/file/user/local
Using ‘sudo’ to access a ‘root’ file, while ‘SCP’ is going to look for the identity file ‘id_rsa’ in ‘/root/.ssh/’ instead of in ‘/home/user/.ssh/’. That's why you will have to specify the identity file (private key) in the SCP command to connect to the Azure ubuntu VM and transfer files from local system to the VM.
Other than this, kindly ensure that port 22 is opened in the inbound NSG rule on the Azure ubuntu VM and the VM's default page is accessible on port 80/443 over public IP address and the Azure FQDN assigned.
For more information, kindly refer to the links below: -
Can't scp to Azure's VM
https://learn.microsoft.com/en-us/azure/virtual-machines/linux/copy-files-to-linux-vm-using-scp#scp-a-directory-from-a-linux-vm
Related
I have created a simple VM in azure in which I will have to host a very simple server written in C.
To send the folder hosted on my computer containing the server to the virtual machine, I use the command from powershell:
scp -r <path_to_key.pem> <path_to_folder_on_my_pc> <azureuser#ip:/home/azureuser/>
The result of this command is
azureuser#ip: Permission denied (publickey).
lost connection
Would anyone who has had this problem have a solution ?
You need to copy your private key to the ~/.ssh/ directory on the host from which you want to transfer the file. Once you have done that, you can use the following command:
scp -i ~/.ssh/<name of your key>.pem <path of file to transfer> user#azureip:<target directory>
So for example you want to transfer file.txt to your Azure VM (IP of 10.10.10.10) with the private key named key.pem
scp -i ~/.ssh/key.pem file.txt user#10.10.10.10:/home/user/
To pull a file from your Azure VM to your local host, you reverse the order of the file to get and user#azureip.
scp -i ~/.ssh/key.pem user#10.10.10.10:/home/user/file.txt /home/user/
This problem may cause in your public key. please Ensure that the public key is also present in your home directory when you create the Azure Virtual machine with a public key. Meaning The public key was kept on your both local computer and virtual machine Then, with the permission accept from your local workstation, you can use ssh into your Azure Virtual Machine using the public key.
Reference: linux - Can't scp to Azure's VM - by ale93p
Suppose if you want to use the private key in the SCP then you will have to use the below command to copy files from the local system to the Azure VM
sudo scp -i ~/.ssh/id_rsa /path/cert.pem azureuser#ip.xxx.xxx.xxx:/home/file/user/local
Make sure that the Azure VM's incoming NSG rule has port 22 opened and by default VM'S page is reachable through port 80/443 over public IP address.
For more information in detail, please refer this link:
Use SSH keys to connect to Linux VMs - Azure Virtual Machines | Microsoft Docs
Use SCP to move files to and from a VM - Azure Virtual Machines | Microsoft Docs
I have a file called test1.zip in /mnt/c/Users/test/ folder of my local laptop [in which ubuntu windows subsystem for linux is installed]. Local ubuntu terminal WSL name is lauda
Now, I would like to transfer this zip file called test1.zip to my remote server named stuff.
PLEASE NOTE THAT ALL COMMANDS ARE TRIED FROM MY LOCAL LAPTOP WSL SCREEN [ubuntu screen]
So, I tried the below command from my WSL [local laptop ubuntu WSL terminal]
scp user1#lauda:/mnt/c/Users/test/test1.zip user1#stuff:/home/test/codes/test1
and got the error ssh: Could not resolve hostname lauda: Name or service not known
So I tried the below [replacing the lauda local laptop ubuntu terminal hostname with its IP]
scp user1#172.xx.xxx.xxx:/mnt/c/Users/test/test1.zip user1#stuff:/home/test/codes/test1
this resulted in error as ssh: connect to host 172.xx.xxx.xxx port 22: Connection refused
Now I tried the same command as above but in opposite way as shown below
scp user1#stuff:/home/test/codes/ user1#lauda:/mnt/c/Users/test/test1.zip
and got the below error
ssh: Could not resolve hostname lauda: Temporary failure in name resolution
Later, I tried with IP address
scp user1#stuff:/home/test/codes/ user1#172.xx.xxx.xxx:/mnt/c/Users/test/test1.zip
And I got the below error
ssh: connect to host 172.xx.xxx.xxx port 22: No route to host lost connection
Later, I tried the below commands as well
scp /mnt/c/Users/test/test1.zip user1#stuff:/home/test/codes/
and got an error scp: /home/test/codes/test1.zip: Permission denied
So, I again tried like below
scp user1#stuff:/home/test/codes/ /mnt/c/Users/test/test1.zip
and got an error scp: /home/test/codes: not a regular file
PLEASE NOTE THAT ALL COMMANDS ARE TRIED FROM MY LOCAL LAPTOP WSL SCREEN [ubuntu screen]
How can I transfer local files/folders from my local ubuntu WSL terminal to remote server?
scp /mnt/c/Users/test/test1.zip user1#stuff:/home/test/codes/ is the closest attempt to working. The error you get could be due to one of two reasons:
Firstly user1 does not have permissions to write to /home/test on stuff - makes sense as usually only the test user would be able to write there. (Note that the test user on your WSL instance is not the same profile the test user on the remote.)
Secondly the /home/test/codes/ folder may not even exist yet.
Instead (if you know test's password) copy as the test user :
scp /mnt/c/Users/test/test1.zip test#stuff:/home/test/codes/
Or copy to user1's home directory (after ensuring you have created /home/user1/codes/
scp /mnt/c/Users/test/test1.zip user1#stuff:/home/user1/codes/
SO i am new to tech, and as previous posts suggests i am working with OCI. Currently i run a linux 8 VM on OCI. My goal is to run terraform scrips on the vm, and have the resources created in OCI.
Current problem:
The tf files i will be writing will be done so on my local windows 10 machine. The files will be saved in a local directory. I need a way of transferring these local files to a directory in my linux machine, in order to execute them!
Is anybody good with OCI is there capability for a sftp transfer using winscp?? I'm just not sure where to start. Anybody with good advice please aid me!
It depends of your OCI network configuration.
If your OCI compute VM is in a public subnet and you have an internet gateway, then you can use ssh to connect to it (using putty for instance). That means you can also use scp which lets move copy files over ssh. As you mentioned, WinSCP let's you connect to your OCI compute VM by using ssh and scp or sFTP. After installing it you can create a new connection using the public ip of your OCI compute VM and the private key.
My personal preference is to use MobaXterm to connect to ssh to
my OCI compute VMs. Once connected to a remote host using ssh, the
left pane directly displays a file browser for the remote host.
Drag-and-dropping a file there would initiate an sFTP transfer
automatically.
Please also note that scp is obsolete since 2019. SFTP or rsync could be used instead. Using MobaXterm, it can be done by opening a new terminal tab (which is local to your Windows machine) and type the rsync command you wish for instance rsync -v -P -e 'ssh -i "D:/my_folder/oci_api_key.pem"' /cygdrive/d/my_folder/*.tf opc#<oci_vm_ip>:/home/opc/my_folder
-v is increasing verbosity, to display more information. -P displays partial progress for each file transferred. -e lets you specify which command to use to run rsync. in this case I use ssh and pass the private key. More option are available and you can check them by typing man rsync.
If your OCI compute VM is in a private subnet, you would need to set up a bastion VM in a public subnet to first access the bastion and then the VM. Here is a blog post about how to achieve that using putty and WinSCP : https://www.ateam-oracle.com/ssh-tunnel-to-a-private-vm-using-a-bastion-host-in-oci
How do I transfer a file from my local machine to a remote host to which I need to get through a jump host? These are the steps I follow to connect to the remote host
1. ssh myname#jump-host
2. enter password
3. sudo su - another-random-name
4. ssh name#remote-host
Now I want to transfer a file from my local machine to the remote-host. How would I achieve this? I have already tried scp -oProxyCommand but I don't quite know where I should include step 3 as part of this command?
Use port forwarding to get third host ssh port on your localhost, in this way:
ssh -L 2222:remote-host:22 myname#jump-host
then (on another tab/shell on first host):
scp -P 2222 file myname#localhost:
will copy directly to remote host.
On the jump host under another-random-name run
ssh -L 2222:remote-host:22 myname#jump-host
then on your local computer you can run
scp -P 2222 file name#jump-host:
SCP will try to connect to jump-host, while in fact this connection will be forwarded to jump-host. And will use name as it is connecting to remote-host.
You are probably still facing problem with certificate for another-random-user. You can either create certificate on your machine for your-local-user and put public key on remote-host in user allowed keys.
I used to connect to Amazon web services using ssh command and application.pem key. Now when I try to connect to other platforms such as Github my ssh client looks for same application.pem key and tries to connect to AWS. How do I connect to Github or change the default host and key configuration.I am using a Ubuntu 13.10 system and following is my ssh output.
pranav#pranav-SVF15318SNW:~/.ssh$ ssh
Warning: Identity file application.pem not accessible: No such file or directory.
You need the identity file to login to the box. Use the command:
ssh -i (identity_file) username#hostname"
This worked for me. Write just the filename (without any slashes), unlike Amazon EC2 tutorial which asks you to enter:
ssh -i /path/key_pair.pem ec2-user#public_dns_name
and also check the permission