Picloud scp with rsa file not working - linux

I am using picloud which is a high performance cloud service run on top of amazon ec2. I am trying to copy files into a newly created "environment" in my account. I am however unable to use the scp command to copy files from my local machine into picloud env that I have created.
The usual way to SSH into the picloud env is as follows:
ssh -i picloud_rsa picloud#ec2-54-242-89-28.compute-1.amazonaws.com
But when I try to replace ssh with scp using the following format:
scp -r ~/path_to_the_directory -i picloud_rsa picloud#ec2-54-242-89-28.compute-1.amazonaws.com
I get the following error:
cp: -i: No such file or directory
cp: picloud#ec2-54-242-89-28.compute-1.amazonaws.com/picloud_rsa: Permission
And If I try the following:
scp -r ~/Desktop/AllFolders/GMU/Fall\ 2013/yelp_phoenix_academic_dataset_duplicated/ picloud_rsa picloud#ec2-54-242-89-28.compute-1.amazonaws.com
I get just the permission denied error:
cp: picloud#ec2-54-242-89-28.compute-1.amazonaws.com/picloud_rsa: Permission denied
I have absolutely no idea how to use scp in this case and would really appreciate any help.
Thanks in advance!

The query I used was:
scp -i picloud_rsa -r ~/my_path_to_directory picloud#ec2-54-242-89-28.compute-1.amazonaws.com:/home/picloud
The solution was to pass the -i flag along with the rsa file in the beginning right after the scp call.

Related

Copy folder frrom local machine to a server in linux

I am trying to copy a filled folder from my local machine to AWS server.
So, I used the following command, but was not working:
scp -r IPADTEST.pem oafolder ec2-user#__________.compute.amazonaws.com:testfolder
The error was:
ec2-user#_________.compute.amazonaws.com: Permission denied (publickey,gssapi-keyex,gssapi-with-mic).
lost connection
I am sure the IPADTEST.pem is working okay, because I can SSH from the same location:
$ ssh -i IPADTEST.pem ec2-user#_____________.compute.amazonaws.com
Also, I can copy a file (not folder), for example I can copy index.html:
sudo scp -i IPADTEST.pem index.html ec2-user#______________.compute.amazonaws.com:testfolder/
You absolutely need "-i .pem".
Q: Have you tried scp -r -i IPADTEST.pem oafolder ec2-user#__________.compute.amazonaws.com:testfolder?

How to properly upload a local file to a server using mobaXterm?

I'm trying to upload a file from my local desktop to a server and I'm using this command:
scp myFile.txt cooluser#192.168.10.102:/opt/nicescada/web
following the structure: scp filename user#ip:/remotePath.
But I get "Permission Denied". I tried using sudo , but I get the same message. I'm being able to download from the server to my local machine, so I assume I have all permissions needed.
What can be wrong in that line of code?
In case your /desired/path on your destination machine has write access only for root, and if you have an account on your destination machine with sudo privileges (super user privileges by prefixing a sudo to your command), you could also do it the following way:
Option 1 based on scp:
copy the file to a location on your destination machine where you have write access like /tmp:
scp file user#destinationMachine:/tmp
Login to your destination machine with:
ssh user#destinationMachine
Move the file to your /desired/path with:
sudo mv /tmp/file /desired/path
In case you have a passwordless sudo setup you could also combine step 2. and 3. to
ssh user#destination sudo mv /tmp/file /desired/path
Option 2 based on rsync
Another maybe even simpler option would be to use rsync:
rsync -e "ssh -tt" --rsync-path="sudo rsync" file user#destinationMachine:/desired/path
with -e "ssh -tt" added to run sudo without having a tty.
Try and specify the full destination path:
scp myFile.txt cooluser#192.168.10.102:/opt/nicescada/web/myFile.txt
Of course, double-check cooluser has the right to write (not just read) in that folder: 755, not 644 for the web parent folder.

SCP not working permission denied even with SSH key given

I am trying to get scp to work and transfer a file from a remote server to my local. I tried looking around and this post helped the most but it still is not working here is the current output.
<HOSTNAME>:chef4-rep
<USERNAME>$ sudo scp -i ./.chef/<NAME>.pem <USERNAME>#<IP>:/home/postgres/post_0604_dump/db0604_schema_and_data.sql ~/
<USERNAME>#<IP>: Permission denied (publickey).
The issue turned out not to be with my command but that I was trying to copy a file in another users directory and it wouldn't work. I ended up SSH'ing in and using sudo to copy the file to my home directory and then used scp with no issues.
Kindly use below command to get it done.
root#localhost# scp -r "source_file/directory" "Destination address- remote-IP/Hostname:/location"
And if you are using passwordless ssh then make sure you are using correct user whose public keys are shared with remote server.
Thanks
I had the same issue with scp and got Permission denied (publickey):
This worked:
ssh -i "mykey.pem" ubuntu#??.??.??.???
But this didn't: scp -i "mykey.pem" test.php ubuntu#??.??.??.???:
I solved it by removing the quotes off my key file:
scp -i mykey.pem test.php ubuntu#??.??.??.???:

Google cloud scp permission denied

I am trying to transfer files to my Google cloud hosted Linux (Debian) instance via secure copy (scp). I did exactly what the documentation told to connect from a local machine to the instance. https://cloud.google.com/compute/docs/instances/connecting-to-instance.
Created a SSH keygen
Added the keygen to my instance
I can login successfully by:
ssh -i ~/.ssh/my-keygen [USERNAME]#[IP]
But when I want to copy files to the instance I get a message "permission denied".
scp -r -i ~/.ssh/my-keygen /path/to/directory/ [USERNAME]#[IP]:/var/www/html/
It looks like the user with which I login has no permissions to write files, so I already tried to change the file permissions of /var/www/, but this still gives the permission denied message.
I also tried to add the user to the root group, but this still gives the same problem.
usermod -G root myuser
The command line should be
scp -r -i ~/.ssh/my-keygen /path/to/directory/ [USERNAME]#[IP]:/var/www/html/
Assuming your files are in the local /path/to/directory/ and the /var/www/html/ is on the remote server.
The permissions does not allow to write in the /var/www/html/. Writing to /tmp/ should work. Then you can copy the files with sudo to the desired destination with root privileges.
If SSH isn't working, install gcloud CLI and run the following locally: gcloud compute scp --recurse /path/to/directory [IP] --tunnel-through-iap. This will dump the directory into your /home/[USERNAME]/ folder. Then log into the console and use sudo to move the directory to /var/www/html/.
For documentation, see https://cloud.google.com/sdk/gcloud/reference/compute/scp.

How to copy entire folder from Amazon EC2 Linux instance to local Linux machine?

I connected to Amazon's linux instance from ssh using private key. I am trying to copy entire folder from that instance to my local linux machine .
Can anyone tell me the correct scp command to do this?
Or do I need something more than scp?
Both machines are Ubuntu 10.04 LTS
another way to do it is
scp -i "insert key file here" -r "insert ec2 instance here" "your local directory"
One mistake I made was scp -ir. The key has to be after the -i, and the -r after that.
so
scp -i amazon.pem -r ec2-user#ec2-##-##-##:/source/dir /destination/dir
Call scp from client machine with recursive option:
scp -r user#remote:src_directory dst_directory
scp -i {key path} -r ec2-user#54.159.147.19:{remote path} {local path}
For EC2 ubuntu
go to your .pem file directory
scp -i "yourkey.pem" -r ec2user#DNS_name:/home/ubuntu/foldername ~/Desktop/localfolder
You could even use rsync.
rsync -aPSHiv remote:directory .
This's how I copied file from amazon ec2 service to local window pc:
pscp -i "your-key-pair.pem" username#ec2-ip-compute.amazonaws.com:/home/username/file.txt C:\Documents\
For Linux to copy a directory:
scp -i "your-key-pair.pem" -r username#ec2-ip-compute.amazonaws.com:/home/username/dirtocopy /var/www/
To connect to amazon it requires key pair authentication.
Note:
Username most probably is ubuntu.
I use sshfs and mount remote directory to local machine and do whatever you want. Here is a small guide, commands may change on your system
This is also important and related to the above answer.
Copying all files in a local directory to EC2. This is a Unix answer.
Copy the entire local folder to a folder in EC2:
scp -i "key-pair.pem" -r /home/Projects/myfiles ubuntu#ec2.amazonaws.com:/home/dir
Copy only the entire contents of local folder to folder in EC2:
scp -i "key-pair.pem" -r /home/Projects/myfiles/* ubuntu#ec2.amazonaws.com:/home/dir
I do not like to use scp for large number of files as it does a 'transaction' for each file. The following is much better:
cd local_dir; ssh user#server 'cd remote_dir_parent; tar -c remote_dir' | tar -x
You can add a z flag to tar to compress on server and uncompress on client.
One way I found on youtube is to connect a local folder with a shared folder in EC2 instance. Please view this video for the full instruction. The sharing is instantaneous.

Resources