Copy file from local machine to remote machine - linux

I have a pretty novice question, but I'm really out of resources right now.
I'm trying to send a script to a remote machine. This script is in my local machine and it's called model.py.
I can access the remote machine by typing ssh csousa#headtop.ncc.unesp.br (the remote machine knows my public key)
I read about the usage of the scp and rsync commands. I tried:
rsync -v -e ssh /home/ecaue/ParticlePhysics/TCC/model.py csousa#headtop.ncc.unesp.br
and
scp -r csousa#headtop.ncc.unesp.br /home/ecaue/ParticlePhysics/TCC/model.py
but what I get is just a copy of my original file with another name ("csousa#headtop.ncc.unesp.br") in the same folder of my original file (like a clone with a different name).
I can access the remote machine and create conda environments and all kind of stuff(install tensorflow, keras,etc), but I'm not able to send my scrip and my dataset.
I would really appreciate any help!!

The scp command should be:
scp PATH_TO_YOUR_SCRIPT/SCRIPT_NAME user#host:DESTINATION_PATH
So in you case:
scp /home/ecaue/ParticlePhysics/TCC/model.py csousa#headtop.ncc.unesp.br:$HOME

Related

Bash script to transfer files to a server with user acces changing

I want to connect and transfer files, from my machine to a server using ssh.
With my normal user, I can only have acces to certain directories. To have access to the destination file, i have to use the command: sudo -su superUser and now I can have access to transfer to the transfer directory.
Can I do that with my bash script?
#!/bin/bash
file_to_upload="*.txt"
remote_username="normal.user"
remote_hostname="123.45.67.8.."
remote_destination_dir="/data/apps/transfer"
scp "$file_to_upload" "$remote_username#$remote_hostname:$remote_destination_dir"
After the transfer, scp is closing the connection, so I don't know if this is the best solution.

On Linux, how can I share scripts across an SSH connection for the session only?

For work, I have to connect to dozens of Linux machines via SSH (to perform maintenance, monitor the system, install software, etc).
I have a small arsenal of scripts that help me do some of these tasks, and these are located in a folder on my Mac in /Users/me/bin. I want to be able to run these scripts on the remote Linux machine, but for several reasons I do not want these scripts permanently located on these machines (e.g., other people also connect to these remote machines, and it would be unwise to let them execute these files).
So, is possible to share scripts across an SSH connection for the lifetime of the session only?
I have a couple of ideas on how to do this, but I don't know if any of them will work. Firstly, if SSH allows file mounting, I could automatically mount me#mymac:/Users/me/bin to me#linux:/remote_bin when I connect to the remote Linux box, and set my PATH variable to "$PATH:/remote_bin". Secondly, I could set up port forwarding in the connection string (e.g., ssh me#linux -R 9999:127.0.0.1:<SMBPORT|ETC> and every time I connect mount the share and set the $PATH variable.
EDIT: I've come up with a semi-solution. On the linux machine, edit /etc/ssh/sshd_config to add the following subsystem: Subsystem shareduserbinary sudo su -l -c "/bin/mount -t cifs -o port=9999,username=me,nounix,sec=ntlmssp //127.0.0.1/exported_bin /mnt/remote_bin" && bash -l -i -s. When connecting to the remote machine, set up a reverse port forward and invoke the subsystem. E.g.: ssh -R 9999:127.0.0.1:445 -s shareduserbinary me#linux.
EDIT 2: You can make the solution above cleaner, by removing the -l from the sudo command and changing the path from /mnt/remote_bin to $HOME/rbin.
Interesting question. Perhaps you can add a command to ~/.bash_login (assuming you are using bash) to copy the scripts from a remote host (such as your mac) when you login, then add a command to ~/.bash_logout to delete the scripts when you logout. But, as bmargulies points out, it would be a good idea to go a step further and make sure that nobody else has permissions to read or execute the scripts.
You can use OpenSSH's LocalCommand to upload the files (using e.g. scp or rsync) when initiating an SSH session (see man ssh_config and this):
Host server1 server2 [...]
PermitLocalCommand yes
LocalCommand scp -q /Users/bin/me/* %h:temp_bin/
and use .bash_logout or an EXIT-trap that you specify in your .bashrc to delete the contents of the directory on logout.

Download a file from a Server with double ssh

I connect to a Server with ssh
Step 1:
$ ssh userid#something.com
and then it asks for password and everything is ok
Then I connect to a DB
Step2:
$ssh user1#smthing_else
and then it asks for password and everything is ok
Now when I type ls I can see the file that I want to download...
How can I download this file on my Desktop..??
You need to scp the file twice in order to bring it to local m/c if you don't have direct access. First ssh to the server 1 and run the command to download it. Then run this command again from your local m/c.
scp -r -i path-to-secret-key ubuntu#ec2-address:/home/ubuntu/app-folder-location /home/user/local-mc-location
As you don't have the key, use the below command
scp -r ubuntu#ec2-address:/home/ubuntu/app-folder-location /home/user/local-mc-location
Update:
path-to-secret-key is the private key address which is used in ec2 instances to ssh. They are used for authentication and are present in home/.ssh/private-key. They have a permission of 400 and are either .pem extension for unix m/c's or ppk extension for windows m/c's
I guess you can't directly reach the "inner" host from the outside? In that case you have to ssh into the outer host, then you can use scp to copy the file from the inner host to the outer one. Accordingly, you can then copy the file to your local pc from the outer host with scp (or whatever you can use in that case).
scp works like this (to copy a local file to a remote host):
scp myfile.txt user#somehost.com:/home/user/whatever
resp. like this (remote to local):
scp user#somehost.com:/home/user/whatever/myfile.txt .

How to run a shell script over ssh with resource(.txt files) in one machine and the script in another machine?

I want to run a shell script using SSH which takes resource from other machine while the script is in some other machine, all on the same network. I don't want to copy the resource to the local machine.
Note: The shell script takes .txt file as input
If you have script.sh on server1 and file.txt on server2, you can connect through ssh to server1, and then do:
[user#server1]$ ssh user#server2 "cd mydir && cat file.txt" | ./script.sh
Try this:
ssh USER_NAME#HOST_ADDRESS "BASH_SCRIPT_FILE_PATH"
You will need to provide password whenever required.
If your script is in Machine A, you can't run that on Machine B without copying it over. First, copy the script over to Machine B using scp
[user#machineA]$ scp /path/to/script user#machineB:/home/user/path
Then, just run the script
[user#machineA]$ ssh user#machineB "/home/user/path/script"
This will work if you have given executable permission to the script.
OR
Try this one..
<hostA_shell_prompt>$ ssh user#hostB "ls -la"
That will prompt you for password, unless you have copied your hostA user's public key to the authorized_keys file on the home of user .ssh's directory. That will allow for passwordless authentication (if accepted as an auth method on the ssh server's configuration)
I not fully understand your question. Other answers gave "How to run remote script?"
But i think question is Remote script has to take remote file, even I not sure about this
Login Remote PC using ssh.
Install sshfs if not installed .
Then mount other remote machine directory which has the file you want to use in script to local directory. This can be done using sshfs
Then run the script with file from locally mounted directory
Then unmount the directory when you finished.
Somewhat large procedure.
Mounting remote directory with sshfs
man sshfs

How to copy entire folder from Amazon EC2 Linux instance to local Linux machine?

I connected to Amazon's linux instance from ssh using private key. I am trying to copy entire folder from that instance to my local linux machine .
Can anyone tell me the correct scp command to do this?
Or do I need something more than scp?
Both machines are Ubuntu 10.04 LTS
another way to do it is
scp -i "insert key file here" -r "insert ec2 instance here" "your local directory"
One mistake I made was scp -ir. The key has to be after the -i, and the -r after that.
so
scp -i amazon.pem -r ec2-user#ec2-##-##-##:/source/dir /destination/dir
Call scp from client machine with recursive option:
scp -r user#remote:src_directory dst_directory
scp -i {key path} -r ec2-user#54.159.147.19:{remote path} {local path}
For EC2 ubuntu
go to your .pem file directory
scp -i "yourkey.pem" -r ec2user#DNS_name:/home/ubuntu/foldername ~/Desktop/localfolder
You could even use rsync.
rsync -aPSHiv remote:directory .
This's how I copied file from amazon ec2 service to local window pc:
pscp -i "your-key-pair.pem" username#ec2-ip-compute.amazonaws.com:/home/username/file.txt C:\Documents\
For Linux to copy a directory:
scp -i "your-key-pair.pem" -r username#ec2-ip-compute.amazonaws.com:/home/username/dirtocopy /var/www/
To connect to amazon it requires key pair authentication.
Note:
Username most probably is ubuntu.
I use sshfs and mount remote directory to local machine and do whatever you want. Here is a small guide, commands may change on your system
This is also important and related to the above answer.
Copying all files in a local directory to EC2. This is a Unix answer.
Copy the entire local folder to a folder in EC2:
scp -i "key-pair.pem" -r /home/Projects/myfiles ubuntu#ec2.amazonaws.com:/home/dir
Copy only the entire contents of local folder to folder in EC2:
scp -i "key-pair.pem" -r /home/Projects/myfiles/* ubuntu#ec2.amazonaws.com:/home/dir
I do not like to use scp for large number of files as it does a 'transaction' for each file. The following is much better:
cd local_dir; ssh user#server 'cd remote_dir_parent; tar -c remote_dir' | tar -x
You can add a z flag to tar to compress on server and uncompress on client.
One way I found on youtube is to connect a local folder with a shared folder in EC2 instance. Please view this video for the full instruction. The sharing is instantaneous.

Resources