I need to see server logs in different machines( around 20 ) and server logs are stored in location with hostname in their path ( I am using super putty).
So I dont have single command to chnage directory instead i have to do it individually.
With hostname command i can get m/c name but i am not able to use as vraible in my cd command.
>hostname
mymachinename
>cd /opt/$"hostname"/logs
no directory name /opt/hostname/logs
Any help on this?
Pardon me if its duplicate. I searched but didn't get any questions related to this.
shold be
cd /opt/$(hostname)
See..
root#mongodbServer1:~# cd /opt/$(hostname)
root#mongodbServer1:/opt/mongodbServer1# pwd
/opt/mongodbServer1
root#mongodbServer1:/opt/mongodbServer1#
Use $HOSTNAME or $(hostname) or `hostname` (inverted quote) to retrieve the hostname.
Related
scp -r /Users/Brain/Desktop/tree.png account#address:/home/directory
I successfully connect to server and enter password, but receive this message "/Users/Brain/Desktop/tree.png: No such file or directory found"
I know the file exists, it is sitting on my desktop and I can open it.
Any guidance would be much appreciated!!
Tried looking at this post but it did not help:scp files from local to remote machine error: no such file or directory
Typo? For a location like /Users, better odds are suggested for a person with the name Brian over one like Brain. After reversing the vowels, what happens with this command?
ls -l /Users/Brian/Desktop/tree.png
When presented with unexpected errors for file(s) known to exist, there's usually an issue with one pathname component. Start with the full path and drop trailing components until there's no error, eg:
ls /Users/Brain/Desktop/tree.png
ls /Users/Brain/Desktop
ls /Users/Brain
ls /Users
Some shells can trim a pathname component from the previous command with :h ; try repeating this:
!!:h
After typing the above, another possible shortcut is UP-arrow RETURN
I can SSH into the EC2 instance:
ssh -i "my_key.pem" ec2-user#my-public-ip
However, scp doesn't work:
scp -r –i "my_key.pem" ./my_file ec2-user#my-public-ip:/home/ec2-user/my_file
Permission denied (publickey).
lost connection
I've also tried using public instance DNS, but nothing changes.
Any idea why is this happening and how to solve it?
The only way for this to happen is the private key mykey.pem is not found in the current directory. It is possible you tried ssh from a directory different than scp.
Try the following with full path to your key:
scp -r –i /path/to/my_key.pem ./my_file ec2-user#my-public-ip:/home/ec2-user/my_file
If it fails, post the output with -v option. It will tell you exactly where the problem is
scp -v -r –i /path/to/my_key.pem ./my_file ec2-user#my-public-ip:/home/ec2-user/my_file
I am bit late but this might be help full to someone.
Do not use the /home/ec2-user. Rather directly use the file name or folder name
E.g. the following command will put your my_file at the home folder (i.e. /home/ec2-user)
scp -r –i "my_key.pem" ./my_file ec2-user#my-public-ip:my_file
Or Say if you have a folder at /home/ect-user/my_data
Then use the following command to copy your file to the folder
scp -r –i "my_key.pem" ./my_file ec2-user#my-public-ip:my_data
Stupidly late addendum:
To avoid specifying the private key every time, just add to the .ssh/config file (create it if not already there) the following (without comments):
Host testserver // a memorable alias
Hostname 12.34.56.67 // your server ip
User ec2-user // user to connect
IdentityFile /path/to/key.pem // path to the private key
PasswordAuthentication no
Then a simple ssh testserver should work from anywhere (and consequently your scp too).
I use it to connect with Vim via scp using:
vim scp://testserver/relative/file/path
or
vim scp://testserver//absolute/file/path
and
vim scp://testserver/relative/dir/path/ (note the trailing slash)
to respectively edit files and browse folders directly from local (thus using my precious .vimrc <3 configuration).
Solution found here
Hope this helps! :)
I was facing this issue today and found solution for me (not elegant but one which worked). - this solution is good if you want to download something once and rollback all settings afterwards.
Solution:
When I specified -v option while using scp I noticed the certificate is being denied for some reason so I went to /etc/ssh/sshd_config and set PasswordAuthentication yes. Then I used systemctl restart sshd.
After this procedure I went to my local machine and used:
scp -v -r myname#VPC:/home/{user}/filename.txt path/on/local/machine
provided PWD and file transmission has been successful.
Hope this helps to someone :)
On a computer with IP address like 10.11.12.123, I have a folder document. I want to copy that folder to my local folder /home/my-pc/doc/ using the shell.
I tried like this:
scp -r smb:10.11.12.123/other-pc/document /home/my-pc/doc/
but it's not working.
So you can use below command to copy your files.
scp -r <source> <destination>
(-r: Recursively copy entire directories)
eg:
scp -r user#10.11.12.123:/other-pc/document /home/my-pc/doc
To identify the location you can use the pwd command, eg:
kasun#kasunr:~/Downloads$ pwd
/home/kasun/Downloads
If you want to copy from B to A if you are logged into B: then
scp /source username#a:/destination
If you want to copy from B to A if you are logged into A: then
scp username#b:/source /destination
In addition to the comment, when you look at your host-to-host copy options on Linux today, rsync is by far, the most robust solution around. It is brought to you by the SAMBA team[1] and continues to enjoy active development. Most distributions include the rsync package by default. (if not, you should find an easily installable package for your distro or you can download it from rsync.samba.org ).
The basic use of rsync for host-to-host directory copy is:
$ rsync -uav srchost:/path/to/dir /path/to/dest
-uav simply recursively copies -ua only new or changed files preserving file & directory times and permissions while providing -v verbose output. You will be prompted for the username/password on 10.11.12.123 unless your have setup ssh-keys to allow public/private key authentication (see: ssh-keygen for key generation)
If you notice, the syntax is basically the same as that for scp with a slight difference in the options: (e.g. scp -rv srchost:/path/to/dir /path/to/dest). rsync will use ssh for secure transport by default, so you will want to insure sshd is running on your srchost (10.11.12.123 in your case). If you have name resolution working (or a simple entry in /etc/hosts for 10.11.12.123) you can use the hostname for the remote host instead of the remote IP. Regardless, you can always transfer the files you are interested in with:
$ rsync -uav 10.11.12.123:/other-pc/document /home/my-pc/doc/
Note: do NOT include a trailing / after document if you want to copy the directory itself. If you do include a trailing / after document (i.e. 10.11.12.123:/other-pc/document/) you are telling rsync to copy the contents, (i.e. the files and directories under) document to 10.11.12.123:/other-pc/ without also copying the document directory.
The reason rsync is far superior to other copy apps is it provides options to truly synchronize filesystems and directory trees both locally and between your local machine and remote host. Meaning, in your case, if you have used rsync to transfer files to /home/my-pc/doc/ and then make changes to the files or delete files on 10.11.12.123, you can simply call rsync again and have the changes/deletions reflected in /home/my-pc/doc/. (look at the several flavors of the --delete option for details in rsync --help or in man 1 rsync)
For these, and many more reasons, it is well worth the time to familiarize yourself with rsync. It is an invaluable tool in any Linux user's hip pocket. Hopefully this will solve your problem and get you started.
Footnotes
[1] the same folks that "Opened Windows to a Wider World" allowing seemless connection between windows/Linux hosts via the native windows server message block (smb) protocol. samba.org
If the two directories (document and /home/my-pc/doc/) you mentioned are on the same 10.11.12.123 machine.
then:
cp -ai document /home/my-pc/doc/
else:
scp -r document/ root#10.11.12.123:/home/my-pc/doc/
I have a few virtual machines running Fedora 20 and one NAS box from which the HOME directories are mounted on all VMs. It works great as users have to maintain one copy of git settings, VNC settings SSH keys and so on. The problem is that regardless of whichever VM they use they always use the same .bash_profile which references always the same .bashrc. I say it's a problem because I want to apply some settings in there on per user basis but only for one specific VM. To be specific I want to change umask for few users on one specific isolated server.
So I thought there must be a way to conditionally load some settings through .bash_profile depending on the IP/hostname of the server.
I thought, just like .bash_profile references .bashrc I could reference another file with this setting which only exists on the server it will affect. Not the other servers this reference would not be valid. This is not an elegant solution, it would throw errors.
I've tried googling this and the results usually point me to stackoverflow where somebody had a similar question but this time I couldn't find anything.
Does somebody have an idea how to solve this?
Thanks in advance for any help.
You can put your host specific commands in .bashrc_myhostname and source it if it exists:
[ -f "$HOME/.bashrc_$HOSTNAME" ] && . "$HOME/.bashrc_$HOSTNAME"
or you can hard code commands for certain hosts with an if or case statement:
if [ "$HOSTNAME" = "mytestserver" ]
then
ulimit -c unlimited
fi
This is what I tested and does what I want it to do.
Most of my users look into .bashrc while changing their aliases so to make sure it won't be accidentally altered I left in in .bash_profile
TargetVM="vm1.example.lan"
if [ "$TargetVM" = "$HOSTNAME" ]; then
umask 077
else
umask 027
fi
scp user#server:/home/loghost??/logfiles.log .
i'm using above scp command in my unix script to download all the logs from loghost folder.
there are mutliple loghost are avaible in my server(i.e. loghost01,loghost02,loghost03)
The log name is same in all the loghost folder. So while scping, the logs are getting override. Is there a way to change the logname while copying?
for server in loghost01 loghost02 loghost03; do
mkdir -p $server;
scp user#$server:/home/$server/logfiles.log $server/;
done
I think something like that might help.
It takes a list of your servers, scps files over to a folder named loghost##/logfiles.log.
If you have a list of servers in a text file, replace the top line with:
for server in `cat file_containing_servers`; do
Put logs from different servers into different directories:
for server in loghost{01,02,03}
do
mkdir -p $server
scp user#$server:/home/$server/logfiles.log ./$server/
done
Put logs from different servers into the same directory with different names:
for server in loghost{01,02,03}
do
scp user#$server:/home/$server/logfiles.log ./$server.log
done