Linux server user details - linux

in linux server someone copied the home directory to the different location . how to find which particular user who has carried the copy operation in the RHEL 6 server.

It depends on various things. If the sudo was used then the command should be logged in /var/log/secure.
You can execute this script to check command history of users on a server.
# the command that moved/copied home directory
# COMMAND=mv
COMMAND=cp
for user in $(ls /home/); do
# I assume that users use bash as their shell
sudo grep --with-filename ${COMMAND} /home/${user}/.bash_history 2>/dev/null
done

Related

ssh sudo to a different user execute commands on remote Linux server

We have a password less authentication between the server for root user, I am trying to run the alias on remote server as below
#ssh remoteserver runuser -l wasadmin wasstart
But it is not working. Any suggestions or any other method to achieve it
Based on your comments as you need to sudo to wasadmin in order to run wasadmin, you can try this:
ssh remoteserver 'echo /path/to/wasadmin wasstart | sudo su - wasadmin'
For add an alias in linux you must run
alias youcommandname=‘command’
Notice:
This will work until you close or exit from current shell . To fix this issue just add this to you .bash_profile and run source .bash_profile
Also your profile file name depending on which shell you using . bash , zsh ,...

cd to directory and su to particular user on remote server in script

I have some tasks to do on a remote Ubuntu CLI-only server in our offices every 2 weeks. I usually type the commands one by one, but I am trying to find a way (write a script maybe?) to decrease the time I spend in repeating those first steps.
Here is what I do:
ssh my_username#my_local_server
# asks for my_username password
cd /path/to/particular/folder
su particular_user_on_local_server
# asks for particular_user_on_local_server password
And then I can do my tasks (run some Ruby script on Rails applications, copy/remove files, restart services, etc.)
I am trying to find a way to do this in a one-step script/command:
"ssh connect then cd to directory then su to this user"
I tried to use the following:
ssh username#server 'cd /some/path/to/folder ; su other_user'
# => does not keep my connection open to the server, just execute my `cd`
# and then tells me `su: must be run from terminal`
ssh username#server 'cd /some/path/to/folder ; bash ; su other_user'
# => keeps my connection open to the server but doesn't switch to user
# and I don't see the usual `username:~/current/folder` prefix in the CLI
Is there a way to open a terminal (keep connection) on a remote server via ssh and change directory + switch to particular in a automated way? (to make things harder, I'm using Yakuake)
You can force allocation of a pseudo-terminal with -t, change to the desired directory and then replace the shell with one where you are the desired user:
ssh -t username#server 'cd /some/path/to/folder && exec bash -c "su other_user"'
sudo -H keeps the current working directory, so you could do:
ssh -t login_user#host.com 'cd /path/to/dir/; sudo -H -u other_user bash'
The -t parameter of ssh is needed otherwise the second sudo won't be able to ask you for your password.

rsyncing between two non-login users

I have two machines A and B, both of which have 3 users:
root (I don't know a password can just switch using sudo su -)
login (used for sshing into both machines, has a password, is a sudoer)
mysql (standard non-interactive user running mysql server)
What I need to do is to rsync data directory (dir) belonging to mysql from machine A to machine B.
Obviously I can't just do:
rsync -avpE /dir/ B:/dir/
Because neither A nor B have read access to dir
I can't do:
sudo -u mysql rsync -avpE /dir/ B:/dir/
Because now A has access to dir but B doesn't.
So is it possible to construct an rsync command so I copy data across without using some temporary space?
rsync has an option called --rsync-path that might help you:
$ rsync |& grep rsync-path
--rsync-path=PROGRAM specify the rsync to run on the remote machine
The idea is to ask (the local) rsync to ssh to the remote machine (as user login) and then when it wants to call rsync on the remote machine, have it call sudo -u mysql rsync instead of plain rsync. So something like that:
sudo -u mysql rsync -avpE --rsync-path="sudo -u mysql rsync" /dir/ login#B:/dir/
Of course for this to work, the user login on the remote machine must be able to sudo -u mysql without a password.

Copy files from Linux server using ssh client with different user name

I have this linux machine with ssh server installed, I can access the server using username="ubuntu". ssh server blocks clients that try to connect using "root" username.
So connection can be made by:
ssh -i mykey ubuntu#myserver
I can get files that belong to "ubuntu" using :
scp -i mykey ubuntu#myserver:<file location> ./
However, what I really want is to get files that belong to "root" username, (Note: I can't access the server with username "root" for obvious security reasons).
so is there a way to do download files that are under "root" username?
I was thinking to do some magic in the server side that enables me to do that.(I don't know how :) )
if this help: I have root access and also I can create files on my server side. but I'm not allowed to change the file permission under the root(if someone get hold of these files I'll be fired)
You can try monster like this
ssh ubuntu#myhost 'sudo cat /path/to/file | uuencode' | uudecode > path/to/local
You should have uuencode and uudecode on coresponding hosts.
Or if file is text you can skip uuencode part
ps: see related topic
You could do it the other way around.
Log into the the pc with the file you want with
ssh ubuntu#myserver
Then gain superuser privileges
sudu su
and then copy the files you want
scp /the_file_you_want ubuntu#myhost:/the_location_and_filename_you_want
Some other ways you can find here
https://unix.stackexchange.com/questions/106480/how-to-copy-files-from-one-machine-to-another-using-ssh
enable ssh on your machine
(if fedora) (for ubuntu you can find command on google easily)
service sshd on
From your local machine
ssh -i ubuntu#myserver
change to root
su
enter password
and copy files using scp
scp somefile.extension randomuser#localmachine:/some/path/
I hope it helps

On Linux, how can I share scripts across an SSH connection for the session only?

For work, I have to connect to dozens of Linux machines via SSH (to perform maintenance, monitor the system, install software, etc).
I have a small arsenal of scripts that help me do some of these tasks, and these are located in a folder on my Mac in /Users/me/bin. I want to be able to run these scripts on the remote Linux machine, but for several reasons I do not want these scripts permanently located on these machines (e.g., other people also connect to these remote machines, and it would be unwise to let them execute these files).
So, is possible to share scripts across an SSH connection for the lifetime of the session only?
I have a couple of ideas on how to do this, but I don't know if any of them will work. Firstly, if SSH allows file mounting, I could automatically mount me#mymac:/Users/me/bin to me#linux:/remote_bin when I connect to the remote Linux box, and set my PATH variable to "$PATH:/remote_bin". Secondly, I could set up port forwarding in the connection string (e.g., ssh me#linux -R 9999:127.0.0.1:<SMBPORT|ETC> and every time I connect mount the share and set the $PATH variable.
EDIT: I've come up with a semi-solution. On the linux machine, edit /etc/ssh/sshd_config to add the following subsystem: Subsystem shareduserbinary sudo su -l -c "/bin/mount -t cifs -o port=9999,username=me,nounix,sec=ntlmssp //127.0.0.1/exported_bin /mnt/remote_bin" && bash -l -i -s. When connecting to the remote machine, set up a reverse port forward and invoke the subsystem. E.g.: ssh -R 9999:127.0.0.1:445 -s shareduserbinary me#linux.
EDIT 2: You can make the solution above cleaner, by removing the -l from the sudo command and changing the path from /mnt/remote_bin to $HOME/rbin.
Interesting question. Perhaps you can add a command to ~/.bash_login (assuming you are using bash) to copy the scripts from a remote host (such as your mac) when you login, then add a command to ~/.bash_logout to delete the scripts when you logout. But, as bmargulies points out, it would be a good idea to go a step further and make sure that nobody else has permissions to read or execute the scripts.
You can use OpenSSH's LocalCommand to upload the files (using e.g. scp or rsync) when initiating an SSH session (see man ssh_config and this):
Host server1 server2 [...]
PermitLocalCommand yes
LocalCommand scp -q /Users/bin/me/* %h:temp_bin/
and use .bash_logout or an EXIT-trap that you specify in your .bashrc to delete the contents of the directory on logout.

Resources