Automating mkdir, chmod and scp across all the servers - linux

This seems to be a simple issue but, I'm not able to figure it out. I am trying to run a couple of small scripts on a server and i'm having issues with that. i have an allhosts file that has the list of servers which is in the same location as that of the .sh file.
script to create a directory structure across all the 20 servers with 777 permissions
#!bin/bash
for q in `cat allhosts`
do
ssh $q "mkdir -p /opt/acd/hgf/tom/hanks/"
chmod -R 777 $q "/opt/acd/hgf/tom/hanks/" >/dev/null 2>&1
done
in the above script, it is only creating the directory paths and not changing the permissions for that path. I tried running that chmod command in a separate script, but no use..
script to scp the contents of hanks to the hanks folder created in the new server.
#!bin/bash
for q in `cat allhosts`
do
scp /opt/acd/hgf/tom/hanks/* $q:/opt/acd/hgf/tom/hanks/ >/dev/null 2>&1
done
in this script too, when i run it, its not copying anything to any of the servers.
i know this is a very small issue, but please check and let me know where I am going wrong. thanks in advance..

The first script is failing because it is running the chmod on the local machine. You should run it on the remote machine via ssh - you could combine this with the other ssh invocation as follows:
ssh $q "mkdir -p /opt/acd/hgf/tom/hanks/ ; chmod -R 777 /opt/acd/hgf/tom/hanks/"
I'd guess the second script is failing because the first script isn't setting permissions; it looks okay.

Related

How to make Ubuntu bash script wait on password input when using scp command

I want to run a script that deletes a files on computer and copies over another file from a connected host using scp command
Here is the script:
#!/bin/bash
echo "Moving Production Folder Over"
cd ~
sudo rm -r Production
scp -r host#192.168.123.456:/home/user1/Documents/Production/file1 /home/user2/Production
I would want to cd into the Production directory after it is copied over. How can I go about this? Thanks!

Passwordless execution of local script on remote machine as root into a local file

I've been working on a bash script that automatically runs certain scripts on remote machines and saves the logs to certain folders. As of now I have been copying the local script to the remote machine, executing it into a remote log, copying the remote log into a local folder, and then deleting the remote log and remote copy of the script.
This works, but I know it can work better if I can avoid doing all the in between steps. The one caveat is I need this to be automatic and passwordless (meaning no user input at all). One of the scripts needs to be ran as root or it won't display all the necessary information and will userlock the machine temporarily.
The code I am currently using to execute the remoteScript into a log that I later retrieve with scp is below.
sshpass -f password.txt ssh user#1.1.1.1 "echo $password | sudo -S /home/user/remoteScript.sh > remoteLog.txt"
And in my testing, execution of local script on remote machine into local log file works like below
sshpass -f password.txt ssh user#1.1.1.1 "bash -s" < /home/user/localScript.sh >> localLog.txt
How could I combine the elements of the two code examples above in order to make a local script run on a remote machine with root privilege and log the output into a local text file?
Some things I have tried that do not work include:
sshpass -f password.txt ssh user#1.1.1.1 "bash -s" < "echo $password | sudo -S /home/user/script.sh >> log.txt"
sshpass -f password.txt ssh user#1.1.1.1 "echo $password | sudo -S /home/user/script.sh" >> log.txt
and notably
sshpass -f password.txt ssh user#1.1.1.1 echo $password | sudo -S /home/user/script.sh >> log.txt
which just executes the local script with root privilege on the local machine.
I have tried many variations of the above commands and I believe its some sort of piping or flow issue but I cannot figure it out. Is there anyway to do this?
Machines are Ubuntu 16.04 and you cannot ssh in already as root.
Thanks in advance
A) It might be worth looking into an orchestration/config management solution (e.g. ansible). It's a steep learning curve at first, but initial outlay will pay off on spades down the line if you're managing multiple servers.
B) Setup password-less sudo for the scripts you want to execute, so you don't have to pass around the password in plaintext, and can run without any input. In sudoers:
user ALL=(ALL) NOPASSWD:/home/user/script.sh
C) Setup an SSH key, so you don't need to use a password at all.
But in nutshell, the code you're looking for is something like:
cat /home/user/localScript.sh | ssh user#1.1.1.1 "sudo bash" > log.txt
Which executes a non-interactive bash shell as root on the remote machine, which will take commands to execute on standard in, and the standard output will come back over the ssh channel for you to write to your local log.
Look into &> or 2>&1 if you want standard error too.

Running linux commands inside bash script throws permission denied error

We have linux script in our environment which does ssh to remote machine with a common user and copies a script from base machine to remote machine through scp.
Script Test_RunFromBaseVM.sh
#!/bin/bash
machines = $1
for machine in $machines
do
ssh -tt -o StrictHostKeyChecking=no ${machine} "mkdir -p -m 700 ~/test"
scp -r bin conf.d ${machine}:~/test
ssh -tt ${machine} "cd ~/test; sudo bash bin/RunFromRemotevm.sh"
done
Script RunFromRemotevm.sh
#!/bin/bash
echo "$(date +"%Y/%m/%d %H:%M:%S")"
Before running Test_RunFromBaseVM.sh script base vm we run below two commands.
eval $(ssh-agent)
ssh-add
Executing ./Test_RunFromBaseVM.sh "<list_of_machine_hosts>" getting permission denied error.
[remote-vm-1] bin/RunFromRemotevm.sh:line 2: /bin/date: Permission denied
any clue or insights on this error will be of great help.
Thanks.
I believe the problem is the presence of the NOEXEC: tag in the sudoers file, corresponding to the user (or group) that's executing the "cd ~/test; sudo bash bin/RunFromRemotevm.sh" command. This causes any further execv(), execve() and fexecve() calls to be refused, in this case it's /bin/date.
The solution is obviously remove the NOEXEC: from the main /etc/sudoers file or some file under /etc/sudoers.d, whereever is this defined.

How to sudo run a local script over ssh

I try to sudo run a local script over ssh,
ssh $HOST < script.sh
and I tried
ssh -t $HOST "sudo -s && bash" < script.sh
Actually, I searched a lot in google, find some similar questions, however, I don't find a solution which can sudo run a local script.
Reading the error message of
$ ssh -t $HOST "sudo -s && bash" < script.sh
Pseudo-terminal will not be allocated because stdin is not a terminal.
makes it pretty clear what's going wrong here.
You can't use the ssh parameter -t (which sudo needs to ask for a password) whilst redirecting your script to bash's stdin of your remote session.
If it is acceptable for you, you could transfer the local script via scp to your remote machine and then execute the script without the need of I/O redirection:
scp script.sh $HOST:/tmp/ && ssh -t $HOST "sudo -s bash /tmp/script.sh"
Another way to fix your issue is to use sudo in non-interactive mode -n but for this you need to set NOPASSWD within the remote machine's sudoers file for the executing user. Then you can use
ssh $HOST "sudo -n -s bash" < script.sh
To make Edward Itrich's answer more scalable and geared towards frequent use, you can set up a system where you only run a one line script that can be quickly ported to any host, file or command in the following manner:
Create a script in your Scripts directory if you have one by changing the name you want the script to be (I use this format frequently to change 1 word for my script name and create the file, set permissions and open for editing):
newscript="runlocalscriptonremotehost.sh"
touch $newscript && chmod +x $newscript && nano $newscript
In nano fill out the script as follows placing the directory and name information of the script you want to run remotely in the variable lines of runlocalscriptonremotehost.sh(only need to edit lines 1-3):
HOSTtoCONTROL="sudoadmin#192.168.0.254"
PATHtoSCRIPT="/home/username/Scripts/"
SCRIPTname="scripttorunremotely.sh"
scp $PATHtoSCRIPT$SCRIPTname $HOSTtoCONTROL:/tmp/ && ssh -t $HOSTtoCONTROL "sudo -s bash /tmp/$SCRIPTname"
Then just run:
sh ./runlocalscriptonremotehost.sh
Keep runlocalscriptonremotehost.sh open in a tabbed text editor for quick updating, go ahead and create a bash alias for the script and you have yourself an app-ified version of this frequently used operation.
First of all divide your objective in 2 parts. 1) ssh to the host. 2) run the command you want as sudo. After you are certain that you can 1) access the host and 2) have sudo privileges then you can combine the two commands with &&. What x_cmd && y_cmd does is that the y_cmd gets executed after x_cmd has exited successfully.

Loop in shell script not working on remote server

The code tries to ssh from my local server to remote server and runs some commands.
ssh root#$remoteip 'bash -s' <<END3
gcdadirs=`strings binary | egrep '.gcda$'`
for dir in ${gcdadirs[#]}; do
directory=$(dirname ${dir})
echo $dir >> dirstr.txt
mkdir -p $directory
chown $root:$root $directory
chmod 777 $directory
done
END3
the above creates a directory structure on remote server which is working fine.
I want to tar up the same directory structure. so i'm using same logic as above.
ssh root#$remoteip 'bash -s' <<END3
touch emptyfile
tar -cf gcda.tar emptyfile
gcdadirs=`strings binary | egrep '.gcda$'`
for dir in ${gcdadirs[#]}; do
tar -rf gcda.tar $dir
done
END3
The above piece of code should create a tar with all the directories included returned by for loop. i tried the code logic by copying the code to remote server and running it there and it worked. But if I ssh connect from my local server to remote server and try it is not enetring the for loop. it is not appending anything to tar file created with empty file in second line.
Try <<'END3'
Note the quotes around END3, they prevent shell substitutions inside the here-document. You want the $-signs to be transferred to the other side of the ssh connection, not interpreted locally. Same for the backticks.
Extracted from the comments as the accepted answer. Posting as community wiki

Resources