Hi I have a shell script which contains s3cmd command on ubuntu 12.04 LTS.
I configured cron for this shell script which works fine for local environment but don't push the file to s3. But when i run shell script manually, It pushes the file to s3 without any error. I checked log and found nothing for this. Here is my shell script.
#!/bin/bash
User="abc"
datab="abc_xyz"
pass="abc#123"
Host="abc1db.instance.com"
FILE="abc_rds`date +%d_%b_%Y`.tar.gz"
S3_BKP_PATH="s3://abc/db/"
cd /abc/xyz/scripts/
mysqldump -u $User $datab -h $Host -p$pass | gzip -c > $FILE | tee -a /abc/xyz/logs/app-bkp.log
s3cmd --recursive put /abc/xyz/scripts/$FILE $S3_BKP_PATH | tee -a /abc/xyz/logs/app-bkp.log
mv /abc/xyz/scripts/$FILE /abc/xyz/backup2015/Database/
#END
This is really weird. Any suggestion would be a great help.
Check if the user running configured in crontab has correct permissions and keys in the environment.
I am guessing the keys are configured in env file as they are not here in the script.
Related
I've been working on a bash script that automatically runs certain scripts on remote machines and saves the logs to certain folders. As of now I have been copying the local script to the remote machine, executing it into a remote log, copying the remote log into a local folder, and then deleting the remote log and remote copy of the script.
This works, but I know it can work better if I can avoid doing all the in between steps. The one caveat is I need this to be automatic and passwordless (meaning no user input at all). One of the scripts needs to be ran as root or it won't display all the necessary information and will userlock the machine temporarily.
The code I am currently using to execute the remoteScript into a log that I later retrieve with scp is below.
sshpass -f password.txt ssh user#1.1.1.1 "echo $password | sudo -S /home/user/remoteScript.sh > remoteLog.txt"
And in my testing, execution of local script on remote machine into local log file works like below
sshpass -f password.txt ssh user#1.1.1.1 "bash -s" < /home/user/localScript.sh >> localLog.txt
How could I combine the elements of the two code examples above in order to make a local script run on a remote machine with root privilege and log the output into a local text file?
Some things I have tried that do not work include:
sshpass -f password.txt ssh user#1.1.1.1 "bash -s" < "echo $password | sudo -S /home/user/script.sh >> log.txt"
sshpass -f password.txt ssh user#1.1.1.1 "echo $password | sudo -S /home/user/script.sh" >> log.txt
and notably
sshpass -f password.txt ssh user#1.1.1.1 echo $password | sudo -S /home/user/script.sh >> log.txt
which just executes the local script with root privilege on the local machine.
I have tried many variations of the above commands and I believe its some sort of piping or flow issue but I cannot figure it out. Is there anyway to do this?
Machines are Ubuntu 16.04 and you cannot ssh in already as root.
Thanks in advance
A) It might be worth looking into an orchestration/config management solution (e.g. ansible). It's a steep learning curve at first, but initial outlay will pay off on spades down the line if you're managing multiple servers.
B) Setup password-less sudo for the scripts you want to execute, so you don't have to pass around the password in plaintext, and can run without any input. In sudoers:
user ALL=(ALL) NOPASSWD:/home/user/script.sh
C) Setup an SSH key, so you don't need to use a password at all.
But in nutshell, the code you're looking for is something like:
cat /home/user/localScript.sh | ssh user#1.1.1.1 "sudo bash" > log.txt
Which executes a non-interactive bash shell as root on the remote machine, which will take commands to execute on standard in, and the standard output will come back over the ssh channel for you to write to your local log.
Look into &> or 2>&1 if you want standard error too.
We have linux script in our environment which does ssh to remote machine with a common user and copies a script from base machine to remote machine through scp.
Script Test_RunFromBaseVM.sh
#!/bin/bash
machines = $1
for machine in $machines
do
ssh -tt -o StrictHostKeyChecking=no ${machine} "mkdir -p -m 700 ~/test"
scp -r bin conf.d ${machine}:~/test
ssh -tt ${machine} "cd ~/test; sudo bash bin/RunFromRemotevm.sh"
done
Script RunFromRemotevm.sh
#!/bin/bash
echo "$(date +"%Y/%m/%d %H:%M:%S")"
Before running Test_RunFromBaseVM.sh script base vm we run below two commands.
eval $(ssh-agent)
ssh-add
Executing ./Test_RunFromBaseVM.sh "<list_of_machine_hosts>" getting permission denied error.
[remote-vm-1] bin/RunFromRemotevm.sh:line 2: /bin/date: Permission denied
any clue or insights on this error will be of great help.
Thanks.
I believe the problem is the presence of the NOEXEC: tag in the sudoers file, corresponding to the user (or group) that's executing the "cd ~/test; sudo bash bin/RunFromRemotevm.sh" command. This causes any further execv(), execve() and fexecve() calls to be refused, in this case it's /bin/date.
The solution is obviously remove the NOEXEC: from the main /etc/sudoers file or some file under /etc/sudoers.d, whereever is this defined.
bash -c "$(curl -s https://install.prediction.io/install.sh)"
I run the above script on and ec2 instance running Amazon linux to install prediction IO. Nothing happens no error. Does anyone know whats going on?
That command downloads a file in silent mode $(curl -s https://install.prediction.io/install.sh) and then run the file with bash -c.
you should run the command in separated steps to check what it is going on
curl -O https://install.prediction.io/install.sh
chmod +x install.sh
./install.sh
First, check whether the like https://install.prediction.io/install.sh is accessible from your Amazon EC2.
curl -v https://install.prediction.io/install.sh
If its working, try this
curl -s https://install.prediction.io/install.sh | bash
This seems to be a simple issue but, I'm not able to figure it out. I am trying to run a couple of small scripts on a server and i'm having issues with that. i have an allhosts file that has the list of servers which is in the same location as that of the .sh file.
script to create a directory structure across all the 20 servers with 777 permissions
#!bin/bash
for q in `cat allhosts`
do
ssh $q "mkdir -p /opt/acd/hgf/tom/hanks/"
chmod -R 777 $q "/opt/acd/hgf/tom/hanks/" >/dev/null 2>&1
done
in the above script, it is only creating the directory paths and not changing the permissions for that path. I tried running that chmod command in a separate script, but no use..
script to scp the contents of hanks to the hanks folder created in the new server.
#!bin/bash
for q in `cat allhosts`
do
scp /opt/acd/hgf/tom/hanks/* $q:/opt/acd/hgf/tom/hanks/ >/dev/null 2>&1
done
in this script too, when i run it, its not copying anything to any of the servers.
i know this is a very small issue, but please check and let me know where I am going wrong. thanks in advance..
The first script is failing because it is running the chmod on the local machine. You should run it on the remote machine via ssh - you could combine this with the other ssh invocation as follows:
ssh $q "mkdir -p /opt/acd/hgf/tom/hanks/ ; chmod -R 777 /opt/acd/hgf/tom/hanks/"
I'd guess the second script is failing because the first script isn't setting permissions; it looks okay.
I configured cygwin in Windows Server 2008, now we need to implement automation
I am writing a batch script to add user to cygwin\etc\passwd file using following command
mkpasswd -l -u %username% -p /home >> /etc/passwd
Please help me how to execute following cmd in batch file
echo off
C:
chdir C:\cygwin\bin
bash --login -i
mkpasswd -l -u %username% -p /home >> /etc/passwd
It's not working
You're mixing Windows and Unix in your windows batch file. The batch file is running as a Windows command, as is the mkpasswd command in it. Windows has no concept of /etc/passwd and will throw an error. Probably something like;
D:\cygwin\bin>mkpasswd -l -u testusr -p /home >> /etc/passwd
The system cannot find the path specified.
Given what you want to do with mkpasswd I'd suggest you find a way to run your automation from within Cygwin. Perhaps setting up a cron job.