Simple bash Linux script to run stored procedure - linux

I need a bash/Linux shell script which will run my stored procedure only on Wednesday.
I am new to Bash shell scripting.
Below is what I came up with:
test.sh
#!/bin/bash
MYSQL="/usr/bin/mysql --compress -hlocalhost -utest -ptest test";
dayofweek=`date +%a`
#if [ ${dayofweek} ='Wed' ] ; then ${MYSQL} -e "CALL testSummary();"; fi ;
When I run it:
sh test.sh, it says:
: command not found
Thank you.

how about run CALL testSummary() from mysql prompt?
besides [ ${dayofweek} ='Wed' ] should be [ ${dayofweek} = 'Wed' ], no other issue with your script
I created a simple testSummary procedure, it only print value of select 1 + 1, running your script gives me the correct output.

I don't see why you'd be getting the command not found error.
Couldn't the -e "CALL testSummary();" part be what's giving it?
Anyway:
You don't need
#!/bin/bash
If you're running it with
$ sh test.sh
Now if you made that script executable with
$ chmod +x test.sh
then doing $ ./test.sh would be tantamount to calling /bin/bash test.sh.
Make up your mind as to whether you want to use bash or sh. They are different.
If you want to run something periodically on Linux, you might want to take a look at this:
https://help.ubuntu.com/community/CronHowto

Related

How to handle errors in shell script

I am writing shell script to install my application. I have more number of commands in my script such as copy, unzip, move, if and so on. I want to know the error if any of the commands fails. Also I don't want to send exit codes other than zero.
Order of script installation(root-file.sh):-
./script-to-install-mongodb
./script-to-install-jdk8
./script-to-install-myapplicaiton
Sample script file:-
cp sourceDir destinationDir
unzip filename
if [ true]
// success code
if
I want to know by using variable or any message if any of my scripts command failed in root-file.sh.
I don't want to write code to check every command status. Sometimes cp or mv command may fail due to invalid directory. At the end of script execution, I want to know all commands were executed successfully or error in it?
Is there a way to do it?
Note: I am using shell script not bash
/* the status of your last command stores in special variable $?, you can define variable for $? doing export var=$? */
unzip filename
export unzipStatus=$?
./script1.sh
export script1Status=$?
if [ !["$unzipStatus" || "$script1Status"]]
then
echo "Everything successful!"
else
echo "unsuccessful"
fi
Well as you are using shell script to achieve this there's not much external tooling. So the default $? should be of help. You may want to check for retrieval value in between the script. The code will look like this:
./script_1
retval=$?
if $retval==0; then
echo "script_1 successfully executed ..."
continue
else;
echo "script_1 failed with error exit code !"
break
fi
./script_2
Lemme know if this added any value to your scenario.
Exception handling in linux shell scripting can be done as follows
command || fallback_command
If you have multiple commands then you can do
(command_one && command_two) || fallback_command
Here fallback_command can be an echo or log details in a file etc.
I don't know if you have tried putting set -x on top of your script to see detailed execution.
Want to give my 2 cents here. Run your shell like this
sh root-file.sh 2> errors.txt
grep patterns from errors.txt
grep -e "root-file.sh: line" -e "script-to-install-mongodb.sh: line" -e "script-to-install-jdk8.sh: line" -e "script-to-install-myapplicaiton.sh: line" errors.txt
Output of above grep command will display commands which had errors in it along with line no. Let say output is:-
test.sh: line 8: file3: Permission denied
You can just go and check line no.(here it is 8) which had issue. refer this go to line no. in vi.
or this can also be automated: grep specific line from your shell script. grep line with had issue here it is 8.
head -8 test1.sh |tail -1
hope it helps.

How to pass shell variables as Command Line Argument to a shell script

I have tried passing the shell variables to a shell script via command line arguments.
Below is the command written inside the shell script.
LOG_DIRECTORY="${prodName}_${users}users"
mkdir -m 777 "${LOG_DIRECTORY}"
and m trying to run this as:
prodName='DD' users=50 ./StatCollection_DBServer.sh
The command is working fine and creating the directory as per my requirement. But the issue is I don't want to execute the shell script as mentioned below.
Instead, I want to run it like
DD 50 ./StatCollection_DBServer.sh
And the script variables should get the value from here only and the Directory that will be created will be as "DD_50users".
Any help on how to do this?
Thanks in advance.
Bash scripts take arguments after the call of the script not before so you need to call the script like this:
./StatCollection_DBServer.sh DD 50
inside the script, you can access the variables as $1 and $2 so the script could look like this:
#!/bin/bash
LOG_DIRECTORY="${1}_${2}users"
mkdir -m 777 "${LOG_DIRECTORY}"
I hope this helps...
Edit:
Just a small explanation, what happened in your approach:
prodName='DD' users=50 ./StatCollection_DBServer.sh
In this case, you set the environment variables prodName and users before calling the script. That is why you were able to use these variables inside your code.
#!/bin/sh
prodName=$1
users=$2
LOG_DIRECTORY="${prodName}_${users}users"
echo $LOG_DIRECTORY
mkdir -m 777 "$LOG_DIRECTORY"
and call it like this :
chmod +x script.sh
./script.sh DD 50
Simple call it like this: sh script.sh DD 50
This script will read the command line arguments:
prodName=$1
users=$2
LOG_DIRECTORY="${prodName}_${users}users"
mkdir -m 777 "$LOG_DIRECTORY"
Here $1 will contain the first argument and $2 will contain the second argument.

command not found when executing nested bash scripts

I am having a bash script that is executing another bash script:
ex:
script name "rotator" is calling script name "s3-get" like below
!# /bin/bash
...
./s3-get {and params here}
All commands as "cat", "basename" etc. run correctly here
Within the "s3-get" script there is code as:
!# /bin/bash
cat > /dev/null << EndOfLicense
...
readonly weAreKnownAs="$(basename $0)"
...
main "$#"
So, if I simply execute the s3-get script directly from shell, it runs perfectly. When I try to execute it from "rotator" script, I get the error "cat: command not found". I can fix this by changing "cat" with "/bin/cat" just that I don't think this is correct since, as I stated above, the script runs correctly when executed as standalone. If I fix the "cat" command as above, the next error that raises is "basename: command not found", then "main: command not found"
I am pretty new to shell programming, so any help is appreciated.
Thank you
Try $ echo 'export PATH=$PATH:/root/scripts/RotateVideos' >> ~/.bashrc && source ~/.bashrc in the command line and then just call it using s3-get in your script. Alternatively use cd /root/scripts/RotateVideos && bash s3-get.

Exporting a script variable using `sh -cx`

I'm trying to export a variable from a script in the following manner:
sh -xc "<script here>"
But cannot get it to work at all. I've tried several techniques such as:
sh -xc "./xxx.sh"(exporting a variable yyy from the file itself)
sh -xc "./xxx.sh && export yyy=1"
(had xxx.sh exit 0)
sh -xc ". ./xxx.sh"
As well as several permutations of the above, but no dice on any of them.
Unfortunately, I must conform to the sh -xc "<script here>" style. Any script I execute will be placed inside of the quotations, file and/or command(s). There's no way around this.
Is what I'm asking even possible? If so, how?
Thanks!
You can't do an export through a shell script, because a shell script runs in a child shell process, and only children of the child shell would inherit the export.
The reason for using source is to have the current shell execute the commands
It's very common to place export commands in a file such as .bashrc which a bash will source on startup (or similar files for other shells)
But you can do as follows:
shell$ cat > script.sh
#!/bin/sh
echo export myTest=success
chmod u+x script.sh
And then have the current shell execute that output
shell$ `./script.sh`
shell$ echo $myTest
success
shell$ /bin/sh
$ echo $myTest
success

How to create file with instruction of commands?

I have a file name myFirstFile that contains certain commands.
But I am not able to excecute them.
If I want to execute this as a program, which code should be implemented?
If you want to execute your program, it should start with:
#!/bin/sh
It's the generic script file "header". It indicates that the script is a shell script (if it's bash script you should have #!/bin/bash, etc.). If you want to execute it, you should call chmod +x ./myFirstFile to give privileges to call it as program, and then you can start your script normally: ./myFirstFile.
Make this file executable* and give it *.sh extention like:
"myFirstFile.sh"
Than run it from terminal (or crontab - it can do things for you when you sleep :) ) like:
cd directory/you/have/that/file
sh ./myFirstFile.sh
*Im not shure that making it executable is the most secure thing you can do. All my sh scripts are and I never digged into this issue, so make sure its cool
Also make sure you have "#!/bin/bash" in first line - sometimes it helps (dont know why, Google it)
edit: for example my script for starting Minecraf server looks like this
start.sh
#!/bin/bash
BINDIR=$(dirname "$(readlink -fn "$0")")
cd "$BINDIR"
while true
do
java -Xmx3584M -jar craftbukkit.jar
echo -e 'If you want to completely stop the server process now, press ctrl-$
echo "Rebooting in:"
for i in {5..1}
do
echo "$i..."
sleep 1
done
echo 'Restarting now!'
done
You have to make the file executable:
chmod +x myFirstFile
Then you can execute the commands in it:
./myFirstFile

Resources