I have a input file which has user names and subject.
input
sk7865 /opt/apps/login
sk4888 /opt/apps/info
I am writing a shell script to take inputs from the above file and send a mail.
shell script
#!/bin/bash
while read a b
echo echo ""$a"" | mail -s ""$b"" "$a"#example.com
done < input
In the above script the actual command I wanted to use is:-
echo "hello world" | mail -s "a subject" someone#example.com
I want it to take arguments a,b at hello world, a subject to send the email to someone like I used it in the script. But it is not taking the arguments. I think it is something to do with double quotes. Please provide me with proper script.
You seem to be missing a do after your while:
#!/bin/bash
while read -r a b
do
echo "$a" | mail -s "$b" "$a"#somewhere.com
done < input
Remember to always run shellcheck on your bash scripts before posting here.
Also, you had an echo echo in there and double-double-quotes all over the place ""xyz"".
Related
I have a problem with the small script I am working on, can you please explain why is this not working:
#!/bin/bash
var1=$( linux command to list ldap users | grep "user: $1")
echo $var1
So, when I deploy my script ( ./mycript.sh $michael ), it should use that value instead of $1 and provide the output via echo $variable1? In my case that is not working.
Can you please explain how should I configure positional parameter inside the variable?
I tried the this solution, but that did not help:
#!/bin/bash
var1=$( linux command to list ldap users | grep user: $1)
echo $var1
If you invoke your script as ./mycript.sh $michael and the variable michael is not set in the shell, they you are calling your script with no arguments. Perhaps you meant ./myscript.h michael to pass the literal string michael as the first argument. A good way to protect against this sort of error in your script is to write:
#!/bin/bash
var1=$( linux command to list ldap users | grep "user: ${1:?}")
echo "$var1"
The ${1:?} will expand to $1 if that parameter is non-empty. If it is empty, you'll get an error message.
If you'd like the script to terminate if no values are found by grep, you might want:
var1=$( linux command to list ldap users | grep "user: ${1:?}") || exit
But it's probably easier/better to actually validate the arguments and print an error message. (Personally, I find the error message from ${:?} constructs to bit less than ideal.) Something like:
#!/bin/bash
if test $# -lt 1; then echo 'Missing arguments' >&2; exit 1; fi
This is my script , even after using the export command not able to use variable outside of the block. Below is the code that i have tried. I also tried other option like declare -x var, but that is also not working.
Can someone please please comment on this , am i doing right ?
#!/bin/bash
{
var="123"
export var # exporting the variable so that i can access from anywhere
echo "var is "$var # able to get the value of this variable
} | tee log.txt
echo "var is "$var # not able to get the value of this variable
Because the pipe is causing the code between the braces to execute in a sub-shell you need to find a way to capture that data as opposed to storing it in a variable that is not accessible from the rest of the code. An example would be to store the output of a function in a variable, or to access it via command substitution. If you have script.sh as such:
#!/bin/bash
function get_pizza() {
echo "Pizza"
}
myvar=$(get_pizza)
printf "myvar is '%s'\n" $myvar
echo "Plain echo follows:"
echo $(get_pizza)
and then run bash script.sh you will get output as such:
[user#host]$ bash ./script.sh
myvar is 'Pizza'
Plain echo follows:
Pizza
Then if you still want to write to a file via tee, you can pipe your whole script to tee:
bash ./script.sh | tee foo.log
If you only want parts of the script to goto a file, you'll can also handle that with I/O redirection within the script: echo pizza > foo.log
please bear with me if my terminology or syntax is less than stellar (still learning). I currently have a simple bash script that checks the arguments of the command and outputs files names with matching text. This part of my script works correctly via a grep command and piped to xargs for proper formatting.
When running the script, I run through a simple loop to check if the value is null and then move to running my variable/search if not.
My question is: Is it possible to have this script output via stdout AND also save a new file each time it is run with the user input and date/time? (but not overwrite) EX: report-bob-0729161500.rpt
I saw same other suggestions to use tee with the command, but I was trying to get it to work within the script. Similarly, another suggestion stated to utilize exec > >(tee -i logfile.txt), but I am unsure how to properly format this to include the date/time and $1 input into new files each time the script is executed.
Any help or suggested resources?
Thank you.
SEARCH=`[search_variable]`
if [ -z "$SEARCH" ]
then
echo "$1 not found."
else
echo -e "REPORT LISTING\n\n"
echo "$SEARCH"
fi
EDIT: I did try simply piping the echo statements to the tee command, which does work. However, I am still curious if anyone has other suggestions to accomplish this same task via alternative methods. Thank you.
With echo statements piped to tee:
SEARCH=`[search_variable]`
DATE=`date +"%m%d%y%k%M"`
if [ -z "$SEARCH" ]
then
echo "$1 not found."
else
echo -e "REPORT LISTING\n\n" | tee tps-list-$1-$DATE.rpt
echo "$SEARCH" | tee tps-list-$1-$DATE.rpt
fi
If you want to do it within the script, why then not just write to
both standard output and the file (using append where appropriate?).
Maybe a bit more writing, but it gives complete control.
Leon
i've a less knowledge about linux.
i 've a file "iplast.txt" and i need that when the text into this file is not equal to the result of this command http://ipinfo.io/ip the script send me a mail.
i've tried like this
if
[ 'cat iplast.txt' = 'curl http://ipinfo.io/ip' ]
then
echo 'ip same'
else
#send mail command that i already know
fi
but the cat command compare not the file iplast.txt but the word "iplast.txt" whit the curl command.
last thing, it need to work with FFP(Funz Fun Plug)
i tried three day but as i already said i know linux just a little. so pls help me tnk!
Try this:
#!/usr/bin/env bash
curl http://ipinfo.io/ip > ipnow.txt
OUTPUT="$(diff iplast.txt ipnow.txt)"
if [ "$OUTPUT" = ""]; then
echo "ip same"
else
echo "ip different"
fi
I also suggest using python rather than bash scripting if possible. This bash stuff is quite cryptic.
Two questions: how can I write a shell variable from this script into its child script?
Are there any easier ways to do this?
If you can't follow what I'm doing, I'm:
1) starting with a list of directories whose names will be stored as values taken by $i
2) cd'ing to every value of $i and ls'ing its contents
3) echoing its contents into a new script with the name of the directory via cat
4) using echo and cat to write a new script that contains the ls'd values of $i and sends them all to a blogging email address called $i#tumblr.com
#/bin/sh
read -d '' commands <<EOF
#list of directories goes here
dir1
dir2
dir3
etc...
EOF
for i in $commands
do
cd $SPECIALPATH/$i
echo ("#/bin/sh \n read -d '' directives <<EOF \n") | cat >> $i.sh
ls | cat >> $i.sh
echo ("EOF \n for q in $directives \n do \n uuencode $q $q | sendmail $i \n done \n") | cat >> $i.sh
# NB -- I am asking the script to write the shell variable $i into the new
# script, called $i.sh, as the email address specified, in the middle of an
# echo statement... I am well aware that it doesn't work as is
chmod +x $i.sh
./$i.sh
done
You are abusing felines a lot - you should simply redirect, rather than pipe to cat which appends.
You can avoid the intermediary $i.sh file by bundling all the output that goes to the file with a single I/O redirection that pipes direct into a shell - no need for the intermediate file to clean up (you didn't show that happening) or the chmod operation.
I would have done this using braces:
{
echo "..."
ls
echo "..."
} | sh
However, when I looked at the script in that form, I realized that wasn't necessary. I've left the initial part of your script unchanged, but the loop is vastly simpler like this:
#/bin/sh
read -d '' commands <<EOF
#list of directories goes here
dir1
dir2
dir3
etc...
EOF
for i in $commands
do
(
cd $SPECIALPATH/$i
ls |
while read q
do uuencode $q $q | sendmail $i
done
)
done
I'm assuming the sendmail command works - it isn't the way I'd try sending email. I'd probably use mailx or something similar, and I'd avoid using uuencode too (I'd use a base-64 encoding, left to my own devices):
do uuencode $q $q | mailx -s "File $q" $i#tumblr.com
The script also uses parentheses around the cd command. It means that the cd command and what follows is run in a sub-shell, so the parent script does not change directory. In this case, with an absolute pathname for $SPECIALDIR, it would not matter much. But as a general rule, it often makes life easier if you isolate directory changes like that.
I'd probably simplify it still further for general reuse (though I'd need to add something to ensure that SPECIALPATH is set appropriately):
#/bin/sh
for i in "$#"
do
(
cd $SPECIALPATH/$i
ls |
while read q
do uuencode $q $q | sendmail $i
done
)
done
I can then invoke it with:
script-name $(<list-of-dirs)
That means that without editing the script, it can be reused for any list of directories.
Intermediate step 1:
for i in $commands
do
(
cd $SPECIALPATH/$i
{
echo "read -d '' directives <<EOF"
ls
echo "EOF"
echo "for q in $directives"
echo "do"
echo " uuencode $q $q | sendmail $i"
echo "done"
} |
sh
)
done
Personally, I find it easier to read the generated script if the code that generates makes the generated script clear - using multiple echo commands. This includes indenting the code.
Intermediate Step 2:
for i in $commands
do
(
cd $SPECIALPATH/$i
{
ls |
echo "while read q"
echo "do"
echo " uuencode $q $q | sendmail $i"
echo "done"
} |
sh
)
done
I don't need to read the data into a variable in order to step through each item in the list once - simply read each line in turn. The while read mechanism is often useful for splitting up a line into multiple variables too: while read var1 var2 var3 junk will read the first field into $var1, the second into $var2, the third into $var3, and if there's anything left over, it goes into $junk. If you've generated the data accurately, there won't be any junk; but sometimes you have to deal with other people's data.
If the generated script is meant to be temporary, I would not use files. Besides, chmoding them to executable sounds unsafe. When I needed to parallel my scripting, I used a bash script to form a set of commands (in an array, split the array in two, then implode the array) to a single \n-separated string and then pass that to a new bash instance.
Basically, in bash:
for orig in "$#"
do
commands="$commands echo \"echoeing stuff here for arguments $orig\" \n"
done
echo -e $commands |bash
And a small tip: if the script doesn't need supervising, throw in a & after the piped bash to make your first script quit and do the rest of the work forked background.
If you export a variable
export VAR1=FOO
it'll be present in any child processes.
If you take a look at the init scripts, /etc/init..d/* you'll notice that many source another file full of "external" definitions. You could set up a file like that and have your child script source these files.