echoing in shell -n doesn't get printed the right thing - linux

I know that this is some kind of special character issue but I do not know how to solve it.
I type in console
echo "-n"
and nothing get printed :(
I also tried with
echo -e "-n"
to execute the special characters (the one escaped from sequence) but again nothing happend
how can I print "-n" ?

Try
printf "%s\n" -n
or
printf "%s\n" '-n'

Here is one way:
aix#aix:~$ echo -e '\x2dn'
-n
It escapes the - as \x2d.
A more verbose way is to print the two characters separately:
aix#aix:~$ echo -n -; echo n
-n
Here, the -n instructs the first echo to not print a newline; it is not related to the -n being printed. :)

Related

About egrep command

How can I create a bash script that admits a file as a command line argument and prints on screen all lines that have a length of more than 12 characters using egrep command?
You can use:
egrep '.{13}'
The . will match any character, and the {13} repeats it exactly 13 times. You can put this in a shell script like:
#!/bin/sh
# Make sure the user actually passed an argument. This is useful
# because otherwise grep will try and read from stdin and hang forever
if [ -z "$1" ]; then
echo "Filename needed"
exit 1
fi
egrep '.{13}' "$1"
The $1 refers to the first command argument. You can also use $2, $3, etc, and $# refers to all commandline arguments (useful if you want to run it over multiple files):
egrep '.{13}' "$#"

How do I echo "-e"?

I want to echo a string that might contain the same parameters as echo. How can I do it without modifying the string?
For instance:
$ var="-e something"
$ echo $var
something
... didn't print -e
A surprisingly deep question. Since you tagged bash, I'll assume you mean bash's internal echo command, though the GNU coreutils' standalone echo command probably works similarly enough.
The gist of it is: if you really need to use echo (which would be surprising, but that's the way the question is written by now), it all depends on what exactly your string can contain.
The easy case: -e plus non-empty string
In that case, all you need to do is quote the variable before passing it to echo.
$ var="-e something"
$ echo "$var"
-e something
If the string isn't eaxctly an echo option or combination, which includes any non-option suffix, it won't be recognized as such by echo and will be printed out.
Harder: string can be -e only
If your case can reduce to just "-e", it gets trickier. One way to do it would be:
$ echo -e '\055e'
-e
(escaping the dash so it doesn't get interpreted as an option but as on octal sequence)
That's rewriting the string. It can be done automatically and non-destructively, so it feels acceptable:
$ var="-e something"
$ echo -e ${var/#-/\\055}
-e something
You noticed I'm actually using the -e option to interpret an octal sequence, so it won't work if you intended to echo -E. It will work for other options, though.
The right way
Seriously, you're not restricted to echo, are you?
printf '%s\n' "$var"
The proper bash way is to use printf:
printf "%s\n" "$var"
By the way, your echo didn't work because when you run:
var="-e something"
echo $var
(without quoting $var), echo will see two arguments: -e and something. Because when echo meets -e as its first argument, it considers it's an option (this is also true for -n and -E), and so processes it as such. If you had quoted var, as shown in other answers, it would have worked.
Quote it:
$ var="-e something"
$ echo "$var"
-e something
If what you want is to get echo -e's behaviour (enable interpretation of backslash escapes), then you have to leave the $var reference without quotes:
$ var="hi\nho"
$ echo $var
hi
ho
Or use eval:
$ var="hi\nho"
$ eval echo \${var}
hi\nho
$ var="-e hi\nho"
$ eval echo \${var}
hi
ho
Since we're using bash, another alternative to echo is to simply cat a "here string":
$ var="-e something"
$ cat <<< "$var"
-e something
$ var="-e"
$ cat <<< "$var"
-e
$
printf-based solutions will almost certainly be more portable though.
Try the following:
$ env POSIXLY_CORRECT=1 echo -e
-e
Due to shell aliases and built-in echo command, using an unadorned
echo interactively or in a script may get you different functionality
than that described here. Invoke it via env (i.e., env echo ...)
to avoid interference from the shell.
The environment variable POSIXLY_CORRECT was introduced to allow the user to force the standards-compliant behaviour. See: POSIX at Wikipedia.
Or use printf:
$ printf '%s\n' "$var"
Source: Why is bash swallowing -e in the front of an array at stackoverflow SE
Use printf instead:
var="-e bla"
printf "%s\n" "$var"
Using just echo "$var" will still fail if var contains just a -e or similar. If you need to be able to print that as well, use printf.

I keep getting a 'while syntax' error on the output of the at job in unix and I have no idea why

#!/usr/dt/bin/dtksh
while getopts w:m: option
do
case $option in
w) wflag=1
wval="$OPTARG";;
m) mflag=1
mval="$OPTARG";;
?) printf 'BAD\n' $0
exit 2;;
esac
done
if [ ! -z "$wflag" ]; then
printf "W and -w arg is $wval\n"
fi
if [ ! -z "$mflag" ]; then
printf "M and -m arg is $mval\n"
fi
shift $(($OPTIND - 1))
printf "Remaining arguments are: $* \n"
at $wval <<ENDMARKER
echo $* >> Search_List
tr " " "\n" <Search_List >Usr_List
while true; do
if [ -s Usr_List ]; then
for i in $(cat Usr_List); do
if finger -m | grep $i; then
echo '$i is online' | elm user
sed '/$i/d' <Usr_List >tmplist
mv tmplist Usr_List
fi
done
else
break
fi
done
ENDMARKER
Essentially I want to keep searching through until it is empty. Each time an element of the list is found, it is deleted. Once the list is empty quit.
There are no error messages when I first run the command, it only shows up in an email containing the output of the at job.
Thanks in advance for any advice
EDIT: The script uses getopts and takes one argument for -w and one for -m, the w value is set as the time for the at job, the m still has to be used. Any arguments after the one for m are sent to a file called Search_List, Search_List is edited and saved as Usr_List. Then in the while loop, while Usr_List is not empty, the script checks the results of finger -m against the names in Usr_List. If a name is found, it is removed from Usr_List. Once Usr_List is empty, the program should stop.
elm is a way to send an email, so elm user sends an email to user.
The error is :
while: Expression syntax
at uses /bin/sh by default.
at now <<ENDMARKER
<code here>
ENDMARKER
All of this executes under /bin/sh, which on some systems can be Bourne Shell (Solaris for example).
You need to figure out what /bin/sh is for your system, then modify things accordingly. Plus, read the gurantees about what is and what is not in your "at" environment. I think the problem lies there. You have both UNIX and linux tags. So I cannot give a lot more help than that.
You can enable logging -- the way YOU need it -- of the at code chunk:
exec 2&>1 > /tmp/somefile.log
Then write debugging messages to stdout or stderr.
Your HEREDOC is being interpolated. Try quoting the delimiter:
at $wval << 'ENDMARKER'
Although ( I haven't looked closely) it appears that you want some interpolation. But you definitely do not want it on the line in which you reference $i, so quote that $ if you do not quote the entire heredoc:
if finger -m | grep \$i; then
You need to pass the -k option to at:
...
at -k $wval <<ENDMARKER
...
at is otherwise defaulting to your login shell which is csh or one of its derivatives.
It turns out that the while command and the if command needed to be combined.
while [[ -s Usr_List ]]; do
......
done

Forcing bash to expand variables in a string loaded from a file

I am trying to work out how to make bash (force?) expand variables in a string (which was loaded from a file).
I have a file called "something.txt" with the contents:
hello $FOO world
I then run
export FOO=42
echo $(cat something.txt)
this returns:
hello $FOO world
It didn't expand $FOO even though the variable was set. I can't eval or source the file - as it will try and execute it (it isn't executable as it is - I just want the string with the variables interpolated).
Any ideas?
I stumbled on what I think is THE answer to this question: the envsubst command:
echo "hello \$FOO world" > source.txt
export FOO=42
envsubst < source.txt
This outputs: hello 42 world
If you would like to continue work on the data in a file destination.txt, push this back to a file like this:
envsubst < source.txt > destination.txt
In case it's not already available in your distro, it's in the
GNU package gettext.
#Rockallite
I wrote a little wrapper script to take care of the '$' problem.
(BTW, there is a "feature" of envsubst, explained at
https://unix.stackexchange.com/a/294400/7088
for expanding only some of the variables in the input, but I
agree that escaping the exceptions is much more convenient.)
Here's my script:
#! /bin/bash
## -*-Shell-Script-*-
CmdName=${0##*/}
Usage="usage: $CmdName runs envsubst, but allows '\$' to keep variables from
being expanded.
With option -sl '\$' keeps the back-slash.
Default is to replace '\$' with '$'
"
if [[ $1 = -h ]] ;then echo -e >&2 "$Usage" ; exit 1 ;fi
if [[ $1 = -sl ]] ;then sl='\' ; shift ;fi
sed 's/\\\$/\${EnVsUbDolR}/g' | EnVsUbDolR=$sl\$ envsubst "$#"
Many of the answers using eval and echo kind of work, but break on various things, such as multiple lines, attempting to escaping shell meta-characters, escapes inside the template not intended to be expanded by bash, etc.
I had the same issue, and wrote this shell function, which as far as I can tell, handles everything correctly. This will still strip only trailing newlines from the template, because of bash's command substitution rules, but I've never found that to be an issue as long as everything else remains intact.
apply_shell_expansion() {
declare file="$1"
declare data=$(< "$file")
declare delimiter="__apply_shell_expansion_delimiter__"
declare command="cat <<$delimiter"$'\n'"$data"$'\n'"$delimiter"
eval "$command"
}
For example, you can use it like this with a parameters.cfg which is really a shell script that just sets variables, and a template.txt which is a template that uses those variables:
. parameters.cfg
printf "%s\n" "$(apply_shell_expansion template.txt)" > result.txt
In practice, I use this as a sort of lightweight template system.
you can try
echo $(eval echo $(cat something.txt))
You don't want to print each line, you want to evaluate it so that Bash can perform variable substitutions.
FOO=42
while read; do
eval echo "$REPLY"
done < something.txt
See help eval or the Bash manual for more information.
Another approach (which seems icky, but I am putting it here anyway):
Write the contents of something.txt to a temp file, with an echo statement wrapped around it:
something=$(cat something.txt)
echo "echo \"" > temp.out
echo "$something" >> temp.out
echo "\"" >> temp.out
then source it back in to a variable:
RESULT=$(source temp.out)
and the $RESULT will have it all expanded. But it seems so wrong !
Single line solution that doesn't need temporary file :
RESULT=$(source <(echo "echo \"$(cat something.txt)\""))
#or
RESULT=$(source <(echo "echo \"$(<something.txt)\""))
If you only want the variable references to be expanded (an objective that I had for myself) you could do the below.
contents="$(cat something.txt)"
echo $(eval echo \"$contents\")
(The escaped quotes around $contents is key here)
If something.txt has only one line, a bash method, (a shorter version of Michael Neale's "icky" answer),
using process & command substitution:
FOO=42 . <(echo -e echo $(<something.txt))
Output:
hello 42 world
Note that export isn't needed.
If something.txt has one or more lines, a GNU sed evaluate method:
FOO=42 sed 's/"/\\\"/g;s/.*/echo "&"/e' something.txt
Following solution:
allows replacing of variables which are defined
leaves unchanged variables placeholders which are not defined. This is especially useful during automated deployments.
supports replacement of variables in following formats:
${var_NAME}
$var_NAME
reports which variables are not defined in environment and returns error code for such cases
TARGET_FILE=someFile.txt;
ERR_CNT=0;
for VARNAME in $(grep -P -o -e '\$[\{]?(\w+)*[\}]?' ${TARGET_FILE} | sort -u); do
VAR_VALUE=${!VARNAME};
VARNAME2=$(echo $VARNAME| sed -e 's|^\${||g' -e 's|}$||g' -e 's|^\$||g' );
VAR_VALUE2=${!VARNAME2};
if [ "xxx" = "xxx$VAR_VALUE2" ]; then
echo "$VARNAME is undefined ";
ERR_CNT=$((ERR_CNT+1));
else
echo "replacing $VARNAME with $VAR_VALUE2" ;
sed -i "s|$VARNAME|$VAR_VALUE2|g" ${TARGET_FILE};
fi
done
if [ ${ERR_CNT} -gt 0 ]; then
echo "Found $ERR_CNT undefined environment variables";
exit 1
fi
foo=45
file=something.txt # in a file is written: Hello $foo world!
eval echo $(cat $file)
$ eval echo $(cat something.txt)
hello 42 world
$ bash --version
GNU bash, version 3.2.57(1)-release (x86_64-apple-darwin17)
Copyright (C) 2007 Free Software Foundation, Inc.
envsubst is a great solution (see LenW's answer) if the content you're substituting is of "reasonable" length.
In my case, I needed to substitute in a file's content to replace the variable name. envsubst requires that the content be exported as environment variables and bash has a problem when exporting environment variables that are more than a megabyte or so.
awk solution
Using cuonglm's solution from a different question:
needle="doc1_base64" # The "variable name" in the file. (A $ is not needed.)
needle_file="doc1_base64.txt" # Will be substituted for the needle
haystack=$requestfile1 # File containing the needle
out=$requestfile2
awk "BEGIN{getline l < \"${needle_file}\"}/${needle}/{gsub(\"${needle}\",l)}1" $haystack > $out
This solution works for even large files.
expenv () {
LF=$'\n'
echo "cat <<END_OF_TEXT${LF}$(< "$1")${LF}END_OF_TEXT" | bash
return $?
}
expenv "file name"
The following works: bash -c "echo \"$(cat something.txt)"\"

How to properly handle wildcard expansion in a bash shell script?

#!/bin/bash
hello()
{
SRC=$1
DEST=$2
for IP in `cat /opt/ankit/configs/machine.configs` ; do
echo $SRC | grep '*' > /dev/null
if test `echo $?` -eq 0 ; then
for STAR in $SRC ; do
echo -en "$IP"
echo -en "\n\t ARG1=$STAR ARG2=$2\n\n"
done
else
echo -en "$IP"
echo -en "\n\t ARG1=$SRC ARG2=$DEST\n\n"
fi
done
}
hello $1 $2
The above is the shell script which I provide source (SRC) & desitnation (DEST) path. It worked fine when I did not put in a SRC path with wild card ''. When I run this shell script and give ''.pdf or '*'as follows:
root#ankit1:~/as_prac# ./test.sh /home/dev/Examples/*.pdf /ankit_test/as
I get the following output:
192.168.1.6
ARG1=/home/dev/Examples/case_Contact.pdf ARG2=/home/dev/Examples/case_howard_county_library.pdf
The DEST is /ankit_test/as but DEST also get manupulated due to '*'. The expected answer is
ARG1=/home/dev/Examples/case_Contact.pdf ARG2=/ankit_test/as
So, if you understand what I am trying to do, please help me out to solve this BUG.
I'll be grateful to you.
Thanks in advance!!!
I need to know exactly how I use '*.pdf' in my program one by one without disturbing DEST.
Your script needs more work.
Even after escaping the wildcard, you won't get your expected answer. You will get:
ARG1=/home/dev/Examples/*.pdf ARG2=/ankit__test/as
Try the following instead:
for IP in `cat /opt/ankit/configs/machine.configs`
do
for i in $SRC
do
echo -en "$IP"
echo -en "\n\t ARG1=$i ARG2=$DEST\n\n"
done
done
Run it like this:
root#ankit1:~/as_prac# ./test.sh "/home/dev/Examples/*.pdf" /ankit__test/as
The shell will expand wildcards unless you escape them, so for example if you have
$ ls
one.pdf two.pdf three.pdf
and run your script as
./test.sh *.pdf /ankit__test/as
it will be the same as
./test.sh one.pdf two.pdf three.pdf /ankit__test/as
which is not what you expect. Doing
./test.sh \*.pdf /ankit__test/as
should work.
If you can, change the order of the parameters passed to your shell script as follows:
./test.sh /ankit_test/as /home/dev/Examples/*.pdf
That would make your life a lot easier since the variable part moves to the end of the line. Then, the following script will do what you want:
#!/bin/bash
hello()
{
SRC=$1
DEST=$2
for IP in `cat /opt/ankit/configs/machine.configs` ; do
echo -en "$IP"
echo -en "\n\t ARG1=$SRC ARG2=$DEST\n\n"
done
}
arg2=$1
shift
while [[ "$1" != "" ]] ; do
hello $1 $arg2
shift
done
You are also missing a final "done" to close your outer for loop.
OK, this appears to do what you want:
#!/bin/bash
hello() {
SRC=$1
DEST=$2
while read IP ; do
for FILE in $SRC; do
echo -e "$IP"
echo -e "\tARG1=$FILE ARG2=$DEST\n"
done
done < /tmp/machine.configs
}
hello "$1" $2
You still need to escape any wildcard characters when you invoke the script
The double quotes are necessary when you invoke the hello function, otherwise the mere fact of evaluating $1 causes the wildcard to be expanded, but we don't want that to happen until $SRC is assigned in the function
Here's what I came up with:
#!/bin/bash
hello()
{
# DEST will contain the last argument
eval DEST=\$$#
while [ $1 != $DEST ]; do
SRC=$1
for IP in `cat /opt/ankit/configs/machine.configs`; do
echo -en "$IP"
echo -en "\n\t ARG1=$SRC ARG2=$DEST\n\n"
done
shift || break
done
}
hello $*
Instead of passing only two parameters to the hello() function, we'll pass in all the arguments that the script got.
Inside the hello() function, we first assign the final argument to the DEST var. Then we loop through all of the arguments, assigning each one to SRC, and run whatever commands we want using the SRC and DEST arguments. Note that you may want to put quotation marks around $SRC and $DEST in case they contain spaces. We stop looping when SRC is the same as DEST because that means we've hit the final argument (the destination).
For multiple input files using a wildcard such as *.txt, I found this to work perfectly, no escaping required. It should work just like a native bash app like "ls" or "rm." This was not documented just about anywhere so since I spent a better part of 3 days trying to figure it out I decided I should post it for future readers.
Directory contains the following files (output of ls)
file1.txt file2.txt file3.txt
Run script like
$ ./script.sh *.txt
Or even like
$ ./script.sh file{1..3}.txt
The script
#!/bin/bash
# store default IFS, we need to temporarily change this
sfi=$IFS
#set IFS to $'\n\' - new line
IFS=$'\n'
if [[ -z $# ]]
then
echo "Error: Missing required argument"
echo
exit 1
fi
# Put the file glob into an array
file=("$#")
# Now loop through them
for (( i=0 ; i < ${#file[*]} ; i++ ));
do
if [ -w ${file[$i]} ]; then
echo ${file[$i]} " writable"
else
echo ${file[$i]} " NOT writable"
fi
done
# Reset IFS to its default value
IFS=$sfi
The output
file1.txt writable
file2.txt writable
file3.txt writable
The key was switching the IFS (Internal Field Separator) temporarily. You have to be sure to store this before switching and then switch it back when you are done with it as demonstrated above.
Now you have a list of expanded files (with spaces escaped) in the file[] array which you can then loop through. I like this solution the best, easiest to program for and easiest for the users.
There's no need to spawn a shell to look at the $? variable, you can evaluate it directly.
It should just be:
if [ $? -eq 0 ]; then
You're running
./test.sh /home/dev/Examples/*.pdf /ankit_test/as
and your interactive shell is expanding the wildcard before the script gets it. You just need to quote the first argument when you launch it, as in
./test.sh "/home/dev/Examples/*.pdf" /ankit_test/as
and then, in your script, quote "$SRC" anywhere where you literally want the things with wildcards (ie, when you do echo $SRC, instead use echo "$SRC") and leave it unquoted when you want the wildcards expanded. Basically, always put quotes around things which might contain shell metacharacters unless you want the metacharacters interpreted. :)

Resources