Capturing command output in a shell variable isn't working [duplicate] - linux

This question already has answers here:
How do I set a variable to the output of a command in Bash?
(15 answers)
Closed 5 years ago.
I am new to shell scripting, and need to output a series of commands to a local variable in a shell script, but keep on failing. For instance, the output of grep -c to a variable that will be use in an if statement. If anyone can redirect me over to a source that explains the process, I will appreciate.
#!/bash/sh
myVar = ls ~/bin | grep -c $0

Posting your code at shellcheck.net gives you valuable pointers quickly:
myVar = ls ~/bin | grep -c $0
^-- SC2037: To assign the output of a command, use var=$(cmd) .
^-- SC1068: Don't put spaces around the = in assignments.
^-- SC2086: Double quote to prevent globbing and word splitting.
If we implement these pointers:
myVar=$(ls ~/bin | grep -c "$0")
Also note that your shebang line has an incorrect path - the #! must be followed by the full path to the executing shell's binary.
Resources for learning bash, the most widely used POSIX-compatible shell:
Introduction: http://www.faqs.org/docs/Linux-HOWTO/Bash-Prog-Intro-HOWTO.html
Guide: http://mywiki.wooledge.org/BashGuide
Cheat sheet: http://mywiki.wooledge.org/BashSheet

Related

Bash loop with grep containing variable [duplicate]

This question already has an answer here:
Tilde not expanded when quoting on the right hand side of a Bash variable assignment [duplicate]
(1 answer)
Closed 1 year ago.
I have a function which should loop through every customer in an array and execute a grep to that directory.
test(){
set -x;
export customers=( customer1 customer2 customer3 );
export repo_path="~/repodir";
export output_path='/tmp';
for i in "${customers[#]}"
do
echo "${repo_path}/PEC/repo-${i}/build.yml"
grep 'link: ' $repo_path/PEC/repo-$i/build.yml | cut -d '/' -f 2 | sed 's/.git//g'
done | sort -u > ${output_path}/repos.txt
echo "${output_path}/repos.txt";
}
For some reason I get the following error message:
grep: ~/repodir/PEC/customer1/build.yml: No such file or directory
But when I check that exact same path I can see and read the file...
The first echo command also doesn't seem to be executing.
When I replace grep 'link: ' $repo_path/PEC/repo-$i/build.yml with grep 'link: ' ~/repodir/PEC/repo-$i/build.yml it does work.
I have tried various ways to define the variable like ${repo_path}, adding different types of quotes, ... So I basically don't know what I can do to make it work anymore.
$HOME is an environment variable, that is set to contain the home folder of the current user. The terminal session should have access to this environment variable at all times, unless this has been removed.
~ is a shell expansion symbol, one of the symbols that is processed before the actual command is performed. ~ alone expands to the value of $HOME.
Your code,
export repo_path="~/repodir";
the ~ is in a string and may not be processed, if you want to use a ~ try escaping the character. For readability using the $HOME variable would be simpler.

File read not interpreting Environment variables [duplicate]

This question already has an answer here:
Expand ENV variable of a string, run command and store in variable?
(1 answer)
Closed 6 years ago.
When I try to read a file which contains a environment variable HOSTNAME.
It is not interpreting its value while reading the file.
For example, if my hostname is linux1.com.
When I try to read a sample file(Test.txt) below
/var/log/$HOSTNAME
using the code below
while read line
do
ls -l $line
done < Test.txt
I am expecting it to interpet the $HOSTNAME variable and print it. But it is not working. It is directly doing ls -l /var/log/$HOSTNAME, instead of
ls -l /var/log/linux1.com
The same command is intrepreting the hostname when I run this command in shell.
Any leads on this is highly appreicated.
Thanks.
Parameter expansion is not recursive. When $line is expanded, its value is subject to pathname expansion (globbing) and word-splitting, but not further parameter expansions.

how to pass asterisk into ls command inside bash script

Hi… Need a little help here…
I tried to emulate the DOS' dir command in Linux using bash script. Basically it's just a wrapped ls command with some parameters plus summary info. Here's the script:
#!/bin/bash
# default to current folder
if [ -z "$1" ]; then var=.;
else var="$1"; fi
# check file existence
if [ -a "$var" ]; then
# list contents with color, folder first
CMD="ls -lgG $var --color --group-directories-first"; $CMD;
# sum all files size
size=$(ls -lgGp "$var" | grep -v / | awk '{ sum += $3 }; END { print sum }')
if [ "$size" == "" ]; then size="0"; fi
# create summary
if [ -d "$var" ]; then
folder=$(find $var/* -maxdepth 0 -type d | wc -l)
file=$(find $var/* -maxdepth 0 -type f | wc -l)
echo "Found: $folder folders "
echo " $file files $size bytes"
fi
# error message
else
echo "dir: Error \"$var\": No such file or directory"
fi
The problem is when the argument contains an asterisk (*), the ls within the script acts differently compare to the direct ls command given at the prompt. Instead of return the whole files list, the script only returns the first file. See the video below to see the comparation in action. I don't know why it behaves like that.
Anyone knows how to fix it? Thank you.
Video: problem in action
UPDATE:
The problem has been solved. Thank you all for the answers. Now my script works as expected. See the video here: http://i.giphy.com/3o8dp1YLz4fIyCbOAU.gif
The asterisk * is expanded by the shell when it parses the command line. In other words, your script doesn't get a parameter containing an asterisk, it gets a list of files as arguments. Your script only works with $1, the first argument. It should work with "$#" instead.
This is because when you retrieve $1 you assume the shell does NOT expand *.
In fact, when * (or other glob) matches, it is expanded, and broken into segments by $IFS, and then passed as $1, $2, etc.
You're lucky if you simply retrieved the first file. When your first file's path contains spaces, you'll get an error because you only get the first segment before the space.
Seriously, read this and especially this. Really.
And please don't do things like
CMD=whatever you get from user input; $CMD;
You are begging for trouble. Don't execute arbitrary string from the user.
Both above answers already answered your question. So, i'm going a bit more verbose.
In your terminal is running the bash interpreter (probably). This is the program which parses your input line(s) and doing "things" based on your input.
When you enter some line the bash start doing the following workflow:
parsing and lexical analysis
expansion
brace expansion
tidle expansion
variable expansion
artithmetic and other substitutions
command substitution
word splitting
filename generation (globbing)
removing quotes
Only after all above the bash
will execute some external commands, like ls or dir.sh... etc.,
or will do so some "internal" actions for the known keywords and builtins like echo, for, if etc...
As you can see, the second last is the filename generation (globbing). So, in your case - if the test* matches some files, your bash expands the willcard characters (aka does the globbing).
So,
when you enter dir.sh test*,
and the test* matches some files
the bash does the expansion first
and after will execute the command dir.sh with already expanded filenames
e.g. the script get executed (in your case) as: dir.sh test.pas test.swift
BTW, it acts exactly with the same way for your ls example:
the bash expands the ls test* to ls test.pas test.swift
then executes the ls with the above two arguments
and the ls will print the result for the got two arguments.
with other words, the ls don't even see the test* argument - if it is possible - the bash expands the wilcard characters. (* and ?).
Now back to your script: add after the shebang the following line:
echo "the $0 got this arguments: $#"
and you will immediatelly see, the real argumemts how your script got executed.
also, in such cases is a good practice trying to execute the script in debug-mode, e.g.
bash -x dir.sh test*
and you will see, what the script does exactly.
Also, you can do the same for your current interpreter, e.g. just enter into the terminal
set -x
and try run the dir.sh test* = and you will see, how the bash will execute the dir.sh command. (to stop the debug mode, just enter set +x)
Everbody is giving you valuable advice which you should definitely should follow!
But here is the real answer to your question.
To pass unexpanded arguments to any executable you need to single quote them:
./your_script '*'
The best solution I have is to use the eval command, in this way:
#!/bin/bash
cmd="some command \"with_quetes_and_asterisk_in_it*\""
echo "$cmd"
eval $cmd
The eval command takes its arguments and evaluates them into the command as the shell does.
This solves my problem when I need to call a command with asterisk '*' in it from a script.

Extract all variable values in a shell script

I'm debugging an old shell script; I want to check the values of all the variables used, it's a huge ugly script with approx more than 140 variables used. Is there anyway I can extract the variable names from the script and put them in a convenient pattern like:
#!/bin/sh
if [ ${BLAH} ....
.....
rm -rf ${JUNK}.....
to
echo ${BLAH}
echo ${JUNK}
...
Try running your script as follows:
bash -x ./script.bash
Or enable the setting in the script:
set -x
You can dump all interested variables in one command using:
set | grep -w -e BLAH -e JUNK
To dump all the variables to stdout use:
set
or
env
from inside your script.
You can extract a (sub)list of the variables declared in your script using grep:
grep -Po "([a-z][a-zA-Z0-9_]+)(?==\")" ./script.bash | sort -u
Disclaimer: why "sublist"?
The expression given will match string followed by an egal sign (=) and a double quote ("). So if you don't use syntax such as myvar="my-value" it won't work.
But you got the idea.
grep Options
-P --perl-regexp: Interpret PATTERN as a Perl regular expression (PCRE, see below) (experimental) ;
-o --only-matching: Print only the matched (non-empty) parts of a matching line, with each such part on a separate output line.
Pattern
I'm using a positive lookahead: (?==\") to require an egal sign followed by a double quote.
In bash, but not sh, compgen -v will list the names of all variables assigned (compare this to set, which has a great deal of output other than variable names, and thus needs to be parsed).
Thus, if you change the top of the script to #!/bin/bash, you will be able to use compgen -v to generate that list.
That said, the person who advised you use set -x did well. Consider this extension on that:
PS4=':$BASH_SOURCE:$LINENO+'; set -x
This will print the source file and line number before every command (or variable assignment) which is executed, so you will have a log not only of which variables are set, but just where in the source each one was assigned. This makes tracking down where each variable is set far easier.

Accessing shell variable in a Perl program

I have this Perl script:
#!/usr/bin/perl
$var = `ls -l \$ddd` ;
print $var, "\n";
And ddd is a shell variable
$ echo "$ddd"
arraytest.pl
When I execute the Perl script I get a listing of all files in the directory instead of just one file, whose file name is contained in shell variable $ddd.
Whats happening here ? Note that I am escaping $ddd in backticks in the Perl script.
The variable $ddd isn't set *in the shell that you invoke from your Perl script.
Ordinary shell variables are not inherited by subprocesses. Environment variables are.
If you want this to work, you'll need to do one of the following in your shell before invoking your Perl script:
ddd=arraytest.pl ; export ddd # sh
export ddd=arraytest.pl # bash, ksh, zsh
setenv ddd arraytest.pl # csh, tcsh
This will make the environment variable $ddd visible from your Perl script. But then it probably makes more sense to refer to it as $ENV{ddd}, rather than passing the literal string '$ddd' to the shell and letting it expand it:
$var = `ls -l $ENV{ddd}`;
You forgot to export ddd:
Mark each name to be passed to child processes in the environment.
So ddd is not automatically available to child processes.
The hash %ENV contains your current environment.
$var = `ls -l $ENV{ddd}`;
/edit - it works, checked, of course ddd need to be exported before running script
export ddd='arraytest.pl'
perl script.pl

Resources