Get version from file name - linux

I'm creating a script that list all the jboss versions. But I was caught in a problem.
Jboss usually has different names for the version.
jboss-4.0.0.tar.gz
jboss-4.0.4.GA.tar.gz
I managed to obtain the version (for example 4.0.0 or 4.0.4). But I need to obtain all the version 4.0.4.GA
ls -1 | grep jboss |sed -r 's/^.*-([0-9.]+)\..*/\1/'
Thanks

Don't parse ls output.
ls is a tool for interactively looking at file information. Its output is formatted for humans and will cause bugs in scripts. Use globs (like I do here) or find instead. Understand why: http://mywiki.wooledge.org/ParsingLs
$ ls -1
jboss-4.0.0.tar.gz
jboss-4.0.4.GA.tar.gz
foobar
Using grep :
$ printf -- '%s\n' * | grep -oP 'jboss-\K.*(?=\.tar\.gz)'
Or using awk :
$ printf -- '%s\n' * | awk -F'jboss-|.tar.gz' '/jboss/{print $2}'
Or using perl :
printf -- '%s\n' * | perl -lne '/jboss-(.*?)\.tar\.gz/ && print $1'
Outputs
4.0.0
4.0.4.GA

$ cat test.sh
#!/bin/bash
for file in jboss-*.tar.gz; do
[ -f "${file}" ] || continue
version="${file#*-}"
version="${version%.tar.gz}"
echo "${version}"
done
Example:
$ find
.
./test.sh
./jboss-4.0.0.tar.gz
./jboss-4.0.4.GA.tar.gz
$ ./test.sh
4.0.0
4.0.4.GA

Related

Concatenating xargs with the use of if-else in bash

I've got two test files, namely, ttt.txt and ttt2.txt, the Content of which is shown as below:
#ttt.txt
(132) 123-2131
543-732-3123
238-3102-312
#ttt2.txt
1
2
3
I've already tried the following commands in bash and it works fine:
if grep -oE "(\(\d{3}\)[ ]?\d{3}-\d{4})|(\d{3}-\d{3}-\d{4})" ttt1.txt ; then echo "found"; fi
# with output 'found'
if grep -oE "(\(\d{3}\)[ ]?\d{3}-\d{4})|(\d{3}-\d{3}-\d{4})" ttt2.txt ; then echo "found"; fi
But when I combine the above command with xargs, it complains error '-bash: syntax error near unexpected token `then''. Could anyone give me some explanation? Thanks in advance!
ll | awk '{print $9}' | grep ttt | xargs -I $ if grep --quiet -oE "(\(\d{3}\)[ ]?\d{3}-\d{4})|(\d{3}-\d{3}-\d{4})" $; then echo "found"; fi
$ is a special character in bash (it marks variables) so don't use it as your xargs marker, you'll only get confused.
The real problem here though is that you are passing if grep --quiet -oE "(\(\d{3}\)[ ]?\d{3}-\d{4})|(\d{3}-\d{3}-\d{4})" $ as the argument to xargs, and then the remainder of the line is being treated as a new command, because it breaks at the ;.
You can wrap the whole thing in a sub-invocation of bash, so that xargs sees the whole command:
$ ll | awk '{print $9}' | grep ttt | xargs -I xx bash -c 'if grep --quiet -oE "(\(\d{3}\)[ ]?\d{3}-\d{4})|(\d{3}-\d{3}-\d{4})" xx; then echo "found"; fi'
found
Finally, ll | awk '{print $9}' | grep ttt is a needlessly complicated way of listing the files that you're looking for. You actually you don't need any of the code above, just do this:
$ if grep --quiet -oE "(\(\d{3}\)[ ]?\d{3}-\d{4})|(\d{3}-\d{3}-\d{4})" ttt*; then echo "found"; fi
found
Alternatively, if you want to process each file in turn (which you don't need here, but you might want when this gets more complicated):
for file in ttt*
do
if grep --quiet -oE "(\(\d{3}\)[ ]?\d{3}-\d{4})|(\d{3}-\d{3}-\d{4})" "$file"
then
echo "found"
fi
done

getting SW version by bash script for uninstall preinstalled software/notifying easily by assigning variable to it. Please share more ideas

Please share more ideas to get software version from bash command and use it as variable later.
su --version
su (GNU coreutils) 5.97
Copyright etc.
and create variable of the result of it.
Something like I tried below.
su --version >/tmp/temp.txt
if [ -f /tmp/temp.txt ]; then
elv=`cat /tmp/temp.txt | gawk 'BEGIN {FS="(GNU coreutils)"} {print $2}' | gawk 'BEGIN {FS="."} {print $1}'`
#Version String. Just a shortcut to be used later
els=el$elv
else
echo "Unable to determine version. I can't continue"
exit 1
fi
if [ `rpm -qa | egrep -c -i "^mysql-"` -gt 0 ]; then
cat << EOF
It appears that the distro-supplied version of MySQL is at least partially installed,
or a prior installation attempt failed.
Please remove these packages, as well as their dependencies (often postfix), and then
retry this script:
$(rpm -qa | egrep -i "^mysql-")
EOF
exit 1
fi

Searching a string in shell script

I am trying to learn shell script. So sorry if my question is so simple.
I am having a file called one.txt and if either strings 1.2 or 1.3 is present in the string then I have to display the success message else the failure message.
The code I tried is follows,
#!/bin/bash
echo "checking"
if grep -q 1.2 /root/one | grep -q 1.3 /root/one; then
echo " vetri Your NAC version"
fi
What I am doing wrong here ?
You can also include the OR in your grep pattern like so:
grep '1.2\|1.3' /root/one
details here
Update:
as twalberg pointed out in the comment, my answer was not precise enough. The better pattern is:
grep '1\.2\|1\.3' /root/one
Or even better, because more compact:
grep '1\.[23]' /root/one
You have to use ||
#!/bin/bash
echo "checking"
if grep -q 1.2 /root/one || grep -q 1.3 /root/one; then
echo " vetri Your NAC version"
fi
Single | operator is called pipe. It will pass the output of the command before | to the command after |.
It is better to join these these greps with | (OR operator):
grep '1.2\|1.3'
or
grep -E '1.2|1.3'
I guess the easier way to do this is to create a variable to check the count of occurrences:
#!/bin/bash
echo "checking"
CHECK=`egrep -c '1\.(2|3)' /root/one`
if [ "$CHECK" -gt 0 ]; then
echo "vetri Your NAC version"
fi

dynamically run linux shell commands

I have a command that should be executed by a shell script.
Actually the command does not matter the only thing that is important the further command execution and the right escaping of the critical parts.
The command that usually is executed normally in putty is something like this(maybe some additional flags for ls)
rm -r `ls /test/parse_first/ | awk '{print $2}' | grep trash`
but now I have a batch of such command so I would like to execute them in a loop
like
for i in {0..100}
do
str=str$i
${!str}
done
where str is :
str0="rm -r `ls /test/parse_first/ | awk '{print $2}' | grep trash`"
str1="rm -r `ls /test/parse_second/ | awk '{print $2}' | grep trash`"
and that gives me a lot of headache cause the execution done by ${!str} brakes the quotations and inline shell between `...` marks
my_rm() { rm -r `ls /test/$1 | awk ... | grep ... `; }
for i in `whatevr`; do
my_rm $i
done;
Getting this right is surprisingly tricky, but it can be done:
for i in $(seq 0 100)
do
str=str$i
eval "eval \"\$$str\""
done
You can also do:
for i in {0..10}
do
<whatevercommand>
done
It's actually simpler to place them on arrays and use glob patterns:
#!/bin/bash
shopt -s nullglob
DIRS=("/test/parse_first/" "/test/parse_second/")
for D in "${DIRS[#]}"; do
for T in "$D"/*trash*; do
rm -r -- "$T"
done
done
And if rm could accept multiple arguments, you don't need to have an extra loop:
for D in "${DIRS[#]}"; do
rm -r -- "$D"/*trash*
done
UPDATE:
#!/bin/bash
readarray -t COMMANDS <<'EOF'
rm -r `ls /test/parse_first/ | awk '{print $2}' | grep trash
rm -r `ls /test/parse_second/ | awk '{print $2}' | grep trash
EOF
for C in "${COMMANDS[#]}"; do
eval "$C"
done
Or you could just read commands from another file:
readarray -t COMMANDS < somefile.txt

command not working as expected if run via /bin/sh -c

I have to concatenate a set of files. Directory structure is like this:
root/features/xxx/multiple_files... -> root/xxx/single_file
what i have written (and it works fine):
for dirname in $(ls -d root/features/*|awk -F/ '{print $NF}');do;mkdir root/${dirname};cat root/features/${dirname}/* > root/${dirname}/final.txt;done
But when i run the same thing via sh shell
/bin/sh -c "for dirname in $(ls -d root/features/*|awk -F/ '{print $NF}');do;mkdir root/${dirname};cat root/features/${dirname}/* > root/${dirname}/final.txt;done"
it gives me errors:
/bin/sh: -c: line 1: syntax error near unexpected token `201201000'
/bin/sh: -c: line 1: `201201000'
My process always appends /bin/sh -c before running any commands. Any suggestions what might be going wrong here? Any alternate ways? I have spent a really long time on this ,without making much headway!
EDIT:
`ls -d root/features/*|awk -F/ '{print $NF}' returns
201201
201201000
201201001
201201002
201201003
201201004
201201005
201201006
201201007
201202000
201205000
201206000
201207000
201207001
201207002
Always use sh -c 'cmd1 | cmd2' with single quotes.
Always use sh -eu -xv -c 'cmd1 | cmd2' to debug.
Always use bash -c 'cmd1 | cmd2' if your code is Bash-specific (cf. process substitution, ...).
Remove ; after do in for ... ; do; mkdir ....
Escape possible single quotes within single quotes like so: ' --> '\''.
(And sometimes just formatting your code clarifies a lot.)
Applied to your command this should look somewhat like this ...
# test version
/bin/sh -c '
for dirname in $(ls -d /* | awk -F/ '\''{print $NF}'\''); do
printf "%s\n" "mkdir root/${dirname}";
printf "%s\n" "cat root/features/${dirname}/* > root/${dirname}/final.txt";
echo
done
' | nl
# test version using 'printf' instead of 'ls'
sh -c '
printf "%s\000" /*/ | while IFS="" read -r -d "" file; do
dirname="$(basename "$file")"
printf "%s\n" "mkdir root/${dirname}";
printf "%s\n" "cat root/features/${dirname}/* > root/${dirname}/final.txt";
echo
done
' | nl
I got this to run in the little test environment I set up on my box. Turns out it didn't like the double quotes. The issue I ran into was the quotes around the awk statement...if you wrap it in double quotes it prints the whole thing.....I used cut to get the desired result, but my guess is you'll have to change the -f arg to 3 instead of 2..I think.
/bin/sh -c 'for dirname in $(ls -d sh_test/* | awk -F/ '\''{print $NF}'\''); do mkdir sh_test_root/${dirname}; cat sh_test/${dirname}/* > sh_test_root/${dirname}/final.txt;done'
edit: Tested edit proposed by nadu and it works fine. The above reflects that change.

Resources