Bash: Variable contains executable path -> Convert to string - string

I have this problem and not found a sufficient solutionyet, maybe you guys can help me.
I need to do this:
find -name some.log
It will give a lot hits back. So now I would like to go through it with a "for" like this:
for a in $(find -name vmware.log)
do
XXXXXXX
done
After that, I would like to cut the path in variable $a. Lets assume, $a has the following content:
./this/is/a/path/some.log
I'll cut this variable with
cut -d/ -f2 $a
The finished code is like this:
for a in $(find -name vmware.log)
do
cutpath=cut -d/ -f2 $a
done
When I do this, the bash uses the content of $a as a system path and not as a string. So "cut" tries to access the file directly, but it should only cut the string path in $a.The error I get on VMware ESXi is:
-sh: ./this/is/a/path/some.log: Device or resource busy
What am I doing wrong? Can anybody help me out?

Firstly, looping through the output of find with for is discouraged, since it's not going to work for filenames containing spaces or glob metacharacters (such as *).
This achieves what you want, using the -exec switch. The name of the file {} is passed to the script as $0.
find -name 'vmware.log' -exec sh -c 'echo "$0" | cut -d/ -f2' {} \;
# or with bash
find -name 'vmware.log' -exec bash -c 'cut -d/ -f2 <<<"$0"' {} \;
It looks like you want to use cut on the filename, rather than the file's contents, so you need to pass the name to cut on standard input, not as an argument. This can be done using a pipe | or using Bash's <<< herestring syntax.

You should try to use something like this :
#!/bin/bash
VAR1="$1"
VAR2="$2"
MOREF='sudo run command against $VAR1 | grep name | cut -c7-'
echo $MOREF
Using tickquotes will execute the command and store it in the variable.

Related

Preserve '\n' newline in returned text over ssh

If I execute a find command, with grep and sort etc. in the local command line, I get returned lines like so:
# find ~/logs/ -iname 'status' | xargs grep 'last seen' | sort --field-separator=: -k 4 -g
0:0:line:1
0:0:line:2
0:0:line:3
If I execute the same command over ssh, the returned text prints without newlines, like so:
# VARcmdChk="$(ssh ${VARuser}#${VARserver} "find ~/logs/ -iname 'status' | xargs grep 'last seen' | sort --field-separator=: -k 4 -g")"
# echo ${VARcmdChk}
0:0:line:1 0:0:line:2 0:0:line:3
I'm trying to understand why ssh is sanitising the returned text, so that newlines are converted to spaces. I have not yet tried output'ing to file, and then using scp to pull that back. Seems a waste, since I just want to view the remote results locally.
When you echo the variable VARcmdChk, you should enclose it with ".
$ VARcmdChk=$(ssh ${VARuser}#${VARserver} "find tmp/ -iname status -exec grep 'last seen' {} \; | sort --field-separator=: -k 4 -g")
$ echo "${VARcmdChk}"
last seen:11:22:33:44:55:66:77:88:99:00
last seen:00:99:88:77:66:55:44:33:22:11
Note that I've replaced your xargs for -exec.
Ok, the question is a duplicate of this one, Why does shell Command Substitution gobble up a trailing newline char?, so partly answered.
However, I say partly, as the answers tell you the reasons for this happening as such, but the only clue to a solution is a small answer right at the end.
The solution is to quote the echo argument, as the solution suggests:
# VARcmdChk="$(ssh ${VARuser}#${VARserver} "find ~/logs/ -iname 'status' | xargs grep 'last seen' | sort --field-separator=: -k 4 -g")"
# echo "${VARcmdChk}"
0:0:line:1
0:0:line:2
0:0:line:3
but there is no explanation as to why this works as such, since assumption is that the variable is a string, so should print as expected. However, reading Expansion of variable inside single quotes in a command in Bash provides the clue regarding preserving newlines etc. in a string. Placing the variable to be printed by echo into quotes preserves the contents absolutely, and you get the expected output.
The echo of the variable is why its putting it all into one line. Running the following command will output the results as expected:
ssh ${VARuser}#${VARserver} "find ~/logs/ -iname 'status' | xargs grep 'last seen' | sort --field-separator=: -k 4 -g"
To get the command output to have each result on a new line, like it does when you run the command locally you can use awk to split the results onto a new line.
awk '{print $1"\n"$2}'
This method can be appended to your command like this:
echo ${VARcmdChk} | awk '{print $1"\n"$2"\n"$3"\n"$4}'
Alternatively, you can put quotes around the variable as per your answer:
echo "${VARcmdChk}"

how to remove the extension of multiple files using cut in a shell script?

I'm studying about how to use 'cut'.
#!/bin/bash
for file in *.c; do
name = `$file | cut -d'.' -f1`
gcc $file -o $name
done
What's wrong with the following code?
There are a number of problems on this line:
name = `$file | cut -d'.' -f1`
First, to assign a shell variable, there should be no whitespace around the assignment operator:
name=`$file | cut -d'.' -f1`
Secondly, you want to pass $file to cut as a string, but what you're actually doing is trying to run $file as if it were an executable program (which it may very well be, but that's beside the point). Instead, use echo or shell redirection to pass it:
name=`echo $file | cut -d. -f1`
Or:
name=`cut -d. -f1 <<< $file`
I would actually recommend that you approach it slightly differently. Your solution will break if you get a file like foo.bar.c. Instead, you can use shell expansion to strip off the trailing extension:
name=${file%.c}
Or you can use the basename utility:
name=`basename $file .c`
You should use the command substitution (https://www.gnu.org/software/bash/manual/bashref.html#Command-Substitution) to execute a command in a script.
With this the code will look like this
#!/bin/bash
for file in *.c; do
name=$(echo "$file" | cut -f 1 -d '.')
gcc $file -o $name
done
With the echo will send the $file to the standard output.
Then the pipe will trigger after the command.
The cut command with the . delimiter will split the file name and will keep the first part.
This is assigned to the name variable.
Hope this answer helps

change name of file in nested folders

I have been trying to think of a way to rename file names that are listed in nested folders and am having an issue resolving this matter. as a test i have been able to cut out what part of the name i would like to alter but can't think of how to put that into a variable and chain the name together. the file format looks like this.
XXX_XXXX_YYYYYYYYYY_100426151653-all.mp3
i have been testing this format out to cut the part out i was looking to change but i am not sure this would be the best way of doing it.
echo XXX_XXXX_YYYYYYYYYY_100426095135-all.mp3 |awk -F_ '{print $4}' | cut -c 1-6
I would like to change the 100426151653 to this 20100426-151653 format in the name.
i have tried to use the rename the file with this command with this format 's/ //g' but that format did not work i had to resort to rename ' ' '' file name to remove a blank space.
so the file would start as this
XXX_XXXX_YYYYYYYYYY_100426151653-all.mp3
and end like this
XXX_XXXX_YYYYYYYYYY_20100426-151653-all.mp3
How about using find and a bash function
#!/bin/bash
modfn () {
suffix=$2
fn=$(basename $1)
path=$(dirname $1)
fld1=$(echo $fn | cut -d '_' -f1)
fld2=$(echo $fn | cut -d '_' -f2)
fld3=$(echo $fn | cut -d '_' -f3)
fld4=$(echo $fn | cut -d '_' -f4)
fld5=${fld4%$suffix}
l5=${#fld5}
fld6=${fld5:0:$(($l5 - 6))}
fld7=${fld5:$(($l5 - 6)):6}
newfn="${fld1}_${fld2}_${fld3}_20${fld6}-${fld7}${suffix}"
echo "moving ${path}/${fn} to ${path}/${newfn}"
mv ${path}/${fn} ${path}/${newfn}"
}
export -f modfn
suffix="-all.mp3"
export suffix
find . -type f -name "*${suffix}" ! -name "*-*${suffix}" -exec bash -c 'modfn "$0" ${suffix}' {} +
The above bash script uses find to search in the current folder and it's contents for files like WWW_XXXX_YYYYYYYYYY_AAAAAABBBBBB-all.mp3 yet excludes ones that are already renamed and look like WWW_XXXX_YYYYYYYYYY_20AAAAAA-BBBBBB-all.mp3.
W,X,Y,A,B can be any character other than underscore or dash.
All the found files are renamed
NOTE: There are ways to shrink the above script but doing that makes the operation less obvious.
This perl one-liner does the job:
find . -name "XXX_XXXX_YYYYYYYYYY_*-all.mp3" -printf '%P\n' 2>/dev/null | perl -nle '$o=$_; s/_[0-9]{6}/_20100426-/; $n=$_; rename($o,$n)if!-e$n'
Note: I came just with a find command and regex part. The credit for a perl one liner goes to perlmonks user at http://www.perlmonks.org/?node=823355

How to get only filenames without Path by using grep

I have got the following Problem.
I´m doing a grep like:
$command = grep -r -i --include=*.cfg 'host{' /omd/sites/mesh/etc/icinga/conf.d/objects
I got the following output:
/omd/sites/mesh/etc/icinga/conf.d/objects/testsystem/test1.cfg:define host{
/omd/sites/mesh/etc/icinga/conf.d/objects/testsystem/test2.cfg:define host{
/omd/sites/mesh/etc/icinga/conf.d/objects/testsystem/test3.cfg:define host{
...
for all *.cfg files.
With exec($command,$array)
I passed the result in an array.
Is it possible to get only the filenames as result of the grep-command.
I have tried the following:
$Command= grep -l -H -r -i --include=*.cfg 'host{' /omd/sites/mesh/etc/icinga/conf.d/objects
but I got the same result.
I know that on the forum a similar topic exists.(How can I use grep to show just filenames (no in-line matches) on linux?), but the solution doesn´t work.
With "exec($Command,$result_array)" I try to get an array with the results.
The mentioned solutions works all, but I can´t get an resultarray with exec().
Can anyone help me?
Yet another simpler solution:
grep -l whatever-you-want | xargs -L 1 basename
or you can avoid xargs and use a subshell instead, if you are not using an ancient version of the GNU coreutils:
basename -a $(grep -l whatever-you-want)
basename is the bash straightforward solution to get a file name without path. You may also be interested in dirname to get the path only.
GNU Coreutils basename documentation
Is it possible to get only the filenames as result of the grep command.
With grep you need the -l option to display only file names.
Using find ... -execdir grep ... \{} + you might prevent displaying the full path of the file (is this what you need?)
find /omd/sites/mesh/etc/icinga/conf.d/objects -name '*.cfg' \
-execdir grep -r -i -l 'host{' \{} +
In addition, concerning the second part of your question, to read the result of a command into an array, you have to use the syntax: IFS=$'\n' MYVAR=( $(cmd ...) )
In that particular case (I formatted as multiline statement in order to clearly show the structure of that expression -- of course you could write as a "one-liner"):
IFS=$'\n' MYVAR=(
$(
find objects -name '*.cfg' \
-execdir grep -r -i -l 'host{' \{} +
)
)
You have then access to the result in the array MYVAR as usual. While I while I was testing (3 matches in that particular case):
sh$ echo ${#MYVAR[#]}
3
sh$ echo ${MYVAR[0]}
./x y.cfg
sh$ echo ${MYVAR[1]}
./d.cfg
sh$ echo ${MYVAR[2]}
./e.cfg
# ...
This should work:
grep -r -i --include=*.cfg 'host{' /omd/sites/mesh/etc/icinga/conf.d/objects | \
awk '{print $1}' | sed -e 's|[^/]*/||g' -e 's|:define$||'
The awk portion finds the first field in it and the sed command trims off the path and the :define.

passing grep into a variable in bash

I have a file named email.txt like these one :
Subject:My test
From:my email <myemail#gmail.com>
this is third test
I want to take out only the email address in this file by using bash script.So i put this script in my bash script named myscript:
#!/bin/bash
file=$(myscript)
var1=$(awk 'NR==2' $file)
var2=$("$var1" | (grep -Eio '\b[A-Z0-9._%+-]+#[A-Z0-9.-]+\.[A-Z]{2,4}\b'))
echo $var2
But I failed to run this script.When I run this command manually in bash i can obtain the email address:
echo $var1 | grep -Eio '\b[A-Z0-9._%+-]+#[A-Z0-9.-]+\.[A-Z]{2,4}\b'
I need to put the email address to store in a variable so i can use it in other function.Can someone show me how to solve this problem?
Thanks.
I think this is an overly complicated way to go about things, but if you just want to get your script to work, try this:
#!/bin/bash
file="email.txt"
var1=$(awk 'NR==2' $file)
var2=$(echo "$var1" | grep -Eio '\b[A-Z0-9._%+-]+#[A-Z0-9.-]+\.[A-Z]{2,4}\b')
echo $var2
I'm not sure what file=$(myscript) was supposed to do, but on the next line you want a file name as argument to awk, so you should just assign email.txt as a string value to file, not execute a command called myscript. $var1 isn't a command (it's just a line from your text file), so you have to echo it to give grep anything useful to work with. The additional parentheses around grep are redundant.
What is happening is this:
var2=$("$var1" | (grep -Eio '\b[A-Z0-9._%+-]+#[A-Z0-9.-]+\.[A-Z]{2,4}\b'))
^^^^^^^ Execute the program named (what is in variable var1).
You need to do something like this:
var2=$(echo "$var1" | grep -Eio '\b[A-Z0-9._%+-]+#[A-Z0-9.-]+\.[A-Z]{2,4}\b')
or even
var2=$(awk 'NR==2' $file | grep -Eio '\b[A-Z0-9._%+-]+#[A-Z0-9.-]+\.[A-Z]{2,4}\b')
There are very helpful flags for bash: -xv
The line with
var2=$("$var1" | (grep...
should be
var2=$(echo "$var1" | (grep...
Also my version of grep doesn't have -o flag.
And, as far as grep patterns are "greedy" even as the following code runs, it's output is not exactly what you want.
#!/bin/bash -xv
file=test.txt
var1=$(awk 'NR==2' $file)
var2=$(echo "$var1" | (grep -Ei '\b[A-Z0-9._%+-]+#[A-Z0-9.-]+.[A-Z]{2,4}\b'))
echo $var2
Use Bash parameter expansion,
var2="${var1#*:}"
There's a cruder way:
cat $file | grep # | tr '<>' '\012\012' | grep #
That is, extract the line(s) with # signs, turn the angle brackets into newlines, then grep again for anything left with an # sign.
Refine as needed...

Resources