Looks like Busybox shell was not doing quotes removal - linux

It is about following setup: Linux machine, bash, adb, embedded Linux target system with Busybox.
For target system following applies:
adb shell echo $SHELL
/bin/sh
adb shell echo $0
/bin/sh
The problem is my find command in some script does not find anything (it has been proved by other means that items being looked for in fact do exist on target). My command:
adb -s $AdbID shell find / -type f \( -name "'"*audio*"'" -or -name "'"*alsa*"'" \) \
\( -path "'"/usr/lib/*"'" -or -path "'"/usr/bin/*"'" -or -path "'"/etc/*"'" \)
If to debug with echo echo gets following string as input arguments:
$ adb -s $AdbID shell echo find / -type f \( -name "'"*audio*"'" -or -name "'"*alsa*"'" \) \
\( -path "'"/usr/lib/*"'" -or -path "'"/usr/bin/*"'" -or -path "'"/etc/*"'" \)
find / -type f ( -name '*audio*' -or -name '*alsa*' ) \
( -path '/usr/lib/*' -or -path '/usr/bin/*' -or -path '/etc/*' )
Note: above transcript uses escaped newlines so you dont need to scroll much here, however those are not used in original command.
I guess same will apply to find if to remove echo from command string.
For me it looks like Busybox was not doing the quotes removal as myself used to have in Bash, after expansions of all other kinds. Ash as it seems to be Busybox shell, in its manual no word reg. quotes removal was found, so no idea how ash works in this regards.
#
If to replace Linux host with Windows desktop machine + dos command line, remaining elements as in case above* the command works fine. One can figure out difference at following point:
*) actually also other target system, however on this side no intentional changes so both setups should be identical regarding target system.
If to debug with echo echo gets following string:
c:\adb_shell>adb -s 2233445 shell echo find / -type f \
\( -name "'"*audio*"'" -or -name "'"*alsa*"'" \) \
\( -path "'"/usr/lib/*"'" -or -path "'"/usr/bin/*"'" -or -path "'"/etc/*"'" \)
find / -type f ( -name *audio* -or -name *alsa* ) \
( -path /usr/lib/* -or -path /usr/bin/* -or -path /etc/* )
Here the command gets from shell the string of input arguments with quotes removed.
I realize just the minute as myself writes this Q that command grouping will also not work as expected (busybox shell will consume paranthesis so 'find' won't receive them), let's address this question in other scope. Possibly command has more errors of this kind.
I believe the lack of quotes removal in case of two Linux shells in a chain is also real problem for my command string. What are possible reasons, solutions?

Note that in
"'"*audio*"'"
the asterisks are not escaped from the shell. The shell will happily glob them, should a file matching *audio* be in the current working directory. If the ' are really part of the filenames, you want to use "'*audio*'" instead. Then find will seach for files named '*audio*' (where find, not the shell, does the globbing for *).
If you simply want to find files matching *audio* (audio anywhere in the name), then use -name "*audio*" etc.
It would help if you told us exactly what the file names you want to find are.

Related

Find workspace and delete everything with the name, except for filename and everything in a directory pattern

I'm trying to create a cronjob that will delete everything with a pattern *.jar, except for master.jar and anything in a directory pattern */jarkeeper/*/staging/*
I'm close but not luck in finding the correct command. Here's what i have so far:
find /var/lib/jenkins/workspace/ ! -path "*/jarkeeper/*/staging/*" -or -type f ! -name master.jar -name \*.jar
and
find /var/lib/jenkins/workspace/ \( ! -path "*/jarkeeper/*/staging/*" \) -or \( -type f ! -name master.jar \) -name \*.jar
What should the correct format be?
The issue looks like you are using -or as opposed to -or. I would also suggest using -path as opposed to -name throughout to keep everything consistent and so:
find /var/lib/jenkins/workspace/ -type f ! -path "*master.jar" -or ! -path "*/jarkeeper/*/staging/*" -or -path "*.jar"
As an idea, I've always felt more comfortable combining more primitive tools than to use find's complex syntax, like:
find $somewhere -name \*.jar | grep -v master.jar | \
grep -vE "jarkeeper/.*/staging/" | xargs rm -rf
This also comes at the advantage that you can test/check/debug your scripts part by part.

For loop won't repeat itself

I have this block and the for loop doesn't repeat even if the path has more than 2 files.. It executes only once and that's all.. What's the problem? How can I make it run for all files in the list?
list=$(find $path -type f \( -name "*.c" -or -name "*.cpp" -or -name "*.cxx" -or -name "*.cc" \))
for file in "$list";do
#commands
done
You can avoid the use of find entirely here (Assuming the only files with those extensions are regular files; no directories etc.), via bash's extended globbing:
shopt -s extglob globstar
for file in "$path"/**/*.#(c|cpp|cxx|cc); do
# commands
done
Putting $list in quotes makes it just one word, so it doesn't loop.
But if you take out the quotes, it won't work properly if any of the filenames contain whitespace, since they'll be split into multiple words.
Instead of assigning to a variable, pipe the output to a while read loop.
find $path -type f \( -name "*.c" -or -name "*.cpp" -or -name "*.cxx" -or -name "*.cc" \) | while read -r file
do
# commands
done

Append output of Find command to Variable in Bash Script

Trying to append output of find command to a variable in a Bash script
Can append output of find command to a log file ok, but can't append it to a variable i.e.
This works ok:
find $DIR -type d -name "*" >> $DIRS_REMOVED_LOG
But this won't:
FILES_TO_EVAL=find $DIR -type f \( -name '*.sh' -or -name '*.txt' -or -name '*.xml' -or -name '*.log' \)
ENV=`basename $PS_CFG_HOME | tr "[:lower:]" "[:upper:]"`
FILE_TYPES=(*.log *.xml *.txt *.sh)
DIRS_TO_CLEAR="$PS_CFG_HOME/data/files $PS_CFG_HOME/appserv/prcs/$ENV/files $PS_CFG_HOME/appserv/prcs/$ENV/files/CQ"
FILES_REMOVED_LOG=$PS_CFG_HOME/files_removed.log
DIRS_REMOVED_LOG=$PS_CFG_HOME/dirs_removed.log
##Cycle through directories
##Below for files_removed_log works ok but can't get the find into a variable.
for DIR in `echo $DIRS_TO_CLEAR`
do
echo "Searching $DIR for files:"
FILES_TO_EVAL=find $DIR -type f \( -name '*.sh' -or -name '*.txt' -or -name '*.xml' -or -name '*.log' \)
find $DIR -type d -name "*" >> $DIRS_REMOVED_LOG
done
Expected FILES_TO_EVAL to be populated with results of find command but it is empty.
Run your scripts through ShellCheck. It finds lots of common mistakes, much like a compiler would.
FILES_TO_EVAL=find $DIR -type f \( -name '*.sh' -or -name '*.txt' -or -name '*.xml' -or -name '*.log' \)
SC2209: Use var=$(command) to assign output (or quote to assign string).
In addition to the problems that shellcheck.net will point out, there are a number of subtler problems.
For one thing, you're using all-caps variable names. This is dangerous, because there are a large number of all-caps variables that have special meanings to the shell and/or other tools, and if you accidentally use one of those, it can have weird effects. Lower- or mixed-case variables are much safer (except when you specifically want the special meaning).
Also, you should almost always put double-quotes around variable references (e.g. find "$dir" ... instead of find $dir ...). Without them, the variables will be subject to word splitting and wildcard expansion, which can have a variety of unintended consequences. In some cases, you need word splitting and/or wildcard expansion on a variable's value, but usually not quite the way the shell does it; in these cases, you should look for a better way to do the job.
In the line that's failing,
FILES_TO_EVAL=find $DIR -type f \( -name '*.sh' -or -name '*.txt' -or -name '*.xml' -or -name '*.log' \)
the immediate problem is that you need to use $(find ...) to capture the output from the find command. But this is still dangerous, because it's just storing a newline-delimited list of file paths, and the standard way to expand this (just using an unquoted variable reference) has all the problems I mentioned above. In this case, it will lead to trouble if any filenames contain spaces or wildcards (which are perfectly legal in filenames). In you're in a controlled environment where you can guarantee this won't happen, you'll get away with it... but it's really not the best idea.
Correctly handling a list of filepaths from find is a little complicated, but there are a number of ways to do it. There's a lot of good info in BashFAQ #20: "How can I find and safely handle file names containing newlines, spaces or both?" I'll summarize some common options below:
If you don't need to store the list, just run commands on individual files, you can use find -exec:
find "$dir" -type f \( -name '*.sh' -or -name '*.txt' -or -name '*.xml' -or -name '*.log' \) -exec somecommand {} \;
If you need to run something more complex, you can use find -print0 to output the list in an unambiguous form, and then use read -d '' to read them. There are a bunch of potential pitfalls here, so here's the version I use to avoid all the trouble spots:
while IFS= read -r -d '' filepath <&3; do
dosomethingwith "$filepath"
done 3< <(find "$dir" -type f \( -name '*.sh' -or -name '*.txt' -or -name '*.xml' -or -name '*.log' \) -print0)
Note that the <(command) syntax (known as process substitution) is a bash-only feature, so use an explicit bash shebang (#!/bin/bash or #!/usr/bin/env bash) on your script, and don't override it by running the script with sh.
If you really do need to store the list of paths for later, store it as an array:
files_to_eval=()
while IFS= read -r -d '' filepath; do
files_to_eval+=("$filepath")
done < <(find "$dir" -type f \( -name '*.sh' -or -name '*.txt' -or -name '*.xml' -or -name '*.log' \) -print0)
..or, if you have bash v4.4 or later, it's easier to use readarray (aka mapfile):
readarray -td '' files_to_eval < <(find "$dir" -type f \( -name '*.sh' -or -name '*.txt' -or -name '*.xml' -or -name '*.log' \) -print0)
In either case, you should then expand the array with "${files_to_eval[#]}" to get all the elements without subjecting them to word splitting and wildcard expansion.
On to some other problems. In this line:
FILE_TYPES=(*.log *.xml *.txt *.sh)
In this context, the wildcards will be expanded immediately to a list of matches in the current director. You should quote them to prevent this:
file_types=("*.log" "*.xml" "*.txt" "*.sh")
In these lines:
DIRS_TO_CLEAR="$PS_CFG_HOME/data/files $PS_CFG_HOME/appserv/prcs/$ENV/files $PS_CFG_HOME/appserv/prcs/$ENV/files/CQ"
...
for DIR in `echo $DIRS_TO_CLEAR`
You're storing a list as a single string with entries separated by spaces, which has all the word-split and wildcard problems I've been harping on. Also, the echo here is a complication that doesn't do anything useful, and actually makes the wildcard problem worse. Use an array, and avoid all the mess:
dirs_to_clear=("$ps_cfg_home/data/files" "$ps_cfg_home/appserv/prcs/$env/files" "$ps_cfg_home/appserv/prcs/$env/files/CQ")
...
for dir in "${dirs_to_clear[#]}"

I want to know exact command of "find . -name '*.c' -or -name '*.cpp'" in Linux

I'm studying shell in Linux these days. and I've had one question.
Please, look at below command:
$ find . -name '*.c' -or -name '*.cpp'
Exact command of above command is processed like below command?
$ find . -name '*.c' -and -print -or -name '*.cpp' -and -print
You are combining different search expressions with the logical operator or.
Basically your command will find all files in the current directory ending with .c or .cppand will print them to STDOUT.
For further info check the man page of find command.
Also note that this question would be more suitable to ask here.

What is wrong with my find command usage?

I'm trying to find all files whose name matches certain C++ file extensions but exclude certain directories matching a pattern with this:
find /home/palchan/code -name "*.[CcHh]" -o -name "*.cpp" -o -name "*.hpp" -a ! -name "*pattern*"
and this still gives me as output certain files like:
/home/palchan/code/libFox/pattern/hdr/fox/RedFox.H
which has the pattern in it?
Here is an example:
> ls -R .
.:
libFox
./libFox:
RedFox.C RedFox.H pattern
./libFox/pattern:
RedFox.C RedFox.H
and then I run:
> find . \( -name "*.[HC]" -a ! -name "*pattern*" \)
./libFox/pattern/RedFox.C
./libFox/pattern/RedFox.H
./libFox/RedFox.C
./libFox/RedFox.H
The following should work:
find /home/palchan/code \( -name "*pattern*" \) -prune -o -type f \( -name "*.[CcHh]" -o -name "*.cpp" -o -name "*.hpp" \) -print
From man find:
-name pattern
Base of file name (the path with the leading directories removed) matches shell pattern pattern. The metacharacters (`*', `?', and `[]') match
a `.' at the start of the base name (this is a change in findutils-4.2.2; see section STANDARDS CONFORMANCE below). To ignore a directory and
the files under it, use -prune; see an example in the description of -path. Braces are not recognised as being special, despite the fact that
some shells including Bash imbue braces with a special meaning in shell patterns. The filename matching is performed with the use of the
fnmatch(3) library function. Don't forget to enclose the pattern in quotes in order to protect it from expansion by the shell.
So, basically, you should use -prune to exclude directories instead of ! -name something
Try doing this :
find /home/palchan/code \( -name "*.[CcHh]" -o -name "*.cpp" -o -name "*.hpp" -a ! -name "*pattern*" \)

Resources