i want to delete all files via shell except few as but results in Syntax error: "(" unexpected - linux

rm -rf * ! ( "update.sh" | "new_update" ) #or
rm -rf ! ( "update.sh" | "new_update" ) #or
rm -rf ! ( update.sh | new_update ) #or
I want to delete all files except update.sh and new_update
I have tired above all lines one by one in shell script but return error
unexpected token (
and when run directly on terminal some times executes and some times gives same error as above

First, you have to enable extended globbing with
shopt -s extglob
Then you can't have spaces between the parts of the wildcard.
rm -rf !(update.sh|new_update)

You can use grep -v to invert match, -w for whole words and -E fir regex to be able add more files.
rm $( printf '%s\n' * | grep -Ewv "update.sh|new_update" )
I don't guarantee it to work with filenames containing spaces and other crazy characters.
Safer option that requires gnu extensions is:
(read here more details)
printf '%s\0' * | grep -zEv '^(update.sh|new_update)$' | xargs -0 rm --
Ed: as commentator accuse ls and xargs of being 'dangerous' (here is why) I keep it for the record.
ls * | grep -Ev "update.sh|new_update" | xargs rm

Related

find and grep: get filenames

I need to find the reports (.docx files), read them with docx2txt, find the second match of "passed" (excluding "not passed") and save these filenames to text file. Here is what I tried:
OIFS="$IFS"
IFS=$'\n'
for f in $(find . -wholename '*_done/(*Report*.docx' |grep -v appendix)
do
docx2txt "$f" - | (grep -q -m2 passed || grep -q -v "not passed") || echo $f >> failed
done
IFS="$OIFS"
But this script gives me an empty file. If I replace || to && before echo, all filenames are stored into the file. grep works fine if it is not in the script, as well as docx2txt. What am I doing wrong here?
There are quite a lot problems with the grep commands.
grep -q always exits successfully on the first match.
With -q the -m2 has no effect. If there is one match grep exits successfully. It does not check if there is a second match.
To check that there are (at least) two matches, count the matches and then use test/[ ] to check the number of found matches. If there is at most one passed per line, grep -c is sufficient. If there can be multiple matches per line, you need grep -o ... | wc -l.
-q and -v together means: Is there at least one line that does not contain the pattern? When grep finds such a line it exits successfully. The only way for this command to fail is an input in which every line contains not passed (this includes the empty file).
Matching passed but not not passed is trickier than one might suspect. If there can be at most one passed/not passed per line, you can use grep -v 'not passed' | grep passed. Otherwise you need a need negative lookbehind, which is only available in perl compatible regular expressions (PCRE).
In addition to that command | (grep ... || grep ...) might not do what you expect. command produces output only once. After the first grep read some of this output, that read part is gone. The second grep will then continue reading where the first grep stopped.
BTW: for … in $(find … | grep -v …) can be turned into a single, safe find command using -not and -exec.
Solution
If each line contains at most one passed/not passed, use
find . -wholename '*_done/(*Report*.docx' -not -wholename '*appendix*' \
-exec sh -c '[ $(docx2txt "$0" - | grep -v "not passed" | grep -cm2 passed) = 2 ]' {} \; -print
If there can be multiple passed/not passed per line, you need GNU grep or pcregrep:
find . -wholename '*_done/(*Report*.docx' -not -wholename '*appendix*' \
-exec sh -c '[ $(docx2txt "$0" - | grep -Pom2 "(?<!not )passed" | wc -l) = 2 ]' {} \; -print
When you run into a problem like this, it's a good idea to remove as much code as possible. If we just take that one line with the multiple grep statements, we can first verify that the current expression doesn't work:
$ echo passed | ((grep -q -m2 passed || grep -q -v "not passed") || echo failed
$ echo not passed | ((grep -q -m2 passed || grep -q -v "not passed") || echo failed
We can see that neither of these commands produces at any output.
Let's think carefully about the logic:
The || operator means "if the first command doesn't succeed, run the second command". So in both cases, the first grep succeeds (because both passed and not passed contain the phrase passed). This means the second grep will never run, and it means that since the first command was successful, the entire grep ... || grep ... command will be successful, and that means the final echo $f will never run.
I was trying to think of a clever way to solve this, but it seems simplest if we make use of a temporary file:
OIFS="$IFS"
IFS=$'\n'
tmpfile=$(mktemp docXXXXXX)
trap "rm -f $tmpfile" EXIT
for f in $(find . -wholename '*_done/(*Report*.docx' |grep -v appendix)
do
docx2txt "$f" - | head -2 > $tmpfile
if grep -q passed $tmpfile && ! grep -q 'not passed' $tmpfile; then
echo $f >> failed
fi
done
IFS="$OIFS"

How to parse file for filenames and remove interactively

I want to read a file and and parse out filenames and remove them. In my case this means removing everything after the first tab for each line in the file to get the filenames and then calling rm -i on the files.
This is what I have so far but it just removes them all without prompting...if I add the -i to xargs rm it gives me a wall of text without letting me choose y/n
while IFS=' ' read -r line; do
#echo ${line%*}
sed -e 's/\t.*$//' | xargs rm
done < $1
The problem is that rm -i asks for yes/no on stdin. You redirect to the while loop and pipe to xargs, both of which will override stdin for rm -i.
You can rewrite to avoid xargs and also use a different FD for your loop:
while IFS=$'\t' read -u 3 -r file _
do
rm -i "$file"
done 3< yourfile.txt
You can avoid rm -i and use xargs -p for prompting use for each file to be deleted:
cut -f1 file | xargs -n1 -p rm

Perl Script to Grep Directory For String and Print

I would like to create a perl or bash script that will read keyboard input and assign a variable, perform a fixed string grep recursively within the current directory filled with Snort logs, and then automatically tcpdump the matched files, grep its output, and print the specified lines to the terminal. Does anyone have a good idea of how this should work?
Here is an example of the methodology I want from the script:
step 1: Read keyboard input and assign it to variable named string.
step 2 command: grep -Fr "$string"
step 2 output: snort.log.1470609906 matches
step 3 command: tcpdump -r snort.log.1470609906 | grep -F "$string" C-10
step 3 output:
Snort log
Here's some bash code that does that:
s="google.com"
grep -Frl "$s" | \
while IFS= read -r x; do
tcpdump -r "$x" | grep -F "$s" -C10
done
idk about perl but you can do it easily enough just in shell:
str="google.com"
find . -type f -name 'snort.log.*' -exec grep -FlZ "$str" {} + |
xargs -0 -I {} sh -c 'tcpdump -r "{}" | grep -F '"$str"' -C10'

How to store --exclude arguments for grep in an environment variable in a bash script

My goal is to search a file-hierarchy for certain text patterns, excluding certain file-name patterns, and recursively copy just the matching files to a local directory named confs. The following script does the job:
#!/bin/bash
export FEXCLUDE="{*edit,*debug,*orig,*BAK,*bak,*fcs,*NOPE,*tomcat,*full.xml,*-ha.xml}";
export SRCDIR=/opt/jboss-as-7.1.1.Final/standalone;
confshow() {
for ii in `grep -rlZ \
--exclude={*edit,*debug,*orig,*BAK,*bak,*fcs,*NOPE,*tomcat,*full.xml,*-ha.xml} \
--exclude-dir={log,tmp,i2b2.war,*.log,*_history,*.old} "<datasource\|username\|password\|user-name" \
$SRCDIR/* | xargs -0 ls {}` ;
do cp --parents $ii confs;
done;
}
However, the exclusion patterns are likely to need frequent updates and may need to be shared with other functions, so I prefer to have them all in a variables declared at the beginning of the script. When I do the following, files that should be excluded get copied to the confs directory:
#!/bin/bash
export FEXCLUDE="{*edit,*debug,*orig,*BAK,*bak,*fcs,*NOPE,*tomcat,*full.xml,*-ha.xml}";
export SRCDIR=/opt/jboss-as-7.1.1.Final/standalone;
confshow() {
for ii in `grep -rlZ \
--exclude=$FEXCLUDE \
--exclude-dir={log,tmp,i2b2.war,*.log,*_history,*.old} "<datasource\|username\|password\|user-name" \
$SRCDIR/* | xargs -0 ls {}` ;
do cp --parents $ii confs;
done;
}
Any idea how to obtain the desired behavior? Or how to see what grep sees when it gets passed the $FEXCLUDE argument (echo doesn't show anything wrong)?
Thanks.
Brace expansion is nice for interactive use, but if you are writing a script, just use your editor to quickly copy the necessary --exclude options and store them in an array. Parameter expansions do not undergo brace expansion, as you may have noticed.
#!/bin/bash
# You didn't need to export these anyway, since only your script uses them
FEXCLUDE=( --exclude '*edit'
--exclude '*debug'
# etc
)
DEXCLUDE=( --exclude-dir log
--exclude-dir tmp
# etc
)
SRCDIR=/opt/jboss-as-7.1.1.Final/standalone
confshow() {
while IFS= read -d'' -r ii; do
cp --parents "$ii" confs
done < <( grep -rlZ "${FEXCLUDE[#]}" "${DEXCLUDE[#]}" "<datasource\|username\|password\|user-name" $SRCDIR/* )
Also, using ls defeats the purpose of using null-delimited output from grep in the first place.
I know this will raise frowns but this can be solved by using eval and it might not come with usual risks as we're using pattern in --exclude= argument.
#!/bin/bash
fexclude='{*edit,*debug,*orig,*BAK,*bak,*fcs,*NOPE,*tomcat,*full.xml,*-ha.xml}'
dexclude='{log,tmp,i2b2.war,*.log,*_history,*.old}'
srcdir=/opt/jboss-as-7.1.1.Final/standalone
confshow() {
eval grep -rlZ \
--exclude="$fexclude" \
--exclude-dir="$dexclude" \
"<datasource\|username\|password\|user-name" \
$srcdir/* | xargs -0 -I {} cp --parents '{}' confs
done
}

Bash: Move files to specific folder if name contains one of a list of strings

I have a script that queries the Twitter API for several queries, and then writes the raw data to a file with the query in the name, plus a timestamp. I'd like to have a script that, given the list of query strings (regexs?) and for all files in a folder, if one of the query strings is a substring in that file, move it to a specific folder. Right now I have just a script with just a few dozen mv commands, but I'd like a simpler and more maintainable version. Here's an example of what I'm doing now:
mv /home/nick/TwitterSearchToDatabase/queries_for_amita/*femin*/home/nick/TwitterSearchToDatabase/queries_for_amita/feminism
mv /home/nick/TwitterSearchToDatabase/queries_for_amita/*patriarchy* /home/nick/TwitterSearchToDatabase/queries_for_amita/feminism
mv /home/nick/TwitterSearchToDatabase/queries_for_amita/*yesallwomen* /home/nick/TwitterSearchToDatabase/queries_for_amita/feminism
mv /home/nick/TwitterSearchToDatabase/queries_for_amita/*womanpower* /home/nick/TwitterSearchToDatabase/queries_for_amita/feminism
I would use a for loop:
for i in femin patriarchy yesallwomen womanpower; do
mv /home/nick/TwitterSearchToDatabase/queries_for_amita/*$i* /home/nick/TwitterSearchToDatabase/queries_for_amita/feminism
done
That way the list is in the first line so it is easy to amend.
I would isolate data (the words to be moved to feminism) and code.
When you have more keywords (feminism and so), you can make files with keywords and check these keywordfiles for the files you are considering to move.
With ${fromdir} where the files come from, ${todir} where you want them and ${keyfiledir} with the keywords, you get something like
for keyfile in ${keyfiledir}/*; do
key="${keyfile##*/}"
find $from -type f | sed 's#.*/##' | while read -r file; do
echo "${file}" | grep -q -f "${keyfiledir}"/"${key}" && mv "${from}"/"${file}" "${to}"/"${key}"
done
done
How does that work? I tested the solution above with the following script.
from=fromdir
to=todir
keyfiledir=keyfiledir
rm -rf ${from} ${to} ${keyfiledir}
mkdir ${from} ${to} ${keyfiledir}
mkdir ${to}/feminism ${to}/so
touch ${from}/yesallwomen ${from}/women ${from}/some_femin ${from}/"help move"
cat <<# > ${keyfiledir}/feminism
femin
patriarchy
yesallwomen
womanpower
#
touch ${from}/yesallwomen ${from}/women ${from}/some_femin
cat <<# > ${keyfiledir}/so
stack
exchange
help
#
test ! -d "${from}" && echo " Wrong dir ${from}" && exit 1
test ! -d "${to}" && echo " Wrong dir ${to}" && exit 1
test ! -d "${keyfiledir}" && echo " Wrong dir ${keyfiledir}" && exit 1
for keyfile in ${keyfiledir}/*; do
key="${keyfile##*/}"
find $from -type f | sed 's#.*/##' | while read -r file; do
echo "${file}" | grep -q -f "${keyfiledir}"/"${key}" && mv "${from}"/"${file}" "${to}"/"${key}"
done
done
echo "Not moved"
ls ${from}
echo "Moved"
ls -R ${to}
A simple combination of mv and egrep should suffice. egrep can take a pattern list from a file (and then you get to use full regexp syntax, not just glob syntax.) Make sure to exclude the name of the target folder.
cd /home/nick/TwitterSearchToDatabase/queries_for_amita
mv $(ls | egrep -f patterns.txt | grep -v '^feminism$') feminism

Resources