I have been trying to gzip .BRIK files on my iMac. However, the files with this extension are scattered everywhere and they are multi-level deep. What I have been doing is going folder by folder and use this
gzip *BRIK
However, it is tedious and will take a long time to do one folder at a time. I also tried
gzip -r *BRIK
or
gzip -r *BRIK ./
They did not work. Any suggestions?
The tool you're looking for is find to discover the files and xargs to call gzip with their names.
find . -name '*.BRIK' -print0 | xargs -0 gzip
The use of -print0 and -0 here allows this to work smoothly with directories and files with spaces in their names.
In addition to find, recursive globbing in zsh is super handy and easy to use. In your case in zsh you can simply:
gzip **/*.BRIK
(Since Catalina, zsh has been the default shell in macOS.)
The problem is that I have a directory full of html files. However, when I open the folder in Firefox it is difficult to navigate because when I open the folder there are also all of the associated html folders.
I tested using a ln -s of just the html to a seperate viewing directory and tested it and it worked.
Now my problem is trying to set up these ln -s across hundreds of files but I cannot figure out how to do this. I thought that the best way would be to use xargs on ls output but I cannot seem to get the syntax to work.
I believe that my problem is that I need to parse two sets of arguments to ln -s but I cannot get it to work
I have tried many different variations of the below but can't get the syntax to work. I've also tried using gnu parallel but still can't seem to get the syntax right.
ls Downloads (filenames) | grep html | xargs ln -s ~\Downloads\(filenames) ~\ViewingDirectory\(filename)
Any help would be appreciated. Thank you.
You misunderstand the use of xargs. And you parse the output of ls, which is generally considered a bad idea.
A better solution would be:
for f in ~\Downloads\*.html ; do
b=$(basename "$f")
ln -s "$f" ~\ViewingDirectory\"$b"
done
If you insist on using xargs, you could do it as follows for example:
find ~/Downloads/ -type f -name '*.html' \
| xargs -I# sh -c 'ln -s # ~/ViewingDirectory/"$(basename #)"'
Now, with xargs you could run the ln calls in parallel by using the -P flag:
find ~/Downloads/ -type f -name '*.html' \
| xargs -P"$(nproc)" -I# sh -c 'ln -s # ~/ViewingDirectory/"$(basename #)"'
where nproc returns the number of processing units available.
I have bash script that that loops through files in the raw folder and puts them into the audio folder. This works just fine.
#!/bin/bash
PATH_IN=('/nas/data/customers/test2/raw/')
PATH_OUT=('/nas/data/customers/test2/audio/')
mkdir -p /nas/data/customers/test2/audio
IFS=$'\n'
find $PATH_IN -type f -name '*.wav' -exec basename {} \; | while read -r file; do
sox -S ${PATH_IN}${file} -e signed-integer ${PATH_OUT}${file}
done
My issue is that, as the folders grow I do not want to run the script on the files that has already been converted, so I would like to loop over only the files that has not been converted yet. I.e the files only in raw but not in audio.
I found the function
diff audio raw
That can I do just that, but I cannot find a good way to incorporate this into my bash script. Any help or nudges in the right direction would be highly appreciated.
You could do:
diff <(ls -1a $PATH_OUT) <(ls -1a $PATH_IN) | grep -E ">" | sed -E 's/> //'
The first part will diff the files on both folders, the second part will filter out to get only the additions, and the third one will clean the list from the diff symbols to get just the names.
I want to generate docs for coffee-script files. I want to use Docco.
When i use:
docco client/coffee/*
it throws error. I think because folders are in file list.
When i use:
docco client/coffee/*.coffee
it cant' find some files, because i havent anithing in root folder.
How to give all *.coffee files recursievly to command in console?
There are several ways to do it
$ find client/coffee/ -name '*.coffee' -exec docco {} +
$ find client/coffee/ -name '*.coffee' | xargs docco
However, note that the latter way does not work if there is space in file name, unless you use find -print0 with combination of xargs -0.
Additionally, if you are using bash, you can use **/*.coffee with setting shopt -s globstar
This question already has answers here:
How can I exclude directories from grep -R?
(14 answers)
Closed 6 years ago.
When I grep my Subversion working copy directory, the results include a lot of files from the .svn directories. Is it possible to recursively grep a directory, but exclude all results from .svn directories?
If you have GNU Grep, it should work like this:
grep --exclude-dir=".svn"
If happen to be on a Unix System without GNU Grep, try the following:
grep -R "whatever you like" *|grep -v "\.svn/*"
For grep >=2.5.1a
You can put this into your environment (e.g. .bashrc)
export GREP_OPTIONS='--exclude-dir=".svn"'
PS: thanks to Adrinan, there are extra quotes in my version:
export GREP_OPTIONS='--exclude-dir=.svn'
PPS: This env option is marked for deprecation: https://www.gnu.org/software/grep/manual/html_node/Environment-Variables.html "As this causes problems when writing portable scripts, this feature will be removed in a future release of grep, and grep warns if it is used. Please use an alias or script instead."
If you use ack (a 'better grep') it will handle this automatically (and do a lot of other clever things too!). It's well worth checking out.
psychoschlumpf is correct, but it only works if you have the latest version of grep. Earlier versions do not have the --exclude-dir option. However, if you have a very large codebase, double-grep-ing can take forever. Drop this in your .bashrc for a portable .svn-less grep:
alias sgrep='find . -path "*/.svn" -prune -o -print0 | xargs -0 grep'
Now you can do this:
sgrep some_var
... and get expected results.
Of course, if you're an insane person like me who just has to use the same .bashrc everywhere, you could spend 4 hours writing an overcomplicated bash function to put there instead. Or, you could just wait for an insane person like me to post it online:
http://gist.github.com/573928
grep --exclude-dir=".svn"
works because the name ".svn" is rather unique. But this might fail on a more generalized name.
grep --exclude-dir="work"
is not bulletproof, if you have "/home/user/work" and "/home/user/stuff/work" it will skip both. It is not possible to define "/*/work/*"
to restrict the exclusion to only the former folder name.
As far as I could experiment, in GNU grep the simple --exclude won't exclude directories.
On my GNU grep 2.5, --exclude-dirs is not a valid option. As an alternative, this worked well for me:
grep --exclude="*.svn-base"
This should be a better solution than excluding all lines which contain .svn/ since it wouldn't accidentally filter out such lines in a real file.
Two greps will do the trick:
The first grep will get everything.
The second grep will use output of first grep as input (via piping). By using the -v flag, grep will select the lines which DON'T match the search terms. Voila. You are left with all the ouputs from the first grep which do not contain .svn in the filepath.
-v, --invert-match
Invert the sense of matching, to select non-matching lines.
grep the_text_you_want_to_search_for * | grep -v .svn
I tried double grep'in on my huge code base and it took forever so I got this solution with the help of my co-worker
Pruning is much faster as it stops find from processing those directories compared to 'grep -v' which processes everything and only excludes displaying results
find . -name .svn -prune -o -type f -print0 | xargs -0 egrep 'YOUR STRING'
You can also alias this command in your .bashrc as
alias sgrep='find . -name .svn build -prune -o -type f -print0 | xargs -0 egrep '
Now simply use
sgrep 'whatever'
Another option, albeit one that may not be perceived as an acceptable answer is to clone the repo into git and use git grep.
Rarely, I run into svn repositories that are so massive, it's just impractical to clone via git-svn. In these rare cases, I use a double grep solution, svngrep, but as many answers here indicate, this could be slow on large repositories, and exclude '.svn' occurrences that aren't directories. I would argue that these would be extremely seldom though...
Also regarding slow performance of multiple greps, once you've used something like git, pretty much everything seems slow in svn!
One last thing.., my variation of svngrep passes through colorization, beware, the implementation is ugly! Roughly grep -rn "$what" $where | egrep -v "$ignore" | grep --color "$what"
For grep version 2.5.1 you can add multiple --exclude items to filter out the .svn files.
$ grep -V | grep grep
grep (GNU grep) 2.5.1
GREP_OPTIONS="--exclude=*.svn-base --exclude=entries --exclude=all-wcprops" grep -l -R whatever ./
I think the --exclude option of recursion is what you are searching for.