I have been trying to gzip .BRIK files on my iMac. However, the files with this extension are scattered everywhere and they are multi-level deep. What I have been doing is going folder by folder and use this
gzip *BRIK
However, it is tedious and will take a long time to do one folder at a time. I also tried
gzip -r *BRIK
or
gzip -r *BRIK ./
They did not work. Any suggestions?
The tool you're looking for is find to discover the files and xargs to call gzip with their names.
find . -name '*.BRIK' -print0 | xargs -0 gzip
The use of -print0 and -0 here allows this to work smoothly with directories and files with spaces in their names.
In addition to find, recursive globbing in zsh is super handy and easy to use. In your case in zsh you can simply:
gzip **/*.BRIK
(Since Catalina, zsh has been the default shell in macOS.)
Related
Here is the situation : I have a folder, containing a lot of subfolders, some of them containing .gz compressed files (NOT tar, just compressed text files). I want to recursively decompress all these .gz files into the root folder, but I can't figure out the exact way to do it.
I have my folders like that :
/folderX/subfolder1/file.gz
by using
gzip -c -d -r *.gz
I can probably extract all the files at once, but they will remain in their respective subfolders. I want them all in /folderX/
find -name *.gz
gives me the correct list of the files I am looking for, but I have no idea how to combine the 2 commands. Should I combine these commands in a script ? Or is there a functionality of gzip that I have missed allowing to decompress everything in the folder from which you are executing the command ?
Thanks for the help !
You can use a while..done loop that iterate the input:
find dirname -name *.gz|while read i; do gzip -c -d -r $i; done
You can also use xargs, with the additional benefit of dealing with spaces (" ") in the file name of parameter -0:
find dirname -name '*.gz' -print0 | xargs -0 -L 1 gzip -c -d -r
The "-print0" output all the files found separated by NULL character. The -0 switch of xargs rebuild the list parsing the NULL character and applies the "gzip..." command to each of them. Pay attention to the "-L 1" parameter which tells xargs to pass only ONE file at a time to gzip.
Alright so i have a web server running CentOS at work that is hosting a few websites internally only. It's our developpement server and thus has lots [read tons] of old junk websites and whatnot.
I was trying to elaborate a command that would find files that haven't been modified for over 6 months, group them all in a tarball and then delete them. Thus far i have tried many different type of find commands with arguments and whatnot. Our structure looks like such
/var/www/joomla/username/fileshere/temp
/var/www/username/fileshere
So i tried something amongst the lines of :
find /var/www -mtime -900 ! -mtime -180 | xargs tar -cf test4.tar
Only to have a 10MB resulting tar, when the expected result would be over 50 GB's.
I tried using gzip instead, but i ended up zipping MY WHOLE SERVER thus making is unusable, had to transfer the whole filesystem and reinstall a complete new server and lots of shit and trouble and... you get the idea. So i want to find the perfect command that won't blow up our server but will find all FILES and DIRECTORIES that haven't been modified for over 6 months.
Be careful with ctime.
ctime is related to changes made to inodes (changing permissions, owner, etc)
atime when a file was last accessed (check if your file system is using noatime or relatime options, in that case the atime option may not work in the expected way)
mtime when data in a file was last modified.
Depending on what are you trying to do, the mtime option could be your best option.
Besides, you should check the print0 option. From man find:
-print0
True; print the full file name on the standard output, followed by a null character (instead of the newline character that -print uses). This allows file names that contain newlines or
other types of white space to be correctly interpreted by programs that process the find output. This option corresponds to the -0 option of xargs.
I do not know what are you trying to do but this command could be useful for you:
find /var/www -mtime +180 -print0 | xargs -0 tar -czf example.tar.gz
Try this:
find /var/www -ctime +180 | xargs tar cf test.tar
The ctime parameter tells you the difference between current time and each files modification times, and if you use the + instead of minus it will give you the "files modified in a date older than x days".
Then just pass it to tar with xargs and you should be set.
I want to generate docs for coffee-script files. I want to use Docco.
When i use:
docco client/coffee/*
it throws error. I think because folders are in file list.
When i use:
docco client/coffee/*.coffee
it cant' find some files, because i havent anithing in root folder.
How to give all *.coffee files recursievly to command in console?
There are several ways to do it
$ find client/coffee/ -name '*.coffee' -exec docco {} +
$ find client/coffee/ -name '*.coffee' | xargs docco
However, note that the latter way does not work if there is space in file name, unless you use find -print0 with combination of xargs -0.
Additionally, if you are using bash, you can use **/*.coffee with setting shopt -s globstar
I have a backup location, which uses hardlinks to store existing or changed files. The location of these backups mimick the linux file system with a date part in it.
For example I have files
/backup/servername/2012-06-26T00.43.01/www.website.com/file1.html
/backup/servername/2012-06-26T06.43.01/www.website.com/file1.html
/backup/servername/2012-06-26T06.43.01/www.website.com/file2.html
/backup/servername/2012-06-26T12.43.01/www.website.com/file1.html
/backup/servername/2012-06-26T12.43.01/www.website.com/file2.html
How can I find all files which have www.website.com in them, so I can delete them
I have this command combination to delete files I can find with find, but I can't figure out how to find these files.
find . -name 'filename.*' -print0 | xargs -0 rm
You're being a little loose with your terminology, so it's a kind of tough to understand what exactly you want. However, if I understood you correctly, you want to delete all the files within a directory called www.website.com:
find . -wholename '*/www.website.com/*.html' -delete
if i understood you right you can use smth like this: find /backup/servername/2012-06-26T12.43.01/www.website.com/ -iname '*file*' -print0 | xargs -0 rm
Can anyone give me a bash script or one line command i can run on linux to recursively go through each folder from the current folder and delete all files or directories starting with '._'?
Change directory to the root directory you want (or change . to the directory) and execute:
find . -name "._*" -print0 | xargs -0 rm -rf
xargs allows you to pass several parameters to a single command, so it will be faster than using the find -exec syntax. Also, you can run this once without the | to view the files it will delete, make sure it is safe.
find . -name '._*' -exec rm -Rf {} \;
I've had a similar problem a while ago (I assume you are trying to clean up a drive that was connected to a Mac which saves a lot of these files), so I wrote a simple python script which deletes these and other useless files; maybe it will be useful to you:
http://github.com/houbysoft/short/blob/master/tidy
find /path -name "._*" -exec rm -fr "{}" +;
Instead of deleting the AppleDouble files, you could merge them with the corresponding files. You can use dot_clean.
dot_clean -- Merge ._* files with corresponding native files.
For each dir, dot_clean recursively merges all ._* files with their corresponding native files according to the rules specified with the given arguments. By default, if there is an attribute on the native file that is also present in the ._ file, the most recent attribute will be used.
If no operands are given, a usage message is output. If more than one directory is given, directories are merged in the order in which they are specified.
Because dot_clean works recursively by default, use:
dot_clean <directory>
If you want to turn off the recursively merge, use -f for flat merge.
dot_clean -f <directory>
find . -name '.*' -delete
A bit shorter and perform better in case of extremely long list of files.