In a directory, is it possible to gzip only files containing "foo"? I can find them all by find . -name "*foo*" but i need a way to archive them.
Thank you.
I assume you mean archive into a single tar file? Not individual .gz files?
Try this: (assumes there aren't too many files)
find . -name "*foo*" | xargs tar cvzf archive.tar.gz
An alternative is to do something like:
find . -name "*foo*" > list.txt
tar cvzf archive.tar.gz -T list.txt #(works only with gnu tar, not bsd i think)
Related
I'm on a RedHat Linux 6 machine, running Elasticsearch and Logstash. I have a bunch of log files that were rotated daily from back in June til August. I am trying to figure out the best way to tar them up to save some diskspace, without manually taring up each one. I'm a bit of a newbie at scripting, so I was wondering if someone could help me out? The files have the name elasticsearch-cluster.log.datestamp. Ideally they would all be in their individual tar files, so that it'd be easier to go back and take a look at that particular day's logs if needed.
You could use a loop :
for file in elasticsearch-cluster.log.*
do
tar zcvf "$file".tar.gz "$file"
done
Or if you prefer a one-liner (this is recursive):
find . -name 'elasticsearch-cluster.log.*' -print0 | xargs -0 -I {} tar zcvf {}.tar.gz {}
or as #chepner mentions with the -exec option:
find . -name 'elasticsearch-cluster.log.*' -exec tar zcvf {}.tar.gz {} \;
or if want to exclude already zipped files:
find . -name 'elasticsearch-cluster.log.*' -not -name '*.tar.gz' -exec tar zcvf {}.tar.gz {} \;
If you don't mind all the files being in a single tar.gz file, you can do:
tar zcvf backups.tar.gz elasticsearch-cluster.log.*
All these commands leave the original files in place. After you validate the tar.gz files, you can delete them manually.
For example, current directory is /A/B/, some scripts whose suffixes are .py and .sh in /A/B/C/,/A/B/C/D/ and /A/B/E/.
How to generate such a compressed file which has the structure of directories and contains the python /shell scripts?
Use find with your compression, e.g.:
zip outfile -r `find . -name '*.py'` `find . -name '*.sh'`
find ./someDir -name ".php" -o -name ".html" | tar -cf my_archive -T -
as seen here in a question very similar to yours.
How to tar certain file types in all subdirectories?
I am trying to zip all PHP files including those in subfolders. But
zip -r PHP.zip *.php
only zips php files under current folder, if I do
zip -r ALL.zip *
it can zip all files including subfolders. So what is wrong with "*.php"?
You can either use,
find -iname '*.php' -print0 | xargs -0 zip -r php.zip
or
zip -r php.zip . --include \*.php
Both of the above commands will do the job.
I have a directory with many sub-directories. In some of those sub-directories I have files with *.asc extension and some with *.xdr.
I want to create a SINGLE tarball/gzip file which maintains the directory structure but excludes all files with the *.xdr extension.
How can I do this?
I did something like find . -depth -name *.asc -exec gzip -r9 {} + but this gzips every *.asc file individually which is not what I want to do.
You need to use the --exclude option:
tar -zc -f test.tar.gz --exclude='*.xdr' *
gzip will always handle files individually. If you want a bundled archive you will have to tar the files first and then gzip the result, hence you will end up with a .tar.gz file or .tgz for short.
To get a better control over what you are doing, you can first find the files using the command you already posted (with -print instead of the gzip command) and put them into a file, then use this file (=filelist.txt) to instruct tar with what to archive
tar -T filelist.txt -c -v -f myarchive.tar
I have a directory. It has about 500K .gz files.
How can I extract all .gz in that directory and delete the .gz files?
This should do it:
gunzip *.gz
#techedemic is correct but is missing '.' to mention the current directory, and this command go throught all subdirectories.
find . -name '*.gz' -exec gunzip '{}' \;
There's more than one way to do this obviously.
# This will find files recursively (you can limit it by using some 'find' parameters.
# see the man pages
# Final backslash required for exec example to work
find . -name '*.gz' -exec gunzip '{}' \;
# This will do it only in the current directory
for a in *.gz; do gunzip $a; done
I'm sure there's other ways as well, but this is probably the simplest.
And to remove it, just do a rm -rf *.gz in the applicable directory
Extract all gz files in current directory and its subdirectories:
find . -name "*.gz" | xargs gunzip
If you want to extract a single file use:
gunzip file.gz
It will extract the file and remove .gz file.
for foo in *.gz
do
tar xf "$foo"
rm "$foo"
done
Try:
ls -1 | grep -E "\.tar\.gz$" | xargs -n 1 tar xvfz
Then Try:
ls -1 | grep -E "\.tar\.gz$" | xargs -n 1 rm
This will untar all .tar.gz files in the current directory and then delete all the .tar.gz files. If you want an explanation, the "|" takes the stdout of the command before it, and uses that as the stdin of the command after it. Use "man command" w/o the quotes to figure out what those commands and arguments do. Or, you can research online.