So I am using amazon to serve a few files.
I want to gzip these files before I upload them
first I copy templates into a new folder
cp -r templates/ templatesGZIP
then I GZIP that folder
gzip -r templatesGZIP
the problem is that this adds .gz to all the file names. so for example homeTemplate.html changes to homeTemplate.html.gz
is there any way when running gzip -r templatesGZIP that I state I want to keep the extensions the same
Thanks
Dan
Bash script: Gzip an entire folder and keep files extensions same
This would surely help first compress them with gzip and then rename them.
Thanks & Regards,
Alok
gzip just does one thing, turns a single file into a gz archive. What you need is a tar.gz file. Your friend is tar, which can use gzip as well
cp -r templates templatesGZIP
tar czf templatesGZIP.tar.gz templatesGZIP
Backround: tar does another one thing well: it turns a directory structure into a single file. The tar commands above, explained:
c = create
z = zipped, default gzip
f FILE = name of the archive file
after copying the directory
find templatesGZIP -type f ! -wholename *images* -exec gzip {} \; -exec mv {}.gz {} \;
You can use the stdout as a temporary buffer and write the output in a file of your choise:
gzip -cr templatesGZIP > outputfile.extension
There is no option in gzip to do this. Using the two-directory method, the following should do the trick
for in_file in $(find templates -type f)
do
out_file="templatesGZIP/${in_file#templates/}"
mkdir -p "$(dirname "$out_file")"
gzip -c <"$in_file" >"$out_file"
done
With this method, there's no need for the cp command.
This has been tested.
One comment on this - it's uncommon to see gzip'ed files without an extension.
Related
My requirement is to send attachment in mail from unix.
There are many attachments, So I need to zip all files into a single folder.
So far I tried:
gzip -c abc.txt > xyz.gz
gzip -c cde.txt >> xyz.gz
But it is behaving like this:
cat abc.txt cde.txt
You can use the below command.
gzip -c abc.txt cde.txt > xyz.gz
If you want more information about the gzip see the man page for the gzip.
The common way to do this would be to bundle the files into a .tar file, then gzip it. This can be done by the tar command alone:
tar -cvzf xyz.tar.gz abc.txt cde.txt
The -z flag tells tar to gzip it's output.
I have a folder of files and they will be in a pattern similar to this:
original.jpg
original.200px.jpg
original.300px.jpg
original.preview.jpg
original.slider.jpg
filetwo.jpg
filetwo.200px.jpg
filetwo.300px.jpg
filetwo.preview.jpg
filetwo.slider.jpg
imagethree.jpg
imagethree.200px.jpg
imagethree.300px.jpg
imagethree.preview.jpg
imagethree.slider.jpg
I want to ONLY select the original file (original.jpg, filetwo.jpg, imagethree.jpg) and omit the server generated files. I'm trying to create a tar file of just those original files and not the dynamically generated copies.
tar -tf file.tar --wildcards '*.jpg' --exclude '*.*.jpg'
Output:
filetwo.jpg
imagethree.jpg
original.jpg
Just change -t to -x to extract instead.
To create the archive:
tar -cf file.tar *.jpg --wildcards --exclude '*.*.jpg'
I have a directory with many sub-directories. In some of those sub-directories I have files with *.asc extension and some with *.xdr.
I want to create a SINGLE tarball/gzip file which maintains the directory structure but excludes all files with the *.xdr extension.
How can I do this?
I did something like find . -depth -name *.asc -exec gzip -r9 {} + but this gzips every *.asc file individually which is not what I want to do.
You need to use the --exclude option:
tar -zc -f test.tar.gz --exclude='*.xdr' *
gzip will always handle files individually. If you want a bundled archive you will have to tar the files first and then gzip the result, hence you will end up with a .tar.gz file or .tgz for short.
To get a better control over what you are doing, you can first find the files using the command you already posted (with -print instead of the gzip command) and put them into a file, then use this file (=filelist.txt) to instruct tar with what to archive
tar -T filelist.txt -c -v -f myarchive.tar
In a directory, is it possible to gzip only files containing "foo"? I can find them all by find . -name "*foo*" but i need a way to archive them.
Thank you.
I assume you mean archive into a single tar file? Not individual .gz files?
Try this: (assumes there aren't too many files)
find . -name "*foo*" | xargs tar cvzf archive.tar.gz
An alternative is to do something like:
find . -name "*foo*" > list.txt
tar cvzf archive.tar.gz -T list.txt #(works only with gnu tar, not bsd i think)
I have a bunch of zip files, and I'm trying to make a bash script to automate the unzipping of certain files from it.
Things is, although I know the name of the file I want, I don't know the name of the folder it's in; it is one folder depth in
How can I extract these files, preferably discarding the folder?
Here's how to unzip any given file at any depth and junk the folder paths on the way out:
unzip -j somezip.zip *somefile.txt
The -j junks any folder structure in the zip file and the asterisk gives a wildcard to match along any path.
if you're in:
some_directory/
and the zip files are in any number of subdirectories, say:
some_directory/foo
find ./ -name myfile.zip -exec unzip {} -d /directory \;
Edit: As for the second part, removing the directory that contained the zip file I assume?
find ./ -name myfile.zip -exec unzip {} -d /directory \; -exec echo rm -rf `dirname {}` \;
Notice the "echo." That's a sanity check. I always echo first when executing something destructive like rm -rf in a loop/iterative sequence like this. Good luck!
Have you tried unzip somefile.zip "*/blah.txt"?
You can use find to find the file that you need to unzip, and xargs to call unzip:
find /path/to/root/ -name 'zipname.zip' -print0 | xargs -0 unzip
print0 enables the command to work with files or paths that have white space in them. -0 is the option to xargs that makes it work with print0.