Using iis, how can we check the size of a script file, eg jQuery library, after the gzip compression?
OSX or Linux command line:
gzip app.js && du -h app.js.gz
Related
I need to change the compression level for BZIP2 compression with tar. I found that we can set the compression level and run the tar command to compress. I tried different compression levels using the following command, but it seems like BZIP2=-<compression level> does not change the compression level.
BZIP2=-1
tar -cjvf <output_file> <input_file>
How to do it correctly?
Since you tagged this "linux", I will assume that you are using GNU tar. Then you can give the compression command with options using -I:
tar -I="bzip2 -1" -cvf out.tar.bz2 files
This command works fine in Local linux
gzip -d omega_data_path_2016-08-10.csv.gz
I would like to decompress a file with extension .csv.gz to HDFS location.
I tried the below command and i get this error
[cloudera#client08 localinputfiles]$ gzip -d omega_data_path_2016-08-10.csv.gz | hadoop dfs -put /user/cloudera/inputfiles/
gzip: omega_data_path_2016-08-10.csv already exists; do you wish to overwrite (y or n)? DEPRECATED: Use of this script to execute hdfs command is deprecated.
Instead use the hdfs command for it.
put: `/user/cloudera/inputfiles/': No such file or directory
Could someone help me to fix this?
To make gzip write the output on standard output, use -c flag.
So the command would be,
gzip -dc omega_data_path_2016-08-10.csv.gz | hdfs dfs -put - /user/cloudera/omega_data_path_2016-08-10.csv
i have a simple bash script to download a lot of logs files over pretty slow network. i can compress logs on the remote side. basically it's:
ssh: compress whole directory
scp: download archive
ssh: rm archive
using lzma gives great compression but compressing the whole directory is slow. is there any tool or easy way to write a script that allows me to compress a single files (or a bunch of files) and start downloading them while other files/chunks are still being compressed? i was thinking about launching compressing for every single file in the background and in the loop downloading/rsync files with correct extension. but then i don't know how to check if compressing process finished its work
The easiest way would be to compress them in transit using ssh -C. However, if you have a large number of small files, you are better off tarring and gzip/bzipping the whole directory at once using tar zcf or tar jcf. You may be able to start downloading the file while it's still being written, though I haven't tried it.
best solution i found here. in my case it was:
ssh -T user#example.com 'tar ... | lzma -5 -' > big.compressed
Try sshing into your server and going to the log directory and using GNU Parallel to compress all the logs in parallel and as each one is compressed, change its name to add the .done suffix so you can do rsync. So, on the server you would run:
cd <LOG DIRECTORY>
rm ALL_COMPRESSED.marker
parallel 'lzma {}; mv {}.lzma {}.lzma.done' ::: *.log
touch ALL_COMPRESSED.marker
It seems that if I run gzip somefile.js, I get somefile.js.gz as the output.
However, I need to preserve the original file extension, i.e for somefile.js, I need it to remain somefile.js after gzipping.
How can I do that?
Edit: To be clear, I need to do this for dozens of files, so I can't just mv each one. I simply want to gzip all the static css / js files and then upload them to my CDN, so I can serve them as regular js / css files.
If you really want to do so, you could simply use a for construct which exists in almost every shell (even on cmd.exe !). In Bourne or Posix sh flavour, it gives
for file in *.js *.css ; do gzip "$file" ; mv "$file.gz" "$file"; done
In Windows cmd.exe it should write (provided you've got a gzip command in your path):
for %file in (*.js *.css) do gzip %file && move %file.gz %file
But BEWARE : as others warned you, you will have binary gzipped files named foo.js or fee.css. If you serve them to standard browsers it definitely will not work !
Be sure to make a backup copy before trying that - it can easily be reversed, but you could at least lose time ...
EDIT : added quotes to shell command as suggested by gniourf_gniourf
Can someone please explain me how to use ">" and "|" in linux commands and convert me these three lines into one line of code please?
mysqldump --user=*** --password=*** $db --single-transaction -R > ${db}-$(date +%m-%d-%y).sql
tar -cf ${db}-$(date +%m-%d-%y).sql.tar ${db}-$(date +%m-%d-%y).sql
gzip ${db}-$(date +%m-%d-%y).sql.tar
rm ${db}-$(date +%m-%d-%y).sql (after conversion I guess this line will be useless)
The GNU tar program can itself do the compression normally done by gzip. You can use the -z flag to enable this. So the tar and gzip could be combined into:
tar -zcf ${db}-$(date +%m-%d-%y).sql.tar.gz ${db}-$(date +%m-%d-%y).sql
Getting tar to read from standard input for archiving is not a simple task but I would question its necessity in this particular case.
The intent of tar is to be able to package up a multitude of files into a single archive file but, since it's only one file you're processing (the output stream from mysqldump), you don't need to tar it up, you can just pipe it straight into gzip itself:
mysqldump blah blah | gzip > ${db}-$(date +%m-%d-%y).sql.gz
That's because gzip will compress standard input to standard output if you don't give it any file names.
This removes the need for any (possibly very large) temporary files during the compression process.
You can use next script:
#!/bin/sh
USER="***"
PASS="***"
DB="***"
mysqldump --user=$USER --password=$PASS $DB --single-transaction -R | gzip > ${DB}-$(date +%m-%d-%y).sql.gz
You can learn more about "|" here - http://en.wikipedia.org/wiki/Pipeline_(Unix). I can say that this construction moves output of mysqldump command to the standard input of gzip command, so that is like you connect output of one command with input of other via pipeline.
I dont see the point in using tar: You just have one file, and for compression you call gzip explicit. Tar is used to archive/pack multiple files into one.
You cammandline should be (the dump command is shorted, but I guess you will get it):
mysqldump .... | gzip > filename.sql.gz
To append the commands together in one line, I'd put && between them. That way if one fails, it stops executing them. You could also use a semicolon after each command, in which case each will run regardless if the prior command fails or not.
You should also know that tar will do the gzip for you with a "z" option, so you don't need the extra command.
Paxdiablo makes a good point that you can just pipe mysqldump directly into gzip.