How to gzip a file [closed] - linux

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 6 years ago.
Improve this question
How can I gzip this file? I get an error:
find /users/tnea01/logfile10.log -type f -exec sh -c \ 'gunzip /users/tnea01/logfile_archive/$(basename $0)_$(date -r {} +%F).gz $0' {} \;
Here is the error I get:
gzip: /users/tnea01/logfile10.tar.gz: No such file or directory gzip: /users/tnea01/logfile10.log: unknown suffix -- ignored

If you didn't know the exact filename, you might do something like this:
find /users/tnea01 -maxdepth 1 -name '*.log' -type f -exec sh -c \
'for f; do
gzip -c <"$f" >"/users/tnea01/logfile_archive/${f##*/}_$(date -r "$f" +%F).gz"
done' _ {} +
To explain the moving pieces:
The only secure way to use sh -c is with a completely constant string; substituting variables into it (including filenames) creates security vulnerabilities. Thus, we don't use any kind of replacement facility in the code, but pass the filename(s) as extra arguments.
for f; do is the same as for f in "$#"; do -- it iterates over all command line arguments.
${f**#/} evaluates to everything after the last / in $f; see the bash-hackers page on parameter expansion.
Expansions, including $(date ...), need to be inside a double-quoted context to be safe; here, we're putting the entire destination filename in such quotes.
However, since you do, that's all entirely needless.
f=/users/tnea01/logfile10.log
d=/users/tnea01/logfile_archive
gzip -c <"$f" >"$d/${f##*/}_$(date -r "$f" +%F).gz"

Related

Rich globbing `ls [G-S]*` in fish shell? [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 5 years ago.
Improve this question
In Bash it is possible to
ls [G-S]*
and it would list all files from g-s and G-S.
How is that done in Fish shell?
Fish currently does not support a rich glob syntax. The current thinking is that a glob command should be added in keeping with the fish goal of doing things via commands rather than magic syntax. See, for example, https://github.com/fish-shell/fish-shell/issues/3681. The solution is to create a function that filters the results. For example, the ** glob matches all files and directories in and below the CWD. I frequently want just the plain files and want to ignore the .git subdir. So I wrote this function:
function ff --description 'Like ** but only returns plain files.'
# This also ignores .git directories.
find . \( -name .git -type d -prune \) -o -type f | sed -n -e '/\/\.git$/n' -e 's/^\.\///p'
end
Which I can then use like this: grep something (ff). You could create a similar function that uses the find -name pattern matching feature or filter the results with string match --regex.
You can use find -iregex "./[G-S].*". Fish is quite limited in this regard.

Find empty files, if found update files with [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 7 years ago.
Improve this question
Can someone with more Linux knowledge answer this correctly for me.
On our web server, we host and run ALOT of web scripts.
we control these via Datestamp files, So the script is not over ran, or ran more than once.
A lot of files are all 0 KB. I wanted to know if there is a quick way in Linux to locate the files and update them.
I have located the files using:
find /var/www/vhosts/DOMAINNAME.co.uk/httpdocs -name "datestamp.*" -type f -empty
I have a long list of files, Can i update these with a simple datestamp format:
i.e.
20150923114046
You can use the -exec option of find:
find /var/www/vhosts/DOMAINNAME.co.uk/httpdocs -name "datestamp.*" -type f -empty \
-exec bash -c 'echo 20150923114046 > {}' \;
To get the timestamp dynamically, use date:
bash -c 'echo $(date +%Y%m%d%H%M%S) > {}'
To use the last modified timestamp, use the -r option:
bash -c 'echo $(date +%Y%m%d%H%M%S -r {}) > {}'

How to search for a string in entire linux system? [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 7 years ago.
Improve this question
I am looking to search for a string for example "/uniquexx" in the entire hard drive and find the files where it is referenced? How could I do that? I tried grep and find / but no luck
grep -r blablastring /
-r means recursive searching from all subdirectories.
If you want to search only text files, try ack. It's like grep, but defaults to skipping file types it recognizes as binary. It also highlights matches by default, when searching recursively in a directory.
Some answer point with use of grep.
What this actually does is make a list of every file on the system, and then for each file, execute grep with the given arguments and the name of each file
use
find / -xdev '(' -type f -a -name '*.txt' -a -size -2M -a -mtime -5 ')' -print0 | xargs -0 grep -H "800x600"
read more: How to search text throughout entire file system?
You can try:
grep -r -H "your string" /home/yourdir
-H means you will shown the filename contains your string.
Anyway if you want to search within the WHOLE linux directory, you need sudo privileges.

Trying to rename .JPG to .jpg in shell CLI [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 8 years ago.
Improve this question
I'm trying to rename all files in a directory from having the .JPG ext to .jpg but it isn't working.
I have looked around the net and found a few things but I can't seem to get any to work. The latest one I tried was:
rename -n .JPG .jpg *.JPG
I used the -n flag to see what would be modified but I got no response (no files).
What am I doing wrong here!?
If you don't want to use rename, you mention you have tried various things, then with only built-in bash utils, you can do this.
for x in `find . -maxdepth 1 -type f -name "*.JPG"` ; do mv "$x" `echo $x|sed 's/JPG/jpg/g'`; done
The backticks around find run the expression and assign the result to variable x. There are various switches you can use with find to limit by time, size, etc, if you need more sophisticated searching than just all JPG in current directory, for example. Maxdepth 1 will limit the search to current directory.
EDIT:
As pointed out by Adrian, using sed is unecessary and wasteful as it uses another subshell, so instead, this could all be compressed to:
for x in `find . -maxdepth 1 -type f -name "*.JPG"` ; do mv "$x" "${x%.JPG}.jpg"; done
The proper perl rename expects a regular expression so you would achieve this doing:
$ rename 's#\.JPG$#.jpg#' *.JPG
The shitty util-linux version of rename does not have an -n switch so you would have to do:
$ rename .JPG .jpg *.JPG
Consult the man page to check which implementation is actually installed on your system.

Linux/Unix Command Needed for finding files on a particular date [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 9 years ago.
Improve this question
I need help finding files in a directory which contain a word/string and on a particular date.
Currently I am using this command:
find . -exec grep -l .string. {} \;
This command returns all the files containing that string in that directory. I would like to get those files on from a particular date, for example 12/24/2013.
You can use:
find . -type f -exec grep 'string' {} \; -exec ls -l {} \; | grep 'Dec 24'
Which will search any files which contain the string string, and then execute ls -l on only those files, and finally, grep out any that match Dec 24.
This works because find will apply it's arguments in order, so only those that match previous results will be passed on.
Maybe this could help you with grep:
find /path/to/find -type d -atime -7
The last parameter is days here 7 days before you can modify to particular dat ,atime is the file access time ,'d' is directory search for directory for find a file replace 'd' with 'f' give the path where to find and then finally make pipeline this with grep to string to search

Resources