Trying to rename .JPG to .jpg in shell CLI [closed] - linux

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 8 years ago.
Improve this question
I'm trying to rename all files in a directory from having the .JPG ext to .jpg but it isn't working.
I have looked around the net and found a few things but I can't seem to get any to work. The latest one I tried was:
rename -n .JPG .jpg *.JPG
I used the -n flag to see what would be modified but I got no response (no files).
What am I doing wrong here!?

If you don't want to use rename, you mention you have tried various things, then with only built-in bash utils, you can do this.
for x in `find . -maxdepth 1 -type f -name "*.JPG"` ; do mv "$x" `echo $x|sed 's/JPG/jpg/g'`; done
The backticks around find run the expression and assign the result to variable x. There are various switches you can use with find to limit by time, size, etc, if you need more sophisticated searching than just all JPG in current directory, for example. Maxdepth 1 will limit the search to current directory.
EDIT:
As pointed out by Adrian, using sed is unecessary and wasteful as it uses another subshell, so instead, this could all be compressed to:
for x in `find . -maxdepth 1 -type f -name "*.JPG"` ; do mv "$x" "${x%.JPG}.jpg"; done

The proper perl rename expects a regular expression so you would achieve this doing:
$ rename 's#\.JPG$#.jpg#' *.JPG
The shitty util-linux version of rename does not have an -n switch so you would have to do:
$ rename .JPG .jpg *.JPG
Consult the man page to check which implementation is actually installed on your system.

Related

Linux shell select files in directory and subdirectiry [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 1 year ago.
Improve this question
For selecting all php files in a folder , we use :
vim *.php
How to use this command for selecting *.php files in this directory and his subdirectories ?
With shell option `globstar`` in bash you can use
vim **/*.php
To enable the shell option use e.g.
shopt -s globstar
You can use find combined with xargs:
find . -type f -name '*.php' -print0 | xargs -0 vi
The find will locate all regular files matching *.php, in and below the current directory, and send all the names in a stream to xargs separated by NUL characters.
The xargs program (with a -0 matching the -print0) will then separate them into individual file names and pass as many as possible to a single vi invocation. If it can't fit them all in one invocation (unlikely, unless the number of files is truly massive), it will make multiple invocations as needed.
You could use brace expansions:
vim {,*/}*.php
How does it work
{,*/} expands to <empty> dir1/ dir2/ ...
the final command line is then: ls *.php dir1/*.php dir2/*.php ...
This only matches immediate subdirectories, so subdirectories of a subdirectory won't be included. As mentioned in the other answeres, find or globstar is better suited for that.

Rich globbing `ls [G-S]*` in fish shell? [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 5 years ago.
Improve this question
In Bash it is possible to
ls [G-S]*
and it would list all files from g-s and G-S.
How is that done in Fish shell?
Fish currently does not support a rich glob syntax. The current thinking is that a glob command should be added in keeping with the fish goal of doing things via commands rather than magic syntax. See, for example, https://github.com/fish-shell/fish-shell/issues/3681. The solution is to create a function that filters the results. For example, the ** glob matches all files and directories in and below the CWD. I frequently want just the plain files and want to ignore the .git subdir. So I wrote this function:
function ff --description 'Like ** but only returns plain files.'
# This also ignores .git directories.
find . \( -name .git -type d -prune \) -o -type f | sed -n -e '/\/\.git$/n' -e 's/^\.\///p'
end
Which I can then use like this: grep something (ff). You could create a similar function that uses the find -name pattern matching feature or filter the results with string match --regex.
You can use find -iregex "./[G-S].*". Fish is quite limited in this regard.

How to search for a string in entire linux system? [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 7 years ago.
Improve this question
I am looking to search for a string for example "/uniquexx" in the entire hard drive and find the files where it is referenced? How could I do that? I tried grep and find / but no luck
grep -r blablastring /
-r means recursive searching from all subdirectories.
If you want to search only text files, try ack. It's like grep, but defaults to skipping file types it recognizes as binary. It also highlights matches by default, when searching recursively in a directory.
Some answer point with use of grep.
What this actually does is make a list of every file on the system, and then for each file, execute grep with the given arguments and the name of each file
use
find / -xdev '(' -type f -a -name '*.txt' -a -size -2M -a -mtime -5 ')' -print0 | xargs -0 grep -H "800x600"
read more: How to search text throughout entire file system?
You can try:
grep -r -H "your string" /home/yourdir
-H means you will shown the filename contains your string.
Anyway if you want to search within the WHOLE linux directory, you need sudo privileges.

Linux/Unix Command Needed for finding files on a particular date [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 9 years ago.
Improve this question
I need help finding files in a directory which contain a word/string and on a particular date.
Currently I am using this command:
find . -exec grep -l .string. {} \;
This command returns all the files containing that string in that directory. I would like to get those files on from a particular date, for example 12/24/2013.
You can use:
find . -type f -exec grep 'string' {} \; -exec ls -l {} \; | grep 'Dec 24'
Which will search any files which contain the string string, and then execute ls -l on only those files, and finally, grep out any that match Dec 24.
This works because find will apply it's arguments in order, so only those that match previous results will be passed on.
Maybe this could help you with grep:
find /path/to/find -type d -atime -7
The last parameter is days here 7 days before you can modify to particular dat ,atime is the file access time ,'d' is directory search for directory for find a file replace 'd' with 'f' give the path where to find and then finally make pipeline this with grep to string to search

find -mtime returns wrong listing [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 9 years ago.
Improve this question
When I run this command:
root:/home/mws 0$ ls -lrt `find /home/data/ll_misc_logs/ -mtime +20`
And there are no files meeting the mtime setting, 20 days, it lists the contents of the current directory, /home/mws
Why?
Is there a way to just return nothing or a message?
When there are no files meeting the mtime setting, the output of find .... expands to ... nothing. In which case, your command becomes ls -lrt, which will always list the current directory.
If there aren't too many files on a typical run, this might work better:
find /home/data/ll_misc_logs -mtime +20 -print0 | xargs -0 -r ls -ltr
But, if you get so many files that xargs decides to split it into multiple invocations, it probably won't do exactly what you want, either.
Which leads me to... What exactly are you trying to do? On the surface, it looks like "show me the old files, in order by modification time", but it's likely part of something bigger that might be solved in a more efficient (and less error-prone) manner...
If you just want a list of files older than 20 days sorted by oldest first:
find /home/data/ll_misc_logs -mtime +20 -exec ls -l --time-style=%s {} \; | sort -n -k 6

Resources