`find` claims it deleted the files, `ls` claims it didn't [closed] - linux

Closed. This question is not reproducible or was caused by typos. It is not currently accepting answers.
This question was caused by a typo or a problem that can no longer be reproduced. While similar questions may be on-topic here, this one was resolved in a way less likely to help future readers.
Closed 8 years ago.
Improve this question
I tried to delete some images by matching them to a regular expression and I did this in two similar ways by now, both including piping the results of find to rm. First I found all the images that I wish to be deleted with this:
find . -type f -regex ".+-[0-9]+x[0-9]+\.jpg"
Which found a lot of results.
So I tried to delete them like this:
find . -type f -regex ".+-[0-9]+x[0-9]+\.jpg" -exec rm -rf {} \;
And then like this:
find . -type f -regex ".+-[0-9]+x[0-9]+\.jpg" | xargs rm
After both attempts, the find command no longer sees the images that I wanted to delete (when I run the first command again), but ls sees them, and so does Nautilus. Is there some kind of commit I should run in order to actually delete them from the hard disk?
I tried searching the rm man page for "commit" and the find man page for "remove", but haven't found anything significant.

Your regex doesn't match these filenames...
$ touch yellow-zone-etna-36x36.png yellow-zone-etna-615x250.png
$ find . -type f -regex ".+-[0-9]+x[0-9]+\.jpg"
$ # no output
because you have PNGs, you're looking for JPEGs, and you additionally have JPEGs that don't match the regex either.

Related

Linux find specific data to a file and entire system [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 7 years ago.
Improve this question
I need to find a specific data from a file. And I want to search it from entire system of my Linux. Is this possible?
The above answer will work but that will try grepping directories as well for the pattern which inturn will throw an error. The best solution will be to search for files only. This will considerably reduce the search time as well.
find / -type f -exec grep -i <pattern> {} \;
If you are only interested in listing the files containing the pattern, you could pass the -l switch in grep.
find / -type f -exec grep -il <pattern> {} \;
If you would like to list both the file(s) and the pattern, you can pass the -H switch in grep.
find / -type f -exec grep -iH <pattern> {} \;
#alvits - Thanks for the suggestion.
find / -name "*" -exec grep -q <pattern> '{}' \; -print
This command search from root directory(/); and so all sub directories.
You can replace your search pattern in place of in command.
And it will print all files contains your pattern.
If you know the file extension of your searching file, you can limit search by replacing * with *.extention within command.

Trying to rename .JPG to .jpg in shell CLI [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 8 years ago.
Improve this question
I'm trying to rename all files in a directory from having the .JPG ext to .jpg but it isn't working.
I have looked around the net and found a few things but I can't seem to get any to work. The latest one I tried was:
rename -n .JPG .jpg *.JPG
I used the -n flag to see what would be modified but I got no response (no files).
What am I doing wrong here!?
If you don't want to use rename, you mention you have tried various things, then with only built-in bash utils, you can do this.
for x in `find . -maxdepth 1 -type f -name "*.JPG"` ; do mv "$x" `echo $x|sed 's/JPG/jpg/g'`; done
The backticks around find run the expression and assign the result to variable x. There are various switches you can use with find to limit by time, size, etc, if you need more sophisticated searching than just all JPG in current directory, for example. Maxdepth 1 will limit the search to current directory.
EDIT:
As pointed out by Adrian, using sed is unecessary and wasteful as it uses another subshell, so instead, this could all be compressed to:
for x in `find . -maxdepth 1 -type f -name "*.JPG"` ; do mv "$x" "${x%.JPG}.jpg"; done
The proper perl rename expects a regular expression so you would achieve this doing:
$ rename 's#\.JPG$#.jpg#' *.JPG
The shitty util-linux version of rename does not have an -n switch so you would have to do:
$ rename .JPG .jpg *.JPG
Consult the man page to check which implementation is actually installed on your system.

find a pattern in files and rename them [closed]

Closed. This question is not reproducible or was caused by typos. It is not currently accepting answers.
This question was caused by a typo or a problem that can no longer be reproduced. While similar questions may be on-topic here, this one was resolved in a way less likely to help future readers.
Closed 5 years ago.
Improve this question
I use this command to find files with a given pattern and then rename them to something else
find . -name '*-GHBAG-*' -exec bash -c 'echo mv $0 ${0/GHBAG/stream-agg}' {} \;
As I run this command, I see some outputs like this
mv ./report-GHBAG-1B ./report-stream-agg-1B
mv ./reoprt-GHBAG-0.5B ./report-stream-agg-0.5B
However at the end, when I run ls, I see the old file names.
You are echo'ing your 'mv' command, not actually executing it. Change to:
find . -name '*-GHBAG-*' -exec bash -c 'mv $0 ${0/GHBAG/stream-agg}' {} \;
I would suggest using the rename command to perform this task. rename renames the filenames supplied according to the rule specified as a Perl regular expression.
In this case, you could use:
rename 's/GHBAG/stream-agg/' *-GHBAG-*
In reply to anumi's comment, you could in effect search recursively down directories by matching '**':
rename 's/GHBAG/stream-agg/' **/*-GHBAG-*
This works for my needs, replacing all matching files or file types. Be warned, this is a very greedy search
# bashrc
function file_replace() {
for file in $(find . -type f -name "$1*"); do
mv $file $(echo "$file" | sed "s/$1/$2/");
done
}
I will usually run with find . -type f -name "MYSTRING*" in advance to check the matches out before replacing.
For example:
file_replace "Slider.js" "RangeSlider.ts"
renamed: packages/react-ui-core/src/Form/Slider.js -> packages/react-ui-core/src/Form/RangeSlider.ts
renamed: stories/examples/Slider.js -> stories/examples/RangeSlider.ts
or ditch the filetype to make it even greedier
file_replace Slider RangeSlider
renamed: packages/react-ui-core/src/Form/Slider.js -> packages/react-ui-core/src/Form/RangeSlider.js
renamed: stories/examples/Slider.js -> stories/examples/RangeSlider.js
renamed: stories/theme/Slider.css -> stories/theme/RangeSlider.css

List of All Folders and Sub-folders [closed]

Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 10 years ago.
Improve this question
In Linux, I want to find out all Folder/Sub-folder name and redirect to text file
I tried ls -alR > list.txt, but it gives all files+folders
You can use find
find . -type d > output.txt
or tree
tree -d > output.txt
tree, If not installed on your system.
If you are using ubuntu
sudo apt-get install tree
If you are using mac os.
brew install tree
find . -type d > list.txt
Will list all directories and subdirectories under the current path. If you want to list all of the directories under a path other than the current one, change the . to that other path.
If you want to exclude certain directories, you can filter them out with a negative condition:
find . -type d ! -name "~snapshot" > list.txt
As well as find listed in other answers, better shells allow both recurvsive globs and filtering of glob matches, so in zsh for example...
ls -lad **/*(/)
...lists all directories while keeping all the "-l" details that you want, which you'd otherwise need to recreate using something like...
find . -type d -exec ls -ld {} \;
(not quite as easy as the other answers suggest)
The benefit of find is that it's more independent of the shell - more portable, even for system() calls from within a C/C++ program etc..

How can I use the `find` command in Linux to remove non-empty directories? [closed]

Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 10 years ago.
Improve this question
I have temp directories full of junk that all start with __temp__ (e.g. __temp__user_uploads), which I want to delete with a cleanup function. My function attempt is to run:
find . -name __temp__* -exec rm -rf '{}' \;
If I run the command and there are multiple __temp__ directories (__temp__foo and __temp__bar), I get the output:
find: __temp__foo: unknown option
If I run the command and there is only 1 __temp__ directory (__temp__foo), it is deleted and I get the output:
find: ./__temp__foo: No such file or directory
Why doesn't the command work, why is it inconsistent like that, and how can I fix it?
Use a depth-first search and quote (or escape) the shell metacharacter *:
find . -depth -name '__temp__*' -exec rm -rf '{}' \;
Explanation
Without the -depth flag, your find command will remove matching filenames and then try to descend into the (now unlinked) directories. That's the origin of the "No such file or directory" in your single __temp__ directory case.
Without quoting or escaping the *, the shell will expand that pattern, matching several __temp__whatever filenames in the current working directory. This expansion will confuse find, which is expecting options rather than filenames at that point in its argument list.

Resources