List of All Folders and Sub-folders [closed] - linux

Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 10 years ago.
Improve this question
In Linux, I want to find out all Folder/Sub-folder name and redirect to text file
I tried ls -alR > list.txt, but it gives all files+folders

You can use find
find . -type d > output.txt
or tree
tree -d > output.txt
tree, If not installed on your system.
If you are using ubuntu
sudo apt-get install tree
If you are using mac os.
brew install tree

find . -type d > list.txt
Will list all directories and subdirectories under the current path. If you want to list all of the directories under a path other than the current one, change the . to that other path.
If you want to exclude certain directories, you can filter them out with a negative condition:
find . -type d ! -name "~snapshot" > list.txt

As well as find listed in other answers, better shells allow both recurvsive globs and filtering of glob matches, so in zsh for example...
ls -lad **/*(/)
...lists all directories while keeping all the "-l" details that you want, which you'd otherwise need to recreate using something like...
find . -type d -exec ls -ld {} \;
(not quite as easy as the other answers suggest)
The benefit of find is that it's more independent of the shell - more portable, even for system() calls from within a C/C++ program etc..

Related

linux remove all files except hidden files and folders [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about programming within the scope defined in the help center.
Closed 5 years ago.
Improve this question
I have a directory structure like:
/Folder1/file1
/file2
/file3
/.file4
I need to remove only /file2 and /file3. I want to keep /Folder1/file1 and /.file4.
If globbing is disabled rm * will work. You can check whether dotglob is disabled with the shopt command. If it is turned on use:
shopt -u dotglob
rm *
shopt -s dotglob
find . -type f -name "[^.]*" -delete
(Do it without -delete first, to be sure you typed it right.)
find . -type f -name file{2,3} -delete
type -f search only for files, ignoring directories
file{2,3} file2 and file3
or
file{2..6} means range (file2, file3, file4, file5 file6)
Unless you search for wildcard before the file name like "*file" this will not delete hidden files.
To run only in the current directory to do not delete /Folder1/file1
find . -maxdepth 0 -type f -name file{2,3} -delete
But to run simply in the current directory i prefer the old rm.
rm file*

Trying to rename .JPG to .jpg in shell CLI [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 8 years ago.
Improve this question
I'm trying to rename all files in a directory from having the .JPG ext to .jpg but it isn't working.
I have looked around the net and found a few things but I can't seem to get any to work. The latest one I tried was:
rename -n .JPG .jpg *.JPG
I used the -n flag to see what would be modified but I got no response (no files).
What am I doing wrong here!?
If you don't want to use rename, you mention you have tried various things, then with only built-in bash utils, you can do this.
for x in `find . -maxdepth 1 -type f -name "*.JPG"` ; do mv "$x" `echo $x|sed 's/JPG/jpg/g'`; done
The backticks around find run the expression and assign the result to variable x. There are various switches you can use with find to limit by time, size, etc, if you need more sophisticated searching than just all JPG in current directory, for example. Maxdepth 1 will limit the search to current directory.
EDIT:
As pointed out by Adrian, using sed is unecessary and wasteful as it uses another subshell, so instead, this could all be compressed to:
for x in `find . -maxdepth 1 -type f -name "*.JPG"` ; do mv "$x" "${x%.JPG}.jpg"; done
The proper perl rename expects a regular expression so you would achieve this doing:
$ rename 's#\.JPG$#.jpg#' *.JPG
The shitty util-linux version of rename does not have an -n switch so you would have to do:
$ rename .JPG .jpg *.JPG
Consult the man page to check which implementation is actually installed on your system.

Recursively doing the command ls without -R [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about programming within the scope defined in the help center.
Closed 9 years ago.
Improve this question
I am trying to find a way to recreate the output of ls -R (linux) without using the option -R i.e without the recursion command, is this at all possible?
There are no other constraints.
shopt -s globstar nullglob
printf "%s\n" **
or
find .
The closest I can think of right now is to recurse through all given directories using find and to perform a listing on each. I used ls -1 because I noticed that ls -R defaults to a single column when redirected into a file; you may choose to omit the -1 option.
for dir in `find . -type d`; do
echo $dir:
ls -1 $dir
done
However, it doesn't work with filenames that contain spaces. I'm still looking for a way around that...

Change filenames to lowercase in Ubuntu in all subdirectories [closed]

Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 10 years ago.
Improve this question
I know it's been asked but what I've found has not worked out so far.
The closet I came is this : rename -n 'y[A-Z]/[a-z]/' *
which works for the current directory. I'm not too good at Linux terminal so what
should I add to this command to apply it to all of the files in all the sub-directories from which I am in, thanks!
Here's one way using find and tr:
for i in $(find . -type f -name "*[A-Z]*"); do mv "$i" "$(echo $i | tr A-Z a-z)"; done
Edit; added: -name "*[A-Z]*"
This ensures that only files with capital letters are found. For example, if files with only lowercase letters are found and moved to the same file, mv will display the are the same file error.
Perl has a locale-aware lc() function which might work better:
find . -type f | perl -n -e 'chomp; system("mv", $_, lc($_))'
Note that this script handles whitespace in filenames, but not newlines. And there's no protection against collisions, if you have "ASDF.txt" and "asdf.txt" one is going to get clobbered.

How can I use the `find` command in Linux to remove non-empty directories? [closed]

Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 10 years ago.
Improve this question
I have temp directories full of junk that all start with __temp__ (e.g. __temp__user_uploads), which I want to delete with a cleanup function. My function attempt is to run:
find . -name __temp__* -exec rm -rf '{}' \;
If I run the command and there are multiple __temp__ directories (__temp__foo and __temp__bar), I get the output:
find: __temp__foo: unknown option
If I run the command and there is only 1 __temp__ directory (__temp__foo), it is deleted and I get the output:
find: ./__temp__foo: No such file or directory
Why doesn't the command work, why is it inconsistent like that, and how can I fix it?
Use a depth-first search and quote (or escape) the shell metacharacter *:
find . -depth -name '__temp__*' -exec rm -rf '{}' \;
Explanation
Without the -depth flag, your find command will remove matching filenames and then try to descend into the (now unlinked) directories. That's the origin of the "No such file or directory" in your single __temp__ directory case.
Without quoting or escaping the *, the shell will expand that pattern, matching several __temp__whatever filenames in the current working directory. This expansion will confuse find, which is expecting options rather than filenames at that point in its argument list.

Resources