How to list directories and files in a Bash by script? - linux

I would like to list directory tree, but I have to write script for it and as parameter script should take path to base directory. Listing should start from this base directory.
The output should look like this:
Directory: ./a
File: ./a/A
Directory: ./a/aa
File: ./a/aa/AA
Directory: ./a/ab
File: ./a/ab/AB
So I need to print path from the base directory for every directory and file in this base directory.
UPDATED
Running the script I should type in the terminal this: ".\test.sh /home/usr/Desktop/myDirectory" or ".\test.sh myDirectory" - since I run the test.sh from the Desktop level.
And right now the script should be run from the level of /home/usr/Dekstop/myDirectory"
I have the following command in my test.sh file:
find . | sed -e "s/[^-][^\/]*\// |/g"
But It is the command, not shell code and prints the output like this:
DIR: dir1
DIR: dir2
fileA
DIR: dir3
fileC
fileB
How to print the path from base directory for every dir or file from the base dir? Could someone help me to work it out?

Not clear what you want maybe,
find . -type d -printf 'Directory: %p\n' -o -type f -printf 'File: %p\n'
However to see the subtree of a directory, I find more useful
find "$dirname" -type f
To answer comment it can also be done in pure bash (builtin without external commands), using a recursive function.
rec_find() {
local f
for f in "$1"/*; do
[[ -d $f ]] && echo "Directory: $f" && rec_find "$f"
[[ -f $f ]] && echo "File: $f"
done
}
rec_find "$1"

You can use tree command. Key -L means max depth. Examples:
tree
.
├── 1
│   └── test
├── 2
│   └── test
└── 3
└── test
3 directories, 3 files
Or
tree -L 1
.
├── 1
├── 2
└── 3
3 directories, 0 files

Create your test.sh with the below codes. Here you are reading command line parameter in system variable $1 and provides parameter to find command.
#!/bin/bash #in which shell you want to execute this script
find $1 | sed -e "s/[^-][^\/]*\// |/g"
Now how will it work:-
./test.sh /home/usr/Dekstop/myDirectory #you execute this command
Here command line parameter will be assign into $1. More than one parameter you can use $1 till $9 and after that you have to use shift command. (You will get more detail information online).
So your command will be now:-
#!/bin/bash #in which shell you want to execute this script
find /home/usr/Dekstop/myDirectory | sed -e "s/[^-][^\/]*\// |/g"
Hope this will help you.

Related

find files based on extension but display name without extension no basename, sed, awk, grep or ; allowed

I need to write a script that lists all the files with a .gif extension in the current directory and all its sub-directories BUT DO NOT use ANY of:
basename
grep
egrep
fgrep
rgrep
&&
||
;
sed
awk
AND still include hidden files.
I tried find . -type f -name '*.gif' -printf '%f\n' which will succesfully display .gif files, but still shows extension. Here's the catch: if I try to use cut -d . -f 1 to remove file extension, I also remove hidden files (which I don't want to) because their names start with ".".
Then I turned to use tr -d '.gif' but some of the files have a 'g' or a '.' in their name.
I also tried to use some of these answers BUT all of them include either basename, sed, awk or use some ";" in their script.
With so many restrictions I really don't know if it's even possible to achieve that but I'm supposed to.
How would you do it?
files/dirs structure:
$ tree -a
.
├── bar
├── bar.gif
├── base
│   └── foo.gif
├── foo
│   └── aaa.gif
└── .qux.gif
3 directories, 4 files
Code
find -type f -name '*.gif' -exec bash -c 'printf "%s\n" "${#%.gif}"' bash {} +
Output
./bar
./.qux
./foo/aaa
./base/foo
Explanations
Parameter Expansion expands parameters: $foo, $1. You can use it to perform string or array operations: "${file%.mp3}", "${0##*/}", "${files[#]: -4}". They should always be quoted. See: http://mywiki.wooledge.org/BashFAQ/073 and "Parameter Expansion" in man bash. Also see http://wiki.bash-hackers.org/syntax/pe.
Something like:
find . -name '*.gif' -type f -execdir bash -c 'printf "%s\n" "${#%.*}"' bash {} +
Using perl:
perl -MFile::Find::Rule -E '
say s/\.gif$//r for File::Find::Rule
->file()
->name(qr/\.gif\z/)
->in(".")
'
Output:
bar
.qux
foo/aaa
base/foo

Bash find fails to return all matching files when called from a script

Running the same command from the command line and from a bash script produces different results on Ubuntu 16.04.
I have a folder with the following contents:
├── audio
│   └── delete_me.mp3
├── words
│   └── audio
│   └── delete_me.mp3
│   └── images
│   └── delete_me.jpg
└── keep_me.txt
I have a bash script named findKeepers.sh:
#!/usr/bin/env bash
findKeepers () {
local dir=$1
echo "$(find $dir -type f ! -name delete_me*)"
}
findKeepers /path/to/directory
I expect it to output the path to the keep_me.txt file. Instead, I get a blank line.
 
If I run what seems to me to be identical commands from the command line, I get what I expect:
dir=/path/to/directory; echo "$(find $dir -type f ! -name delete_me*)"
/path/to/directory/keep_me.txt
If search instead for all files not called keep_me, the bash script ignores the audio folder. Here's another bash script called findUnwanted.sh:
#!/usr/bin/env bash
findUnwanted () {
local dir=$1
echo "$(find $dir -type f ! -name keep_me*)"
}
findUnwanted /path/to/directory
Here's the result:
$ ./findUnwanted.sh
/path/to/directory/words/audio/delete_me.mp3
/path/to/directory/words/images/delete_me.jpg
If I run the same thing from the command line, I get all three delete_me files:
$ dir=/path/to/directory; echo "$(find $dir -type f ! -name keep_me*)"
/path/to/directory/words/audio/delete_me.mp3
/path/to/directory/words/images/delete_me.jpg
/path/to/directory/audio/delete_me.mp3
It seems to me that the bash script starts by going deep into the words folder, and then does not come out again to search adjacent folders or files. Is there something special about the #!/usr/bin/env bash environment that makes it do this? Or is there some other difference that I'm not seeing?
CODA: I'm guessing it was pilot error, because after more modifications it started working for me again. For anyone who is interested, the final version of my function is shown below.
#!/usr/bin/env bash
# Returns 1 if the given directory contains only placeholder files, or
# 0 if the directory contains something worth keeping
checkForDeletion () {
local _dir=$1
local _temp=$(find "$_dir" -type f ! -regex '.*\(unused.txt\|delete_me.*\)')
if [ -z "$_temp" ]
then
return 1
fi
}
I use it like this:
parent=/path/to/parent/
for dir in $parent*/
do
checkForDeletion $dir
if [ $? = 1 ]
then
echo "DELETE? $dir" # rm -rf $dir
fi
done
I am guessing that your '!' is breaking the whole pipe. Try using '-not' instead, so your first code snippet would look like this:
echo "$(find $dir -type f -not -name delete_me*)"
I am not that good at explaining where you should escape special characters and where not, but the fact that things work differently when using that outside function suggests that escaping may be the issue.

Bash script to rename file names with correct date format in all sub folders in Linux

I have a buch of logs with names in "filename.logdate month year" (for example, filename.log25 Aug 2015, note there are space between the date/month/year) and I'd like to change them to "filename.logmonthdateyear" (for example filename.logOct052015, with no space).
These files are in a bunch of sub folders which makes it more challenging.
Parent Folder
--- sub folder1
file1
file2
--- sub folder2
file3
file4
etc.
Can anyone suggest a bash script that can do this?
Thank you!
find and rename should do the trick
strawman example:
to go from
...
├── foo/
│   ├── file name with spaces
│   └── bar/
│   └── another file with spaces
...
you can use
find foo/ -type f -exec rename 's/ //g' '{}' \;
to get
...
├── foo/
│ ├── filenamewithspaces
│ └── bar/
│ └── anotherfilewithspaces
...
in your case:
in your case, it would be something like
find path/to/files/ -type f -exec rename 's/ //g' '{}' \;
but you can use fancier filters in your find command like
find path/to/files/ -type f -name *.log* -exec rename 's/ //g' '{}' \;
to select only .log files in case there are other file names with spaces you don't want to touch
heads up:
as pointed out in the comments there's the potential to overwrite files if their names only differ by space placement (e.g., a bc.log and ab c.log if carelessly renamed would end up with a single abc.log).
for your case, you have two things on your side:
rename will give you a heads up as long as you're not using it's --force option
and will give you a helpful message like ./ab c.log not renamed: ./abc.log already exists
your files are named programatically, and you're stripping the spaces in dates, so, assuming that's all you have in there, you shouldn't have any problems
regardless, it's good to be mindful of this sort of thing
This is a way to do it with just Bash (4+) and 'mv':
# Prevent breakages when nothing matches patterns
shopt -s nullglob
# Enable '**' matches (requires Bash 4)
shopt -s globstar
topdir=$PWD
for folder in **/ ; do
# Work in the directory to avoid problems if its path has spaces
cd -- "$folder"
for file in *' '*' '* ; do
# Use the '-i' option to prevent silent clobbering
mv -i -- "$file" "${file// /}"
done
cd -- "$topdir"
done
If there is just one level of subfolders (as stated in the question), the requirement for Bash 4+ can be dropped: remove the shopts -s globstar, and change the first line of the outer loop to for folder in */ ; do.

Bash - finding files with spaces and rename with sed [duplicate]

This question already has answers here:
Recursively rename files using find and sed
(20 answers)
Closed 9 years ago.
I have been trying to write a script to rename all files that contain a space and replace the space with a dash.
Example: "Hey Bob.txt" to "Hey-Bob.txt"
When I used a for-loop, it just split up the file name at the space, so "Hey Bob.txt" gave separate argument like "Hey" and "Bob.txt".
I tried the following script but it keeps hanging on me.
#!/bin/bash
find / -name '* *' -exec mv {} $(echo {} | sed 's/ /-g')\;
Building off OP's idea:
find ${PATH_TO_FILES} -name '* *' -exec bash -c 'eval $(echo mv -v \"{}\" $(echo {} | sed "s/ /-/g"))' \;
NOTE: need to specify the PATH_TO_FILES variable
EDIT: BroSlow pointed out need to consider directory structure:
find ${PATH_TO_FILES} -name '* *' -exec bash -c 'DIR=$(dirname "{}" | sed "s/ /-/g" ); BASE=$(basename "{}"); echo mv -v \"$DIR/$BASE\" \"$DIR/$(echo $BASE | sed "s/ /-/g")\"' \; > rename-script.sh ; sh rename-script.sh
Another way:
find . -name "* *" -type f |while read file
do
new=${file// /}
mv "${file}" $new
done
Not one line, but avoids sed and should work just as well if you're going to be using it for a script anyway. (replace the mv with an echo if you want to test)
In bash 4+
#!/bin/bash
shopt -s globstar
for file in **/*; do
filename="${file##*/}"
if [[ -f $file && $filename == *" "* ]]; then
onespace=$(echo $filename)
dir="${file%/*}"
[[ ! -f "$dir/${onespace// /-}" ]] && mv "$file" "$dir/${onespace// /-}" || echo "$dir/${onespace// /-} already exists, so not moving $file" 1>&2
fi
done
Older bash
#!/bin/bash
find . -type f -print0 | while read -r -d '' file; do
filename="${file##*/}"
if [[ -f $file && $filename == *" "* ]]; then
onespace=$(echo $filename)
dir="${file%/*}"
[[ ! -f "$dir/${onespace// /-}" ]] && mv "$file" "$dir/${onespace// /-}" || echo "$dir/${onespace// /-} already exists, so not moving $file" 1>&2
fi
done
Explanation of algorithm
**/* This recursively lists all files in the current directory (** technically does it but /* is added at the end so it doesn't list the directory itself)
${file##*/} Will search for the longest pattern of */ in file and remove it from the string. e.g. /foo/bar/test.txt gets printed as test.txt
$(echo $filename) Without quoting echo will truncate spaces to one, making them easier to replace with one - for any number of spaces
${file%/*} Remove everything after and including the last /, e.g. /foo/bar/test.txt prints /foo/bar
mv "$file" ${onespace// /-} replace every space in our filename with - (we check if the hyphened version exists before hand and if it does echo that it failed to stderr, note && is processed before || in bash)
find . -type f -print0 | while read -r -d '' file This is used to avoid break up strings with spaces in them by setting a delimiter and not processing \
Sample Output
$ tree
.
├── bar
│   ├── some dir
│   │   ├── some-name-without-space1.pdf
│   │   ├── some name with space1.pdf
│   ├── some-name-without-space1.pdf
│   ├── some name with space1.pdf
│   └── some-name-with-space1.pdf
└── space.sh
$ ./space.sh
bar/some-name-with-space1.pdf already exists, so not moving bar/some name with space1.pdf
$ tree
.
├── bar
│   ├── some dir
│   │   ├── some-name-without-space1.pdf
│   │   ├── some-name-with-space1.pdf
│   ├── some-name-without-space1.pdf
│   ├── some name with space1.pdf
│   └── some-name-with-space1.pdf
└── space.sh

How to use 'mv' command to move files except those in a specific directory?

I am wondering - how can I move all the files in a directory except those files in a specific directory (as 'mv' does not have a '--exclude' option)?
Lets's assume the dir structure is like,
|parent
|--child1
|--child2
|--grandChild1
|--grandChild2
|--grandChild3
|--grandChild4
|--grandChild5
|--grandChild6
And we need to move files so that it would appear like,
|parent
|--child1
| |--grandChild1
| |--grandChild2
| |--grandChild3
| |--grandChild4
| |--grandChild5
| |--grandChild6
|--child2
In this case, you need to exclude two directories child1 and child2, and move rest of the directories in to child1 directory.
use,
mv !(child1|child2) child1
This will move all of rest of the directories into child1 directory.
Since find does have an exclude option, use find + xargs + mv:
find /source/directory -name ignore-directory-name -prune -print0 | xargs -0 mv --target-directory=/target/directory
Note that this is almost copied from the find man page (I think using mv --target-directory is better than cpio).
First get the names of files and folders and exclude whichever you want:
ls --ignore=file1 --ignore==folder1 --ignore==regular-expression1 ...
Then pass filtered names to mv as the first parameter and the second parameter will be the destination:
mv $(ls --ignore=file1 --ignore==folder1 --ignore==regular-expression1 ...) destination/
This isn't exactly what you asked for, but it might do the job:
mv the-folder-you-want-to-exclude somewhere-outside-of-the-main-tree
mv the-tree where-you-want-it
mv the-excluded-folder original-location
(Essentially, move the excluded folder out of the larger tree to be moved.)
So, if I have a/ and I want to exclude a/b/c/*:
mv a/b/c ../c
mv a final_destination
mkdir -p a/b
mv ../c a/b/c
Or something like that. Otherwise, you might be able to get find to help you.
This will move all files at or below the current directory not in the ./exclude/ directory to /wherever...
find -E . -not -type d -and -not -regex '\./exclude/.*' -exec echo mv {} /wherever \;
ls | grep -v exclude-dir | xargs -t -I '{}' mv {} exclude-dir
rename your directory to make it hidden so the wildcard does not see it:
mv specific_dir .specific_dir
mv * ../other_dir
#!/bin/bash
touch apple banana carrot dog cherry
mkdir fruit
F="apple banana carrot dog cherry"
mv ${F/dog/} fruit
# this removes 'dog' from the list F, so it remains in the
current directory and not moved to 'fruit'
Inspired by #user13747357 's answer.
First you can ls the file and filter them by:
ls | egrep -v '(dir_name|file_name.ext)'
Then you can run the following command to move the files except the specific ones:
mv $(ls | egrep -v '(dir_name|file_name.ext)') target_dir
* Note that I tested this inside a specific directory. Cross-directory operation should be more carefully executed :)
suppose you directory is
.
├── dir1
│ └── a.txt
├── dir2
│ ├── b.txt
│ └── hello.c
├── file1.txt
├── file2.txt
└── file3.txt
and you gonna put file1 file2 file3 into dir2.
you can use
mv $(ls -p | grep -v /) /dir2 to finish it, because
ls -p | grep -v / will print all files except directory in cwd.
For example, if I want to move all files/directories - except a specified file or directory - inside "var/www/html" to a sub-folder named "my_sub_domain", then I use "mv" with the command "!(what_to_exclude)":
$ cd /var/www/html
$ mv !(my_sub_domain) my_sub_domain
To exclude more I use "|" to seperate file/directory names:
$ mv !(my_sub_domain|test1.html) my_sub_domain
mv * exclude-dir
was the perfect solution for me

Resources