Creating list of files of every subfolders in folders bash - linux

I have a problem with creating a list of files with template *.cbf in any subfolders of every folders.
I wrote the script in Shell. But it always exit with "no such file or directory".
The structure of path is following /dir///*.cbf
#!/usr/bin/env bash
input_dir=$1
for i in `ls $input_dir/*/*/*_00001.cbf`; do
cbf=$(readlink -e $i)
cbf_fn=$(basename $cbf)
cbf_path=$(dirname $cbf)
cbf_path_p2=$(basename $cbf_path)
cbf_path_p1=$(basename $(dirname $cbf_path))
find `$input_dir/$cbf_path_p1/$cbf_path_p2` -name "*.cbf" -print > files.lst
done

The main reason is that the directory will probably not exist. I'll go through your code:
Suppose your input_dir is /hoppa and your link is /hoppa/1/2/a_00001.cbf. /hoppa/1/2/a_00001.cbf is a link that point to /level1/level2/level3/filename.ext
for i in `ls $input_dir/*/*/*_00001.cbf`; do
It is in general a bad idea to process the output of ls. Also, for those who once did Fortran (punch-card, ah those days...), i suggests an integer. f or file would probably a better choice. So, assuming that your input-dir does not contain spaces,
for file in $input_dir/*/*/*_00001.cbf ; do
cbf=$(readlink -e $i)
(those who sugested find probably missed the readlink)
cbf_fn=$(basename $cbf) # cbf_fn=filename.ext
cbf_path=$(dirname $cbf) # cbf_path=/level1/level2/level3
cbf_path_p2=$(basename $cbf_path)
# cbf_path_p2=level3
cbf_path_p1=$(basename $(dirname $cbf_path))
# cbf_path_p1=level2
find `$input_dir/$cbf_path_p1/$cbf_path_p2` -name "*.cbf" -print > files.lst
So the find will look in /hoppa/level2/level3, a directory which may not exist.
done

Related

for loop in Linux treats pattern as filename when no files exist

I ran the following in a directory with no files:
for file in *.20191017.*;do echo ${file}; done
what it returned was this:
*.20191017.*
which is little awkward since this was just a pattern and not the filename itself.
Can anyone please help on this?
Found the reason for this anomaly (source: https://www.cyberciti.biz/faq/bash-loop-over-file/)
You can do filename expansion in loop such as work on all pdf files in current directory:
for f in *.pdf; do
echo "Removing password for pdf file - $f"
done
However, there is one problem with the above syntax. If there are no pdf files in current directory it will expand to *.pdf (i.e. f will be set to *.pdf”). To avoid this problem add the following statement before the for loop:
#!/bin/bash
# Usage: remove all utility bills pdf file password
shopt -s nullglob # expands the glob to empty string when there are no matching files in the directory.
for f in *.pdf; do
echo "Removing password for pdf file - $f"
pdftk "$f" output "output.$f" user_pw "YOURPASSWORD-HERE"
done
The for loop simply iterates over the words between in and ; (possibly expanded by bash). Here, file is just the variable name. If you want to iterate between all files that are actually present, you can, for example, add a if to check if the ${file} really exists:
for file in *.20191017.*
do
if [ -e "${file}" ]
then
echo ${file}
fi
done
Or you can use, e.g., find
find . -name '*.20191017.*' -maxdepth 1
-maxdepth 1 is to avoid recursion.

Deleting all files in a directory except the ones mentioned in a list [duplicate]

This question already has answers here:
Shell script: How to delete all files in a directory except ones listed in a file?
(2 answers)
Closed 2 years ago.
I have a directory called a00 containing 3000 files with extension .SAC. I have a text file called gd.list containing names of 88 of those 3000 files. I am trying to write a code that will delete all .SAC files except those mentioned in gd.list
How to do that using shell/bash?
The rm command is commented out so that you can check and verify that it's working as needed. Then just un-comment that line.
The check directory section will ensure you don't accidentally run the script from the wrong directory and clobber the wrong files.
You can remove the echo deleting line to run silently.
#!/bin/bash
cd /home/me/myfolder2tocleanup/
# Exit if the directory isn't found.
if (($?>0)); then
echo "Can't find work dir... exiting"
exit
fi
for i in *; do
if ! grep -qxFe "$i" filelist.txt; then
echo "Deleting: $i"
# the next line is commented out. Test it. Then uncomment to removed the files
# rm "$i"
fi
done
You can find the answer here https://askubuntu.com/questions/830776/remove-file-but-exclude-all-files-in-a-list by L. D. James
there are a few alternatives.
I'd prefer to see find -Z as it more clearly demarcates the file names:
find . -maxdepth 1 -name '*.sac' -print0 | grep -x -z -Z -f gd.list | xargs -0 echo rm
Again, test this first. Perhaps sort the output and make sure it is unique versus the original file.
For a smaller list of filenames I would recommend just using find with -and -not -name and -delete, but with a larger list that can be tricky.
You could tag the files you want to keep as read-only, then delete the wildcard with the appropriate setting in rm or find to skip read-only files. That assumes you own the read-only flag. You could tag the files as executable, and use find, if the read-only flag is not for you.
Another option would be to move the matching files to a temp folder, delete the wildcard, then move the files you want to keep back. That is assuming you can afford for the files to disappear temporarily.
To make them disappear for a shorter time, move the kept files out to a temp directory, move the original directory out, move the temp directory in, then delete the movced out directory.
If you are feeling brave, try something like
ls *.sac | fgrep -v -f gd.list | xargs echo rm
Note that I've put an echo in that xargs, just to make sure no one has a cut and paste accident.
Note also the limitations of this approach mentioned in the comments. As I said, if you are feeling brave...

How to recursively get all files filtered by multiple extensions within a folder including working folder without using find in Bash script

I have this question after quite a day of searching the net, perhaps I'm doing something wrong , here is my script:
#!/bin/bash
shopt -s extglob
FILE_EXTENSIONS=properties\|xml\|sh\|sql\|ksh
SOURCE_FOLDER=$1
if [ -z "$SOURCE_FOLDER" ]; then
SOURCE_FOLDER=$(pwd)
fi # Set directory to current working folder if no input parameter.
for file in $SOURCE_FOLDER/**/*.*($FILE_EXTENSIONS)
do
echo Working with file: $file
done
Basically, I want to recursively get all the files filtered by a list of extensions within folders from a directory that is passed as an argument including the directory itself.
I would like to know if there is a way of doing this and how without the use of the find command.
Imagine I have this file tree:
bin/props.properties
bin/xmls.xml
bin/source/sources.sh
bin/config/props.properties
bin/config/folders/moreProps.xml
My script, as it is right now and running from /bin, would echo:
bin/source/sources.sh
bin/config/props.properties
bin/config/folders/moreProps.xml
Leaving the ones in the working path aside.
P.S. I know this can be done with find but I really want to know if there's another way for the sake of learning.
Thanks!
You can use find with grep, just like this:
#!/bin/bash
SOURCE_FOLDER=$1
EXTENSIONS="properties|xml|sh|sql|ksh"
find $SOURCE_FOLDER | grep -E ".(${EXTENSIONS})"
#or even better
find $SOURCE_FOLDER -regextype posix-egrep -regex ".*(${EXTENSIONS})"

How to make this (l)unix script dynamically accept directory name in for-loop?

I am teaching myself more (l)unix skills and wanted to see if I could begin to write a program that will eventually read all .gz files and expand them. However, I want it to be super dynamic.
#!/bin/bash
dir=~/derp/herp/path/goes/here
for file in $(find dir -name '*gz')
do
echo $file
done
So when I excute this file, I simply go
bash derp.sh.
I don't like this. I feel the script is too brittle.
How can I rework my for loop so that I can say
bash derp.sh ~/derp/herp/path/goes/here (1)
I tried re-coding it as follows:
for file in $*
However, I don't want to have to type in bash
derp.sh ~/derp/herp/path/goes/here/*.gz.
How could I rewrite this so I could simply type what is in (1)? I feel I must be missing something simple?
Note
I tried
for file in $*/*.gz and that obviously did not work. I appreciate your assistance, my sources have been a wrox unix text, carpentry v5, and man files. Unfortunately, I haven't found anything that will what I want.
Thanks,
GeekyOmega
for dir in "$#"
do
for file in "$dir"/*.gz
do
echo $file
done
done
Notes:
In the outer loop, dir is assigned successively to each argument given on the command line. The special form "$#" is used so that the directory names that contain spaces will be processed correctly.
The inner loop runs over each .gz file in the given directory. By placing $dir in double-quotes, the loop will work correctly even if the directory name contains spaces. This form will also work correctly if the gz file names have spaces.
#!/bin/bash
for file in $(find "$#" -name '*.gz')
do
echo $file
done
You'll probably prefer "$#" instead of $*; if you were to have spaces in filenames, like with a directory named My Documents and a directory named Music, $* would effectively expand into:
find My Documents Music -name '*.gz'
where "$#" would expand into:
find "My Documents" "Music" -name '*.gz'
Requisite note: Using for file in $(find ...) is generally regarded as a bad practice, because it does tend to break if you have spaces or newlines in your directory structure. Using nested for loops (as in John's answer) is often a better idea, or using find -print0 and read as in this answer.

How to open all files in a directory in Bourne shell script?

How can I use the relative path or absolute path as a single command line argument in a shell script?
For example, suppose my shell script is on my Desktop and I want to loop through all the text files in a folder that is somewhere in the file system.
I tried sh myshscript.sh /home/user/Desktop, but this doesn't seem feasible. And how would I avoid directory names and file names with whitespace?
myshscript.sh contains:
for i in `ls`
do
cat $i
done
Superficially, you might write:
cd "${1:-.}" || exit 1
for file in *
do
cat "$file"
done
except you don't really need the for loop in this case:
cd "${1:-.}" || exit 1
cat *
would do the job. And you could avoid the cd operation with:
cat "${1:-.}"/*
which lists (cats) all the files in the given directory, even if the directory or the file names contains spaces, newlines or other difficult to manage characters. You can use any appropriate glob pattern in place of * — if you want files ending .txt, then use *.txt as the pattern, for example.
This breaks down if you might have so many files that the argument list is too long. In that case, you probably need to use find:
find "${1:-.}" -type f -maxdepth 1 -exec cat {} +
(Note that -maxdepth is a GNU find extension.)
Avoid using ls to generate lists of file names, especially if the script has to be robust in the face of spaces, newlines etc in the names.
Use a glob instead of ls, and quote the loop variable:
for i in "$1"/*.txt
do
cat "$i"
done
PS: ShellCheck automatically points this out.

Resources