Display contents of files which are greater than 0 - linux

I need to write a shell script.
I have bunch of files in a directory. From there, I need to display content of files which are greater than 0 bytes in size. and delete the files which are 0 in size.
Please help. Thanks in advance.

I Found an answer which works fine. But, any more inputs will be welcome.
The answer is the following. Which I need to use in shell script.
find . -size 0c -delete

Here's something that will work
#!/bin/bash
for f in $(ls) ; do
if [ -f $f ] ; then
if [ -s $f ] ; then
ls $f
else
rm $f
fi
fi
done
Note, it's just doing ls in the current directory. You could also pass in a directory as arg to look in or other method. Also, this won't pick up hidden files (.*).
The key to how it works are the Bash conditional expressions -s (true if file is more than 0 size) and -f (true if regular file).

To display the contents of all files under topdir that have non-zero size:
find topdir -type f \! -size 0c -exec cat {} +
To delete all completely empty files under the same directory:
find topdir -type f -size 0c -ok rm {} \;
Replace -ok with -exec (and the \; at the end to +) if you don't want to confirm each removal.
This solution assumes a POSIX find.

for i in `ls` ; do
if [ -s $i ] ; then
cat $i
else
rm -f $i
fi
done
if you have spaces in you filenames you may need to change IFS env variable, or think about using "find" command instead

Related

Bash Globbing Pattern Matching for Imagemagick recursive convert to pdf

I have the following 2 scripts, that recursively convert folders of images to pdf's for my wifes japanese manga kindle using find and Imagemagick convert:
#!/bin/bash
_d="$(pwd)"
echo "$_d"
find . -type d -exec echo "Will convert in the following order: {}" \;
find . -type d -exec echo "Converting: '{}'" \; -exec convert '{}/*.jpg' "$_d/{}.pdf" \;
and the same for PNG
#!/bin/bash
_d="$(pwd)"
echo "$_d"
find . -type d -exec echo "Will convert in the following order: {}" \;
find . -type d -exec echo "Converting: '{}'" \; -exec convert '{}/*.png' "$_d/{}.pdf" \;
Unfortunately I am not able make one universal script that works for all image formats.
How do I make one script that works for both ?
I would also need JPG,PNG as well as jpeg,JPEG
Thx in advance
I wouldn't use find at all, just a loop:
#!/use/bin/env bash
# enable recursive globs
shopt -s globstar
for dir in **/*/; do
printf "Converting jpgs in %s\n" "$dir"
convert "$dir"/*.jpg "$dir/out.pdf"
done
If you want to combine .jpg and .JPG in the same pdf, add nocaseglob to the shopt line. Add .jpeg to the mix? Add extglob and change "$dir"/*.jpg to "$dir"/*.#(jpg|jpeg)
You can do more complicated actions if you turn the find exec into a bash function (or even a standalone script).
#!/bin/bash
do_convert()(
shopt -s nullglob
for dir in "$#"; do
files=("$dir"/*.{jpg,JPG,PNG,jpeg,JPEG})
if [[ -z $files ]]; then
echo 1>&2 "no suitable files in $dir"
continue
fi
echo "Converting $dir"
convert "${files[#]}" "$dir.pdf"
done
)
export -f do_convert
pwd
echo "Will convert in the following order:"
find . -type d
# find . -type d -exec bash -c 'do_convert {}' \;
find . -type d -exec bash -c 'do_convert "$#"' -- {} \+
nullglob makes *.xyz return nothing if there is no match, instead of returning the original string unchanged
p/*.{a,b,c} expands into p/*.a p/*.b p/*.c before the * are expanded
x()(...) instead of the more normal x(){...} uses a subshell so we don't have to remember to unset nullglob again or clean up any variable definitions
export -f x makes function x available in subshells
we skip conversion if there are no suitable files
with the slightly more complicated find command, we can reduce the number of invocations of bash (probably doesn't save a great deal in this particular case)
how about a one-liner
dry-run
find -name \*.jpg -or -name \*.png | xargs -I xxx echo "xxx =>" xxx.pdf
run
find -name \*.jpg -or -name \*.png | xargs -I xxx echo xxx xxx.pdf
help
-name match name
-or logical or => both jpg and png
xargs map input into a name to execute a command on
-I select a name, it is like {} in file
NOTE
instead of $(pwd) which is a command substitution you can use variable $PWD
xxx maps into a name and xxx.pdf still has the matched extension found by find. which means filename.png becomes filename.png.pdf. If this is not desired, you can sed it
to run convert command in parallel you can use -P 0 with xargs -- see xargs --help
With sed to remove extensions
dry-run
find -name \*.jpg -or -name \*.png | sed 's/.\(png\|jpg\)$//g' | xargs -I xxx echo "xxx =>" xxx.pdf
#shawn Your solution works, just as I stated in the comments, I am to stupid to name the resulting pdf properly (folder name) and save in the script caller directory. Nevertheless, it solves my case insensitive jpg, jpeg, png problems just fine.
Here is shawns solution:
#!/bin/bash
# enable recursive globs
shopt -s globstar nocaseglob extglob
for dir in **/*/; do
printf "Converting (jpg|jpeg|png) in %s\n" "$dir"
convert "$dir"/*.#(jpg|jpeg|png) "$dir/out.pdf"
done
#jhnc Your solution works out of the box, it does exactly what I intended, and I really like calling functions, or even standalone scripts to increase complexity. One drawback is, that I can not Ctrl-c the process, because it is thereby threaded, or runs in a subshell ? I think you were missing an exit statement at the end of the function, it never stopped.
#!/bin/bash
do_convert()(
shopt -s nullglob
for dir in "$#"; do
files=("$dir"/*.{jpg,JPG,png,PNG,jpeg,JPEG})
if [[ -z $files ]]; then
echo 1>&2 "no suitable files in $dir"
continue
fi
echo "Converting $dir"
convert "${files[#]}" "$dir.pdf"
done
exit
)
export -f do_convert
pwd
echo "Will convert in the following order:"
find . -type d
# find . -type d -exec bash -c 'do_convert {}' \;
find . -type d -exec bash -c 'do_convert "$#"' -- {} \+
# everyone else, it's already after midnight again, I guess this is a trivial question for you guys, and I am very grateful for your ALL your answers, I didn't have the time to try everything.
I find linux bash very challenging.
A lot of ways to skin this cat. My thought is:
for F in `find . -type f -print`
do
TYPE=`file -n --mime-type $F`
if [ "$TYPE" = image/png ]
then
## do png conversion here
elif [ "$TYPE" = image/jpg ]
then
## do jpg conversion here
fi
done

How to rename directory and subdirectories recursively in linux?

Let say I have 200 directories and it have variable hierarchy sub-directories, How can I rename the directory and its sub directories using mv command with find or any sort of combination?
for dir in ./*/; do (i=1; cd "$dir" && for dir in ./*; do printf -v dest %s_%02d "$dir" "$((i++))"; echo mv "$dir" "$dest"; done); done
This is for 2 level sub directory, is there more cleaner way to do it for multiple hierarchy? Any other one line command suggestions/ solutions are welcome.
I had a specific task - to replace non-ASCII symbols and square brackets, in directories and in files as well. It works fine.
First, exactly my case, as a working example:
find . -depth -execdir rename -v 's/([^\x00-\x7F]+)|([\[\]]+)/\_/g' {} \;
or separately non-ascii and brackets:
find . -depth -execdir rename -v 's/[^\x00-\x7F]+/\_/g' {} \;
find . -depth -execdir rename -v 's/[\[\]]+/\_/g' {} \;
If we'd like to work only with directories, add -type d (after the -depth option)
Now, in more generalized view:
find . -depth [-type d] [-type f] -execdir rename [-v] 's/.../.../g' '{}' \;
Here we can control dirs/files and verbosity. Quotes around {} may be needed or not on your machine (backslash before ; serves the same, may be replaced with quotes)
You have two options when you want to do recursive operations in files/directories:
Option 1 : Find
while IFS= read -r -d '' subd;do
#do your stuff here with var $subd
done < <(find . -type d -print0)
In this case we use find to return only dirs using -type d
We can ask find to return only files using -type f or not to specify any type and both directories and files will be returned.
We also use find option -print0 to force null separation of the find results and thus to ensure correct names handling in case names include special chars like spaces, etc.
Testing:
$ while IFS= read -r -d '' s;do echo "$s";done < <(find . -type d -print0)
.
./dir1
./dir1/sub1
./dir1/sub1/subsub1
./dir1/sub1/subsub1/subsubsub1
./dir2
./dir2/sub2
Option 2 : Using Bash globstar option
shopt -s globstar
for subd in **/ ; do
#Do you stuff here with $subd directories
done
In this case , the for loop will match all subdirs under current working directory (operation **/).
You can also ask bash to return both files and folders using
for sub in ** ;do #your commands;done
if [[ -d "$sub" ]];then
#actions for folders
elif [[ -e "$sub" ]];then
#actions for files
else
#do something else
fi
done
Folders Test:
$ shopt -s globstar
$ for i in **/ ;do echo "$i";done
dir1/
dir1/sub1/
dir1/sub1/subsub1/
dir1/sub1/subsub1/subsubsub1/
dir2/
dir2/sub2/
In your small script, just by enabling shopt -s globstar and by changing your for to for dir in **/;do it seems that work as you expect.

check if find command return something (in bash script)

i have the following bash script on my server:
today=$(date +"%Y-%m-%d")
find /backups/www -type f -mtime -1|xargs tar uf /daily/backup-$today.tar
as you can see it creates backups of files modified/created in the last 24h. However if no files are found, it creates corrupted tar file. I would like to wrap it in if..fi statement so id doesn't create empty/corrupted tar files.
Can someone help me modify this script?
Thanks
You can check if result is ok then check if result is empty :
today=$(date +"%Y-%m-%d")
results=`find /backups/www -type f -mtime -1`
if [[ 0 == $? ]] ; then
if [[ -z $results ]] ; then
echo "No files found"
else
tar uf /daily/backup-$today.tar $results
fi
else
echo "Search failed"
fi
find /backups/www -type f -mtime -1 -exec tar uf /daily/backup-$today.tar {} +
Using -exec is preferable to xargs. There's no pipeline needed and it will handle file names with spaces, newlines, and other unusual characters without extra work. The {} at the end is a placeholder for the file names, and + marks the end of the -exec command (in case there were more arguments to find).
As a bonus it won't execute the command if no files are found.
One relatively simple trick would be this:
today=$(date +"%Y-%m-%d")
touch /backups/www/.timestamp
find /backups/www -type f -mtime -1|xargs tar uf /daily/backup-$today.tar
That way you're guaranteed to always find at least one file (and it's minimal in size).
xargs -r does nothing if there is no input.

How to delete every other file in a directory from a shell command?

I have extracted frames from a video in png format:
00000032.png
00000033.png
00000034.png
00000035.png
00000036.png
00000037.png
and so on...
I would like to delete every other frame from the dir using a shell command, how to do this?
EDIT
I think I wasn't clear in my question. I know I can delete each file manually like:
rm filename.png
rm filename2.png
etc...
I need to do all this in one command dynamically because there are thousands of images in the folder.
This should do the trick:
rm -f *[13579].png
which would exterminate every file which name ends with "1" or "3" or "5" or "7" or "9" plus trailing ".png".
Note: * used in pattern stands for 0 or more characters so 1.png will match but so would foo1.png
delete=yes
for file in *.png
do
if [ $delete = yes ]
then rm -f $file; delete=no
else delete=yes
fi
done
This forces strict alternation even if the numbers on the files are not consecutive. You might choose to speed things up with xargs by using:
delete=yes
for file in *.png
do
if [ $delete = yes ]
then echo $file; delete=no
else delete=yes
fi
done |
xargs rm -f
Your names look like they're sane (no spaces or other weird characters to deal with), so you don't have to worry about some of the minutiae that a truly general purpose tool would have to deal with. You might even use:
ls *.png |
awk 'NR % 2 == 1 { print }' |
xargs rm -f
There are lots of ways to achieve your desired result.
rm ???????1.png
rm ???????3.png
rm ???????5.png
rm ???????7.png
rm ???????9.png
(but make a backup before you try it!). Replace "rm" with "erase" for dos/windows.
Suppose every other means files with ending digit 1, 3, 5, 7 or 9, then this solves your problem
find . -regex '.*[13579]\.png' -exec rm {} \;
Other than what?
You can use * to delete multiple frames. For example rm -f *.png to delete all.
This small script remove all png files:
$ find . -name "*.png" -exec /bin/rm {} \;
Pay attention to the dot, it means current directory.
It's the same, but more secure:
$ find . -name "*.txt" -delete:
Now, remove all files that does not have png extension:
$ find . ! -name "*.png" -type f -delete

A bash script to run a program for directories that do not have a certain file

I need a Bash Script to Execute a program for all directories that do not have a specific file and create the output file on the same directory.This program needs an input file which exist in every directory with the name *.DNA.fasta.Suppose I have the following directories that may contain sub directories also
dir1/a.protein.fasta
dir2/b.protein.fasta
dir3/anyfile
dir4/x.orf.fasta
I have started by finding the directories that don't have that specific file whic name is *.protein.fasta
in this case I want the dir3 and dir4 to be listed (since they do not contain *.protein.fasta)
I have tried this code:
find . -maxdepth 1 -type d \! -exec test -e '{}/*protein.fasta' \; -print
but it seems I missed some thing it does not work.
also I do not know how to proceed for the whole story.
This is a tricky one.
I can't think of a good solution. But here's a solution, nevertheless. Note that this is guaranteed not to work if your directory or file names contain newlines, and it's not guaranteed to work if they contain other special characters. (I've only tested with the samples in your question.)
Also, I haven't included a -maxdepth because you said you need to search subdirectories too.
#!/bin/bash
# Create an associative array
declare -A excludes
# Build an associative array of directories containing the file
while read line; do
excludes[$(dirname "$line")]=1
echo "excluded: $(dirname "$line")" >&2
done <<EOT
$(find . -name "*protein.fasta" -print)
EOT
# Walk through all directories, print only those not in array
find . -type d \
| while read line ; do
if [[ ! ${excludes[$line]} ]]; then
echo "$line"
fi
done
For me, this returns:
.
./dir3
./dir4
All of which are directories that do not contain a file matching *.protein.fasta. Of course, you can replace the last echo "$line" with whatever you need to do with these directories.
Alternately:
If what you're really looking for is just the list of top-level directories that do not contain the matching file in any subdirectory, the following bash one-liner may be sufficient:
for i in *; do test -d "$i" && ( find "$i" -name '*protein.fasta' | grep -q . || echo "$i" ); done
#!/bin/bash
for dir in *; do
test -d "$dir" && ( find "$dir" -name '*protein.fasta' | grep -q . || Programfoo"$dir/$dir.DNA.fasta");
done

Resources