Using bash, how can I remove the extensions of all files in a specific directory? - linux

I want to keep the files but remove their extensions. The files do not have the same extension to them. My end goal is to remove all their extensions and change them to one single extension of my choice. I have the second part down.
My code so far:
#!/bin/bash
echo -n "Enter the directory: "
read path
#Remove all extensions
find $path -type f -exec mv '{}' '{}'.extension \; #add desired extension

You don't need an external command find for this, but do it in bash alone. The script below removes the extension from all the files in the folder path.
for file in "$path"/*; do
[ -f "$file" ] || continue
mv "$file" "${file%.*}"
done
The reason for using [ -f "$file" ] is only for a safety check. The glob expression "$path"/* might end up in no files listed, in that case the mv command would fail as there are no files. The [ -f "$file" ] || continue condition safeguards this by exiting the loop when the $file variable is empty in which the [ -f "$file" ] returns a failure error code. The || when used in a compound statement will run if the previous command fails, so when continue is hit next, the for loop is terminated.
If you want to add a new extension just do
mv "$file" "${file%.*}.extension"

This could also be a way
for i in `find . -type f `;do filename=`ls $i | cut -f 2 -d "."`; mv $i ./$filename.ext; done

You might want to try the below. It uses find and awk with system() to remove the extension:
find . -name "*" -type f|awk 'BEGIN{FS="/"}{print $2}'|awk 'BEGIN{FS="."}{system("mv "$0" ./"$1"")}'
example:
[root#puppet:0 check]# ls -lrt
total 0
-rw-r--r--. 1 root root 0 Oct 5 13:49 abc.ext
-rw-r--r--. 1 root root 0 Oct 5 13:49 something.gz
[root#puppet:0 check]# find . -name "*" -type f|awk 'BEGIN{FS="/"}{print $2}'|awk 'BEGIN{FS="."}{system("mv "$0" ./"$1"")}'
[root#puppet:0 check]# ls -lrt
total 0
-rw-r--r--. 1 root root 0 Oct 5 13:49 abc
-rw-r--r--. 1 root root 0 Oct 5 13:49 something
also if you have a specific extension that you want to add to all the files, you may modify the command as below:
find . -name "*" -type f|awk 'BEGIN{FS="/"}{print $2}'|awk 'BEGIN{FS=".";ext=".png"}{system("mv "$0" ./"$1ext"")}'

Related

How to get echo to print only deleted file paths?

I'm trying to write a script to create mysqldumps daily in a directory as well as check all the backups in that directory and remove any older than 7 days that is going to run on cron.
So my functions work correctly, it's just my last echo command that is not doing what I want it to. This is what I have so far:
DBNAME=database
DATE=`date +\%Y-\%m-\%d_\%H\%M`
SQLFILE=$DBNAME-${DATE}.sql
curr_dir=$1
#MAIN
mysqldump -u root -ppassword --databases $DBNAME > $SQLFILE
echo "$SQLFILE has been successfully created."
#Remove files older than 7 days
for filepath in "$curr_dir"*
do
find "$filepath" -mtime +7 -type f -delete
echo "$filepath has been deleted."
done
exit
So the backup creations and removal of old files both work. But, my problem is that echo "$filepath has been deleted." is printing all files in the directory instead of just the files older than 7 days that were deleted. Where am I going wrong here?
EDIT (Full solution):
This is the full solution that wound up working for me using everyone's advice from the answers and comments. This works for cron jobs. I had to specify the main function's output filepath because the files were being created in the root directory instead of the path specified in Argument $1.
Thank you everyone for the help! The if statement also checks whether or not $1 is the specified directory I want files to be deleted in.
#Variables
DBNAME=database
DATE=`date +\%Y-\%m-\%d_\%H\%M`
SQLFILE=$DBNAME-${DATE}.sql
curr_dir=$1
#MAIN
mysqldump -u root -ppassword --databases $DBNAME > /path/to/db-backups/directory/$SQLFILE
echo "$SQLFILE has been successfully created."
#Remove files older than 7 days
for filepath in "$curr_dir"*
do
if [[ $1 = "/path/to/db-backups/directory" ]]; then
find "$filepath" -mtime +7 -type f -delete -exec sh -c 'printf "%s has been deleted.\n" "$#"' _ {} +
fi
done
exit
You can merge the echo into the find:
find "$filepath" -mtime +7 -type f -delete -exec echo '{}' "has been deleted." \;
The -delete option is just a shortcut for -exec rm '{}' \; and all the -exec commands are run in the sequence you specify them in.

How to prefix folders and files within?

I'm stuck looking for a one-liner to add a prefix to all subfolder names and file names in a directory
eg "AAA" in the examples below
/folder/AAAfile.txt
/folder/AAAread/AAAdoc.txt
/folder/AAAread/AAAfinished/AAAread.txt
I've tried using xargs and find, but can't get them to go recursively through the subdirectories and their contents. Any suggestions?
James
You could use something like that
find . -mindepth 1 | sort -r | xargs -l -I {} bash -c 'mv $1 ${1%/*}/AAA${1##*/}' _ {}
Tested with your folder structure, executed from the root (same as AAAfile.txt).
The following script should meet your need (ran it from inside your folder directory):
for i in `ls -R`;do
dname=`dirname $i`
fname=AAA`basename $i`
if [ -f $i ]
then
mv $i $dname/$fname
fi
#this could be merged with previous condition but have been kept just to avoid invalid directory warning
if [ -d $i ]
then
mv $i $dname/$fname
fi
done

Get files in bash script that contain an underscore

In a directory with a bunch of files that look like this:
./test1_November 08, 2014 AM.flv
./test2.flv
./script1.sh
./script2.sh
I want to process only files that have an .flv extension and no underscore. I'm trying to eliminate the files with underscores without much luck.
bash --version
GNU bash, version 4.2.37(1)-release (x86_64-pc-linux-gnu)
script:
#!/bin/bash
FILES=$(find . -mtime 0)
for f in "${FILES}"
do
if [[ "$f" != *_* ]]; then
echo "$f"
fi
done
This gives me no files. Changing the != to == gives me all files instead of just those with an underscore. Other answers on SO indicate this should work.
Am I missing something simple here?
You can use this extended glob pattern:
shopt -s extglob
echo +([^_]).flv
+([^_]) will match 1 or more of any non underscore character.
Testing:
ls -l *flv
-rw-r--r-- 1 user1 staff 0 Nov 8 12:44 test1_November 08, 2014 AM.flv
-rw-r--r-- 1 user1 staff 0 Nov 8 12:44 test2.flv
echo +([^_]).flv
test2.flv
To process these files in a loop use:
for f in +([^_]).flv; do
echo "Processing: $f"
done
PS: Not sure you're using -mtime 0 in your find as my answer is for the requiremnt:
I want to process only files that have an .flv extension and no underscore
You can pass multiple patterns to find and include a not
find . -name "*.flv" -not -name "*_*"
You can loop over the results of find by piping it into a while
find -name "*.flv" -not -name "*_*" -print0 | while IFS= read -r -d '' filename; do
echo "$filename"
done
or you can forgo the loop completely and use xargs
find -name "*.flv" -not -name "*_*" -print0 | xargs -0 -n1 echo
The problem is this line:
for f in "${FILES}"
The quotes are preventing word splitting, so entire list of filenames is being processed as a single item. What you want is:
IFS=$'\n'
for f in $FILES
The IFS setting makes it use newlines as the word delimiters, so you can have filenames with spaces in them.
A better way to write loops like this is to avoid using the variable:
find ... | while read -r f
do
...
done

how to perform action of list of directories listed using for loop?

I am doing one check to see if a directory is present inside list of below directories. Below is the table listed. This is the only table available about such directories.
user#root> cat u
Directory owner value
-------- ---- -----
0-0-1-0 Aleks 10
0-0-2-0 Ram 23
0-0-3-0 mark 43
0-0-4-0 Sam 22
0-0-5-0 wood 21
0-0-6-0 peter 34
0-0-7-0 ron 45
0-0-8-0 Alic 44
0-0-9-0 amber 56
0-0-10-0 janny 34
user#root> cat u |grep -Ev "owner|--"|awk '{print $1 }'
0-0-1-0
0-0-2-0
0-0-3-0
0-0-4-0
0-0-5-0
0-0-6-0
0-0-7-0
0-0-8-0
0-0-9-0
0-0-10-0
Query:
I want to login into all the directories from 0-0-1-0 to 0-0-10-0 and perform some action. How can I do that ?
For example I want to validate if XYZ directory is present inside all the directories or not.
user#root>test -d 0-0-1-0/XYZ; if [ "$?" != "0" ];then echo "directory is missing" fi
I think if I can store value of each row incrementally in some variable then issue will be resolved.
You can process your list of files like this:
#!/bin/sh
for dir in $(awk 'NR>2 {print $1}' $1)
do
if [[ -d "$dir" ]]
then
cd "$dir"
pwd
# Do random stuff
fi
done
Run the script like this:
./script.sh my_list_of_files
If the directory exists it will cd to that directory and run pwd.
One warning though, this script will get a bit confused if any of your directories have a space in them.
If you know the DIR_NAME_TO_BE_SEARCHED, then you can use following command:
find YOUR_STARTING_DIRECTORY -type d -name DIR_NAME_TO_BE_SEARCHED -print
example:
find . -type d -name test -print
explanation:
will find all directories (-type d) starting from your current directory that have their name as test (-name test) and output them (-print).
and if you don't know the exact DIRECTORY_NAME_TO_BE_SEARCHED, then you can use pattern as well :
find YOUR_STARTING_DIRECTORY -type d -name "DIR_NAME_TO_BE_SEARCHED_PATTERN" -print
example:
find . -type d -name "\*test\*" -print

How can I add text to the same line?

I used this command to find mp3 files and write their name on log.txt:
find -name *.mp3 >> log.txt
I want to move the files using the mv command and I would like to append that to the log file so it could show the path where the files have been moved.
For example if the mp3 files are 1.mp3 and 2.mp3 then the log.txt should look like
1.mp3 >>>> /newfolder/1.mp3
2.mp3 >>>> /newfolder/2.mp3
How can I do that using unix commands? Thank you!
Using only move:
mv -v *.mp3 tmp/ > log.txt
or using find:
find -name '*.mp3' -exec mv -v {} test/ >> log.txt \;
You should probably use some scripting language like Perl or Python; text processing is rather awkward in the shell.
E.g. in Perl you can just postprocess the output of find, and print out what you did.
#!/usr/bin/perl -w
use strict;
use File::Find;
my #directories_to_search=("/tmp/");
sub wanted {
print "$File::Find::name >>> newdir/$_\n";
# do what you want with the file, e.g. invoke commands on it using system()
}
find(\&wanted, #directories_to_search);
Doing it in Perl or similar makes some things easier than in the shell; in particular handling of funny filenames (embedded spaces, special chars) is easier. Be careful when invoking syste() commands though.
For docs on the File::Find module see http://perldoc.perl.org/File/Find.html .
GNU find
find /path -type f -iname "*.mp3" -printf "%f/%p\n" | while IFS="/" -r read filename path
do
mv "$path" "$destination"
echo "$filename >>> $destination/$filename " > newfile.txt
done
output
$ touch 'test"quotes.txt'
$ ls -ltr
total 0
-rw-r--r-- 1 root root 0 2009-11-20 10:30 test"quotes.txt
$ mkdir temp
$ ls -l temp
total 0
$ find . -type f -iname "*\"*" -printf "%f:%p\n" | while IFS=":" read filename path; do mv "$filename" temp ; done
$ ls -l temp
total 0
-rw-r--r-- 1 root root 0 2009-11-20 11:53 test"quotes.txt

Resources