Rename files in multiple directories to the name of the directory - linux

I have something like this:
v_1/file.txt
v_2/file.txt
v_3/file.txt
...
and I want to rename those files to something like this:
v_1.txt
v_2.txt
v_3.txt
...
in the same directory.
I guess I can use rename but I can't figure out how to use it with folder and file renaming at the same time.

The result can be achieved with a bash for loop and mv:
for subdir in *; do mv $subdir/file.txt $subdir.txt; done;
Note that the solution above will not work if the directory name contains spaces. Related link.
Another solution based on comments (that works for directories having spaces in the name as well):
find . -type d -not -empty -exec echo mv \{\}/file.txt \{\}.txt \;

You can use rnm. The command would be:
rnm -fo -dp -1 -ns '/pd0/.txt' -ss '\.txt$' /path/to/the/directory
-fo implies file only mode.
-dp directory depth. -1 makes it recursive to all subdirectories.
-ns implies name string i.e the new name of the file.
/pd0/ is the immediate parent directory of the file which is subject to rename operation.
-ss is a search string (regex). '\.txt$' regex searches for file with .txt at the end of the filename.
/path/to/the/directory this is the path where the v_1, v_2 ... directories reside. You can pass the directories ( v_1, v_2 ...) too in place of the parent directory path. For example:
#from inside the parent directory
rnm -fo -dp -1 -ns '/pd0/.txt' -ss '\.txt$' v_*

Seem pretty straightforward to me:
$ mkdir /tmp/sandbox
$ cd /tmp/sandbox
$ mkdir v_{1,2,3}
$ touch v_{1,2,3}/file.txt
$ rename -v 's#/file##' v_{1,2,3}/file.txt
rename v_1/file.txt v_1.txt
rename v_2/file.txt v_2.txt
rename v_3/file.txt v_3.txt
$ ls -F
v_1/ v_1.txt v_2/ v_2.txt v_3/ v_3.txt

Related

Bash script to sort files into sub folders based on extension

I have the following structure:
FolderA
Sub1
Sub2
filexx.csv
filexx.doc
FolderB
Sub1
Sub2
fileyy.csv
fileyy.doc
I want to write a script that will move the .csv files into the folder sub1 for each parent directory (Folder A, Folder B and so on) giving me the following structure:
FolderA
Sub1
filexx.csv
Sub2
filexx.doc
FolderB
Sub1
fileyy.csv
Sub2
fileyy.doc
This is what I have till now but I get the error mv: cannot stat *.csv: No such file or directory
for f in */*/*.csv; do
mv -v "$f" */*/Sub1;
done
for f in */*/*.doc; do
mv -v "$f" */*/Sub2;
done
I am new to bash scripting so please forgive me if I have made a very obvious mistake. I know I can do this in Python as well but it will be lengthier which is why I would like a solution using linux commands.
find . -name "*.csv" -type f -execdir mv '{}' Sub1/ \;
Using find, search for all files with the extension .csv and then when we find them, execute a move command from within the directory containing the files, moving the files to directory Sub1
find . -name "*.doc" -type f -execdir mv '{}' Sub2/ \;
Follow the same principle for files with the extension .doc but this time, move the files to Sub2.
I believe you are getting this error because no file matched your wildcard. When it happens, the for loop will give $f the value of the wildcard itself. You are basically trying to move the file *.csv which does not exist.
To prevent this behavior, you can add shopt -s nullglob at the top of your script. When using this, if no file is found, your script won't enter the loop.
My advise is, make sure you run your script from the correct location when using wildcards like this. But maybe what you meant to do by writing */*/*.csv is to recursively match all the csv files. If that's what you intended to do, this is not the right way to do it.
To recursively match all csv/doc/etc files using native bash you can add shopt -s globstar to the top of your script and use **/*.csv as wildcard
#!/bin/bash
shopt -s globstar nullglob
for f in **/*.csv; do
mv "$f" Destination/ # Note that $f is surrounded by "" to handle whitespaces in filenames
done
You could also use the find (1) utility to achieve that. But if you're planning to do more processing on the files than just moving them, a for loop might be cleaner as you won't have to inline everything in the same command.
Side note : "Linux commands" as you say are actually not Linux commands, they are part of the GNU utilities (https://www.gnu.org/gnu/linux-and-gnu.en.html)
If csv files you want to move are in the top directories (from the point of view of the current directory), but not in the subdirectories of them, then simply:
#!/bin/bash
for dir in */; do
mv -v "$dir"*.csv "${dir}Sub1/"
mv -v "$dir"*.doc "${dir}Sub2/"
done
If the files in all subdirectories are wanted to be moved similarly, then:
shopt -s globstar
for file in **/*.csv; do
mv -v "$file" "${file%/*}/Sub1/"
done
for file in **/*.doc; do
mv -v "$file" "${file%/*}/Sub2/"
done
Note that, the directories Sub1 and Sub2 are relative to the directory where csv and doc files reside.

Linux recursive copy files to its parent folder

I want to copy recursively files to its parent folder for a specific file extension. For example:
./folderA/folder1/*.txt to ./folderA/*.txt
./folderB/folder2/*.txt to ./folderB/*.txt
etc.
I checked cp and find commands but couldn't get it working.
I suspect that while you say copy, you actually mean to move the files up to their respective parent directories. It can be done easily using find:
$ find . -name '*.txt' -type f -execdir mv -n '{}' ../ \;
The above command recurses into the current directory . and then applies the following cascade of conditionals to each item found:
-name '*.txt' will filter out only files that have the .txt extension
-type f will filter out only regular files (eg, not directories that – for whatever reason – happen to have a name ending in .txt)
-execdir mv -n '{}' ../ \; executes the command mv -n '{}' ../ in the containing directory where the {} is a placeholder for the matched file's name and the single quotes are needed to stop the shell from interpreting the curly braces. The ; terminates the command and again has to be escaped from the shell interpreting it.
I have passed the -n flag to the mv program to avoid accidentally overwriting an existing file.
The above command will transform the following file system tree
dir1/
dir11/
file3.txt
file4.txt
dir12/
file2.txt
dir2/
dir21/
file6.dat
dir22/
dir221/
dir221/file8.txt
file7.txt
file5.txt
dir3/
file9.dat
file1.txt
into this one:
dir1/
dir11/
dir12/
file3.txt
file4.txt
dir2/
dir21/
file6.dat
dir22/
dir221/
file8.txt
file7.txt
dir3/
file9.dat
file2.txt
file5.txt
To get rid of the empty directories, run
$ find . -type d -empty -delete
Again, this command will traverse the current directory . and then apply the following:
-type d this time filters out only directories
-empty filters out only those that are empty
-delete deletes them.
Fine print: -execdir is not specified by POSIX, though major implementations (at least the GNU and BSD one) support it. If you need strict POSIX compliance, you'll have to make do with the less safe -exec which would need additional thought to be applied correctly in this case.
Finally, please try your commands in a test directory with dummy files, not your actual data. Especially with the -delete option of find, you can loose all your data quicker than you might imaging. Read the man page and, if that is not enough, the reference manual of find. Never blindly copy shell commands from random strangers posted on the internet if you don't understand them.
$cp ./folderA/folder1/*.txt ./folderA
Try this commnad
Run something like this from the root(ish) directory:
#! /bin/bash
BASE_DIR=./
new_dir() {
LOC_DIR=`pwd`
for i in "${LOC_DIR}"/*; do
[[ -f "${i}" ]] && cp "${i}" ../
[[ -d "${i}" ]] && cd "${i}" && new_dir
cd ..
done
return 0
}
new_dir
This will search each directory. When a file is encountered, it copies the file up a directory. When a directory is found, it will move down into the directory and start the process over again. I think it'll work for you.
Good luck.

copy entire directory excluding a file

As we know, cp -r source_dir intended_new_directory creates a copy of source directory with a new name. Now I want to do the same but want to exclude a particular file. I have found some related answers here, using tar and rsync, but in those solutions I need to create the destination directory first (using mkdir).
I honestly searched a lot, but didn't find exactly what I want.
So far the best I got is this:
tar -c --exclude=\*.dll --exclude=\*.exe sourceDir | tar -x -C destDir
(from http://www.linuxquestions.org/questions/programming-9/how-to-copy-an-entire-directory-structure-except-certain-files-385321/)
If you have binutils, you could use find to filter next cpio to copy (and create directories) :
find <sourceDir> \( ! -name *.dll \) -a \( ! -name *.exe \) | cpio -dumpv <destDir>
Try this by excluding the file using 'grep -v' ->
cp `ls | grep -v <exclude-file>` <dest-dir>
If the directory is not very large I used to write something like this:
src=path/to/source/directory
dst=path/to/destination/directory
find $src -type f | while read f ; do mkdir -p "$dst/`dirname $f`"; cp "$f" "$dst/$f" ; done
Here we list all regular files in $src, iterate over this list and for each file make a directory in $dst if it does not exist yet (-p option of mkdir), then copy the file to that directory.
The above command will copy all the files. Finally, just use
find $src -type f | grep -v whatever | while ...... # same as above
to filter out the files you don't need (e.g. \.bak$, \.orig$, or whatever files you don't want to copy).
Move all exclude file into home or other directory,copy the directory containing all remaining files to the destination folder then restore all exclude files.
#cd mydirectory
#mv exclude1 exclude2 /home/
#cp mydirectory destination_folder/
#cd /home/
#mv eclude1 exclude2 mydirectory/

Copying files in multiple subdirectories in the Linux command line

Let's say I have the following subdirectories
./a/, ./b/, ./c/, ...
That is, in my current working directory are these subdirectories a/, b/ and c/, and in each of these subdirectories are files. In directory a/ is the file a.in, in directory b/ is the file b.in and so forth.
I now want to copy each .in file to a .out file, that is, a.in to a.out and b.in to b.out, and I want them to reside in the directories they were copied from. So a.out will be found in directory a/.
I've tried various different approaches, such as
find ./ -name '*.in'|cp * *.out
which doesn't work because it thinks *.out is a directory. Also tried
ls -d */ | cd; cp *.in *.out
but it that would list the subdirectories, go into each one of them, but won't let cp do it's work (which still doesn't work)
The
find ./ -name '*.in'
command works fine. Is there a way to pipe arguments to an assignment operator? E.g.
find ./ -name '*.in'| assign filename=|cp filename filename.out
where assign filename= gives filename the value of each .in file. In fact, it would be even better if the assignment could get rid of the .in file extension, then instead of getting a.in.out we would get the preferred a.out
Thank you for your time.
Let the shell help you out:
find . -name '*.in' | while read old; do
new=${old%.in}.out # strips the .in and adds .out
cp "$old" "$new"
done
I just took the find command you said works and let bash read its output one filename at a time. So the bash while loop gets the filenames one at a time, does a little substitution, and a straight copy. Nice and easy (but not tested!).
Try a for loop:
for f in */*.in; do
cp $f ${f%.in}.out;
done
The glob should catch all the files one directory down that have a .in extension. In the cp command, it strips off the .in suffix and then appends a .out (see Variable Mangling in Bash with String Operators)
Alternatively, if you want to recurse into every subdirectory (not just 1 level deep) replace the glob with a find:
for f in $(find . -name '*.in'); do
cp $f ${f%.in}.out;
done
This should do the trick!
for f in `find . -type f -name "*.in"`; do cp $f `echo $f | sed 's/in$/out/g'`; done

How can I generate a list of files with their absolute path in Linux?

I am writing a shell script that takes file paths as input.
For this reason, I need to generate recursive file listings with full paths. For example, the file bar has the path:
/home/ken/foo/bar
but, as far as I can see, both ls and find only give relative path listings:
./foo/bar (from the folder ken)
It seems like an obvious requirement, but I can't see anything in the find or ls man pages.
How can I generate a list of files in the shell including their absolute paths?
If you give find an absolute path to start with, it will print absolute paths. For instance, to find all .htaccess files in the current directory:
find "$(pwd)" -name .htaccess
or if your shell expands $PWD to the current directory:
find "$PWD" -name .htaccess
find simply prepends the path it was given to a relative path to the file from that path.
Greg Hewgill also suggested using pwd -P if you want to resolve symlinks in your current directory.
readlink -f filename
gives the full absolute path. but if the file is a symlink, u'll get the final resolved name.
Use this for dirs (the / after ** is needed in bash to limit it to directories):
ls -d -1 "$PWD/"**/
this for files and directories directly under the current directory, whose names contain a .:
ls -d -1 "$PWD/"*.*
this for everything:
ls -d -1 "$PWD/"**/*
Taken from here
http://www.zsh.org/mla/users/2002/msg00033.html
In bash, ** is recursive if you enable shopt -s globstar.
You can use
find $PWD
in bash
ls -d "$PWD/"*
This looks only in the current directory. It quotes "$PWD" in case it contains spaces.
Command: ls -1 -d "$PWD/"*
This will give the absolute paths of the file like below.
[root#kubenode1 ssl]# ls -1 -d "$PWD/"*
/etc/kubernetes/folder/file-test-config.txt
/etc/kubernetes/folder/file-test.txt
/etc/kubernetes/folder/file-client.txt
Try this:
find "$PWD"/
You get list of absolute paths in working directory.
You can do
ls -1 |xargs realpath
If you need to specify an absolute path or relative path You can do that as well
ls -1 $FILEPATH |xargs realpath
The $PWD is a good option by Matthew above. If you want find to only print files then you can also add the -type f option to search only normal files. Other options are "d" for directories only etc. So in your case it would be (if i want to search only for files with .c ext):
find $PWD -type f -name "*.c"
or if you want all files:
find $PWD -type f
Note: You can't make an alias for the above command, because $PWD gets auto-completed to your home directory when the alias is being set by bash.
If you give the find command an absolute path, it will spit the results out with an absolute path. So, from the Ken directory if you were to type:
find /home/ken/foo/ -name bar -print
(instead of the relative path find . -name bar -print)
You should get:
/home/ken/foo/bar
Therefore, if you want an ls -l and have it return the absolute path, you can just tell the find command to execute an ls -l on whatever it finds.
find /home/ken/foo -name bar -exec ls -l {} ;\
NOTE: There is a space between {} and ;
You'll get something like this:
-rw-r--r-- 1 ken admin 181 Jan 27 15:49 /home/ken/foo/bar
If you aren't sure where the file is, you can always change the search location. As long as the search path starts with "/", you will get an absolute path in return. If you are searching a location (like /) where you are going to get a lot of permission denied errors, then I would recommend redirecting standard error so you can actually see the find results:
find / -name bar -exec ls -l {} ;\ 2> /dev/null
(2> is the syntax for the Borne and Bash shells, but will not work with the C shell. It may work in other shells too, but I only know for sure that it works in Bourne and Bash).
Just an alternative to
ls -d "$PWD/"*
to pinpoint that * is shell expansion, so
echo "$PWD/"*
would do the same (the drawback you cannot use -1 to separate by new lines, not spaces).
fd
Using fd (alternative to find), use the following syntax:
fd . foo -a
Where . is the search pattern and foo is the root directory.
E.g. to list all files in etc recursively, run: fd . /etc -a.
-a, --absolute-path Show absolute instead of relative paths
If you need list of all files in current as well as sub-directories
find $PWD -type f
If you need list of all files only in current directory
find $PWD -maxdepth 1 -type f
You might want to try this.
for name in /home/ken/foo/bar/*
do
echo $name
done
You can get abs path using for loop and echo simply without find.
find jar file recursely and print absolute path
`ls -R |grep "\.jar$" | xargs readlink -f`
/opt/tool/dev/maven_repo/com/oracle/ojdbc/ojdbc8-19.3.0.0.jar
/opt/tool/dev/maven_repo/com/oracle/ojdbc/ons-19.3.0.0.jar
/opt/tool/dev/maven_repo/com/oracle/ojdbc/oraclepki-19.3.0.0.jar
/opt/tool/dev/maven_repo/com/oracle/ojdbc/osdt_cert-19.3.0.0.jar
/opt/tool/dev/maven_repo/com/oracle/ojdbc/osdt_core-19.3.0.0.jar
/opt/tool/dev/maven_repo/com/oracle/ojdbc/simplefan-19.3.0.0.jar
/opt/tool/dev/maven_repo/com/oracle/ojdbc/ucp-19.3.0.0.jar
This works best if you want a dynamic solution that works well in a function
lfp ()
{
ls -1 $1 | xargs -I{} echo $(realpath $1)/{}
}
lspwd() { for i in $#; do ls -d -1 $PWD/$i; done }
Here's an example that prints out a list without an extra period and that also demonstrates how to search for a file match. Hope this helps:
find . -type f -name "extr*" -exec echo `pwd`/{} \; | sed "s|\./||"
This worked for me. But it didn't list in alphabetical order.
find "$(pwd)" -maxdepth 1
This command lists alphabetically as well as lists hidden files too.
ls -d -1 "$PWD/".*; ls -d -1 "$PWD/"*;
stat
Absolute path of a single file:
stat -c %n "$PWD"/foo/bar
This will give the canonical path (will resolve symlinks): realpath FILENAME
If you want canonical path to the symlink itself, then: realpath -s FILENAME
Most if not all of the suggested methods result in paths that cannot be used directly in some other terminal command if the path contains spaces. Ideally the results will have slashes prepended.
This works for me on macOS:
find / -iname "*SEARCH TERM spaces are okay*" -print 2>&1 | grep -v denied |grep -v permitted |sed -E 's/\ /\\ /g'
for p in <either relative of absolute path of the directory>/*; do
echo $(realpath -s $p)
done
Recursive files can be listed by many ways in Linux. Here I am sharing one liner script to clear all logs of files(only files) from /var/log/ directory and second check recently which logs file has made an entry.
First:
find /var/log/ -type f #listing file recursively
Second:
for i in $(find $PWD -type f) ; do cat /dev/null > "$i" ; done #empty files recursively
Third use:
ls -ltr $(find /var/log/ -type f ) # listing file used in recent
Note: for directory location you can also pass $PWD instead of /var/log.
If you don't have symbolic links, you could try
tree -iFL 1 [DIR]
-i makes tree print filenames in each line, without the tree structure.
-f makes tree print the full path of each file.
-L 1 avoids tree from recursion.
Write one small function
lsf() {
ls `pwd`/$1
}
Then you can use like
lsf test.sh
it gives full path like
/home/testuser/Downloads/test.sh
I used the following to list absolute path of files in a directory in a txt file:
find "$PWD" -wholename '*.JPG' >test.txt
find / -print will do this
ls -1 | awk -vpath=$PWD/ '{print path$1}'

Resources