Move files in bulk and create links in their place in the directory in linux - linux

I am trying to move hundreds of files from one directory to another but create a softlink in the old directory while doing that. Is there a single line command that can do that?
/dir1
file1.txt
file2.txt
.
.
.
file100.txt
move to dir2 and create soft link to them in dir1.
I am currently doing that seperately but was hoping to find a single line command if possible.
cd dir1
mv *.txt /dir2
ln -s /dir2/*.txt .
I tried using find but that didn't work either.

There's no single line command. It's quite trivial to do with shell scripting. For example, in tcsh:
% cd dir1
% foreach FILETOMOVE ( file*.txt )
echo mv -iv $FILETOMOVE /dir2
echo ln -s /dir2/$FILETOMOVE .
end
(Remove the echo's once you're sure you've got it right.)
Bash is similar, with slightly different syntax.
This is slightly more complicated if the filenames or paths include spaces, but still quite simple. (:q in tcsh, using "", etc.)

Related

Bash script to sort files into sub folders based on extension

I have the following structure:
FolderA
Sub1
Sub2
filexx.csv
filexx.doc
FolderB
Sub1
Sub2
fileyy.csv
fileyy.doc
I want to write a script that will move the .csv files into the folder sub1 for each parent directory (Folder A, Folder B and so on) giving me the following structure:
FolderA
Sub1
filexx.csv
Sub2
filexx.doc
FolderB
Sub1
fileyy.csv
Sub2
fileyy.doc
This is what I have till now but I get the error mv: cannot stat *.csv: No such file or directory
for f in */*/*.csv; do
mv -v "$f" */*/Sub1;
done
for f in */*/*.doc; do
mv -v "$f" */*/Sub2;
done
I am new to bash scripting so please forgive me if I have made a very obvious mistake. I know I can do this in Python as well but it will be lengthier which is why I would like a solution using linux commands.
find . -name "*.csv" -type f -execdir mv '{}' Sub1/ \;
Using find, search for all files with the extension .csv and then when we find them, execute a move command from within the directory containing the files, moving the files to directory Sub1
find . -name "*.doc" -type f -execdir mv '{}' Sub2/ \;
Follow the same principle for files with the extension .doc but this time, move the files to Sub2.
I believe you are getting this error because no file matched your wildcard. When it happens, the for loop will give $f the value of the wildcard itself. You are basically trying to move the file *.csv which does not exist.
To prevent this behavior, you can add shopt -s nullglob at the top of your script. When using this, if no file is found, your script won't enter the loop.
My advise is, make sure you run your script from the correct location when using wildcards like this. But maybe what you meant to do by writing */*/*.csv is to recursively match all the csv files. If that's what you intended to do, this is not the right way to do it.
To recursively match all csv/doc/etc files using native bash you can add shopt -s globstar to the top of your script and use **/*.csv as wildcard
#!/bin/bash
shopt -s globstar nullglob
for f in **/*.csv; do
mv "$f" Destination/ # Note that $f is surrounded by "" to handle whitespaces in filenames
done
You could also use the find (1) utility to achieve that. But if you're planning to do more processing on the files than just moving them, a for loop might be cleaner as you won't have to inline everything in the same command.
Side note : "Linux commands" as you say are actually not Linux commands, they are part of the GNU utilities (https://www.gnu.org/gnu/linux-and-gnu.en.html)
If csv files you want to move are in the top directories (from the point of view of the current directory), but not in the subdirectories of them, then simply:
#!/bin/bash
for dir in */; do
mv -v "$dir"*.csv "${dir}Sub1/"
mv -v "$dir"*.doc "${dir}Sub2/"
done
If the files in all subdirectories are wanted to be moved similarly, then:
shopt -s globstar
for file in **/*.csv; do
mv -v "$file" "${file%/*}/Sub1/"
done
for file in **/*.doc; do
mv -v "$file" "${file%/*}/Sub2/"
done
Note that, the directories Sub1 and Sub2 are relative to the directory where csv and doc files reside.

How do you move files from one folder to the other?

I am trying to move specific files from one folder to another. Would the below work?
mkdir test
touch test1.sh
touch test2.sh
touch test3.sh
mkdir test2
find test/ | xargs -I% mv % test2
I think this can work:
find ./ -name "test*.sh" | xargs -I% mv % test2
There is somethin odd in your example:
If test1 does not contain any subdirectories, or if you want to move the subdirectories as they are, you could simply do a
mv test1/* test2
(Note that this would (by default) not move entries which start with a period. If this is a problem, you either should consider not using Posix shell but, say, bash or Zsh, or indeed could use find, for the safe side with the -prune option) .
The problem starts with subdirectories. The output of find contains all directories along with the files at the end. The mv inside the xargs would then move, say, a directory test1/foo, and if it later wants to process a file test1/foo/bar/baz.txt, the file is not here anymore. The overall effect would be that you would have moved all the subdirectory (as in my first solution which does not need find), but get in addition plenty of error messages.

Linux Bash: Move multiple different files into same directory

As a rather novice Linux user, I can't seem to find how to do this.
I am trying to move unique files all in one directory into another directory.
Example:
$ ls
vehicle car.txt bicycle.txt airplane.html train.docx (more files)
I want car.txt, bicycle.txt, airplane.html, and train.docx inside vehicle.
Right now I do this by moving the files individually:
$ mv car.txt vehicle
$ mv bicycle.txt vehicle
...
How can I do this in one line?
You can do
mv car.txt bicycle.txt vehicle/
(Note that the / above is unnecessary, I include it merely to ensure that vehicle is a directory.)
You can test this as follows:
cd #Move to home directory
mkdir temp #Make a temporary directory
touch a b c d #Make test (empty) files ('touch' also updates the modification date of an existing file to the current time)
ls #Verify everything is there
mv a b c d temp/ #Move files into temp
ls #See? They are gone.
ls temp/ #Oh, there they are!
rm -rf temp/ #DESTROY (Be very, very careful with this command)
Shorthand command to move all .txt file
You can try using a wildcard. In the code below, * will match all the files which have any name ending with .txt or .docx, and move them to the vehicle folder.
mv *.txt *.docx vehicle/
If you want to move specific files to a directory
mv car.txt bicycle.txt vehicle/
Edit: As mentioned in a comment, If you are moving files by hand, I suggest using mv -i ... which will warn you in case the destination file already exists, giving you a choice of not overwriting it. Other 'file destroyer' commands like cp & rm too have a -i option
mv command in linux allow us to move more than one file into another directory. All you have to do is write the name of each file you want to move, seperated by a space.
Following command will help you:
mv car.txt bicycle.txt airplane.html train.docx vehicle
or
mv car.txt bicycle.txt airplane.html train.docx vehicle/
both of them will work.
You can move multiple files to a specific directory by using mv command.
In your scenario it can be done by,
mv car.txt bicycle.txt airplane.html train.docx vehicle/
The point you must note is that the last entry is the destination and rest everything except mv is source.
One another scenario is that the destination is not present in our directory,then we must opt for absolute path in place of vehicles/.
Note: Absolute path always starts from / ,which means we are traversing from root directory.
I have written a small bash script that will move multiple files(matched using pattern) present in multiple directories(matched using pattern) to a single location using mv and find command in bash
#!/bin/bash
for i in $(find /path/info/*/*.fna -type f) # find files and return their path
do
mv -iv $i -t ~/path/to/destination/directory # move files
done
$() is for command substitution(in other words it expand the expression inside it)
/*/ wild card for matching any directory, you can replace this with any wild card expression
*.fna is for finding any file with.fna extension
-type f is for getting the full path info of the located file
-i in mv is for prompt before overwrite( extra caution in case the wild card exp was wrong)
-v for verbose
-t for destination
NOTE: the above flags are not mandatory
Hope this helps

Copy all files in a directory to a local subdirectory in linux

I have a directory with the following structure:
file_1
file_2
dir_1
dir_2
# etc.
new_subdir
I'd like to make a copy of all the existing files and directories located in this directory in new_subdir. How can I accomplish this via the linux terminal?
This is an old question, but none of the answers seem to work (they cause the destination folder to be copied recursively into itself), so I figured I'd offer up some working examples:
Copy via find -exec:
find . ! -regex '.*/new_subdir' ! -regex '.' -exec cp -r '{}' new_subdir \;
This code uses regex to find all files and directories (in the current directory) which are not new_subdir and copies them into new_subdir. The ! -regex '.' bit is in there to keep the current directory itself from being included. Using find is the most powerful technique I know, but it's long-winded and a bit confusing at times.
Copy with extglob:
cp -r !(new_subdir) new_subdir
If you have extglob enabled for your bash terminal (which is probably the case), then you can use ! to copy all things in the current directory which are not new_subdir into new_subdir.
Copy without extglob:
mv * new_subdir ; cp -r new_subdir/* .
If you don't have extglob and find doesn't appeal to you and you really want to do something hacky, you can move all of the files into the subdirectory, then recursively copy them back to the original directory. Unlike cp which copies the destination folder into itself, mv just throws an error when it tries to move the destination folder inside of itself. (But it successfully moves every other file and folder.)
You mean like
cp -R * new_subdir
?
cp take -R as argument which means recursive (so, copy also directories), * means all files (and directories).
Although * includes new_subdir itself, but cp detects this case and ignores new_subdir (so it doesn't copy it into itself!)
Try something like:
cp -R * /path_to_new_dir/

Unix: traverse a directory

I need to traverse a directory so starting in one directory and going deeper into difference sub directories. However I also need to be able to have access to each individual file to modify the file. Is there already a command to do this or will I have to write a script? Could someone provide some code to help me with this task? Thanks.
The find command is just the tool for that. Its -exec flag or -print0 in combination with xargs -0 allows fine-grained control over what to do with each file.
Example: Replace all foo's by bar's in all files in /tmp and subdirectories.
find /tmp -type f -exec sed -i -e 's/foo/bar/' '{}' ';'
for i in `find` ; do
if [ -d $i ] ; then do something with a directory ; fi
if [ -f $i ] ; then do something with a file etc. ; fi
done
This will return the whole tree (recursively) in the current directory in a list that the loop will go through.
This can be easily achieved by mixing find, xargs, sed (or other file modification command).
For example:
$ find /path/to/base/dir -type f -name '*.properties' | xargs sed -ie '/^#/d'
This will filter all files with file extension .properties.
The xargs command will feed the file path generated by find command into the sed command.
The sed command will delete all lines start with # in the files (feed by xargs).
Command combination in this way is very flexible.
For example, find command have different parameters so you can filter by user name, file size, file path (eg: under /test/ subfolder), file modification time.
Another dimension of flexibility is how and what to change in your file. For ex, sed command allows you to make changes on file in applying substitution (specify via regular expressions). Similarly, you can use gzip to compress the file. And so on ...
You would usually use the find command. On Linux, you have the GNU version, of course. It has many extra (and useful) options. Both will allow you to execute a command (eg a shell script) on the files as they are found.
The exact details of how to make changes to the file depend on the change you want to make to the file. That is probably best scripted, with find running the script:
POSIX or GNU:
find . -type f -exec your_script '{}' +
This will run your script once for a group of files with those names provided as arguments. If you want to do it one file at a time, replace the + with ';' (or \;).
I am assuming SearchMe is the example directory name you need to traverse completely.
I am also assuming, since it was not specified, the files you want to modify are all text file. Is this correct?
In such scenario I would suggest using the command:
find SearchMe -type f -exec vi {} \;
If you are not familiar with vi editor, just use another one (nano, emacs, kate, kwrite, gedit, etc.) and it should work as well.
Bash 4+
shopt -s globstar
for file in **
do
if [ -f "$file" ];then
# do some processing to your file here
# where the find command can't do conveniently
fi
done

Resources