mkdir command for a list of filenames paths - linux

I have txt file with content like this
/home/username/Desktop/folder/folder3333/IMAGw488.jpg
/home/username/Desktop/folder/folder3333/IMAG04f88.jpg
/home/username/Desktop/folder/folder3333/IMAGe0488.jpg
/home/username/Desktop/folder/folder3333/IMAG0r88.jpg
/home/username/Desktop/folder/folder3333/
/home/username/Desktop/folder/
/home/username/Desktop/folder/IMAG0488.jpg
/home/username/Desktop/folder/fff/fff/feqw/123.jpg
/home/username/Desktop/folder/fffa/asd.png
....
these are filenames paths but also paths of folders.
The problem I want to solve is to create all folders that doesn't exist.
I want to call mkdir command for every folder that does not exist
How can I do this on easy way ?
Thanks

This can be done in native bash syntax without any calls to external binaries:
while read line; do mkdir -p "${line%/*}"; done < infile
Or perhaps with a just a single call to mkdir if you have bash 4.x
mapfile -t arr < infile; mkdir -p "${arr[#]%/*}"

How about...
for p in $(xargs < somefile.txt);
do
mkdir -p $(dirname ${p})
done

xargs -n 1 dirname <somefile.txt | xargs mkdir -p

It can be done without loop also (provided input file not huge):
mkdir -p $(perl -pe 's#/(?!.*/).*$##' file.txt)

If you have file "file1" with filenames you could try this oneliner:
cat file1 |xargs -I {} dirname "{}"| sort -u | xargs -I{} mkdir -p "{}"
Use of:
xargs -I{} mkdir -p "{}"
ensures that even path names with spaces will be created

Using a perl one-liner and File::Path qw(make_path):
perl -MFile::Path=make_path -lne 'make_path $_' dirlist.txt

Related

How to pipe output from grep to cp?

I have a working grep command that selects files meeting a certain condition. How can I take the selected files from the grep command and pipe it into a cp command?
The following attempts have failed on the cp end:
grep -r "TWL" --exclude=*.csv* | cp ~/data/lidar/tmp-ajp2/
cp: missing destination file operand after
‘/home/ubuntu/data/lidar/tmp-ajp2/’ Try 'cp --help' for more
information.
cp `grep -r "TWL" --exclude=*.csv*` ~/data/lidar/tmp-ajp2/
cp: invalid option -- '7'
grep -l -r "TWL" --exclude=*.csv* | xargs cp -t ~/data/lidar/tmp-ajp2/
Explanation:
grep -l option to output file names only
xargs to convert file list from the standard input to command line arguments
cp -t option to specify target directory (and avoid using placeholders)
you need xargs with the placeholder option:
grep -r "TWL" --exclude=*.csv* | xargs -I '{}' cp '{}' ~/data/lidar/tmp-ajp2/
normally if you use xargs, it will put the output after the command, with the placeholder ('{}' in this case), you can choose the location where it is inserted, even multiple times.
This worked for me when searching for files with a specific date:
ls | grep '2018-08-22' | xargs -I '{}' cp '{}' ~/data/lidar/tmp-ajp2/
To copy files to grep found directories, use -printf to output directories and -i to place the command argument from xarg (after pipe)
find ./ -name 'filename.*' -print '%h\n' | xargs -i cp copyFile.txt {}
this copies copyFile.txt to all directories (in ./) containing "filename"
grep -rl '/directory/' -e 'pattern' | xargs cp -t /directory

Copy files from a list to a folder

i have a text file abc.txt and it's contents are,
/lag/cnn/org/one.txt
/lag/cnn/org/two.txt
/lag/cnn/org/three.txt
if i use ,
tar -cvf allfiles.tar -T abc.txt
i am getting the tar of files in the list. Similarly is it possible to copy those files in abc.txt to a folder.
I tried ,
cp --files-from test1.txt ./Folder
but it is not working. Please help
You could use xargs:
cat test1.txt | xargs -I{} cp {} ./Folder
In order to avoid Useless use of cat, you could say:
xargs -a test1.txt -I{} cp {} ./Folder
You can use xargs for example, or do it in a loop:
while read FILE; do cp "$FILE" ./Folder; done <test1.txt
You can write:
while IFS= read -r FILENAME ; do
cp -- "$FILENAME" ./Folder || break
done < test1.txt
which loops over the lines of the file, reading each line into the variable FILENAME, and running cp for each one.
You could use tar cvf - to write the tar to stdout and pipe it right into a tar xvfC - Folder.

copy few files from a directory that are specified in a text file (linux)

I have a directory called images
and it contains many images.
For example:
images/
imag001.png
imag002.png
imag003.png
imag004.png
And I have a text file that has the files that I want to copy somewhere else. Say the test.txt file has
img001.png
img003.png
How do I copy the files specified in test.txt from the images folder to some other place?
try this one-liner under your images directory:
awk '{print "cp "$0" /target/path"}' test.txt|sh
There are probably many solutions to this problem. I would do it by using xargs:
cd images/
cat path/to/test.txt | xargs -I FILES cp FILES path/to/dest/
I think in the bash shell you can do:
for image in $(cat copylist.txt); do
cp $image destination
done
You can write:
while IFS= read -r image_filename ; do
cp images/"$image_filename" some_other_place/
done < test.txt
cat copylist.txt | xargs -n1 -I % echo cp % destination/.
# remove echo after testing

xargs copy if file exists

I got a string with filenames I want to copy. However, only some of these files exist. My current script looks like this:
echo $x | xargs -n 1 test -f {} && cp --target-directory=../folder/ --parents
However, I always get a test: {}: binary operator expected error.
How can I do that?
You need to supply the -i flag to xargs for it to substitute {} for the filename.
However, you seem to expect xargs to feed into the cp, which it does not do. Maybe try something like
echo "$x" |
xargs -i sh -c 'test -f {} && cp --target-directory=../folder/ --parents {}'
(Notice also the use of double quotes with echo. There are very few situations where you want a bare unquoted variable interpolation.)
To pass in many files at once, you can use a for loop in the sh -c:
echo "$x" |
xargs sh -c 'for f; do
test -f "$f" && continue
echo "$f"
done' _ |
xargs cp --parents --target-directory=".,/folder/"
The _ argument is because the first argument to sh -c is used to populate $0, not $#
xargs can only run a simple command. The && part gets interpreted by the shell which is not what you want. Just create a temporary script with the commands you want to run:
cat > script.sh
test -f "$1" && cp "$1" --target-directory=../folder/ --parents
Control-D
chmod u+x ./script.sh
echo $x | xargs -n1 ./script.sh
Also note that {} is not needed with -n1 because the parameter is used as the last word on a line.

delete file other than particular extension file format

i have a lot of different type of files in one folder. i need to delete the files but except the pdf file.
I tried to display the pdf file only. but i need to delete the other than pdf files
ls -1 | xargs file | grep 'PDF document,' | sed 's/:.*//'
You could do the following - I've used echo rm instead of rm for safety:
for i in *
do
[ x"$(file --mime-type -b "$i")" != xapplication/pdf ] && echo rm "$i"
done
The --mime-type -b options to file make the output of file easier to deal with in a script.
$ ls
aa.txt a.pdf bb.cpp b.pdf
$ ls | grep -v .pdf | xargs rm -rf
$ ls
a.pdf b.pdf
:) !
ls |xargs file|awk -F":" '!($2~/PDF document/){print $1}'|xargs rm -rf
Try inverting the grep match:
ls -1 | xargs file | grep -v 'PDF document,' | sed 's/:.*//'
It's rare in my experience to encounter PDF files which don't have a .pdf extension. You don't state why "file" is necessary in the example, but I'd write this as:
# find . -not -name '*.pdf' -delete
Note that this will recurse into subdirectories; use "-maxdepth 1" to limit to the current directory only.

Resources