xargs copy if file exists - linux

I got a string with filenames I want to copy. However, only some of these files exist. My current script looks like this:
echo $x | xargs -n 1 test -f {} && cp --target-directory=../folder/ --parents
However, I always get a test: {}: binary operator expected error.
How can I do that?

You need to supply the -i flag to xargs for it to substitute {} for the filename.
However, you seem to expect xargs to feed into the cp, which it does not do. Maybe try something like
echo "$x" |
xargs -i sh -c 'test -f {} && cp --target-directory=../folder/ --parents {}'
(Notice also the use of double quotes with echo. There are very few situations where you want a bare unquoted variable interpolation.)
To pass in many files at once, you can use a for loop in the sh -c:
echo "$x" |
xargs sh -c 'for f; do
test -f "$f" && continue
echo "$f"
done' _ |
xargs cp --parents --target-directory=".,/folder/"
The _ argument is because the first argument to sh -c is used to populate $0, not $#

xargs can only run a simple command. The && part gets interpreted by the shell which is not what you want. Just create a temporary script with the commands you want to run:
cat > script.sh
test -f "$1" && cp "$1" --target-directory=../folder/ --parents
Control-D
chmod u+x ./script.sh
echo $x | xargs -n1 ./script.sh
Also note that {} is not needed with -n1 because the parameter is used as the last word on a line.

Related

Recursively get all the files and dirs they are located in

Trying to run a script that will fetch all directories, and files containing these directories, and logs data onto a .CSV file.
So, if I were to have structure like:
mainDir.dir -> [sub1.dir -> file01.png, sub2.dir -> file02.png]
, I would get a CSV of
dir; file
sub1; file01.png
sub2; file02.png
This is the script I currently have
for dir in */ .*/ ;
do
for entry in $dir
do
path="$entry"
empty=""
file="${$dir/$empty}"
echo -e "$dir;$file;" >> file.csv
done
done
find is useful for processing many files recursively.
Command
find . -type f -execdir sh -c "pwd | tr -d '\n' >> ~/my_file.csv; echo -n ';' >> ~/my_file.csv; echo {} | sed -e 's/^\.\///' >> ~/my_file.csv" \;
Note: make sure you do not give a relative path to the output CSV file. execdir changes the working directory (and that is what makes pwd work).
Breakdown
find . -type f find all files recursively starting here
-execdir sh -c "pwd | tr -d '\n' >> ~/my_file.csv; echo -n ';' >> ~/my_file.csv; For each file, execute in its directory pwd. Strip the newline and add directory name to output. Also add a semicolon, again with no newline.
echo {} | sed -e 's/^\.\///' >> ~/my_file.csv" \; Append filename to output. This time, leave newline, but by default find will place the ./ in front of the filename. The sed here removes it.
If you don't need more than one level deep, this seems to work
for i in **/*; do echo $i | tr / \; ; done >> file.csv

Grep and delete file

I run the following code to delete malware, I would like to extend it with a pipe so it can delete the files that found to contain the string below (delete result return by grep).
grep -rnw . -e "ALREADY_RUN_1bc29b36f342a82aaf6658785356718"
It return a list of files
./gallery.php:2:if (!defined('ALREADY_RUN_1bc29b36f342a82aaf6658785356718'))
./gallery.php:4:define('ALREADY_RUN_1bc29b36f342a82aaf6658785356718', 1);
./wp-includes/SimplePie/HTTP/db.php:2:if (!defined('ALREADY_RUN_1bc29b36f342a82aaf6658785356718'))
./wp-includes/SimplePie/HTTP/db.php:4:define('ALREADY_RUN_1bc29b36f342a82aaf6658785356718', 1);
./wp-includes/SimplePie/Parse/template.php:2:if (!defined('ALREADY_RUN_1bc29b36f342a82aaf6658785356718'))
./wp-includes/SimplePie/Parse/template.php:4:define('ALREADY_RUN_1bc29b36f342a82aaf6658785356718', 1);
./wp-includes/SimplePie/XML/file.php:2:if (!defined('ALREADY_RUN_1bc29b36f342a82aaf6658785356718'))
Here's a solution using xargs to process files as they are listed from stdin.
Grep recursively searches the contents of . for the pattern (you don't seem to be using any regex features, so I changed the flag to -F for fixed string).
Here's a simple script that will delete the files, note that it will split on all newlines, including newlines in file names.
$ grep -rl -F "ALREADY_RUN_1bc29b36f342a82aaf6658785356718" . | \
xargs -I'{}' rm '{}'
For the sake of completeness, here's a command that will work regardless of file name (using rm is safe because we know the path MUST begin with ./)
$ find . -type f -exec \
/bin/sh -c 'grep -q -F "$0" "$1" && rm "$1"' 'ALREADY_RUN_1bc29b36f342a82aaf6658785356718' '{}' \;
and deleting multiple files at once.
$ find . -type f -exec \
/bin/sh -c 'grep -q -F "$0" "$#" && rm "$#"' 'ALREADY_RUN_1bc29b36f342a82aaf6658785356718' '{}' +

shell script or linux command to recursively find all js/css file under public folder

I need help with shell script or linux command to recursively find all js/css file under public folder, then create the filename.min.jsm on the same directory of previous found but also put the filename.js inside that filename.min.jsm.
For example
public/test/a.js
public/b.js
public/test2/test3/c.js
output:
public/test/a.js
public/test/a.min.jsm -> a.js is written inside of this file
public/b.js
public/b.min.jsm -> b.js is written inside of this file
public/test2/test3/c.js
public/test2/test3/c.min.jsm c.js is written inside of this file
Here is a simple refactoring of #choroba's answer which inlines the shell script so you don't need a separate file.
find \( -name '*.js' -o -name '*.css' \) -exec sh -c '
filename=$1
path=${filename%/*}
basename=${filename##*/}
prefix=${basename%%.*}
echo "$basename" > "$path/$prefix".min.jsm
' _ {} \;
I added -o -name '*.css' and a set of parentheses for grouping the conditions, too. If you don't want to run this on CSS files, revert that change, or add a conditional to the embedded shell script snippet.
Create the following script:
#!/bin/bash
filename=$1
path=${filename%/*}
basename=${filename##*/}
prefix=${basename%%.*}
echo "$basename" > "$path/$prefix".min.jsm
Then, run
find -name '*.js' -exec /path/to/script.sh {} \;
It's not clear what you want to do with the css files, but that's left as an exercise for the reader.
You can try the following script:
#!/bin/bash
export MY_TMPDIR=$(mktemp -d)
trap 'rm -rf ${MY_TMPDIR}' EXIT
export fstyle_tmp="${MY_TMPDIR}/fstyle"
find /* -regextype posix-extended -regex '.+\.(js|css)' > "${fstyle_tmp}"
while read line; do
line2=$(echo "${line}" | sed -e 's/\.js$//' -e 's/\.css$//')
cp "${line}" "${line2}.min.jsm"
done < "${fstyle_tmp}"
exit 0
#EOF
If you dont want copy css content change:
line2=$(echo "${line}" | sed -e 's/\.js$//' -e 's/\.css$//')
cp "${line}" "${line2}.min.jsm"
For:
line2=$(echo "${line}" | sed -e 's/\.js$//' -e 's/\.css$//')
if [[ "${line}" =~ \.js$ ]]; then
cp "${line}" "${line2}.min.jsm"
else
touch "${line2}.min.jsm"
fi

mkdir command for a list of filenames paths

I have txt file with content like this
/home/username/Desktop/folder/folder3333/IMAGw488.jpg
/home/username/Desktop/folder/folder3333/IMAG04f88.jpg
/home/username/Desktop/folder/folder3333/IMAGe0488.jpg
/home/username/Desktop/folder/folder3333/IMAG0r88.jpg
/home/username/Desktop/folder/folder3333/
/home/username/Desktop/folder/
/home/username/Desktop/folder/IMAG0488.jpg
/home/username/Desktop/folder/fff/fff/feqw/123.jpg
/home/username/Desktop/folder/fffa/asd.png
....
these are filenames paths but also paths of folders.
The problem I want to solve is to create all folders that doesn't exist.
I want to call mkdir command for every folder that does not exist
How can I do this on easy way ?
Thanks
This can be done in native bash syntax without any calls to external binaries:
while read line; do mkdir -p "${line%/*}"; done < infile
Or perhaps with a just a single call to mkdir if you have bash 4.x
mapfile -t arr < infile; mkdir -p "${arr[#]%/*}"
How about...
for p in $(xargs < somefile.txt);
do
mkdir -p $(dirname ${p})
done
xargs -n 1 dirname <somefile.txt | xargs mkdir -p
It can be done without loop also (provided input file not huge):
mkdir -p $(perl -pe 's#/(?!.*/).*$##' file.txt)
If you have file "file1" with filenames you could try this oneliner:
cat file1 |xargs -I {} dirname "{}"| sort -u | xargs -I{} mkdir -p "{}"
Use of:
xargs -I{} mkdir -p "{}"
ensures that even path names with spaces will be created
Using a perl one-liner and File::Path qw(make_path):
perl -MFile::Path=make_path -lne 'make_path $_' dirlist.txt

How do I send multiple results from one command to another in bash?

I'm not sure if this is possible in one line (i.e., without writing a script), but I want to run an ls | grep command and then for each result, pipe it to another command.
To be specific, I've got a directory full of images and I only want to view certain ones. I can filter the images I'm interested in with ls | grep -i <something>, which will return a list of matching files. Then for each file, I want to view it by passing it in to eog.
I've tried simply passing the results in to eog like so:
eog $(ls | grep -i <something>)
This doesn't quite work as it will only open the first entry in the result list.
So, how can I execute eog FILENAME for each entry in the result list without having to bundle this operation into a script?
Edit: As suggested in the answers, I can use a for loop like so:
for i in 'ls | grep -i ...'; do eog $i; done
This works, but the loop waits to iterate until I close the currently opened eog instance.
Ideally I'd like for n instances of eog to open all at once, where n is the number of results returned from my ls | grep command. Is this possible?
Thanks everybody!
I would use xargs:
$ ls | grep -i <something> | xargs -n 1 eog
A bare ls piped into grep is sort of redundant given arbitrary?sh*ll-glo[bB] patterns (unless there are too many matches to fit on a command line in which case the find | xargs combinations in other answers should be used.
eog is happy to take multiple file names so
eog pr0n*really-dirty.series?????.jpg
is fine and simpler.
Use find:
find . -mindepth 1 -maxdepth 1 -regex '...' -exec eog '{}' ';'
or
find . -mindepth 1 -maxdepth 1 -regex '...' -print0 | xargs -0 -n 1 eog
If the pattern is not too complex, then globbing is possible, making the call much easier:
for file in *.png
do
eog -- "$file"
done
Bash also has builtin regex support:
pattern='.+\.png'
for file in *
do
[[ $file =~ $pattern ]] && eog -- "$file"
done
Never use ls in scripts, and never use grep to filter file names.
#!/bin/bash
shopt -s nullglob
for image in *pattern*
do
eog "$image"
done
Bash 4
#!/bin/bash
shopt -s nullglob
shopt -s globstar
for image in **/*pattern*
do
eog "$image"
done
Try looping over the results:
for i in `ls | grep -i <something>`; do
eog $i
done
Or you can one-line it:
for i in `ls | grep -i <something>`; do eog $i; done
Edit: If you want the eog instances to open in parallel, launch each in a new process with eog $i &. The updated one-liner would then read:
for i in `ls | grep -i <something>`; do (eog $i &); done
If you want more control over the number of arguments passed on to eog, you may use "xargs -L" in combination with "bash -c":
printf "%s\n" {1..10} | xargs -L 5 bash -c 'echo "$#"' arg0
ls | grep -i <something> | xargs -L 5 bash -c 'eog "$#"' arg0

Resources