Multiple command with xargs using sh -c is not working - linux

I am trying to run this command
find . -name "new_impl.jar" | xargs -I '{}' sh -c 'java -jar jd-cli.jar --skipResources -n -g ALL '{}';'
it is not working and the error looks like it is not able to pick value of '{}'. When I am removing the section sh -c that help in running multiple command
find . -name "new_impl.jar" | xargs -I '{}' java -jar jd-cli.jar --skipResources -n -g ALL '{}';'
This command is working fine. I am using Oracle Linux 7. Can someone tell what is the reason behind this and if there is any other way to run multiple command.

I'd recommend passing the file name as an argument to sh.
find . -name "new_impl.jar" |
xargs -I '{}' \
sh -c 'java -jar jd-cli.jar --skipResources -n -g ALL "$1";rm "$1";mv *.jar "$1";unzip "$1" -d "$1".bk/;rm "$1"' _ {}
Note this will also work for using -exec from find instead.
find -name "new_impl.jar" \
-exec sh -c 'java -jar jd-cli.jar --skipResources -n -g ALL "$1";rm "$1";mv *.jar "$1";unzip "$1" -d "$1".bk/;rm "$1"' _ {}

Related

alias for a bash cmd with xargs for directories fails

I use a command to collect all my projects by searching for a common file (Jenkinsfile) because I want to execute a command in every project directory:
find . -name 'Jenkinsfile' | sed s/Jenkinsfile// | xargs -L 1 bash -c '(cd $0 && git branch)'
To shorten this for future usage I tried to create an alias for it like this:
alias fgb="find . -name 'Jenkinsfile' | sed s/Jenkinsfile// | xargs -L 1 bash -c 'cd $0 && git branch'"
But now I only get such error msgs:
./shared/authorization-provider/: line 0: cd: /usr/bin/bash: No such file or directory
What is wrong?
EDIT:
I found a solution:
alias fgb="find . -name 'Jenkinsfile' | sed s/Jenkinsfile// | xargs -I {} bash -c 'cd {} && pwd && git branch' "
As suggested by Kamil you should try to use functions instead of the more or less obsolete aliases. And find is enough for what you want:
fgb() {
find . -name 'Jenkinsfile' -execdir 'git branch' \;
}
-execdir runs its command argument from the directory in which a matching file was found.
What is wrong?
$0 is inside " quotes, so it is expanded to /usr/bin/bash. cd $0 -> cd /usr/bin/bash -> no such file or dir.
The millennium long advice is: use a function, not an alias. Instead of an alias, write a function.
fgb() {
find . -name 'Jenkinsfile' | sed s/Jenkinsfile// | xargs -L 1 bash -c 'cd $0 && git branch'
}

curl request with arguments from "find" command

I have the bash script which makes pg_dumpall and uploads to my FTP server.
#/bin/bash
timeslot=`date +\%Y-\%m-\%d-\%H:\%M:\%S`
backup_dir="/var/lib/backup"
name=recycle_backup_$timeslot.sql.gz
set -e
find /var/lib/backup/*.sql.gz -mtime +7 -type f -exec rm -rvf {} \; -exec curl -p --insecure ftp://aaa.bbb/backup/{} --user user:password -Q 'DELE backup/'{} \;
echo $timeslot "Running docker container to backup Postgresql"
sudo docker run --rm \
-t \
--name postgres-backup \
-v "/var/lib/backup":"/mnt" \
-w "/mnt" \
-e PGPASSWORD="password" \
--entrypoint=/usr/bin/pg_dumpall \
postgres:9 \
-h 1.2.3.4 -U postgres | gzip -9 > $backup_dir/$name
echo "Your back is successfully stored in " $backup_dir
curl -T /var/lib/backup/$name ftp://aaa.bbb/backup/ --user user:password
It cleans up all old .sql.gz files (older than 7 days).
After these steps, the script makes a docker container with Postgres, runs commands to make backup and saves it locally to /var/lib/backup/filename.sql.gz.
The problem is I can't clean up files on my FTP server with arguments that "find" returns.
find /var/lib/backup/*.sql.gz -mtime +7 -type f -exec rm -rvf {} \; -exec curl -p --insecure ftp://aaa.bbb/backup/{} --user user:password -Q 'DELE backup/'{} \;
How to add the argument {} from find command to this curl request? -Q 'DELE BACKUP/'{} \;
Thanks for the support.
Solved the problem with a second script, which connects to FTP server and deletes outdated files in a loop from this topic Linux shell script for delete old files from ftp
That's not going to work, you can only use {} once with find.
Maybe try this approach? Remove the echo if this looks sane ...
find /var/lib/backup/*.sql.gz -mtime +7 -type f -exec rm -rvf {} + | awk '{print gensub( /...(.*)/, "\\1","1",$2)}' | while read -s file
do
echo curl -p --insecure ftp://aaa.bbb/backup/{} --user user:password -Q 'DELE backup/'${file}
done

Grep and delete file

I run the following code to delete malware, I would like to extend it with a pipe so it can delete the files that found to contain the string below (delete result return by grep).
grep -rnw . -e "ALREADY_RUN_1bc29b36f342a82aaf6658785356718"
It return a list of files
./gallery.php:2:if (!defined('ALREADY_RUN_1bc29b36f342a82aaf6658785356718'))
./gallery.php:4:define('ALREADY_RUN_1bc29b36f342a82aaf6658785356718', 1);
./wp-includes/SimplePie/HTTP/db.php:2:if (!defined('ALREADY_RUN_1bc29b36f342a82aaf6658785356718'))
./wp-includes/SimplePie/HTTP/db.php:4:define('ALREADY_RUN_1bc29b36f342a82aaf6658785356718', 1);
./wp-includes/SimplePie/Parse/template.php:2:if (!defined('ALREADY_RUN_1bc29b36f342a82aaf6658785356718'))
./wp-includes/SimplePie/Parse/template.php:4:define('ALREADY_RUN_1bc29b36f342a82aaf6658785356718', 1);
./wp-includes/SimplePie/XML/file.php:2:if (!defined('ALREADY_RUN_1bc29b36f342a82aaf6658785356718'))
Here's a solution using xargs to process files as they are listed from stdin.
Grep recursively searches the contents of . for the pattern (you don't seem to be using any regex features, so I changed the flag to -F for fixed string).
Here's a simple script that will delete the files, note that it will split on all newlines, including newlines in file names.
$ grep -rl -F "ALREADY_RUN_1bc29b36f342a82aaf6658785356718" . | \
xargs -I'{}' rm '{}'
For the sake of completeness, here's a command that will work regardless of file name (using rm is safe because we know the path MUST begin with ./)
$ find . -type f -exec \
/bin/sh -c 'grep -q -F "$0" "$1" && rm "$1"' 'ALREADY_RUN_1bc29b36f342a82aaf6658785356718' '{}' \;
and deleting multiple files at once.
$ find . -type f -exec \
/bin/sh -c 'grep -q -F "$0" "$#" && rm "$#"' 'ALREADY_RUN_1bc29b36f342a82aaf6658785356718' '{}' +

Any way to show the commands generated by a command containing 'find', 'xargs'?

For example, if I have a command:
find . -name "*.png" |xargs -I{} sh -c "mycommand {}"
If I have a.png and b.png in the current folder, I want a way to show the following, but not execute them:
mycommand a.png
mycommand b.png
Is there anyway to achieve this?
You could change that command to
find . -name "*.png" |xargs -I{} echo "mycommand {}"
If you just want to show the command before execute it, you could use -t option of xargs:
find . -name "*.png" |xargs -t -I{} sh -c "mycommand {}"

How to avoid 'are the same file' warning message when using cp in Linux?

I'm trying to copy certain files from one directory to another. Using this command
find "$HOME" -name '*.txt' -type f -print0 | xargs -0 cp -t $HOME/newdir
I get an warning message saying
cp: '/home/me/newdir/logfile.txt' and '/home/me/newdir/logfile.txt'
are the same file
How to avoid this warning message?
The problem is that you try to copy a file to itself. You can avoid it by excluding the destination directory from the results of the find command like this:
find "$HOME" -name '*.txt' -type f -not -path "$HOME/newdir/*" -print0 | xargs -0 cp -t "$HOME/newdir"
try using install instead, this replaces by removing the file first.
install -v target/release/dynnsd-client target/
removed 'target/dynnsd-client'
'target/release/dynnsd-client' -> 'target/dynnsd-client'
and then remove the source files
Make it unique in the process. But this require sorting
find "$HOME" -name '*.txt' -type f -print0 | sort -zu | xargs -0 cp -t "$HOME/newdir"
Or if it's not about the generated files, try to use the -u option of cp.
find "$HOME" -name '*.txt' -type f -print0 | xargs -0 cp -ut "$HOME/newdir"
-u copy only when the SOURCE file is newer than the destination file or when
the destination file is missing
install
worked perfectly in a Makefile context with docker - thanks!
copy:
#echo ''
bash -c 'install -v ./docker/shell .'
bash -c 'install -v ./docker/docker-compose.yml .'
bash -c 'install -v ./docker/statoshi .'
bash -c 'install -v ./docker/gui .'
bash -c 'install -v ./docker/$(DOCKERFILE) .'
bash -c 'install -v ./docker/$(DOCKERFILE_SLIM) .'
bash -c 'install -v ./docker/$(DOCKERFILE_GUI) .'
bash -c 'install -v ./docker/$(DOCKERFILE_EXTRACT) .'
#echo ''
build-shell: copy
docker-compose build shell
Try using rsync instead of cp:
find "$HOME" -name "*.txt" -exec rsync {} "$HOME"/newdir \;

Resources