Perform grep on individual results of a command spawned by xargs - linux

Is it possible to grep the result of a command spawned by xargs?
As an example I am trying the following command
findbranch prj-xyz -latest|sed 's/^\(.*\/.*\)##.*$/\1/'|xargs -I {} cleartool lsh {}|grep -m 1 'user'
but seems like grep is executing on the entire result set returned by findbranch, rather individual results of lsh
As an example what I want from above is, for every file returned by findbranch and sed combined I would like to find that version which was last modified by a certain user.
Note If in case it is of a concern, findbranch is an internal utility.

How about this approach?
.... | xargs -I {} bash -c "cleartool lsh {}|grep -m 1 'user'"
I guess, this answer is self explanatory for you...

Why not use a two phase command? something like
findbranch prj-xyz -latest|sed 's/^\(.*\/.*\)##.*$/\1/' > /tmp/x ; for x in `cat /tmp/x`; do echo $x; done
Once you see $x is the input you need for xargs you can further manipulate it

If you have GNU Parallel this ought to work:
findbranch prj-xyz -latest|sed 's/^\(.*\/.*\)##.*$/\1/'|parallel cleartool lsh {}'|'grep -m 1 'user'
It will still spawn multiple shells, but at least you can use more CPUs to process it.

Related

Pass each file obtained from a command to another command as a parameter

I am using the following line to take a pdf and split it:
pdfseparate -f 14 -l 23 ALF.SS.0.pdf "${FILE}"-%d.pdf
Now I want for each file produced, to run several commands like this:
pdfcrop --margins '-30 0 -385 0' outputOfpdfSeparate outputOfpdfSeparate-1stCol.pdf
I am trying to figure out the best way to do this:
With a single loop, for each file created by pdfseparate, if I manage to "know" what is the name of the file, I could pass it to pdfcrop and done. But since it is using %d, I do not know how to handle this "new name" in which each file has a new number. I know how to do this in Java but here I do not see it so clear.
Using pipes. I think I have the same issue since if I do
pdfseparate [options] | pdfcrops inputfile outputfile,
I do not know how to "use" the name of inputfile. I am sure it is easy but I dont see it.
Using xargs. I am studying this command since it is new for me.
Using exec. I am under the impression this is not necessary but maybe I am wrong since it's been a long while since I last used exec.
Thanks in advance.
You can use xargs. It is the best way in terms of speed.
I usually use it for converting a lot of .mp4 file to .mp3.
Doing this conversion one-by-one not only is tedious but also takes a long time. Therefore you can use the auto parallel mechanism with the help of -P 0 option in xargs
for example if I had 10 .mp4 files I would do this:
ls *.mp4 | xargs -I xxx -P 0 ffmpeg -i xxx xxx.mp3
After running this line; 10 ffmpet commands are running simultaneously.
The other way to do this is storing a list of .mp4 file in a text file like this:
ls *.mp4 > list-mp4
then:
xargs -I xxx -P 0 ffmpeg -i xxx xxx.mp3 < list-mp4
Or may you have access to GNU-parallel. Thus you can:
parallel ffmpeg -i {} {}.mp3 ::: *.mp4
Now for your case; if you want to use these (= xargs or parallel) or your own command, you should notice that your first command should send its output to stdout. Because the second command is going to read its stdin from the stdout of the first command and bash does this for your.
Thus when you can use pipe == | with your: pdfseparate than it sends its output to stdout. If it does/can NOT, then the right-side of the pipe == the second command does nothing and vice versa: the second command should/can read its stdin from incoming stdout.
For example
ls *.txt | echo {}
here echo does not read any incoming stdout from the ls command and just prints {}
Eventually, your pdfseparate should send to stdout. Then xargs store it in -I anything-your-like and then call your second command
Therefor:
pdfseparate options... | xargs -I ABC -P 0 your-second-command+its-options ABC
NOTE-1 that xargs stores the given stdout line-by-line in ABC and you pass this to your second command as its input
NOTE-2 you do not have to use -P 0 at all. It is just for speeding up the executing time. You can omit it but your second command are synchronized per incoming line.
pdfseparate does not output the files it created, thus you have to use "ls" command to get the filelist, you want to operate on.
# separate the pdfs
pdfseparate -f 14 -l 23 ALF.SS.0.pdf "${FILE}"-%d.pdf
# operate on the just created files, assumes that a "FILE" variable is set, which might not be the case
for i in $(ls "${FILE}-*.pdf"); do pdfcrop --margins '-30 0 -385 0' $i; done;
# assuming that FILE variable in your case would match ALF.SS.0-[0-9]*.pdf, you'd use this:
for i in $(ls ALF.SS.0-[0-9]*.pdf); do pdfcrop --margins '-30 0 -385 0' $i; done;

understanding linux arguments and piping

So I'm trying to use the sh (Bourne Shell) to write some scripts. I keep running into this confusion. For the following:
1. rm `echo test`
2. echo test | rm
I know backticks are used to run the command first, okay.
But for piping in #2, why doesn't rm take in test as an argument? Is there something about piping I don't understand? I thought it was simply sending output of one command as the input to another.
And... related to my piping confusion maybe.
dir=/blah/blar/blar
files=`ls ${dir} -rt`
count=`wc -l $files` # doesn't work, in fact it's running it along with each file that exists
count2=`$files | wc -l` # doesn't work
How come I can't store the ls into "files" and use that?
You would need to use xargs there, as rm takes arguments to delete, it doesn't read from the STDIN (which is what pipes typically pipe).
echo test | xargs rm
The first one works because backticks are for substitutions, much like ${} but not as easy. :)
Alternatively, you could use find.
find . -name test -exec rm -f '{}' \;
In the first case the results of echo test (the string test) are being provided as a command-line argument to rm. In the second, the string test is being piped to the stdin file descriptor of the rm process. These are two very different things. Since rm doesn't read from stdin, it never sees test.

Bash command substitution with a variable

I'm new to bash scripting and I've been learning as I go with a small project I'm taking on. However, I've run into a problem that I cannot seem to get past.
I have a variable that I need to include in a command. When ran directly in the shell (with the variable manually typed), the command returns the expected result. However, I can't get it to work when using a variable.
So, if I manually run this, it correctly returns 0 or 1, depending if it is running or not.
ps -ef | grep -v grep | grep -c ProcessName
However, when I try to embed that into this while clause, it always evaluates to 0 because it's not searching for the correct text.
while [ `ps -ef | grep -v grep | grep -c {$1}` -ne 0 ]
do
sleep 5
done
Is there a way I can accomplish this? I've tried a myriad of different things to no avail. I also tried using the $() syntax for command substitution, but I had no luck with that either.
Thanks!
I think that instead of {$1} you mean "$1". Also, you can just do pgrep -c "$1" instead of the two pipes.
In addition, there's also no need to compare the output of grep -c with 0, since you can just see if the command failed or not. So, a much simplified version might be:
while pgrep "$1" > /dev/null
do
sleep 4
done
You should really use -C with ps rather than the messy pipes if you're using the full process name. If you're interested in substring matching, then your way is the only thing I can think of.

Linux command to list all available commands and aliases

Is there a Linux command that will list all available commands and aliases for this terminal session?
As if you typed 'a' and pressed tab, but for every letter of the alphabet.
Or running 'alias' but also returning commands.
Why? I'd like to run the following and see if a command is available:
ListAllCommands | grep searchstr
You can use the bash(1) built-in compgen
compgen -c will list all the commands you could run.
compgen -a will list all the aliases you could run.
compgen -b will list all the built-ins you could run.
compgen -k will list all the keywords you could run.
compgen -A function will list all the functions you could run.
compgen -A function -abck will list all the above in one go.
Check the man page for other completions you can generate.
To directly answer your question:
compgen -ac | grep searchstr
should do what you want.
Add to .bashrc
function ListAllCommands
{
echo -n $PATH | xargs -d : -I {} find {} -maxdepth 1 \
-executable -type f -printf '%P\n' | sort -u
}
If you also want aliases, then:
function ListAllCommands
{
COMMANDS=`echo -n $PATH | xargs -d : -I {} find {} -maxdepth 1 \
-executable -type f -printf '%P\n'`
ALIASES=`alias | cut -d '=' -f 1`
echo "$COMMANDS"$'\n'"$ALIASES" | sort -u
}
There is the
type -a mycommand
command which lists all aliases and commands in $PATH where mycommand is used. Can be used to check if the command exists in several variants. Other than that... There's probably some script around that parses $PATH and all aliases, but don't know about any such script.
The others command didn't work for me on embedded systems, because they require bash or a more complete version of xargs (busybox was limited).
The following commands should work on any Unix-like system.
List by folder :
ls $(echo $PATH | tr ':' ' ')
List all commands by name
ls $(echo $PATH | tr ':' ' ') | grep -v '/' | grep . | sort
Use "which searchstr". Returns either the path of the binary or the alias setup if it's an alias
Edit:
If you're looking for a list of aliases, you can use:
alias -p | cut -d= -f1 | cut -d' ' -f2
Add that in to whichever PATH searching answer you like. Assumes you're using bash..
Try this script:
#!/bin/bash
echo $PATH | tr : '\n' |
while read e; do
for i in $e/*; do
if [[ -x "$i" && -f "$i" ]]; then
echo $i
fi
done
done
For Mac users (find doesn't have -executable and xargs doesn't have -d):
echo $PATH | tr ':' '\n' | xargs -I {} find {} -maxdepth 1 -type f -perm '++x'
Alternatively, you can get a convenient list of commands coupled with quick descriptions (as long as the command has a man page, which most do):
apropos -s 1 ''
-s 1 returns only "section 1" manpages which are entries for executable programs.
'' is a search for anything. (If you use an asterisk, on my system, bash throws in a search for all the files and folders in your current working directory.)
Then you just grep it like you want.
apropos -s 1 '' | grep xdg
yields:
xdg-desktop-icon (1) - command line tool for (un)installing icons to the desktop
xdg-desktop-menu (1) - command line tool for (un)installing desktop menu items
xdg-email (1) - command line tool for sending mail using the user's preferred e-mail composer
xdg-icon-resource (1) - command line tool for (un)installing icon resources
xdg-mime (1) - command line tool for querying information about file type handling and adding descriptions for new file types
xdg-open (1) - opens a file or URL in the user's preferred application
xdg-screensaver (1) - command line tool for controlling the screensaver
xdg-settings (1) - get various settings from the desktop environment
xdg-user-dir (1) - Find an XDG user dir
xdg-user-dirs-update (1) - Update XDG user dir configuration
The results don't appear to be sorted, so if you're looking for a long list, you can throw a | sort | into the middle, and then pipe that to a pager like less/more/most. ala:
apropos -s 1 '' | sort | grep zip | less
Which returns a sorted list of all commands that have "zip" in their name or their short description, and pumps that the "less" pager. (You could also replace "less" with $PAGER and use the default pager.)
Try to press ALT-? (alt and question mark at the same time). Give it a second or two to build the list. It should work in bash.
Here's a solution that gives you a list of all executables and aliases. It's also portable to systems without xargs -d (e.g. Mac OS X), and properly handles paths with spaces in them.
#!/bin/bash
(echo -n $PATH | tr : '\0' | xargs -0 -n 1 ls; alias | sed 's/alias \([^=]*\)=.*/\1/') | sort -u | grep "$#"
Usage: myscript.sh [grep-options] pattern, e.g. to find all commands that begin with ls, case-insensitive, do:
myscript -i ^ls
It's useful to list the commands based on the keywords associated with the command.
Use: man -k "your keyword"
feel free to combine with:| grep "another word"
for example, to find a text editor:
man -k editor | grep text
shortcut method to list out all commands.
Open terminal and press two times "tab" button.
Thats show all commands in terminal
You can always to the following:
1. Hold the $PATH environment variable value.
2. Split by ":"
3. For earch entry:
ls * $entry
4. grep your command in that output.
The shell will execute command only if they are listed in the path env var anyway.
it depends, by that I mean it depends on what shell you are using. here are the constraints I see:
must run in the same process as your shell, to catch aliases and functions and variables that would effect the commands you can find, think PATH or EDITOR although EDITOR might be out of scope. You can have unexported variables that can effect things.
it is shell specific or your going off into the kernel, /proc/pid/enviorn and friends do not have enough information
I use ZSH so here is a zsh answer, it does the following 3 things:
dumps path
dumps alias names
dumps functions that are in the env
sorts them
here it is:
feed_me() {
(alias | cut -f1 -d= ; hash -f; hash -v | cut -f 1 -d= ; typeset +f) | sort
}
If you use zsh this should do it.
The problem is that the tab-completion is searching your path, but all commands are not in your path.
To find the commands in your path using bash you could do something like :
for x in echo $PATH | cut -d":" -f1; do ls $x; done
Here's a function you can put in your bashrc file:
function command-search
{
oldIFS=${IFS}
IFS=":"
for p in ${PATH}
do
ls $p | grep $1
done
export IFS=${oldIFS}
}
Example usage:
$ command-search gnome
gnome-audio-profiles-properties*
gnome-eject#
gnome-keyring*
gnome-keyring-daemon*
gnome-mount*
gnome-open*
gnome-sound-recorder*
gnome-text-editor#
gnome-umount#
gnome-volume-control*
polkit-gnome-authorization*
vim.gnome*
$
FYI: IFS is a variable that bash uses to split strings.
Certainly there could be some better ways to do this.
maybe i'm misunderstanding but what if you press Escape until you got the Display All X possibilities ?
compgen -c > list.txt && wc list.txt
Why don't you just type:
seachstr
In the terminal.
The shell will say somehing like
seacrhstr: command not found
EDIT:
Ok, I take the downvote, because the answer is stupid, I just want to know: What's wrong with this answer!!! The asker said:
and see if a command is available.
Typing the command will tell you if it is available!.
Probably he/she meant "with out executing the command" or "to include it in a script" but I cannot read his mind ( is not that I can't regularly it is just that he's wearing a
mind reading deflector )
in debian: ls /bin/ | grep "whatImSearchingFor"

How to execute a command with one parameter at a time in the *nix shell?

Some commands like svn log, for example will only take one input from the command line, so I can't say grep 'pattern' | svn log. It will only return the information for the first file, so I need to execute svn log against each one independently.
I can do this with find using it's exec option: find -name '*.jsp' -exec svn log {} \;. However, grep and find provide differently functionality, and the -exec option isn't available for grep or a lot of other tools.
So is there a generalized way to take output from a unix command line tool and have it execute an arbitrary command against each individual output independent of each other like find does?
The answer is xargs -n 1.
echo moo cow boo | xargs -n 1 echo
outputs
moo
cow
boo
try xargs:
grep 'pattern' | xargs svn log
A little one off shell script (using xargs is much better for a one off, that's why it exists)
#!/bin/sh
# Shift past argv[0]
shift 1
for file in "$#"
do
svn log $file
done
You could name it 'multilog' or something like that. Call it like this:
./multilog.sh foo.c abc.php bar.h Makefile
It allows for a little more sanity when being called by automated build scripts, i.e. test the existence of each before talking to SVN, or redirect each output to a separate file, insert it into a sqlite database, etc.
That may or may not be what you are looking for.

Resources