Shell Script intend to read argument after '|' - linux

Hi I was wondering how to read the argument after "|" pipe from shell script.
For example, when I run ./tmp.sh ls -la | sort
I could only get 2 arguments, which is "ls" and "-la".
Is there any way to read "| sort" without modifying the command, and realize only with shell script?
Thanks a lot!!

One way would be to pass the entire command as a string to your script.
./tmp.sh -c "ls -la | sort"
...or without a flag...
./tmp.sh "ls -la | sort"
Afterward, you can split the string into an array in your script.

I guess you could check ps axf, but part of the beauty of pipes is the loose coupling they give, because bash knows what is in your pipeline, not the individual pieces of the pipeline. This makes writing filters simple.

Related

How can I use xargs to run a function in a command substitution for each match?

While writing Bash functions for string replacements I have encountered a strange behaviour when using xargs. This is actually driving me mad currently as I cannot get it to work.
Fortunately I have been able to nail it down to the following simple example:
Define a simple function which doubles every character of the given parameter:
function subs { echo $1 | sed -E "s/(.)/\1\1/g"; }
Call the function:
echo $(subs "ABC")
As expected the output is:
AABBCC
Now call the function using xargs:
echo "ABC" | xargs -I % echo $(subs "%")
Surprisingly the result now is:
ABCABC
It seems as if the sed command inside the function treats the whole string now as a single character.
Why does this happen and how can it be prevented?
You might ask, why I use xargs at all. Of course, this is a simplified example and the actual use case is much more complex.
In the original use case, I have a program which produces lots of output. I pipe the output through several greps to get the lines of interest. Afterwards, I pipe the lines to sed to extract the data I need from the lines. Because some transformations I need to do on the data are too complex to do with regular expressions alone, I'd like to use a function for these. So, my original idea was to simply pipe into the function but I couldn't get that to work and end up with the xargs solution. My original idea was something like this:
command | grep ... | grep ... | grep ... | sed ... | subs
BTW: I do not do this from the command line but from within a script. The function is defined in the very same script in which it is used.
I'm using Bash 3.2 (Mac OS X default), so fancy Bash 4.x stuff won't help me, sorry.
I'll be happy about everything which might shed some light on this topic.
Best regards
Frank
If you really need to do this (and you probably don't, but we can't help without a more representative sample), a better-practice approach might look like:
subs() { sed -E "s/(.)/\1\1/g" <<<"$1"; }
export -f subs
echo "ABC" | xargs bash -c 'for arg; do subs "$arg"; done' _
The use of echo "$(subs "$arg")" instead of just subs "$arg" adds nothing but bugs (consider what happens if one of your arguments is -n -- and that's assuming a relatively tame echo; they're allowed to consume backslashes even without a -e argument and to do all manner of other surprising things). You could do it above, but it slows your program down and makes it more prone to surprising behaviors; there's no point.
Running export -f subs export your function to the environment, so it can be run by other instances of bash invoked as child processes (all programs invoked by xargs are outside your shell, so they can't see shell-local variables or functions).
Without -I -- which is to say, in its default mode of operation -- xargs appends arguments to the end of the command it's given. This permits a much more efficient usage mode, where instead of invoking one command per line of input, it passes as many arguments as possible to the shortest possible number of subprocesses.
This also avoids major security bugs that can happen when using xargs -I in conjunction with bash -c '...' or sh -c '...'. (If you ever use -I% sh -c '...%...', then your filenames become part of your code, and are able to be used in injection attacks on your system).
That's because the construct $(subs "%") gets expanded by the shell when parsing the pipeline, so xargs runs with echo %%.

How to use output directly as a command

with this grep it shows a comand I used:
echo `history | grep "ssh root" | head -1| cut -c6-`
with this output:
ssh root#107.170.70.100
I want the output to directly execute as the command instead of printed.
How can I do it?
In principle, this can be done by using the $() format, so
$(history | grep "ssh root" | head -1| cut -c6-)
should do what you ask for. However, I don't think that it is advisable to do so, as this will automatically execute the command that results from your grep, so if you did a mistake, a lot of bad things can happen. Instead I suggest reviewing your result before re-executing. bash history has a lot of nice shortcuts to deal with these kind of things. As an example, imagine:
> history | grep "ssh root"
756 ssh root#107.170.70.100
you can call this command on line 756 easily by typing
!756
It's definitely much safer. Hope this helps.
Ideally you'd be using the $(cmd) syntax rather than the `cmd` syntax. This makes it easier to nest subshells as well as keep track of what's going on.
That aside, if you remove the echo statement it will run the script:
# Prints out ls
echo $( echo ls )
# Runs the ls command
$( echo ls )
Use eval.
$ eval `history | grep "ssh root" | head -1| cut -c6-`
From eval command in Bash and its typical uses:
eval takes a string as its argument, and evaluates it as if you'd typed that string on a command line.
And the Bash Manual (https://www.gnu.org/software/bash/manual/html_node/Bourne-Shell-Builtins.html#Bourne-Shell-Builtins)
eval
eval [arguments]
The arguments are concatenated together into a single command, which is then read and executed, and its exit status returned as the exit status of eval. If there are no arguments or only empty arguments, the return status is zero.

Linux command line, reverse polish notation

ls /tmp
How can I run the same command but using reverse polish notation?
Is there a mode that would allow me to do this or something similar to that?
I could use xargs but that's a lot more typing:
echo /tmp | xargs ls
This would be ideal:
/tmp ls
or
/tmp | ls
Bash (I assume you are using it) is a shell for unixoid systems.
As far as I know, bash doesn't provide such a mode. You could use a different shell that provides this feature. Searching in the web, this was my first result: https://github.com/iconmaster5326/RPOS, but maybe it is far from stable ;)
Alternatively, you can make a command that reverses it's argument list and execute it.
The usage would be like this:
reversex /tmp ls
reversex A.txt B.txt cp
Here is an example of such a command:
#!/bin/bash
for i in "$#"
do
CMDLINE="$i $CMDLINE"
done
$CMDLINE
If you name it /usr/local/bin/reversex and make it executable, you should be able to use simple reverse commands with the prefix reversex. I can not give a warranty that it works. Note that the arguments are parsed twice and have to be escaped twice, too.

How to take advantage of filters

I've read here that
To make a pipe, put a vertical bar (|) on the command line between two commands.
then
When a program takes its input from another program, performs some operation on that input, and writes the result to the standard output, it is referred to as a filter.
So I've first tried the ls command whose output is:
Desktop HelloWord.java Templates glassfish-4.0
Documents Music Videos hs_err_pid26742.log
Downloads NetBeansProjects apache-tomcat-8.0.3 mozilla.pdf
HelloWord Pictures examples.desktop netbeans-8.0
Then ls | echo which outputs absolutely nothing.
I'm looking for a way to take advantages of pipelines and filters in my bash script. Please help.
echo doesn't read from standard input. It only writes its command-line arguments to standard output. The cat command is what you want, which takes what it reads from standard input to standard output.
ls | cat
(Note that the pipeline above is a little pointless, but does demonstrate the idea of a pipe. The command on the right-hand side must read from standard input.)
Don't confuse command-line arguments with standard input.
echo doesn't read standard input. To try something more useful, try
ls | sort -r
to get the output sorted in reverse,
or
ls | grep '[0-9]'
to only keep the lines containing digits.
In addition to what others have said - if your command (echo in this example) does not read from standard input you can use xargs to "feed" this command from standard input, so
ls | echo
doesn't work, but
ls | xargs echo
works fine.

How to make bash to know | is a pipe and not a string

Hi my question is simple. I want to do this in a command prompt.
var="ls | cat"
$var
Now I know that when I try to do this manually
ls | cat
Bash takes | as a special thing. I don't know how its called, I know | it's called a pipe but I mean that bash takes | as a ... and actually makes a pipe. I also figured that when I try to do $var bash actually takes | as a string and not as a pipe. Well, my question is How can I make bash to realize that | is actually a pipe and not a string. Thanks, I hope I am clear about my point.
Simple solution: use eval:
var="ls | cat"
eval $var
bash interprets the arguments to eval as if you had typed that on the command line.
Of course, keep in mind the security risks to using eval with user input, in case that's an issue for your program.
This may or may not apply - but it sounds like you may be looking for the alias command. You can do alias var="ls | cat" and then in your command prompt you can do var and it treats it as if you wrote ls | cat
Rather than trying to embed executable code into a variable (which should be used to hold data, not code), use a shell function, which is intended to hold code:
my_func () {
ls | cat
}
| is called a pipe, I haven't heard any other naming. Basically the stream output by the command on its left goes as the input to the command on its right. In your case, ls output goes into a stream (i.e. a temporary file), and that stream is fed to cat. cat prints the content of a file, and ls stream is very much like a file.
Now, you are trying to make bash interpret your variable var. To do this, try:
var=`ls | cat`
$var
On my computer I get this:
-bash: Applications: command not found
Because in my case, $var is expanded to Applications Documents Downloads, the output of my ls.
Given crudely as is, bash believes this is a command I want him to execute.
If your intention is not to execute $varcontent but print it, try:
var=`ls | cat`
echo $var
The cat is not needed here, just use ls -1 and as other answers say you can alias it or put it in a function.
For example, if you want to override ls to print each file on a new line do something like
> alias ls='command ls -1'
> ls
file1
file2
etc...
And put it in a bash init file like ~/.bashrc if you want to make the change permanent
1) Functions are suitable for such tasks:
func (){
ls | cat
}
Invoke it by saying func
2) Also another suitable solution could be eval:
eval takes a string as its argument, and evaluates it as if you'd typed that string on a command line. (If you pass several arguments, they are first joined with spaces between them.)
var="ls | cat"
eval $var

Resources