How to make bash to know | is a pipe and not a string - linux

Hi my question is simple. I want to do this in a command prompt.
var="ls | cat"
$var
Now I know that when I try to do this manually
ls | cat
Bash takes | as a special thing. I don't know how its called, I know | it's called a pipe but I mean that bash takes | as a ... and actually makes a pipe. I also figured that when I try to do $var bash actually takes | as a string and not as a pipe. Well, my question is How can I make bash to realize that | is actually a pipe and not a string. Thanks, I hope I am clear about my point.

Simple solution: use eval:
var="ls | cat"
eval $var
bash interprets the arguments to eval as if you had typed that on the command line.
Of course, keep in mind the security risks to using eval with user input, in case that's an issue for your program.

This may or may not apply - but it sounds like you may be looking for the alias command. You can do alias var="ls | cat" and then in your command prompt you can do var and it treats it as if you wrote ls | cat

Rather than trying to embed executable code into a variable (which should be used to hold data, not code), use a shell function, which is intended to hold code:
my_func () {
ls | cat
}

| is called a pipe, I haven't heard any other naming. Basically the stream output by the command on its left goes as the input to the command on its right. In your case, ls output goes into a stream (i.e. a temporary file), and that stream is fed to cat. cat prints the content of a file, and ls stream is very much like a file.
Now, you are trying to make bash interpret your variable var. To do this, try:
var=`ls | cat`
$var
On my computer I get this:
-bash: Applications: command not found
Because in my case, $var is expanded to Applications Documents Downloads, the output of my ls.
Given crudely as is, bash believes this is a command I want him to execute.
If your intention is not to execute $varcontent but print it, try:
var=`ls | cat`
echo $var

The cat is not needed here, just use ls -1 and as other answers say you can alias it or put it in a function.
For example, if you want to override ls to print each file on a new line do something like
> alias ls='command ls -1'
> ls
file1
file2
etc...
And put it in a bash init file like ~/.bashrc if you want to make the change permanent

1) Functions are suitable for such tasks:
func (){
ls | cat
}
Invoke it by saying func
2) Also another suitable solution could be eval:
eval takes a string as its argument, and evaluates it as if you'd typed that string on a command line. (If you pass several arguments, they are first joined with spaces between them.)
var="ls | cat"
eval $var

Related

How can I use xargs to run a function in a command substitution for each match?

While writing Bash functions for string replacements I have encountered a strange behaviour when using xargs. This is actually driving me mad currently as I cannot get it to work.
Fortunately I have been able to nail it down to the following simple example:
Define a simple function which doubles every character of the given parameter:
function subs { echo $1 | sed -E "s/(.)/\1\1/g"; }
Call the function:
echo $(subs "ABC")
As expected the output is:
AABBCC
Now call the function using xargs:
echo "ABC" | xargs -I % echo $(subs "%")
Surprisingly the result now is:
ABCABC
It seems as if the sed command inside the function treats the whole string now as a single character.
Why does this happen and how can it be prevented?
You might ask, why I use xargs at all. Of course, this is a simplified example and the actual use case is much more complex.
In the original use case, I have a program which produces lots of output. I pipe the output through several greps to get the lines of interest. Afterwards, I pipe the lines to sed to extract the data I need from the lines. Because some transformations I need to do on the data are too complex to do with regular expressions alone, I'd like to use a function for these. So, my original idea was to simply pipe into the function but I couldn't get that to work and end up with the xargs solution. My original idea was something like this:
command | grep ... | grep ... | grep ... | sed ... | subs
BTW: I do not do this from the command line but from within a script. The function is defined in the very same script in which it is used.
I'm using Bash 3.2 (Mac OS X default), so fancy Bash 4.x stuff won't help me, sorry.
I'll be happy about everything which might shed some light on this topic.
Best regards
Frank
If you really need to do this (and you probably don't, but we can't help without a more representative sample), a better-practice approach might look like:
subs() { sed -E "s/(.)/\1\1/g" <<<"$1"; }
export -f subs
echo "ABC" | xargs bash -c 'for arg; do subs "$arg"; done' _
The use of echo "$(subs "$arg")" instead of just subs "$arg" adds nothing but bugs (consider what happens if one of your arguments is -n -- and that's assuming a relatively tame echo; they're allowed to consume backslashes even without a -e argument and to do all manner of other surprising things). You could do it above, but it slows your program down and makes it more prone to surprising behaviors; there's no point.
Running export -f subs export your function to the environment, so it can be run by other instances of bash invoked as child processes (all programs invoked by xargs are outside your shell, so they can't see shell-local variables or functions).
Without -I -- which is to say, in its default mode of operation -- xargs appends arguments to the end of the command it's given. This permits a much more efficient usage mode, where instead of invoking one command per line of input, it passes as many arguments as possible to the shortest possible number of subprocesses.
This also avoids major security bugs that can happen when using xargs -I in conjunction with bash -c '...' or sh -c '...'. (If you ever use -I% sh -c '...%...', then your filenames become part of your code, and are able to be used in injection attacks on your system).
That's because the construct $(subs "%") gets expanded by the shell when parsing the pipeline, so xargs runs with echo %%.

How to use output directly as a command

with this grep it shows a comand I used:
echo `history | grep "ssh root" | head -1| cut -c6-`
with this output:
ssh root#107.170.70.100
I want the output to directly execute as the command instead of printed.
How can I do it?
In principle, this can be done by using the $() format, so
$(history | grep "ssh root" | head -1| cut -c6-)
should do what you ask for. However, I don't think that it is advisable to do so, as this will automatically execute the command that results from your grep, so if you did a mistake, a lot of bad things can happen. Instead I suggest reviewing your result before re-executing. bash history has a lot of nice shortcuts to deal with these kind of things. As an example, imagine:
> history | grep "ssh root"
756 ssh root#107.170.70.100
you can call this command on line 756 easily by typing
!756
It's definitely much safer. Hope this helps.
Ideally you'd be using the $(cmd) syntax rather than the `cmd` syntax. This makes it easier to nest subshells as well as keep track of what's going on.
That aside, if you remove the echo statement it will run the script:
# Prints out ls
echo $( echo ls )
# Runs the ls command
$( echo ls )
Use eval.
$ eval `history | grep "ssh root" | head -1| cut -c6-`
From eval command in Bash and its typical uses:
eval takes a string as its argument, and evaluates it as if you'd typed that string on a command line.
And the Bash Manual (https://www.gnu.org/software/bash/manual/html_node/Bourne-Shell-Builtins.html#Bourne-Shell-Builtins)
eval
eval [arguments]
The arguments are concatenated together into a single command, which is then read and executed, and its exit status returned as the exit status of eval. If there are no arguments or only empty arguments, the return status is zero.

How to take advantage of filters

I've read here that
To make a pipe, put a vertical bar (|) on the command line between two commands.
then
When a program takes its input from another program, performs some operation on that input, and writes the result to the standard output, it is referred to as a filter.
So I've first tried the ls command whose output is:
Desktop HelloWord.java Templates glassfish-4.0
Documents Music Videos hs_err_pid26742.log
Downloads NetBeansProjects apache-tomcat-8.0.3 mozilla.pdf
HelloWord Pictures examples.desktop netbeans-8.0
Then ls | echo which outputs absolutely nothing.
I'm looking for a way to take advantages of pipelines and filters in my bash script. Please help.
echo doesn't read from standard input. It only writes its command-line arguments to standard output. The cat command is what you want, which takes what it reads from standard input to standard output.
ls | cat
(Note that the pipeline above is a little pointless, but does demonstrate the idea of a pipe. The command on the right-hand side must read from standard input.)
Don't confuse command-line arguments with standard input.
echo doesn't read standard input. To try something more useful, try
ls | sort -r
to get the output sorted in reverse,
or
ls | grep '[0-9]'
to only keep the lines containing digits.
In addition to what others have said - if your command (echo in this example) does not read from standard input you can use xargs to "feed" this command from standard input, so
ls | echo
doesn't work, but
ls | xargs echo
works fine.

Grep filtering output from a process after it has already started?

Normally when one wants to look at specific output lines from running something, one can do something like:
./a.out | grep IHaveThisString
but what if IHaveThisString is something which changes every time so you need to first run it, watch the output to catch what IHaveThisString is on that particular run, and then grep it out? I can just dump to file and later grep but is it possible to do something like background it and then bring it to foreground and bringing it back but now piped to some grep? Something akin to:
./a.out
Ctrl-Z
fg | grep NowIKnowThisString
just wondering..
No, it is only in your screen buffer if you didn't save it in some other way.
Short form: You can do this, but you need to know that you need to do it ahead-of-time; it's not something that can be put into place interactively after-the-fact.
Write your script to determine what the string is. We'd need a more detailed example of the output format to give a better example of usage, but here's one for the trivial case where the entire first line is the filter target:
run_my_command | { read string_to_filter_for; fgrep -e "$string_to_filter_for" }
Replace the read string_to_filter_for with as many commands as necessary to read enough input to determine what the target string is; this could be a loop if necessary.
For instance, let's say that the output contains the following:
Session id: foobar
and thereafter, you want to grep for lines containing foobar.
...then you can pipe through the following script:
re='Session id: (.*)'
while read; do
if [[ $REPLY =~ $re ]] ; then
target=${BASH_REMATCH[1]}
break
else
# if you want to print the preamble; leave this out otherwise
printf '%s\n' "$REPLY"
fi
done
[[ $target ]] && grep -F -e "$target"
If you want to manually specify the filter target, this can be done by having the loop check for a file being created with filter contents, and using that when starting up grep afterwards.
That is a little bit strange what you need, but you can do it tis way:
you must go into script session first;
then you use shell how usually;
then you start and interrupt you program;
then run grep over typescript file.
Example:
$ script
$ ./a.out
Ctrl-Z
$ fg
$ grep NowIKnowThisString typescript
You could use a stream editor such as sed instead of grep. Here's an example of what I mean:
$ cat list
Name to look for: Mike
Dora 1
John 2
Mike 3
Helen 4
Here we find the name to look for in the fist line and want to grep for it. Now piping the command to sed:
$ cat list | sed -ne '1{s/Name to look for: //;h}' \
> -e ':r;n;G;/^.*\(.\+\).*\n\1$/P;s/\n.*//;br'
Mike 3
Note: sed itself can take file as a parameter, but you're not working with text files, so that's how you'd use it.
Of course, you'd need to modify the command for your case.

Do a complete flux of work on bash script

I'm trying to automate a proces which I have to do over and over again in which I have to parse the output from a shell function, look for 5 different things, and then put them on a file
I know I can match patterns with grep however I don't know how to store the result on a variable so I can use it after :(
I also have to parse this very same output to get the other 5 values
I have no idea on how to use the same output for the 5 grep's i need to do and then store it to 5 different variables for after use
I know i have to create a nice and tidy .sh but I don't know how to do this
Currently im trying this
#!/bin/bash
data=$(cat file)
lol=$(echo data|grep red)
echo $lol
not working , any ideas?
you should show some examples of what you want to do next time..
assuming you shell function is called func1
func1(){
echo "things i want to get are here"
}
func1 | grep -E "things|want|are|here|get" > outputfile.txt
Update:
your code
#!/bin/bash
data=$(cat file)
lol=$(echo data|grep red)
echo $lol
practically just means this
lol=$(grep "red" file)
or
lol=$(awk '/red/' file)
also, if you are considering using bash, this is one way you can do it
while read -r myline
do
case "$myline" in
*"red"* ) echo "$myline" >> output.txt
esac
done <file
You can use the following syntax:
VAR=$(grep foo bar)
or alternatively:
VAR=`grep foo bar`
The easiest thing to do would be to redirect the output of the function to a file. You can then run multiple greps on it and only delete the file once you are done with it.
To save the output, you want to use command substitution. This runs a command and then converts the output into command line parameter. Combined with variable assignment you get:
variable=$(grep expression file)
Your second line is wrong. Change it to this:
lol=$(echo "$data"|grep red)
use egrep istead of grep.
variable=$(egrep "exp1|exp2|exp3|exp4|exp5" file)

Resources