How to sed output of a command and not supress output - linux

I have the example text stored in test.sh:
echo 'Hello world 123'
echo 'some other text'
With the following command in a bash script:
word123=$(./test.sh |sed -nr 's/Hello world (.*)/\1/p' )
This works correctly and outputs:
123
However, this does output
Hello world 123
some other text
Is there a way to capture the text in 123 and also output everything else in the file?

With Linux, bash and tee:
word123=$( ./test.sh | tee >&255 >(sed -nr 's/Hello world (.*)/\1/p') )
File descriptor 255 is a non-redirected copy of stdout.
See: What is the use of file descriptor 255 in bash process

Related

Shell script - Output to both the terminal and a log file in a sub-shell

I have a few shell scripts that are intended to work together. The first script (script1.sh) calls the next script in a sub-shell. The second script (script2.sh) needs to "return" something for the first script to use. I need the last line that is echoed in the second script. However, when I use it this way, any output via echo in the second script does not get output to the terminal. I want all output from the first and second (and third, and fourth, ...) to be output to the terminal, but also written to a log file.
script1.sh:
#!/bin/sh
func_one() {
RESULT=$(./script2.sh | tail -1)
echo "RESULT: $RESULT"
}
func_one | tee log_file.log
script2.sh:
#!/bin/sh
echo "Hello"
echo "World!"
Attempt 1 output:
$ ./script1.sh
RESULT: World!
$
log_file.log contents:
RESULT: World!
If I try to redirect output in the second script, then it outputs to the terminal, but not to the log file:
script2.sh:
#!/bin/sh
echo "Hello" >&2
echo "World!" >&2
Attempt 2 output:
$ ./script1.sh
Hello
World!
RESULT:
log_file.log contents:
RESULT:
I also tried outputting to terminal and tee on the same line in script 1:
func_one >&2 | tee log_file.log
But that gives the same result as the first attempt.
What I would like is to have both output to the terminal AND written to the .log file: (if it was working correctly)
$ ./script1.sh
Hello
World!
RESULT: World!
$
log_file.log contents:
Hello
World!
RESULT: World!
How can I go about getting this result? Also, it would be preferred to NOT use bash, as a few of our machines we are going to be running this on do not have bash.
I've looked here:
How do I get both STDOUT and STDERR to go to the terminal and a log file?
but that didn't help in my case.
To get all the output of script2.sh sent to the terminal without interfering with the work of script1.sh, try this modification of script1.sh:
$ cat script1.sh
#!/bin/bash
func_one() {
RESULT=$(./script2.sh | tee >(cat >&2) | tail -1)
echo "RESULT: $RESULT"
}
func_one | tee log_file.log
Here, the first tee command makes sure that all script2.sh output appears, via stderr, on the terminal. To do this, process substitution is needed (and this, in turn, requires an upgrade from sh to bash).
The output is:
$ ./script1.sh
Hello
World!
RESULT: World!
Variation
This is the same as the above except that we don't touch stderr (you may want to reserve that errors). Here, we create an additional file descripter, 3, to duplicate stdout:
#!/bin/bash
exec 3>&1
func_one() {
RESULT=$(./script2.sh | tee >(cat >&3) | tail -1)
echo "RESULT: $RESULT"
}
func_one | tee log_file.log

How can bash be used to pipe stdout to a script AND write to terminal?

I would like to pipe the output of a job to a script to read in that stdout lines and complete actions and display the output on the terminal.
Right now, I have this..
ls | ./script.sh
This allows my script to be run on the output, but does not display the result of ls on the terminal.
I have tried this:
ls | tee ./script.sh
but this overwrites the contents of script.sh with the output from ls.
How can I show the output of "ls" on my terminal, and run the contents on script.sh over that input? Here is an example of what my script.sh looks like:
#!/bin/bash
while read line
do
echo line input
done
You can do:
ls | tee /dev/tty | ./script.sh
or, if you want to use exactly what stdout was before the piping, you can do
something like:
{ ls | tee /dev/fd/3 | ./script.sh ; } 3>&1 #(3 is an semi-arbirtrary choice of fd)

Bash output to screen and logfile differently

I have been trying to get a bash script to output different things on the terminal and logfile but am unsure of what command to use.
For example,
#!/bin/bash
freespace=$(df -h / | grep -E "/" | awk '{print $4}')
greentext="\033[32m"
bold="\033[1m"
normal="\033[0m"
logdate=$(date +"%Y%m%d")
logfile="$logdate"_report.log
exec > >(tee -i $logfile)
echo -e $bold"Quick system report for "$greentext"$HOSTNAME"$normal
printf "\tSystem type:\t%s\n" $MACHTYPE
printf "\tBash Version:\t%s\n" $BASH_VERSION
printf "\tFree Space:\t%s\n" $freespace
printf "\tFiles in dir:\t%s\n" $(ls | wc -l)
printf "\tGenerated on:\t%s\n" $(date +"%m/%d/%y") # US date format
echo -e $greentext"A summary of this info has been saved to $logfile"$normal
I want to omit the last output (echo "A summary...") in the logfile while displaying it in the terminal. Is there a command to do so? It would be great if a general solution can be provided instead of a specific one because I want to apply this to other scripts.
EDIT 1 (after applying >&6):
Files in dir: 7
A summary of this info has been saved to 20160915_report.log
Generated on: 09/15/16
One option:
exec 6>&1 # save the existing stdout
exec > >(tee -i $logfile) # like you had it
#... all your outputs
echo -e $greentext"A summary of this info has been saved to $logfile"$normal >&6
# writes to the original stdout, saved in file descriptor 6 ------------^^^
The >&6 sends echo's output to the saved file descriptor 6 (the terminal, if you're running this from an interactive shell) rather than to the output path set up by tee (which is on file descriptor 1). Tested on bash 4.3.46.
References: "Using exec" and "I/O Redirection"
Edit As OP found, the >&6 message is not guaranteed to appear after the lines printed by tee off stdout. One option is to use script, e.g., as in the answers to this question, instead of tee, and then print the final message outside of the script. Per the docs, the stdbuf answers to that question won't work with tee.
Try a dirty hack:
#... all your outputs
echo >&6 # <-- New line
echo -e $greentext ... >&6
Or, equally hackish, (Note that, per OP, this worked)
#... all your outputs
sleep 0.25s # or whatever time you want <-- New line
echo -e ... >&6

cat file_name | grep "something" results "cat: grep: No such file or directory" in shell scripting

I have written shell script which reads commands from input file and execute commands. I have command like:
cat linux_unit_test_commands | grep "dmesg"
in the input file. I am getting below error message while executing shell script:
cat: |: No such file or directory
cat: grep: No such file or directory
cat: "dmesg": No such file or directory
Script:
#!/bin/bash
while read line
do
output=`$line`
echo $output >> logs
done < $1
Below is input file(example_commands):
ls
date
cat linux_unit_test_commands | grep "dmesg"
Execute: ./linux_unit_tests.sh example_commands
Please help me to resolve this issue.
Special characters like | and " are not parsed after expanding variables; the only processing done after variable expansion is word splitting and wildcard expansions. If you want the line to be parsed fully, you need to use eval:
while read line
do
output=`eval "$line"`
echo "$output" >> logs
done < $1
You might be wondering why its not working with cat command.
Then here is the answer for your question.
output=`$line` i.e. output=`cat linux_unit_test_commands | grep "dmesg"`
here the cat command will take (linux_unit_test_commands | grep "dmesg") all these as arguments i.e. fileNames.
From Man page:
SYNTAX : cat [OPTION]... [FILE]...
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Script is OK!
#!/bin/bash
while read line;
do
output=`$line`
echo $output >> logs
done < $1
To make it work you need to change 'cat: "dmesg": No such file or directory' to 'grep "dmesg" linux_unit_test_commands'. It will work!
cat linux_unit_test_commands
ls
date
grep "dmesg" linux_unit_test_commands

Print on terminal and into file simultaneously?

I have a shell script that greps some data.. I want to print the result into a file, but doing that prevents the result being displayed on the terminal. Is there a way that can both print the result on the screen and also write into a file.
Thanks in advance.
Pipe your output to the tee command.
Example:
[me#home]$ echo hello | tee out.txt
hello
[me#home]$ cat out.txt
hello
Note that the stdout of echo is printed out as well as written to the file specified by thr tee command.
Note you can add the -a flag to tee to append to the output file
[me#home]$ echo hello | tee out.txt
hello
[me#home]$ echo hello again | tee -a out.txt
hello again
[me#home]$ cat out.txt
hello
hello again
Does exactly your thing
http://linux.die.net/man/1/tee

Resources