I have a shell script that greps some data.. I want to print the result into a file, but doing that prevents the result being displayed on the terminal. Is there a way that can both print the result on the screen and also write into a file.
Thanks in advance.
Pipe your output to the tee command.
Example:
[me#home]$ echo hello | tee out.txt
hello
[me#home]$ cat out.txt
hello
Note that the stdout of echo is printed out as well as written to the file specified by thr tee command.
Note you can add the -a flag to tee to append to the output file
[me#home]$ echo hello | tee out.txt
hello
[me#home]$ echo hello again | tee -a out.txt
hello again
[me#home]$ cat out.txt
hello
hello again
Does exactly your thing
http://linux.die.net/man/1/tee
Related
I have the example text stored in test.sh:
echo 'Hello world 123'
echo 'some other text'
With the following command in a bash script:
word123=$(./test.sh |sed -nr 's/Hello world (.*)/\1/p' )
This works correctly and outputs:
123
However, this does output
Hello world 123
some other text
Is there a way to capture the text in 123 and also output everything else in the file?
With Linux, bash and tee:
word123=$( ./test.sh | tee >&255 >(sed -nr 's/Hello world (.*)/\1/p') )
File descriptor 255 is a non-redirected copy of stdout.
See: What is the use of file descriptor 255 in bash process
I have a few shell scripts that are intended to work together. The first script (script1.sh) calls the next script in a sub-shell. The second script (script2.sh) needs to "return" something for the first script to use. I need the last line that is echoed in the second script. However, when I use it this way, any output via echo in the second script does not get output to the terminal. I want all output from the first and second (and third, and fourth, ...) to be output to the terminal, but also written to a log file.
script1.sh:
#!/bin/sh
func_one() {
RESULT=$(./script2.sh | tail -1)
echo "RESULT: $RESULT"
}
func_one | tee log_file.log
script2.sh:
#!/bin/sh
echo "Hello"
echo "World!"
Attempt 1 output:
$ ./script1.sh
RESULT: World!
$
log_file.log contents:
RESULT: World!
If I try to redirect output in the second script, then it outputs to the terminal, but not to the log file:
script2.sh:
#!/bin/sh
echo "Hello" >&2
echo "World!" >&2
Attempt 2 output:
$ ./script1.sh
Hello
World!
RESULT:
log_file.log contents:
RESULT:
I also tried outputting to terminal and tee on the same line in script 1:
func_one >&2 | tee log_file.log
But that gives the same result as the first attempt.
What I would like is to have both output to the terminal AND written to the .log file: (if it was working correctly)
$ ./script1.sh
Hello
World!
RESULT: World!
$
log_file.log contents:
Hello
World!
RESULT: World!
How can I go about getting this result? Also, it would be preferred to NOT use bash, as a few of our machines we are going to be running this on do not have bash.
I've looked here:
How do I get both STDOUT and STDERR to go to the terminal and a log file?
but that didn't help in my case.
To get all the output of script2.sh sent to the terminal without interfering with the work of script1.sh, try this modification of script1.sh:
$ cat script1.sh
#!/bin/bash
func_one() {
RESULT=$(./script2.sh | tee >(cat >&2) | tail -1)
echo "RESULT: $RESULT"
}
func_one | tee log_file.log
Here, the first tee command makes sure that all script2.sh output appears, via stderr, on the terminal. To do this, process substitution is needed (and this, in turn, requires an upgrade from sh to bash).
The output is:
$ ./script1.sh
Hello
World!
RESULT: World!
Variation
This is the same as the above except that we don't touch stderr (you may want to reserve that errors). Here, we create an additional file descripter, 3, to duplicate stdout:
#!/bin/bash
exec 3>&1
func_one() {
RESULT=$(./script2.sh | tee >(cat >&3) | tail -1)
echo "RESULT: $RESULT"
}
func_one | tee log_file.log
I want to grep the output of my script - which itself contains call to different binaries...
Since the script has multiple binaries within I can't simply put exec and dump the output in file (it does not copy output from the binaries)...
And to let you know, I am monitoring the script output to determine if the system has got stuck!
Why don't you append instead?
mybin1 | grep '...' >> mylog.txt
mybin2 | grep '...' >> mylog.txt
mybin3 | grep '...' >> mylog.txt
Does this not work?
#!/bin/bash
exec 11>&1 12>&2 > >(exec tee /var/log/somewhere) 2>&1 ## Or add -a option to tee to append.
# call your binaries here
exec >&- 2>&- >&11 2>&12 11>&- 12>&-
I have tried redirecting the terminal output to a file using tee and > as in the examples here and the question. It worked for echo test | tee log.txt or ls -l | tee log.txt
But It does not work (does not add anything to the log.txt) when I run a command like divine verify file.dve | tee log.txt
where divine is an installed tool. Any ideas or alternatives?
Try divine verify file.dve 2>&1 | tee log.txt. If the program is outputting to stderr instead of stdout, this redirects stderr to stdout.
works on ffmpeg output too
{ echo ffmpeg -i [rest of command]; ffmpeg -i [rest of command]; } 2>&1 | tee ffmpeg.txt
and tee -a to append if file already exists
======
also if you want to see mediainfo on all files in a folder and make sure command is also visible in mediainfo.txt
{ echo mediainfo *; mediainfo *; } 2>&1 | tee mediainfo.txt
NB: { echo cmd; cmd; } means the command is kept in the txt file ; without this it is not printed
I'm making a script and every time something is done I would like to write into my custom .log file. How do I do that?
And in the end.. I'd just like to read it with Bash,.. do I just use cat?
Thanks.
The simplest syntax I always use is 2>&1 | tee -a file_name.log.
The syntax can be used after a command or execution of a file. e.g.
find . -type f 2>&1 | tee -a file_name.log
or
./test.sh 2>&1 | tee -a file_name.log
Just cat <log message here> >> custom.log.
The >> means add on to the bottom of the file rather than > which would delete the contents of the file and then write the message.