Display the output on the terminal and tail the last 10 lines to log file - linux

I need to run a script and output the full contents to the terminal.
I then want to get the last 10 lines from the terminal output and put them in a log file.
I have tried using ./script.sh 2>&1 | tail -10 > log.log
but this stops the output to the terminal.

Leverage process substitution of bash with tee:
./script.sh |& tee >(tail -10 >file.txt)
|& is a shortcut for sending both STDOUT and STDERR over the pipe.
tee redirects it's STDIN to both STDOUT and to the file(s) given as argument(s) -- we have used process substitution to get a file descriptor and used tail -10 >file.txt inside process substitution to save the desired content.

For that you have to use the tee command. Then you can pipe to a file and your console.
ls -a | tee output.file

Related

Linux: tee usage misunderstanding

When I use
$ ls | tee log.txt
I get correct, expected result: log.txt keeps 'ls' output.
I need to save output of the command:
$ svnadmin dump MyRepo -r1:r2 > dumpfile
* Dumped revision 1.
* Dumped revision 2.
Where "$ svnadmin .." is command and
"* Dumped .." are outputs
So, the question itself. When I run the command:
$ svnadmin dump MyRepo -r1:r2 > dumpfile | tee log.txt
* Dumped revision 1.
* Dumped revision 2.
log.txt has 0 size
TL;DR -- Try this one:
svnadmin dump MyRepo -r1:r2 2>&1 > dumpfile | tee log.txt
Explanation: Each command has three connected "standard streams": Standard Input (STDIN, filehandle 0), Standard Output (STDOUT, 1) and Standard Error (STDERR, 2). Usually commands ouput their "data" on STDOUT and errors on STDERR.
A simple command like ls has all three of them connected to the console. Because both STDOUT and STDERR are connected to the console the output of the command is interleaved.
A "pipe" like ls | tee log.txt redirects STDOUT of the first command to STDIN of the second command. No more - no less. Therefore all other streams are still connected to the console. Should the ls part produce error messages they will be written to the console, not to the file! But your ls example did not output any errors so you didn't notice.
After the pipe is setup the shell evaluates the other redirection operator of the first command -- from left to right.
Therefore svnadmin dump > dumpfile | tee log.txt will redirect STDOUT of svnadmin to dumpfile leaving effectively no data for the tee command because that's a redirection, not a copy.
svnadmin dump MyRepo 2>&1 > dumpfile | tee log.txt adds another redirection step. It reads "make filehandle 2 (STDERR) a copy of the filehandle 1 (STDOUT)" which is the pipe at this point. Writing to either STDOUT or STDERR will write to the pipe. But after that the > dumpfile redirection is applied and STDOUT is redirected to the file.
You can read all this (and much more) in the shell's manual. For bash it is in the section REDIRECTION.

How can bash be used to pipe stdout to a script AND write to terminal?

I would like to pipe the output of a job to a script to read in that stdout lines and complete actions and display the output on the terminal.
Right now, I have this..
ls | ./script.sh
This allows my script to be run on the output, but does not display the result of ls on the terminal.
I have tried this:
ls | tee ./script.sh
but this overwrites the contents of script.sh with the output from ls.
How can I show the output of "ls" on my terminal, and run the contents on script.sh over that input? Here is an example of what my script.sh looks like:
#!/bin/bash
while read line
do
echo line input
done
You can do:
ls | tee /dev/tty | ./script.sh
or, if you want to use exactly what stdout was before the piping, you can do
something like:
{ ls | tee /dev/fd/3 | ./script.sh ; } 3>&1 #(3 is an semi-arbirtrary choice of fd)

Write stderr and stdout to one file, but also write stderr to a separate file

I have a shell script whose stdout and stderr I want to write to a logfile. I know that this can be achieved via
sh script.sh >> both.log 2>&1
However, I also want to simultaneously write the stderr to a separate file, "error.log". Is this achievable?
You can use tee to duplicate output to two locations. Combine that with some tricky redirections, and...
script.sh 2>&1 >> both.log | tee -a both.log >> error.log
This redirects stderr to stdout, and then stdout to both.log. stderr remains, and is piped to tee, which copies it to both log files.
For this you need to first switch stdout and stderr, which requires an additional file descriptor:
sh script.sh 3>&2 2>&1 1>&3 3>&-
The last operator closes the auxiliary file descriptor.
After that you can use tee to duplicate the error stream (which is now on stdin) and append it to your error log:
sh script.sh 3>&2 2>&1 1>&3 3>&- | tee -a error.log
And after that you can then direct both stdin and stderr to your combined log:
(sh script.sh 3>&2 2>&1 1>&3 3>&- | tee -a error.log) >> both.log 2>&1
The parentheses around the command are important to capture the error stream of the whole command. Without them only the (empty) error stream of the tee command would be captured and the rest would still go to the terminal.
Note: this does not check wheater the file descriptor 3 was in use (open) before. In bash you can use this to choose a previously unused file descriptor and close it on the last redirection:
sh script.sh {tmpfd}>&2 2>&1 1>&${tmpfd}-

What is meant by 'output to stdout'

New to bash programming. I am not sure what is meant by 'output to stdout'. Does it mean print out to the command line?
If I have a simple bash script:
#!/bin/bash
wget -q http://192.168.0.1/test -O - | grep -m 1 'Hello'
it outputs a string to the terminal. Does this mean it's 'outputting to stdout' ?
Thanks
Yes, stdout is the terminal (unless it's redirected to a file using the > operator or into the stdin of another process using |)
In your specific example, you're actually redirecting using | grep ... through grep then to the terminal.
Every process on a Linux system (and most others) has at least 3 open file descriptors:
stdin (0)
stdout (1)
stderr (2)
Regualary every of this file descriptors will point to the terminal from where the process was started. Like this:
cat file.txt # all file descriptors are pointing to the terminal where you type the command
However, bash allows to modify this behaviour using input / output redirection:
cat < file.txt # will use file.txt as stdin
cat file.txt > output.txt # redirects stdout to a file (will not appear on terminal anymore)
cat file.txt 2> /dev/null # redirects stderr to /dev/null (will not appear on terminal anymore
The same is happening when you are using the pipe symbol like:
wget -q http://192.168.0.1/test -O - | grep -m 1 'Hello'
What is actually happening is that the stdout of the wget process (the process before the | ) is redirected to the stdin of the grep process. So wget's stdout isn't a terminal anymore while grep's output is the current terminal. If you want to redirect grep's output to a file for example, then use this:
wget -q http://192.168.0.1/test -O - | grep -m 1 'Hello' > output.txt
Unless redirected, standard output is the text terminal which initiated the program.
Here's a wikipedia article: http://en.wikipedia.org/wiki/Standard_streams#Standard_output_.28stdout.29

How to log output in bash and see it in the terminal at the same time?

I have some scripts where I need to see the output and log the result to a file, with the simplest example being:
$ update-client > my.log
I want to be able to see the output of the command while it's running, but also have it logged to the file. I also log stderr, so I would want to be able to log the error stream while seeing it as well.
update-client 2>&1 | tee my.log
2>&1 redirects standard error to standard output, and tee sends its standard input to standard output and the file.
Just use tail to watch the file as it's updated. Background your original process by adding & after your above command After you execute the command above just use
$ tail -f my.log
It will continuously update. (note it won't tell you when the file has finished running so you can output something to the log to tell you it finished. Ctrl-c to exit tail)
You can use the tee command for that:
command | tee /path/to/logfile
The equivelent without writing to the shell would be:
command > /path/to/logfile
If you want to append (>>) and show the output in the shell, use the -a option:
command | tee -a /path/to/logfile
Please note that the pipe will catch stdout only, errors to stderr are not processed by the pipe with tee. If you want to log errors (from stderr), use:
command 2>&1 | tee /path/to/logfile
This means: run command and redirect the stderr stream (2) to stdout (1). That will be passed to the pipe with the tee application.
Learn about this at askubuntu site
another option is to use block based output capture from within the script (not sure if that is the correct technical term).
Example
#!/bin/bash
{
echo "I will be sent to screen and file"
ls ~
} 2>&1 | tee -a /tmp/logfile.log
echo "I will be sent to just terminal"
I like to have more control and flexibility - so I prefer this way.

Resources