How to pipe the output of a command to file on Linux - linux

I am running a task on the CLI, which prompts me for a yes/no input.
After selecting a choice, a large amount of info scrolls by on the screen - including several errors. I want to pipe this output to a file so I can see the errors. A simple '>' is not working since the command expects keyboard input.
I am running on Ubuntu 9.1.

command &> output.txt
You can use &> to redirect both stdout and stderr to a file. This is shorthand for command > output.txt 2>&1 where the 2>&1 means "send stderr to the same place as stdout" (stdout is file descriptor 1, stderr is 2).
For interactive commands I usually don't bother saving to a file if I can use less and read the results right away:
command 2>&1 | less

echo yes | command > output.txt
Depending on how the command reads it's input (some programs discard whatever was on stdin before it displays it's prompt, but most don't), this should work on any sane CLI-environment.

Use 2> rather than just >.

If the program was written by a sane person what you probably want is the stderr not the stdout. You would achieve this by using something like
foo 2> errors.txt

you can use 2> option to send errors to the file.
example:
command 2> error.txt
(use of option 2>) --- see if their would be any error while the execution of the command it will send it to the file error.txt.

Related

Linux command to run a shell script upon receipt of any input

I need to monitor stderr for a certain error message, and when it is detected, run a shell script.
Current idea is to pipe stderr to grep, and filter for the message. I would pipe grep's output to some program that would run my script upon receipt of any user input.
Alternatively, if I could make grep output a specific command-line option parameter, I could presumably pipe that directly to my shell script.
Is there a better way of doing this?
I would like to recommend inotifywait.
Monitor stderr-log.txt for changes.
Read file and grep the stderr message.
Call another script if error message is found
while:;do
inotifywait -q -e modify ./stderr-log.txt > /dev/null
if cat stderr-log.txt | grep -q <error message>; then
<do something here..>
fi
done

Linux writing console output to a file

I have a large amount of prints on the console and I want to store them into a file. Can anyone suggest a way in Linux?
your_print_command > filename.txt
Or
your_print_command >> filename.txt
The latter appends data into file instead of overriding it.
To make sure you get both stderr and stdout to the file instead of the console
command_generating_text &> /path/to/file
To keep stderr and stdout to different files
command_generating_text 1> /path/to/file.stdout 2> /path/to/file.stderr

Linux All Output to a File

Is there any way to tell Linux system put all output(stdout,stderr) to a file?
With out using redirection, pipe or modification the how scrips get called.
Just tell the Linux use a file for output.
for example:
script test1.sh:
#!/bin/bash
echo "Testing 123 "
If i run it like "./test1.sh" (with out redirection or pipe)
i'd like to see "Testing 123" in a file (/tmp/linux_output)
Problem: in the system a binary makes a call to a script and this script call many other scrips. it is not possible to modify each call so If i can modify Linux put "output" to a file i can review the logs.
#!/bin/bash
exec >file 2>&1
echo "Testing 123 "
You can read more about exec here
If you are running the program from a terminal, you can use the command script.
It will open up a sub-shell. Do what you need to do.
It will copy all output to the terminal into a file. When you are done, exit the shell. ^D, or exit.
This does not use redirection or pipes.
You could set your terminal's scrollback buffer to a large number of lines and then see all the output from your commands in the buffer - depending on your terminal window and the options in its menus, there may be an option in there to capture terminal I/O to a file.
Your requirement if taken literally is an impractical one, because it is based in a slight misunderstanding. Fundamentally, to get the output to go in a file, you will have to change something to direct it there - which would violate your literal constraint.
But the practical problem is solvable, because unless explicitly counteracted in the child, the output directions configured in a parent process will be inherited. So you only have to setup the redirection once, using either a shell, or a custom launcher program or intermediary. After that it will be inherited.
So, for example:
cat > test.sh
#/bin/sh
echo "hello on stdout"
rm nosuchfile
./test2.sh
And a child script for it to call
cat > test2.sh
#/bin/sh
echo "hello on stdout from script 2"
rm thisfileisnteither
./nonexistantscript.sh
Run the first script redirecting both stdout and stderr (bash version - and you can do this in many ways such as by writing a C program that redirects its outputs then exec()'s your real program)
./test.sh &> logfile
Now examine the file and see results from stdout and stderr of both parent and child.
cat logfile
hello on stdout
rm: nosuchfile: No such file or directory
hello on stdout from script 2
rm: thisfileisnteither: No such file or directory
./test2.sh: line 4: ./nonexistantscript.sh: No such file or directory
Of course if you really dislike this, you can always always modify the kernel - but again, that is changing something (and a very ungainly solution too).

shell prompt seemingly does not reappear after running a script that uses exec with tee to send stdout output to both the terminal and a file

I have a shell script which writes all output to logfile
and terminal, this part works fine, but if I execute the script
a new shell prompt only appear if I press enter. Why is that and how do I fix it?
#!/bin/bash
exec > >(tee logfile)
echo "output"
First, when I'm testing this, there always is a new shell prompt, it's just that sometimes the string output comes after it, so the prompt isn't last. Did you happen to overlook it? If so, there seems to be a race where the shell prints the prompt before the tee in the background completes.
Unfortunately, that cannot fixed by waiting in the shell for tee, see this question on unix.stackexchange. Fragile workarounds aside, the easiest way to solve this that I see is to put your whole script inside a list:
{
your-code-here
} | tee logfile
If I run the following script (suppressing the newline from the echo), I see the prompt, but not "output". The string is still written to the file.
#!/bin/bash
exec > >(tee logfile)
echo -n "output"
What I suspect is this: you have three different file descriptors trying to write to the same file (that is, the terminal): standard output of the shell, standard error of the shell, and the standard output of tee. The shell writes synchronously: first the echo to standard output, then the prompt to standard error, so the terminal is able to sequence them correctly. However, the third file descriptor is written to asynchronously by tee, so there is a race condition. I don't quite understand how my modification affects the race, but it appears to upset some balance, allowing the prompt to be written at a different time and appear on the screen. (I expect output buffering to play a part in this).
You might also try running your script after running the script command, which will log everything written to the terminal; if you wade through all the control characters in the file, you may notice the prompt in the file just prior to the output written by tee. In support of my race condition theory, I'll note that after running the script a few times, it was no longer displaying "abnormal" behavior; my shell prompt was displayed as expected after the string "output", so there is definitely some non-deterministic element to this situation.
#chepner's answer provides great background information.
Here's a workaround - works on Ubuntu 12.04 (Linux 3.2.0) and on OS X 10.9.1:
#!/bin/bash
exec > >(tee logfile)
echo "output"
# WORKAROUND - place LAST in your script.
# Execute an executable (as opposed to a builtin) that outputs *something*
# to make the prompt reappear normally.
# In this case we use the printf *executable* to output an *empty string*.
# Use of `$ec` is to ensure that the script's actual exit code is passed through.
ec=$?; $(which printf) ''; exit $ec
Alternatives:
#user2719058's answer shows a simple alternative: wrapping the entire script body in a group command ({ ... }) and piping it to tee logfile.
An external solution, as #chepner has already hinted at, is to use the script utility to create a "transcript" of your script's output in addition to displaying it:
script -qc yourScript /dev/null > logfile # Linux syntax
This, however, will also capture stderr output; if you wanted to avoid that, use:
script -qc 'yourScript 2>/dev/null' /dev/null > logfile
Note, however, that this will suppress stderr output altogether.
As others have noted, it's not that there's no prompt printed -- it's that the last of the output written by tee can come after the prompt, making the prompt no longer visible.
If you have bash 4.4 or newer, you can wait for your tee process to exit, like so:
#!/usr/bin/env bash
case $BASH_VERSION in ''|[0-3].*|4.[0-3]) echo "ERROR: Bash 4.4+ needed" >&2; exit 1;; esac
exec {orig_stdout}>&1 {orig_stderr}>&2 # make a backup of original stdout
exec > >(tee -a "_install_log"); tee_pid=$! # track PID of tee after starting it
cleanup() { # define a function we'll call during shutdown
retval=$?
exec >&$orig_stdout # Copy your original stdout back to FD 1, overwriting the pipe to tee
exec 2>&$orig_stderr # If something overwrites stderr to also go through tee, fix that too
wait "$tee_pid" # Now, wait until tee exits
exit "$retval" # and complete exit with our original exit status
}
trap cleanup EXIT # configure the function above to be called during cleanup
echo "Writing something to stdout here"

Redirecting standard error to file and leaving standard output to screen when launching makefile

I am aware that for redirecting standard error and output to file I have to do:
make > & ! output.txt
Note I use ! to overwrite the file. But How can I redirect standard error to file and leave standard output to screen? Or even better having both error and output on file but also output on screen, so I can see how my compiling is progressing?
I tried:
make 2>! output.txt
but it gives me an error.
Note that > it enough to overwrite the file. You can use the tail -f command to see the output on screen if it is redirected to a file:
$(make 1>output.txt 2>error.txt &) && tail -f output.txt error.txt
You can do this simply with pipe into tee command. The following will put both stdout and stderr into a file and also to the terminal:
make |& tee output.txt
Edit
Explanation from GNU Bash manual, section 3.2.2 Pipelines:
If ‘|&’ is used, command1’s standard error, in addition to its
standard output, is connected to command2’s standard input through the
pipe; it is shorthand for 2>&1 |. This implicit redirection of the
standard error to the standard output is performed after any
redirections specified by the command.
You are reading bash/sh documentation and using tcsh. tcsh doesn't have any way to redirect just stderr. You might want to switch to one of the non-csh shells.

Resources