How to redirect standard error to a file - linux

In linux if I want to redirect standard error to a file, I can do this:
$ls -l /bin/usr 2> ls-error.txt
But when I try:
$foo=
$echo ${foo:?"parameter is empty"} 2> ls-error.txt
The result in terminal is:
bash: foo: parameter is empty
It doesn't work!
Can somebody explain why?
I thought ${parameter:?word} would send the value of word to standard error.

echo ${foo:?"parameter is empty"} 2>ls-error.txt redirects the stderr of echo, but the error message is produced by the shell while expanding
${foo:?"parameter is empty"}.
You can get the result you want by redirecting a block (or a subshell) instead so that the shell's stderr is included in the redirection:
{ echo "${foo:?"parameter is empty"}"; } 2>ls-error.txt

Try this command:
($echo ${foo:?"parameter is empty"}) 2> ls-error.txt

In case you would like to redirect both sandard and error output, AND to still get these messages when executing your command, you can use the tee command:
$echo ${foo:?"parameter is empty"} |& tee -a ls-error.txt

Related

Redirect both standard output and standard error to different file in the same command [duplicate]

I know this much:
$ command 2>> error
$ command 1>> output
Is there any way I can output the stderr to the error file and output stdout to the output file in the same line of bash?
Just add them in one line command 2>> error 1>> output
However, note that >> is for appending if the file already has data. Whereas, > will overwrite any existing data in the file.
So, command 2> error 1> output if you do not want to append.
Just for completion's sake, you can write 1> as just > since the default file descriptor is the output. so 1> and > is the same thing.
So, command 2> error 1> output becomes, command 2> error > output
Try this:
your_command 2>stderr.log 1>stdout.log
More information
The numerals 0 through 9 are file descriptors in bash.
0 stands for standard input, 1 stands for standard output, 2 stands for standard error. 3 through 9 are spare for any other temporary usage.
Any file descriptor can be redirected to a file or to another file descriptor using the operator >. You can instead use the operator >> to appends to a file instead of creating an empty one.
Usage:
file_descriptor > filename
file_descriptor > &file_descriptor
Please refer to Advanced Bash-Scripting Guide: Chapter 20. I/O Redirection.
Like that:
$ command >>output 2>>error
Or if you like to mix outputs (stdout & stderr) in one single file you may want to use:
command > merged-output.txt 2>&1
Multiple commands' output can be redirected. This works for either the command line or most usefully in a bash script. The -s directs the password prompt to the screen.
Hereblock cmds stdout/stderr are sent to seperate files and nothing to display.
sudo -s -u username <<'EOF' 2>err 1>out
ls; pwd;
EOF
Hereblock cmds stdout/stderr are sent to a single file and display.
sudo -s -u username <<'EOF' 2>&1 | tee out
ls; pwd;
EOF
Hereblock cmds stdout/stderr are sent to separate files and stdout to display.
sudo -s -u username <<'EOF' 2>err | tee out
ls; pwd;
EOF
Depending on who you are(whoami) and username a password may or may not be required.

How can I retain stderr from nohup?

Nohup redirects stderr to stdout if it points to a terminal. But I want to retain stderr output to the terminal
Is there a way to accomplish that? Is there an alternative?
I don't know if I understood correctly or not.
you mean that you don't want to see the error in terminal?
if yes:
if you want to save the error in file:
nohup command 2> file.txt
if you don't need the errors:
nohup command 2> /dev/null
2 means the error output of command
2> file.txt means write the error output to the file.txt
Just redirect it somewhere else, so it's not the terminal:
nohup bash -c 'echo OUT ; echo ERR >& 2' 2> err
You can redirect the stderr back to stdout instead of to a file to keep the output in the terminal, but it doesn't make much sense: nohup is for situations where the terminal might get lost, in which case you'll lose the stderr.
nohup bash -c 'echo OUT ; echo ERR >& 2' 2> >(cat)

How to redirect command line outputs to a file, but still show them in the command line?

In tcsh I want to redirect command line outputs to a file, but I still want to show them in the command line.
Did a little bit search that
./MyCommand.sh 2>&1 | tee /tmp/Output.txt
should do the job. But I got an error like:
Ambiguous output redirect
Use of 2>&1 to combine stderr and stdout works only in bash and sh. It does not for csh or tcsh. A work around is suggested at Redirect stdout to stderr in tcsh.
In bash instead of 2>&1 I use |&
Not sure how this plays out for tcsh, but this question isn't currently tagged for it and hoping this helps someone else.
According to this redirect stderr to stdout in c shell you can't do this in csh which tcsh extends which could be related
It isn't clear from the question if you want to redirect stdout only, or stdout and stderr.
Using | will redirect stdout to tee (which outputs it to a file and to terminal), leaving stderr untouched (so it only goes to terminal):
./MyCommand.sh | tee /tmp/Output.txt
Using |& will "merge" stdout and stderr, and tee will redirect both to file and to terminal:
./MyCommand.sh |& tee /tmp/Output.txt

How do I pipe or redirect the output of curl -v?

For some reason the output always gets printed to the terminal, regardless of whether I redirect it via 2> or > or |. Is there a way to get around this? Why is this happening?
add the -s (silent) option to remove the progress meter, then redirect stderr to stdout to get verbose output on the same fd as the response body
curl -vs google.com 2>&1 | less
Your URL probably has ampersands in it. I had this problem, too, and I realized that my URL was full of ampersands (from CGI variables being passed) and so everything was getting sent to background in a weird way and thus not redirecting properly. If you put quotes around the URL it will fix it.
The answer above didn't work for me, what did eventually was this syntax:
curl https://${URL} &> /dev/stdout | tee -a ${LOG}
tee puts the output on the screen, but also appends it to my log.
If you need the output in a file you can use a redirect:
curl https://vi.stackexchange.com/ -vs >curl-output.txt 2>&1
Please be sure not to flip the >curl-output.txt and 2>&1, which will not work due to bash's redirection behavior.
Just my 2 cents.
The below command should do the trick, as answered earlier
curl -vs google.com 2>&1
However if need to get the output to a file,
curl -vs google.com > out.txt 2>&1
should work.
I found the same thing: curl by itself would print to STDOUT, but could not be piped into another program.
At first, I thought I had solved it by using xargs to echo the output first:
curl -s ... <url> | xargs -0 echo | ...
But then, as pointed out in the comments, it also works without the xargs part, so -s (silent mode) is the key to preventing extraneous progress output to STDOUT:
curl -s ... <url> | perl -ne 'print $1 if /<sometag>([^<]+)/'
The above example grabs the simple <sometag> content (containing no embedded tags) from the XML output of the curl statement.
The following worked for me:
Put your curl statement in a script named abc.sh
Now run:
sh abc.sh 1>stdout_output 2>stderr_output
You will get your curl's results in stdout_output and the progress info in stderr_output.
This simple example shows how to capture curl output, and use it in a bash script
test.sh
function main
{
\curl -vs 'http://google.com' 2>&1
# note: add -o /tmp/ignore.png if you want to ignore binary output, by saving it to a file.
}
# capture output of curl to a variable
OUT=$(main)
# search output for something using grep.
echo
echo "$OUT" | grep 302
echo
echo "$OUT" | grep title
Solution = curl -vs google.com 2>&1 | less
BUT, if you want to redirect the output to a file and the output is still on the screen, then the URL response contains a newline char \n which messed up your shell.
To avoit this put everything in a variable:
result=$(curl -v . . . . )

How to log output in bash and see it in the terminal at the same time?

I have some scripts where I need to see the output and log the result to a file, with the simplest example being:
$ update-client > my.log
I want to be able to see the output of the command while it's running, but also have it logged to the file. I also log stderr, so I would want to be able to log the error stream while seeing it as well.
update-client 2>&1 | tee my.log
2>&1 redirects standard error to standard output, and tee sends its standard input to standard output and the file.
Just use tail to watch the file as it's updated. Background your original process by adding & after your above command After you execute the command above just use
$ tail -f my.log
It will continuously update. (note it won't tell you when the file has finished running so you can output something to the log to tell you it finished. Ctrl-c to exit tail)
You can use the tee command for that:
command | tee /path/to/logfile
The equivelent without writing to the shell would be:
command > /path/to/logfile
If you want to append (>>) and show the output in the shell, use the -a option:
command | tee -a /path/to/logfile
Please note that the pipe will catch stdout only, errors to stderr are not processed by the pipe with tee. If you want to log errors (from stderr), use:
command 2>&1 | tee /path/to/logfile
This means: run command and redirect the stderr stream (2) to stdout (1). That will be passed to the pipe with the tee application.
Learn about this at askubuntu site
another option is to use block based output capture from within the script (not sure if that is the correct technical term).
Example
#!/bin/bash
{
echo "I will be sent to screen and file"
ls ~
} 2>&1 | tee -a /tmp/logfile.log
echo "I will be sent to just terminal"
I like to have more control and flexibility - so I prefer this way.

Resources