write stderr and stdout to file without wrapper script - linux

I want to redirect stderr and stdout to a file inside my bash script myscript.sh where some sub-processes are run.
The easiest way is to write a wrapper script start_myscript.sh that calls myscript.sh 2>&1 > logfile or myscript.sh &> logfile.
I wonder if it is possible to redirect both stderr and stdout inside myscript.sh to a logfile without using a wrapper script?
Fot this I opened a new file descriptor 3 and redirected stderr and stdout to fd3 which I redirect to /tmp/logfile.
#!/bin/bash
exec 3> /tmp/logfile
exec 1>&3
exec 2>&3
doing_something &
doing_other_things &
Both functions are doing something and producing output to stderr or stdout. The thing is they are working simultaneously and sometimes writing at the same time to stdout/stderr.
In this case a sentence from the first function is interrupted, then follows an output from the second one and finally follows the last part of the sentence from the first function. That breaks any log file...
So I thought of FIFO named pipes as it should be blocking and did a short test:
#!/bin/bash
PIPE=/tmp/mypipe
mkfifo $PIPE
cat LONGTEXTFILE > $PIPE &
ls -la /usr/bin > $PIPE &
ls -la /tmp > $PIPE &
The test is simple. I create a named pipe $PIPE, cat a long text file over some seconds to stdout which is redirected to the named pipe. I'm doing this in the background while short calls of ls redirect its output to the named pipe. This should simulate output from different functions running at once in the background.
In another terminal I opened the named pipe with
cat /tmp/mypipe
My assumption was that the FIFO named pipe is blocked while it gets input from cat and stdout from ls would wait until stdout from cat ends.
Unfortunately the result is disappointing. Output from ls is placed somewhere in the middle of the output from cat, not afterwards.
Have I done any mistakes?
Is there any other way to redirect stderr/stdout from inside a script to a logfile without using a wrapper script?

Related

How to add nohup? - Redirect stdin to program and background

I have a program prog that takes stdin input like this:
prog < test.txt
But the processing takes quite a lot time, so once the input is read, it the process should background.
From this answer https://unix.stackexchange.com/a/71218/201221 I have working solution, but without nohup. How modify it to use nohup too?
#!/bin/sh
{ prog <&3 3<&- & } 3<&0
disown is a shell builtin which tells bash to remove a process from its recordkeeping -- including the recordkeeping that forwards HUP signals. Consequently, if stdin, stdout and stderr are all redirected or closed before the terminal disappears, there's absolutely no need for nohup so long as you use disown.
#!/bin/bash
logfile=nohup.out # change this to something that makes more sense.
[ -t 1 ] && exec >"$logfile" # do like nohup does: redirect stdout to logfile if TTY
[ -t 2 ] && exec 2>&1 # likewise, redirect stderr away from TTY
{ prog <&3 3<&- & } 3<&0
disown
If you really need compatibility with POSIX sh, then you'll want to capture stdin to a file (at a potentially very large cost to efficiency):
#!/bin/sh
# create a temporary file
tempfile=$(mktemp "${TMPDIR:-/tmp}/input.XXXXXX") || exit
# capture all of stdin to that temporary file
cat >"$tempfile"
# nohup a process that reads from that temporary file
tempfile="$tempfile" nohup sh -c 'prog <"$tempfile"; rm -f "$tempfile"' &
From what I see the following code is contained in a separate shell file:
#!/bin/sh
{ prog <&3 3<&- & } 3<&0
So, why not try just:
nohup the_file.sh &

Write stderr and stdout to one file, but also write stderr to a separate file

I have a shell script whose stdout and stderr I want to write to a logfile. I know that this can be achieved via
sh script.sh >> both.log 2>&1
However, I also want to simultaneously write the stderr to a separate file, "error.log". Is this achievable?
You can use tee to duplicate output to two locations. Combine that with some tricky redirections, and...
script.sh 2>&1 >> both.log | tee -a both.log >> error.log
This redirects stderr to stdout, and then stdout to both.log. stderr remains, and is piped to tee, which copies it to both log files.
For this you need to first switch stdout and stderr, which requires an additional file descriptor:
sh script.sh 3>&2 2>&1 1>&3 3>&-
The last operator closes the auxiliary file descriptor.
After that you can use tee to duplicate the error stream (which is now on stdin) and append it to your error log:
sh script.sh 3>&2 2>&1 1>&3 3>&- | tee -a error.log
And after that you can then direct both stdin and stderr to your combined log:
(sh script.sh 3>&2 2>&1 1>&3 3>&- | tee -a error.log) >> both.log 2>&1
The parentheses around the command are important to capture the error stream of the whole command. Without them only the (empty) error stream of the tee command would be captured and the rest would still go to the terminal.
Note: this does not check wheater the file descriptor 3 was in use (open) before. In bash you can use this to choose a previously unused file descriptor and close it on the last redirection:
sh script.sh {tmpfd}>&2 2>&1 1>&${tmpfd}-

How can I send error output to both stdout and file in bash

If I use this
cmd 2>/var/error.log
Then my error goes to that file but then I can't see on screen.
Is there any way I can simultaneously show it on screen as well as send to file?
This will display both stdout and stderr on the terminal while only sending stderr to err.log:
cmd 2> >(tee err.log >&2)
>(...) is process substitution. (The space between the two consecutive > is essential.) This sends stderr and only stderr to the tee command.
The >&2 causes the error messages remain in stderr. This would be important, for example, if this line occurs inside some script whose stdin or stderr is being redirected. (Hat tip: Chepner.)
cmd 2>&1 | tee /tmp/error.log

How redirect nohup stdout to stdin

Is there a way to redirect the nohup output to the stdin instead of nohup.out ?
I've tried:
nohup echo Hello > /dev/stdin 2>&1 &
But it does not the trick.
The nohup command purposefully detaches itself from the stdin, so there is nothing it expects to read in itself, and thus I think what you are really after in this question, is redirecting the output of nohup as the stdin for the next command. (Well somebody has to read the stdin, and it ain't nohup.)
Further, POSIX mandates that the output goes to the nohup.out file in the working directory, if the file can be successfully opened. So what you can do is to wire the stdin of the following commands from the nohup.out file. For instance:
$ nohup echo Hello 2>/dev/null; read VAR 0<nohup.out; echo "VAR=$VAR"
VAR=Hello

echo stderr and stdout to file from bash script variable

I have a bash script and inside this bash script I have a JAVARESULT variable like this :
JAVARESULT=`java -cp ... parser_file $file $someextravar`
and what I want is to catch in a log file the stderr and stdout of this result variable.
echo "$JAVARESULT" > $LOG_FILE
but i get only the stdout not the stderr. I tried with :
echo "$JAVARESULT" &> $LOG_FILE
but I don't get the java errors in the log file .
In every Unix based system every process have at least three file descriptors open. As you know, file descriptors are identified by numbers. This three standard file descriptors are:
0 for stdin
1 for stdout
2 for stderr
What you want to do is redirect stderr to stdout, and then redirect stdout to a file. So, in your JAVARESULT variable you'll just have to append:
2>&1
What you're saying here is: redirect stderr (file descriptor 2) to stdout (file descriptor 1).
Try this:
echo "$JAVARESULT" > $LOG_FILE 2>&1

Resources