How to redirect STDOUT and STDERR in a file for every command? [duplicate] - linux

This question already has answers here:
How to redirect output of an entire shell script within the script itself?
(6 answers)
Closed 3 years ago.
I'm trying to store both STDOUT and STDERR from the terminal (and if possible STDIN given by user) in a file for every command.
So i started creating a trap function to execute every command in a edited manner like:
shopt -s extdebug
preexec_invoke_exec () {
[ -n "$COMP_LINE" ] && return # do nothing if completing
[ "$BASH_COMMAND" = "$PROMPT_COMMAND" ] && return # don't cause a preexec for $PROMPT_COMMAND
eval `history 1 | sed -e "s/^[ ]*[0-9]*[ ]*//"` |& tee ~/recent_output.txt
return 1 # This prevent executing of original command
}
trap 'preexec_invoke_exec' DEBUG
and saving the above file and executing
source file.sh
This did the work what i wanted but stopped some commands from executing like
cd ..
The reason for this was piping creates a sub-shell and then executes every command in it. So the main shell remains unaffected.
Even the script functionality of bash i.e
script ~/recent_output.txt
worked but only gives output after you do exit in in terminal
So, basically i want to store/get the output of previous command executed in the bash terminal. You can help me with any language (golang,python...).

It is possible to capture commands, stderr and stdout of a bash script (say x.sh), using:
bash -x x.sh 2> >(tee err.txt) | tee out.txt
The err.txt will capture the executed commands (prefixed with '+'), and the stderr of each command. The out.txt will capture the output

Related

nohup append the executed command at the top of the output file

Let's say that we invoke the nohup in the following way:
nohup foo.py -n 20 2>&1 &
This will write the output to the nohup.out.
How could we achieve to have the whole command nohup foo.py -n 20 2>&1 & sitting at the top of the nohup.out (or any other specified output file) after which the regular output of the executed command will be written to that file?
The reason for this is for purely debugging purpose as there will be thousands of commands like this executed and very often some of them will crash due to various reasons. It's like a basic report kept in a file with the executed command written at the top followed by the output of the executed command.
A straightforward alternative would be something like:
myNohup() {
(
set +m # disable job control
[[ -t 0 ]] && exec </dev/null # redirect stdin away from tty
[[ -t 1 ]] && exec >nohup.out # redirect stdout away from tty
[[ -t 2 ]] && exec 2>&1 # redirect stderr away from tty
set -x # enable trace logging of all commands run
"$#" # run our arguments as a command
) & disown -h "$!" # do not forward any HUP signal to the child process
}
To define a command we can test this with:
waitAndWrite() { sleep 5; echo "finished"; }
...and run:
myNohup waitAndWrite
...will return immediately and, after five seconds, leave the following in nohup.out:
+ waitAndWrite
+ sleep 5
+ echo finished
finished
If you only want to write the exact command run without the side effects of xtrace, replace the set -x with (assuming bash 5.0 or newer) printf '%s\n' "${*#Q}".
For older versions of bash, you might instead consider printf '%q ' "$#"; printf '\n'.
This does differ a little from what the question proposes:
Redirections and other shell directives are not logged by set -x. When you run nohup foo 2>&1 &, the 2>&1 is not passed as an argument to nohup; instead, it's something the shell does before nohup is started. Similarly, the & is not an argument but an instruction to the shell not to wait() for the subprocess to finish before going on to future commands.

command to redirect output to console and to a file at the same time works fine in bash. But how do i make it work in korn shell(ksh)

command to redirect output to console and to a file at the same time works fine in bash. But how do i make it work in korn shell(ksh).
All my scripts runs on korn shell so cant change them to bash for this particular command to work.
exec > >(tee -a $LOGFILE) 2>&1
In the code beneath I use the variable logfile, lowercase is better.
You can try something like
touch "${logfile}"
tail -f "${logfile}"&
tailpid=$!
trap 'kill -9 ${tailpid}' EXIT INT TERM
exec 1>"${logfile}" 2>&1
A not too unreasonable technique is to re-exec the shell with output to tee. That is, at the top of the script, do something like:
#!/bin/sh
test -z "$REXEC" && { REXEC=1 exec "$0" "$#" | tee -a $LOGFILE; exit; }

How to execute commands read from the txt file using shell? [duplicate]

This question already has answers here:
Run bash commands from txt file
(4 answers)
Closed 4 years ago.
I tried to execute commands read it from txt file. But only 1st command is executing, after that script is terminated. My script file name is shellEx.sh is follows:
echo "pwd" > temp.txt
echo "ls" >> temp.txt
exec < temp.txt
while read line
do
exec $line
done
echo "printed"
if I keep echo in the place of exec, just it prints both pwd and ls. But i want to execute pwd and ls one by one.
o/p am getting is:
$ bash shellEx.sh
/c/Users/Aditya Gudipati/Desktop
But after pwd, ls also need to execute for me.
Anyone can please give better solution for this?
exec in bash is meant in the Unix sense where it means "stop running this program and start running another instead". This is why your script exits.
If you want to execute line as a shell command, you can use:
line="find . | wc -l"
eval "$line"
($line by itself will not allow using pipes, quotes, expansions or other shell syntax)
To execute the entire file including multiline commands, use one of:
source ./myfile # keep variables, allow exiting script
bash myfile # discard variables, limit exit to myfile
A file with one valid command per line is itself a shell script. Just use the . command to execute it in the current shell.
$ echo "pwd" > temp.txt
$ echo "ls" >> temp.txt
$ . temp.txt

How do I redirect the output from a bash script to stderr

I've written a bash script. It has a few echo statements. It invokes a few 3rd party tools. I would like to redirect everything that goes to stdout (from my echo statements and all the tools output) to stderr. How do I do that?
You need to redirect the stdout of the command to stderr.
your_command.sh 1>&2
If you want to do this from within the script, you can wrap your whole script into one function and redirect its output to stderr:
main() {
echo hello
echo world
some_script.sh
}
main 1>&2
exec >&2
Put that at the top of your script to redirect all future output to stderr.
$ help exec
exec: exec [-cl] [-a name] [command [arguments ...]] [redirection ...]
Replace the shell with the given command.
Execute COMMAND, replacing this shell with the specified program.
ARGUMENTS become the arguments to COMMAND. If COMMAND is not specified,
any redirections take effect in the current shell.
The solution that worked for me was to enclose the text of the script within ()'s and redirect stdout to stderr like so:
(
echo 1
echo 2
tool1
) 1>&2

How to turn off echo while executing a shell script Linux [duplicate]

This question already has answers here:
How to silence output in a Bash script?
(9 answers)
Closed 6 years ago.
Here is a simple thing i was working on
echo "please enter a command"
read x
$x
checkexitstatus()
{...}
checkexit status is a ifloop created somewhere else just to check exit status
What i want to know is
Is there any way that when i run the $x that it wont be displayed on the screen
I want to know if it is possible without redirecting the output to a file
No, it isn't possible.
$x &> /dev/null
You could use Bash redirection :
command 1> /.../path_to_file => to redirect stdout into path_to_file.
command > /.../path_to_file is a shortcut of the previous command.
command 2> /.../path_to_file => to redirect stderr into path_to_file
To do both at the same time to the same output: command >/.../path_to_file 2>&1.
2>&1 means redirect 2 (stderr) to 1 (stdout which became path_to_file).
You could replace path_to_file by /dev/null if you don't want to retrieve the output of your command.
Otherwise, you could also store the output of a command :
$ var=$(command) # Recent shell like Bash or KSH
$ var=`command` # POSIX compliant
In this example, the output of command will be stored in $var.
If you want to turn off only the echo command and not other commands that send their output to the stdout, as the title suggests, you can possibly (it may break the code) create an alias for echo
alias echo=':'
now echo is an alias for noop. You can unalias it by unalias echo.

Resources