I have sophisticated bash script that uses "read -p"(stderr output) very often. And now I need to duplicate all script input from terminal into log file.
tee file.log | script.sh
this command does'nt work carefully because ignores output to user.
Example:
#!/bin/sh
echo "start"
read -p "input value: " val
echo $val
echo "finish"
Terminal run:
start
input value: 3
3
finish
Tee run:
# tee file.log | ./script.sh
start
3
3
finish
No idea why you're using tee here. What I suspect is happening is it needs input, so waits for it, then pipes 3 to stdout
-p prompt
Display prompt, without a trailing newline, before attempting
to read any input. The prompt is displayed only if input is coming from a
terminal.
However input isn't sent from tty here so prompt is never printed. Still feels very weird for me to use tee here, but you can just use echo -n instead of the -p flag for read and it should work.
#!/bin/sh
echo "start"
echo -n "input value: "
read val
echo $val
echo "finish"
e.g.
> tee file.log | ./abovescript
start
input value: 3
3
finish
> cat file.log
3
Also not sure how to get tee to terminate properly from in-script here, so you need to press return key at end which of course causes newline.
That said, since it's an extra line each time anyway, seems worse than just be doing echo "$val" >> file.log each time, though a better option would be just to use a function
#!/bin/bash
r() {
read -p "input value: " val
echo "$val" >> file.log
echo "$val"
}
echo "start"
val=$(r)
echo "$val"
echo "finish"
Related
Is there a way for a script to log both, the command line run (including piped ones) plus its output without duplicating the line for the command?
The intention is that the script should have a clean output, but should log verbosely into a log file (so no set -x). Apart from the output, it shall also log the command line causing the output, which could be a piped command-one liner.
The most basic approach is to duplicate the command line in the script and then dump it into the log followed by the captured output of the actual command being run:
echo "command argument1 \"quoted argument2\" | grep -oE \"some output\"" >> file.log
output="$(command argument1 "quoted argument2" 2>&1 | grep -oE "some output")"
echo "${output}" >> file.log
This has the side effect that quoted sections would need to be escaped for the log, which can lead to errors resulting in confusion.
If none of the commands were piped, one could store the command line in an array and then "run" the array.
command=(command argument1 "quoted argument2")
echo "${command[#]}" >> file.log
output="$("${command[#]}" 2>&1)"
echo "${output}" >> file.log
Though with this approach "quoted argument2" would become quoted argument2 in the log.
Is there a way (in bash) to realize this without having to duplicate the commands?
You could play with redirections, switch the x option on and off on demand, unset PS4 to get rid of the leading + , and define log_on and log_off functions for easier coding. Something like this:
$ cat script.sh
#!/usr/bin/env bash
function log_on {
exec 3>&1 4>&2
exec &> >( sed -E '/^(set \+x|log_off)$/d' >> file.log )
ps4=$PS4
PS4=
set -x
}
function log_off {
set +x
exec 1>&3 2>&4
PS4=$ps4
}
echo something not logged
log_on
echo something logged
log_off
echo something else not logged
$ rm -f file.log
$ ./script.sh
something not logged
something else not logged
$ cat file.log
echo something logged
something logged
The exec <redirection> commands look a bit cryptic (as most redirections) but they are rather simple:
exec 3>&1 4>&2 makes copies of file descriptors fd1 and fd2 (stdout and stderr by default) to be able to restore these in log_off. After this fd3 and fd4 are copies of fd1 and fd2, respectively. Pick other fd than 3 or 4 if you already use them.
exec &> >( sed ... ) redirect fd1 and fd2 to the standard input of a sed command.
The sed command sed -E '/^(set \+x|log_off)$/d' >> file.log deletes lines containing only set + or log_off and appends its output to file.log. Without this sed command you would always see the two following lines:
log_off
set +x
in your logs, after a group of logged commands.
exec 1>&3 2>&4 restores fd1 and fd2 from their copies in fd3 and fd4.
The rest is straightforward: save PS4 in ps4 such that it can be restored, enable/disable the x option. This should be easy to adapt or extend if needed.
The x option displays the simple commands separately. It breaks pipes, for instance. If you prefer a command log that looks more like the commands you wrote you can replace set -/+x by set -/+v.
IMHO this has already been answered here:
For simplicity the set linux command is what you need.
set -x or set -v
Please explain to me why the very last echo statement is blank? I expect that XCODE is incremented in the while loop to a value of 1:
#!/bin/bash
OUTPUT="name1 ip ip status" # normally output of another command with multi line output
if [ -z "$OUTPUT" ]
then
echo "Status WARN: No messages from SMcli"
exit $STATE_WARNING
else
echo "$OUTPUT"|while read NAME IP1 IP2 STATUS
do
if [ "$STATUS" != "Optimal" ]
then
echo "CRIT: $NAME - $STATUS"
echo $((++XCODE))
else
echo "OK: $NAME - $STATUS"
fi
done
fi
echo $XCODE
I've tried using the following statement instead of the ++XCODE method
XCODE=`expr $XCODE + 1`
and it too won't print outside of the while statement. I think I'm missing something about variable scope here, but the ol' man page isn't showing it to me.
Because you're piping into the while loop, a sub-shell is created to run the while loop.
Now this child process has its own copy of the environment and can't pass any
variables back to its parent (as in any unix process).
Therefore you'll need to restructure so that you're not piping into the loop.
Alternatively you could run in a function, for example, and echo the value you
want returned from the sub-process.
http://tldp.org/LDP/abs/html/subshells.html#SUBSHELL
The problem is that processes put together with a pipe are executed in subshells (and therefore have their own environment). Whatever happens within the while does not affect anything outside of the pipe.
Your specific example can be solved by rewriting the pipe to
while ... do ... done <<< "$OUTPUT"
or perhaps
while ... do ... done < <(echo "$OUTPUT")
This should work as well (because echo and while are in same subshell):
#!/bin/bash
cat /tmp/randomFile | (while read line
do
LINE="$LINE $line"
done && echo $LINE )
One more option:
#!/bin/bash
cat /some/file | while read line
do
var="abc"
echo $var | xsel -i -p # redirect stdin to the X primary selection
done
var=$(xsel -o -p) # redirect back to stdout
echo $var
EDIT:
Here, xsel is a requirement (install it).
Alternatively, you can use xclip:
xclip -i -selection clipboard
instead of
xsel -i -p
I got around this when I was making my own little du:
ls -l | sed '/total/d ; s/ */\t/g' | cut -f 5 |
( SUM=0; while read SIZE; do SUM=$(($SUM+$SIZE)); done; echo "$(($SUM/1024/1024/1024))GB" )
The point is that I make a subshell with ( ) containing my SUM variable and the while, but I pipe into the whole ( ) instead of into the while itself, which avoids the gotcha.
#!/bin/bash
OUTPUT="name1 ip ip status"
+export XCODE=0;
if [ -z "$OUTPUT" ]
----
echo "CRIT: $NAME - $STATUS"
- echo $((++XCODE))
+ export XCODE=$(( $XCODE + 1 ))
else
echo $XCODE
see if those changes help
Another option is to output the results into a file from the subshell and then read it in the parent shell. something like
#!/bin/bash
EXPORTFILE=/tmp/exportfile${RANDOM}
cat /tmp/randomFile | while read line
do
LINE="$LINE $line"
echo $LINE > $EXPORTFILE
done
LINE=$(cat $EXPORTFILE)
I have been trying to get a bash script to output different things on the terminal and logfile but am unsure of what command to use.
For example,
#!/bin/bash
freespace=$(df -h / | grep -E "/" | awk '{print $4}')
greentext="\033[32m"
bold="\033[1m"
normal="\033[0m"
logdate=$(date +"%Y%m%d")
logfile="$logdate"_report.log
exec > >(tee -i $logfile)
echo -e $bold"Quick system report for "$greentext"$HOSTNAME"$normal
printf "\tSystem type:\t%s\n" $MACHTYPE
printf "\tBash Version:\t%s\n" $BASH_VERSION
printf "\tFree Space:\t%s\n" $freespace
printf "\tFiles in dir:\t%s\n" $(ls | wc -l)
printf "\tGenerated on:\t%s\n" $(date +"%m/%d/%y") # US date format
echo -e $greentext"A summary of this info has been saved to $logfile"$normal
I want to omit the last output (echo "A summary...") in the logfile while displaying it in the terminal. Is there a command to do so? It would be great if a general solution can be provided instead of a specific one because I want to apply this to other scripts.
EDIT 1 (after applying >&6):
Files in dir: 7
A summary of this info has been saved to 20160915_report.log
Generated on: 09/15/16
One option:
exec 6>&1 # save the existing stdout
exec > >(tee -i $logfile) # like you had it
#... all your outputs
echo -e $greentext"A summary of this info has been saved to $logfile"$normal >&6
# writes to the original stdout, saved in file descriptor 6 ------------^^^
The >&6 sends echo's output to the saved file descriptor 6 (the terminal, if you're running this from an interactive shell) rather than to the output path set up by tee (which is on file descriptor 1). Tested on bash 4.3.46.
References: "Using exec" and "I/O Redirection"
Edit As OP found, the >&6 message is not guaranteed to appear after the lines printed by tee off stdout. One option is to use script, e.g., as in the answers to this question, instead of tee, and then print the final message outside of the script. Per the docs, the stdbuf answers to that question won't work with tee.
Try a dirty hack:
#... all your outputs
echo >&6 # <-- New line
echo -e $greentext ... >&6
Or, equally hackish, (Note that, per OP, this worked)
#... all your outputs
sleep 0.25s # or whatever time you want <-- New line
echo -e ... >&6
Here is a script I tried to write:
#!/bin/bash
cat <&3 & # runs in background, takes input from file desc 3
echo "To Terminal"
...
echo "To cat" 1>&3
echo "to cat again" 1>&3
Essentially I want my script to spawn a program (in this case, cat) and be able to send input to it through a file descriptor.
This doesn't work ("bad file descriptor"), I think because file descriptors must be associated with a real file. What I need then is to be able to create a permanent pipe with an associated descriptor (such as 3) that I can use to write to cat throughout the program. How can I do it?
Try:
#!/bin/bash
exec 3> >(cat)
echo "To Terminal"
echo "To cat" 1>&3
echo "To cat again" 1>&3
exec 3>&-
cat, of course, does nothing interesting. For an example that is still simple but slightly more interesting output, replace cat with awk:
exec 3> >(awk '{print NR,length($0),$0}')
I have a simple function to open an editor:
open_an_editor()
{
nano "$1"
}
If called like open_an_editor file.ext, it works. But if I need to get some output from the function — smth=$(open_an_editor file.ext) — I cannot see the editor, script just stucks. What am I missing here?
Update: I am trying to write a function which would ask the user to write a value in editor, if it wasn't given in script arguments.
#!/bin/bash
open_an_editor()
{
if [ "$1" ]
then
echo "$1"
return 0
fi
tmpf=$(mktemp -t pref)
echo "default value, please edit" > "$tmpf"
# and here the editor should show up,
# allowing user to edit the value and save it
# this will stuck without showing the editor:
#nano "$tmpf"
# but this, with the help of Kimvais, works perfectly:
nano "$tmpf" 3>&1 1>&2 2>&3
cat "$tmpf"
rm "$tmpf"
}
something=$(open_an_editor "$1")
# and then I can do something useful with that value,
# for example count chars in it
echo -n "$something" | wc -c
So, if the script was called with an argument ./script.sh "A value", the function would just use that and immediately echo 7 bytes. But if called without arguments ./script.sh — nano should pop up.
If the input you need is the edited file, then you obviously need to cat filename after you do the open_an_editor filename
If you actually need the output of the editor, then you need to swap stderr and stdin i.e:
nano "$1" 3>&1 1>&2 2>&3
If yo need 'friendly' user input, see this question on how to use whiptail
if you need to get output from function and store in variable, you just display what's in file.
open_an_editor()
{
cat "$1"
}
smth=$(open_an_editor file.txt)
If all you want is for a user to enter a value then read is enough:
OLDIFS="$IFS"
IFS=$'\n'
read -p "Enter a value: " -e somevar
IFS="$OLDIFS"
echo "$somevar"