read not prompting when i/p redirected from a file - linux

I have this:
while read -r line; do echo "hello $line"; read -p "Press any key" -n 1; done < file
hello This is line 1
hello his is line 2
hello his is line 3
hello his is line 4
hello his is line 5
hello his is line 6
hello his is line 7
Why do I not see the prompt "Press any key" ?

Quote from man bash:
-p prompt
Display prompt on standard error, without a trailing new
line, before attempting to read any input. The prompt is
displayed only if input is coming from a terminal.
So, because you read lines from file but not from terminal prompt not displayed.

As others mentioned, you don't see the prompt because bash only prints the prompt when stdin is a terminal. In your case, stdin is a file.
But there's a bigger bug here: It seems to me that you want to read from two places: a file and the user. You'll have to do some redirection magic to accomplish this:
# back up stdin
exec 3<&0
# read each line of a file. the IFS="" prevents read from
# stripping leading and trailing whitespace in the line
while IFS="" read -r line; do
# use printf instead of echo because ${line} might have
# backslashes in it which some versions of echo treat
# specially
printf '%s\n' "hello ${line}"
# prompt the user by reading from the original stdin
read -p "Press any key" -n 1 <&3
done <file
# done with the stdin backup, so close the file descriptor
exec 3<&-
Note that the above code won't work with /bin/sh because it's not POSIX compliant. You'll have to use bash. I'd recommend making it POSIX compliant by changing the line that prompts the user:
printf 'Press enter to continue' >&2
read <&3

You may explicitly read from the controlling terminal /dev/tty:
while IFS="" read -r line; do
echo "hello $line"
read -p "Press any key" -n 1 </dev/tty
done < file

Related

How can I keep a FIFO open for reading?

I'm trying to redirect a program's stdin and stdout. I'm currently experimenting with a bash mockup of this, but I'm getting some odd behavior.
I have the following:
mkfifo in
mkfifo out
I also have the following script, test.sh
#!/bin/bash
while read line; do
echo "I read ${line}"
done < /dev/stdin
In terminal 1, I do the following:
tail -f out
In terminal 2, I do the following:
./test.sh < in > out
In terminal 3, I do the following:
echo "foo" > in
echo "bar > in
However, instead of seeing "I read foo" followed by "I read bar" in terminal 1, I get nothing after the first echo, both lines after the second echo, and then the test.sh program in terminal 2 exits. How can I prevent the exit so I can keep sending test.sh input? Also, instead of buffering and then dumping when the program terminates, how can I get the output from test.sh to flush to the tail -f in terminal 1?
Use the redirection on a single compound command that contains your two echo commands.
{
echo "foo"
echo "bar"
} > in
If, as seems likely on a closer reading, you want in to stay open while you are executing commands interactively, use exec to open in on another file descriptor:
exec 3> in # Open in on file descriptor 3
echo "foo" >&3 # Write to file descriptor 3 instead of standard output
echo "bar" >&3 # "
exec 3>&- # Close file descriptor 3
Note that exec 3> in will block until something (test.sh in your case) opens in for reading, and due to buffering, you may not see any output from tail -f out until you close file descriptor 3.

Shell Script Looping not working while on remote SSH

I have been trying to make a looping work while accessing via SSH Remote command but it doesn't work. The command works when ran directly in host.
scp file root#${SERVER}:/tmp
ssh ${SERVER} "
while IFS= read -r line; do
echo "$line"
done </tmp/file
"
I have tried using the single quotes in main script but it causing errors.
bash: line n: warning: here-document at line n delimited by end-of-file
Any advise will be much appreciated.
UPDATE
testfile
1
2
3
4
5
6
Script test
SERVER='client'
ssh ${SERVER} '
echo "inside remote ClientServer"
echo "cat testfile"
cat /tmp/testfile
while read line; do
echo "${line}"
done <</tmp/testfile
'
echo "Back to MasterServer..."
Terminal Result
root#server]# ./test
S
Kernel 4.14.35-1902.10.7.el7uek.x86_64 on an x86_64
inside remote ClientServer
cat testfile
1
2
3
4
5
6
bash: line 8: warning: here-document at line 8 delimited by end-of-file (wanted `/tmp/testfile')
Back to MasterServer...
Thank you.
You will probably want to use single quotes to pass the remote commands verbatim:
scp file root#${SERVER}:/tmp
ssh ${SERVER} '
while IFS= read -r line; do
echo "$line"
done </tmp/file
'
Ensure you're using </tmp/file, not <</tmp/file. The sequence << is used to start a here-document which is not what you want in this case.

linux shell script read from stdin if no file

I am trying to set my Linux shell script to read from a file (which I have working) but if there isn't any file then I need to read from stdin.
The command for reading a file looks like this:
./stats -row test_file
How would I be able to read what the user enters with something like this:
./stats -row 4 2 3 5 3 4 5 3 6 5 6 3 4
When I enter a command like this I get 'no such file or directory'
I broke my script down to the problem I need help with.
#!/bin/sh
INPUT_FILE=$2 #Argument 2 from command line is the input file
exec 5< $INPUT_FILE #assign input file to file descriptor #5
while read -u 5 line #read from file descriptor 5 (input file)
do
echo "$line"
done
exec 5<&- #close file descriptor #5
This also won't work for the input I need.
while read line
do
echo "$line"
done <$2
InArtful Solution
A very in-artful if statement will do the trick:
INPUT_FILE=$2 #Argument 2 from command line is the input file
if [ -f "$INPUT_FILE" ]; then
while read -r line
do
echo "$line"
done <"$INPUT_FILE"
else
while read -r line
do
echo "$line"
done
fi
Note: this presumes you are still looking for the filename as the 2nd argument.
Artful Solution
I cannot take credit, but the artful solution was already answered here: How to read from file or stdin in bash?
INPUT_FILE=${2:-/dev/stdin} #Argument 2 from command line is the input file
while read -r line
do
echo "$line"
done <"$INPUT_FILE"
exit 0
I was picking around with a solution like this but missed the stdin device /dev/stdin as the default for INPUT_FILES. note this solution is limited to OS's with a proc-filesystem.
In bash scripts, I usually put code that reads from a file (or a pipe) in a function, where the redirection can be separated from the logic.
Also, when reading from a file or from STDIN, it's a good idea for the logic to not care which is which. So, it's best to capture STDIN into a temp file and then the rest of the file reading code is the same.
Here's an example script that reads from ARG 1 or from STDIN, and just counts the lines in the file. It also invokes wc -l on the same input and shows the results from both methods.
#!/bin/bash
# default input is this script
input=$0
# If arg given, read from it
if (( $# > 0 )); then
input=$1
echo 1>&2 "Reading from $input"
else
# otherwise, read from STDIN
# since we're reading twice, need to capture it into
# a temp file
input=/tmp/$$.tmp
cat >$input
trap "rm -f $input" EXIT ERR HUP INT QUIT
echo 1>&2 "Reading from STDIN (saved to $input)"
fi
count_lines() {
local count=0
while read line ; do
let count+=1
done
echo $count
}
lines1=`count_lines <$input`
lines2=`wc -l <$input`
fmt="%15s: %d\n"
printf "$fmt" 'count_lines' $lines1
printf "$fmt" 'wc -l' $lines2
exit
Here are two invocations: one with a file on arg 1, and one with no argument, reading from STDIN:
$ ./t2.sh t2.sh
Reading from t2.sh
count_lines: 35
wc -l: 35
$ ./t2.sh <t2.sh
Reading from STDIN (saved to /tmp/8757.tmp)
count_lines: 35
wc -l: 35

How to read each line of a file 1 at a time in BASH [duplicate]

This question already has answers here:
Looping through the content of a file in Bash
(16 answers)
Closed 2 years ago.
I have the following .txt file:
Marco
Paolo
Antonio
I want to read it line-by-line, and for each line I want to assign a .txt line value to a variable. Supposing my variable is $name, the flow is:
Read first line from file
Assign $name = "Marco"
Do some tasks with $name
Read second line from file
Assign $name = "Paolo"
The following reads a file passed as an argument line by line:
while IFS= read -r line; do
echo "Text read from file: $line"
done < my_filename.txt
This is the standard form for reading lines from a file in a loop. Explanation:
IFS= (or IFS='') prevents leading/trailing whitespace from being trimmed.
-r prevents backslash escapes from being interpreted.
Or you can put it in a bash file helper script, example contents:
#!/bin/bash
while IFS= read -r line; do
echo "Text read from file: $line"
done < "$1"
If the above is saved to a script with filename readfile, it can be run as follows:
chmod +x readfile
./readfile filename.txt
If the file isn’t a standard POSIX text file (= not terminated by a newline character), the loop can be modified to handle trailing partial lines:
while IFS= read -r line || [[ -n "$line" ]]; do
echo "Text read from file: $line"
done < "$1"
Here, || [[ -n $line ]] prevents the last line from being ignored if it doesn't end with a \n (since read returns a non-zero exit code when it encounters EOF).
If the commands inside the loop also read from standard input, the file descriptor used by read can be chanced to something else (avoid the standard file descriptors), e.g.:
while IFS= read -r -u3 line; do
echo "Text read from file: $line"
done 3< "$1"
(Non-Bash shells might not know read -u3; use read <&3 instead.)
I encourage you to use the -r flag for read which stands for:
-r Do not treat a backslash character in any special way. Consider each
backslash to be part of the input line.
I am citing from man 1 read.
Another thing is to take a filename as an argument.
Here is updated code:
#!/usr/bin/bash
filename="$1"
while read -r line; do
name="$line"
echo "Name read from file - $name"
done < "$filename"
Using the following Bash template should allow you to read one value at a time from a file and process it.
while read name; do
# Do what you want to $name
done < filename
#! /bin/bash
cat filename | while read LINE; do
echo $LINE
done
Use:
filename=$1
IFS=$'\n'
for next in `cat $filename`; do
echo "$next read from $filename"
done
exit 0
If you have set IFS differently you will get odd results.
Many people have posted a solution that's over-optimized. I don't think it is incorrect, but I humbly think that a less optimized solution will be desirable to permit everyone to easily understand how is this working. Here is my proposal:
#!/bin/bash
#
# This program reads lines from a file.
#
end_of_file=0
while [[ $end_of_file == 0 ]]; do
read -r line
# the last exit status is the
# flag of the end of file
end_of_file=$?
echo $line
done < "$1"
If you need to process both the input file and user input (or anything else from stdin), then use the following solution:
#!/bin/bash
exec 3<"$1"
while IFS='' read -r -u 3 line || [[ -n "$line" ]]; do
read -p "> $line (Press Enter to continue)"
done
Based on the accepted answer and on the bash-hackers redirection tutorial.
Here, we open the file descriptor 3 for the file passed as the script argument and tell read to use this descriptor as input (-u 3). Thus, we leave the default input descriptor (0) attached to a terminal or another input source, able to read user input.
For proper error handling:
#!/bin/bash
set -Ee
trap "echo error" EXIT
test -e ${FILENAME} || exit
while read -r line
do
echo ${line}
done < ${FILENAME}
Use IFS (internal field separator) tool in bash, defines the character using to separate lines into tokens, by default includes <tab> /<space> /<newLine>
step 1: Load the file data and insert into list:
# declaring array list and index iterator
declare -a array=()
i=0
# reading file in row mode, insert each line into array
while IFS= read -r line; do
array[i]=$line
let "i++"
# reading from file path
done < "<yourFullFilePath>"
step 2: now iterate and print the output:
for line in "${array[#]}"
do
echo "$line"
done
echo specific index in array: Accessing to a variable in array:
echo "${array[0]}"
The following will just print out the content of the file:
cat $Path/FileName.txt
while read line;
do
echo $line
done

How to duplicate stdin into file

I have sophisticated bash script that uses "read -p"(stderr output) very often. And now I need to duplicate all script input from terminal into log file.
tee file.log | script.sh
this command does'nt work carefully because ignores output to user.
Example:
#!/bin/sh
echo "start"
read -p "input value: " val
echo $val
echo "finish"
Terminal run:
start
input value: 3
3
finish
Tee run:
# tee file.log | ./script.sh
start
3
3
finish
No idea why you're using tee here. What I suspect is happening is it needs input, so waits for it, then pipes 3 to stdout
-p prompt
Display prompt, without a trailing newline, before attempting
to read any input. The prompt is displayed only if input is coming from a
terminal.
However input isn't sent from tty here so prompt is never printed. Still feels very weird for me to use tee here, but you can just use echo -n instead of the -p flag for read and it should work.
#!/bin/sh
echo "start"
echo -n "input value: "
read val
echo $val
echo "finish"
e.g.
> tee file.log | ./abovescript
start
input value: 3
3
finish
> cat file.log
3
Also not sure how to get tee to terminate properly from in-script here, so you need to press return key at end which of course causes newline.
That said, since it's an extra line each time anyway, seems worse than just be doing echo "$val" >> file.log each time, though a better option would be just to use a function
#!/bin/bash
r() {
read -p "input value: " val
echo "$val" >> file.log
echo "$val"
}
echo "start"
val=$(r)
echo "$val"
echo "finish"

Resources