linux shell script read from stdin if no file - linux

I am trying to set my Linux shell script to read from a file (which I have working) but if there isn't any file then I need to read from stdin.
The command for reading a file looks like this:
./stats -row test_file
How would I be able to read what the user enters with something like this:
./stats -row 4 2 3 5 3 4 5 3 6 5 6 3 4
When I enter a command like this I get 'no such file or directory'
I broke my script down to the problem I need help with.
#!/bin/sh
INPUT_FILE=$2 #Argument 2 from command line is the input file
exec 5< $INPUT_FILE #assign input file to file descriptor #5
while read -u 5 line #read from file descriptor 5 (input file)
do
echo "$line"
done
exec 5<&- #close file descriptor #5
This also won't work for the input I need.
while read line
do
echo "$line"
done <$2

InArtful Solution
A very in-artful if statement will do the trick:
INPUT_FILE=$2 #Argument 2 from command line is the input file
if [ -f "$INPUT_FILE" ]; then
while read -r line
do
echo "$line"
done <"$INPUT_FILE"
else
while read -r line
do
echo "$line"
done
fi
Note: this presumes you are still looking for the filename as the 2nd argument.
Artful Solution
I cannot take credit, but the artful solution was already answered here: How to read from file or stdin in bash?
INPUT_FILE=${2:-/dev/stdin} #Argument 2 from command line is the input file
while read -r line
do
echo "$line"
done <"$INPUT_FILE"
exit 0
I was picking around with a solution like this but missed the stdin device /dev/stdin as the default for INPUT_FILES. note this solution is limited to OS's with a proc-filesystem.

In bash scripts, I usually put code that reads from a file (or a pipe) in a function, where the redirection can be separated from the logic.
Also, when reading from a file or from STDIN, it's a good idea for the logic to not care which is which. So, it's best to capture STDIN into a temp file and then the rest of the file reading code is the same.
Here's an example script that reads from ARG 1 or from STDIN, and just counts the lines in the file. It also invokes wc -l on the same input and shows the results from both methods.
#!/bin/bash
# default input is this script
input=$0
# If arg given, read from it
if (( $# > 0 )); then
input=$1
echo 1>&2 "Reading from $input"
else
# otherwise, read from STDIN
# since we're reading twice, need to capture it into
# a temp file
input=/tmp/$$.tmp
cat >$input
trap "rm -f $input" EXIT ERR HUP INT QUIT
echo 1>&2 "Reading from STDIN (saved to $input)"
fi
count_lines() {
local count=0
while read line ; do
let count+=1
done
echo $count
}
lines1=`count_lines <$input`
lines2=`wc -l <$input`
fmt="%15s: %d\n"
printf "$fmt" 'count_lines' $lines1
printf "$fmt" 'wc -l' $lines2
exit
Here are two invocations: one with a file on arg 1, and one with no argument, reading from STDIN:
$ ./t2.sh t2.sh
Reading from t2.sh
count_lines: 35
wc -l: 35
$ ./t2.sh <t2.sh
Reading from STDIN (saved to /tmp/8757.tmp)
count_lines: 35
wc -l: 35

Related

Tee command basic behavior

I want to simulate the behavior of tee command in a shell script by using while loop and read, or if it’s possible to see the content of the command.
Not sure what you're asking, but for a simplistic example, try this -
file=$1 # take an argument that specifies the file to write into
exec 3>&1 # create a dup of stdout
while read line # now for each line of input
do echo "$line" >&3 # send a copy to the dup of stdout
echo "$line" # and a copy into the specified file
done > $file # this is the file redirection for the loop

How to portable read a text file line by line in bash

For processing a text file in bash line by line, I usually implement a while loop like this:
function doSomething() {
local inputFile="$1"
local fd=""
local line=""
exec {fd}<"$inputFile" # open file
echo "Opened ${inputFile} for read using descriptor ${fd}"
while IFS='' read -r -u $fd line || [[ -n "$line" ]]; do
echo "read = \"$line\""
done
exec {fd}<&- # close file
return 0
}
This works on my Linux but unfortunately not in OSX. For OSX I currently have to change the code to something like this:
exec 3<"$inputFile" # open file
while IFS='' read -r -u 3 line || [[ -n "$line" ]]; do
echo "read = \"$line\""
done
exec 3<&- # close file
But this has the disadvantage, that I have to manage the file descriptor numbers by myself (in the first script, I let bash choose an available file descriptor number).
Did someone have a solution for this which works for both Linux and OSX?
Note that for some reason, I don't want to use piping or I/O redirection to the complete loop like this (because I don't want to execute the loop in a different process):
while IFS='' read -r line || [[ -n "$line" ]]; do
echo "read = \"$line\""
done < "$inputFile"
The last loop will not fork a new process. You can verify that by printing "$BASHPID" in and outside of the loop.
New processes are only created for pipelines. Simple redirections are handled by temporary dups within the bash process.
Feel free to use standard stdin/stdout redirection. It's no more expensive than redirection done with the exec builtin.

How to read each line of a file 1 at a time in BASH [duplicate]

This question already has answers here:
Looping through the content of a file in Bash
(16 answers)
Closed 2 years ago.
I have the following .txt file:
Marco
Paolo
Antonio
I want to read it line-by-line, and for each line I want to assign a .txt line value to a variable. Supposing my variable is $name, the flow is:
Read first line from file
Assign $name = "Marco"
Do some tasks with $name
Read second line from file
Assign $name = "Paolo"
The following reads a file passed as an argument line by line:
while IFS= read -r line; do
echo "Text read from file: $line"
done < my_filename.txt
This is the standard form for reading lines from a file in a loop. Explanation:
IFS= (or IFS='') prevents leading/trailing whitespace from being trimmed.
-r prevents backslash escapes from being interpreted.
Or you can put it in a bash file helper script, example contents:
#!/bin/bash
while IFS= read -r line; do
echo "Text read from file: $line"
done < "$1"
If the above is saved to a script with filename readfile, it can be run as follows:
chmod +x readfile
./readfile filename.txt
If the file isn’t a standard POSIX text file (= not terminated by a newline character), the loop can be modified to handle trailing partial lines:
while IFS= read -r line || [[ -n "$line" ]]; do
echo "Text read from file: $line"
done < "$1"
Here, || [[ -n $line ]] prevents the last line from being ignored if it doesn't end with a \n (since read returns a non-zero exit code when it encounters EOF).
If the commands inside the loop also read from standard input, the file descriptor used by read can be chanced to something else (avoid the standard file descriptors), e.g.:
while IFS= read -r -u3 line; do
echo "Text read from file: $line"
done 3< "$1"
(Non-Bash shells might not know read -u3; use read <&3 instead.)
I encourage you to use the -r flag for read which stands for:
-r Do not treat a backslash character in any special way. Consider each
backslash to be part of the input line.
I am citing from man 1 read.
Another thing is to take a filename as an argument.
Here is updated code:
#!/usr/bin/bash
filename="$1"
while read -r line; do
name="$line"
echo "Name read from file - $name"
done < "$filename"
Using the following Bash template should allow you to read one value at a time from a file and process it.
while read name; do
# Do what you want to $name
done < filename
#! /bin/bash
cat filename | while read LINE; do
echo $LINE
done
Use:
filename=$1
IFS=$'\n'
for next in `cat $filename`; do
echo "$next read from $filename"
done
exit 0
If you have set IFS differently you will get odd results.
Many people have posted a solution that's over-optimized. I don't think it is incorrect, but I humbly think that a less optimized solution will be desirable to permit everyone to easily understand how is this working. Here is my proposal:
#!/bin/bash
#
# This program reads lines from a file.
#
end_of_file=0
while [[ $end_of_file == 0 ]]; do
read -r line
# the last exit status is the
# flag of the end of file
end_of_file=$?
echo $line
done < "$1"
If you need to process both the input file and user input (or anything else from stdin), then use the following solution:
#!/bin/bash
exec 3<"$1"
while IFS='' read -r -u 3 line || [[ -n "$line" ]]; do
read -p "> $line (Press Enter to continue)"
done
Based on the accepted answer and on the bash-hackers redirection tutorial.
Here, we open the file descriptor 3 for the file passed as the script argument and tell read to use this descriptor as input (-u 3). Thus, we leave the default input descriptor (0) attached to a terminal or another input source, able to read user input.
For proper error handling:
#!/bin/bash
set -Ee
trap "echo error" EXIT
test -e ${FILENAME} || exit
while read -r line
do
echo ${line}
done < ${FILENAME}
Use IFS (internal field separator) tool in bash, defines the character using to separate lines into tokens, by default includes <tab> /<space> /<newLine>
step 1: Load the file data and insert into list:
# declaring array list and index iterator
declare -a array=()
i=0
# reading file in row mode, insert each line into array
while IFS= read -r line; do
array[i]=$line
let "i++"
# reading from file path
done < "<yourFullFilePath>"
step 2: now iterate and print the output:
for line in "${array[#]}"
do
echo "$line"
done
echo specific index in array: Accessing to a variable in array:
echo "${array[0]}"
The following will just print out the content of the file:
cat $Path/FileName.txt
while read line;
do
echo $line
done

read not prompting when i/p redirected from a file

I have this:
while read -r line; do echo "hello $line"; read -p "Press any key" -n 1; done < file
hello This is line 1
hello his is line 2
hello his is line 3
hello his is line 4
hello his is line 5
hello his is line 6
hello his is line 7
Why do I not see the prompt "Press any key" ?
Quote from man bash:
-p prompt
Display prompt on standard error, without a trailing new
line, before attempting to read any input. The prompt is
displayed only if input is coming from a terminal.
So, because you read lines from file but not from terminal prompt not displayed.
As others mentioned, you don't see the prompt because bash only prints the prompt when stdin is a terminal. In your case, stdin is a file.
But there's a bigger bug here: It seems to me that you want to read from two places: a file and the user. You'll have to do some redirection magic to accomplish this:
# back up stdin
exec 3<&0
# read each line of a file. the IFS="" prevents read from
# stripping leading and trailing whitespace in the line
while IFS="" read -r line; do
# use printf instead of echo because ${line} might have
# backslashes in it which some versions of echo treat
# specially
printf '%s\n' "hello ${line}"
# prompt the user by reading from the original stdin
read -p "Press any key" -n 1 <&3
done <file
# done with the stdin backup, so close the file descriptor
exec 3<&-
Note that the above code won't work with /bin/sh because it's not POSIX compliant. You'll have to use bash. I'd recommend making it POSIX compliant by changing the line that prompts the user:
printf 'Press enter to continue' >&2
read <&3
You may explicitly read from the controlling terminal /dev/tty:
while IFS="" read -r line; do
echo "hello $line"
read -p "Press any key" -n 1 </dev/tty
done < file

Bash script does not continue to read the next line of file

I have a shell script that saves the output of a command that is executed to a CSV file. It reads the command it has to execute from a shell script which is in this format:
ffmpeg -i /home/test/videos/avi/418kb.avi /home/test/videos/done/418kb.flv
ffmpeg -i /home/test/videos/avi/1253kb.avi /home/test/videos/done/1253kb.flv
ffmpeg -i /home/test/videos/avi/2093kb.avi /home/test/videos/done/2093kb.flv
You can see each line is an ffmpeg command. However, the script just executes the first line. Just a minute ago it was doing nearly all of the commands. It was missing half for some reason. I edited the text file that contained the commands and now it will only do the first line. Here is my bash script:
#!/bin/bash
# Shell script utility to read a file line line.
# Once line is read it will run processLine() function
#Function processLine
processLine(){
line="$#"
START=$(date +%s.%N)
eval $line > /dev/null 2>&1
END=$(date +%s.%N)
DIFF=$(echo "$END - $START" | bc)
echo "$line, $START, $END, $DIFF" >> file.csv 2>&1
echo "It took $DIFF seconds"
echo $line
}
# Store file name
FILE=""
# get file name as command line argument
# Else read it from standard input device
if [ "$1" == "" ]; then
FILE="/dev/stdin"
else
FILE="$1"
# make sure file exist and readable
if [ ! -f $FILE ]; then
echo "$FILE : does not exists"
exit 1
elif [ ! -r $FILE ]; then
echo "$FILE: can not read"
exit 2
fi
fi
# read $FILE using the file descriptors
# Set loop separator to end of line
BAKIFS=$IFS
IFS=$(echo -en "\n\b")
exec 3<&0
exec 0<$FILE
while read line
do
# use $line variable to process line in processLine() function
processLine $line
done
exec 0<&3
# restore $IFS which was used to determine what the field separators are
BAKIFS=$ORIGIFS
exit 0
Thank you for any help.
UPDATE 2
Its the ffmpeg commands rather than the shell script that isn't working. But I should of been using just "\b" as Paul pointed out. I am also making use of Johannes's shorter script.
I think that should do the same and seems to be correct:
#!/bin/bash
CSVFILE=/tmp/file.csv
cat "$#" | while read line; do
echo "Executing '$line'"
START=$(date +%s)
eval $line &> /dev/null
END=$(date +%s)
let DIFF=$END-$START
echo "$line, $START, $END, $DIFF" >> "$CSVFILE"
echo "It took ${DIFF}s"
done
no?
ffmpeg reads STDIN and exhausts it. The solution is to call ffmpeg with:
ffmpeg </dev/null ...
See the detailed explanation here: http://mywiki.wooledge.org/BashFAQ/089
Update:
Since ffmpeg version 1.0, there is also the -nostdin option, so this can be used instead:
ffmpeg -nostdin ...
I just had the same problem.
I believe ffmpeg is responsible for this behaviour.
My solution for this problem:
1) Call ffmpeg with an "&" at the end of your ffmpeg command line
2) Since now the skript will not wait till completion of the ffmpeg process,
we have to prevent our script from starting several ffmpeg processes.
We achieve this goal by delaying the loop pass while there is at least
one running ffmpeg process.
#!/bin/bash
cat FileList.txt |
while read VideoFile; do
<place your ffmpeg command line here> &
FFMPEGStillRunning="true"
while [ "$FFMPEGStillRunning" = "true" ]; do
Process=$(ps -C ffmpeg | grep -o -e "ffmpeg" )
if [ -n "$Process" ]; then
FFMPEGStillRunning="true"
else
FFMPEGStillRunning="false"
fi
sleep 2s
done
done
I would add echos before and after the eval to see what it's about to eval (in case it's treating the whole file as one big long line) and after (in case one of the ffmpeg commands is taking forever).
Unless you are planning to read something from standard input after the loop, you don't need to preserve and restore the original standard input (though it is good to see you know how).
Similarly, I don't see a reason for dinking with IFS at all. There is certainly no need to restore the value of IFS before exit - this is a real shell you are using, not a DOS BAT file.
When you do:
read var1 var2 var3
the shell assigns the first field to $var1, the second to $var2, and the rest of the line to $var3. In the case where there's just one variable - your script, for example - the whole line goes into the variable, just as you want it to.
Inside the process line function, you probably don't want to throw away error output from the executed command. You probably do want to think about checking the exit status of the command. The echo with error redirection is ... unusual, and overkill. If you're sufficiently sure that the commands can't fail, then go ahead with ignoring the error. Is the command 'chatty'; if so, throw away the chat by all means. If not, maybe you don't need to throw away standard output, either.
The script as a whole should probably diagnose when it is given multiple files to process since it ignores the extraneous ones.
You could simplify your file handling by using just:
cat "$#" |
while read line
do
processline "$line"
done
The cat command automatically reports errors (and continues after them) and processes all the input files, or reads standard input if there are no arguments left. The use of double quotes around the variable means that it is passed as a single unit (and therefore unparsed into separate words).
The use of date and bc is interesting - I'd not seen that before.
All in all, I'd be looking at something like:
#!/bin/bash
# Time execution of commands read from a file, line by line.
# Log commands and times to CSV logfile "file.csv"
processLine(){
START=$(date +%s.%N)
eval "$#" > /dev/null
STATUS=$?
END=$(date +%s.%N)
DIFF=$(echo "$END - $START" | bc)
echo "$line, $START, $END, $DIFF, $STATUS" >> file.csv
echo "${DIFF}s: $STATUS: $line"
}
cat "$#" |
while read line
do
processLine "$line"
done

Resources