I want to write a shell that runs until something is written to a file (by another process). I have written this:
PID_FILE=log.txt
DONE=0
while [$DONE -eq 0]
do
cat $PID_FILE | while read LINE
do
if [$LINE -neq ""]; then
echo "Do stuff here"
$DONE=1
fi
done
done
echo "DONE"
echo "">$PID_FILE
but I get
test.sh: 3: test.sh: [0: not found
DONE
This line:
while [$DONE -eq 0]
Needs spaces around the square brackets:
while [ $DONE -eq 0 ]
As does this one:
if [$LINE -neq ""]; then
Like this:
if [ $LINE -neq "" ]; then
It helps when you know that \[ is a command. See Why should be there a space after '[' and before ']' in the Bash Script for an explanation.
Related
I am looking to create a shell script that reads command line arguments, then concatenates the contents of those files and print it to stdout. I need to verify the files passed to the command line exist.
I have written some code so far, but the script only works if only one command line argument is passed. If passing more than one argument, the error checking I have tried does not work.
#!/bin/bash
if [ $# -eq 0 ]; then
echo -e "Usage: concat FILE ... \nDescription: Concatenates FILE(s)
to standard output separating them with divider -----."
exit 1
fi
for var in "$#"
do
if [[ ! -e $# ]]; then
echo "One or more files does not exist"
exit 1
fi
done
for var in "$#"
do
if [ -f $var ]; then
cat $var
echo "-----"
exit 0
fi
done
I need to fix the error checking on this so that every command line argument is checked to be an existing file. If a file does not exist, the error must be printed to stderr and nothing should be printed to stdout.
You have a bug in line 11:
if [[ ! -e $# ]]; then
You do need to check for a given file here using $var like that:
if [[ ! -e "$var" ]]; then
And you exit prematurely in line 23 - you will always print only a
single file. And remember to always quote your variable because
otherwise your script would not run correctly on files that have a whitespaces in the name, for example:
$ echo a line > 'a b'
$ cat 'a b'
a line
$ ./concat.sh 'a b'
cat: a: No such file or directory
cat: b: No such file or directory
-----.
You said:
if a file does not exist, the error must be printed to stderr and
nothing should be printed to stdout.
You aren't printing anything to stderr at the moment, if you want to
you should do:
echo ... >&2
And you should use printf instead of echo as it's more portable
even though you're using Bash.
All in all, your script could look like this:
#!/bin/bash
if [ $# -eq 0 ]; then
printf "Usage: concat FILE ... \nDescription: Concatenates FILE(s) to standard output separating them with divider -----.\n" >&2
exit 1
fi
for var in "$#"
do
if [[ ! -e "$var" ]]; then
printf "One or more files does not exist\n" >&2
exit 1
fi
done
for var in "$#"
do
if [ -f "$var" ]; then
cat "$var"
printf -- "-----\n"
fi
done
exit 0
I can't get my bash script (a logging file) to detect any other exit code other than 0, so the count for failed commands isn't being incremented, but the successes is incremented regardless of whether the command failed or succeeded.
Here is the code:
#!/bin/bash
#Script for Homework 8
#Created by Greg Kendall on 5/10/2016
file=$$.cmd
signal() {
rm -f $file
echo
echo "User Aborted by Control-C"
exit
}
trap signal 2
i=0
success=0
fail=0
commands=0
read -p "$(pwd)$" "command"
while [ "$command" != 'exit' ]
do
$command
((i++))
echo $i: "$command" >> $file
if [ "$?" -eq 0 ]
then
((success++))
((commands++))
else
((fail++))
((commands++))
fi
read -p "$(pwd)" "command"
done
if [ "$command" == 'exit' ]
then
rm -f $file
echo commands:$commands "(successes:$success, failures:$fail)"
fi
Any help would be greatly appreciated. Thanks!
That's because echo $i: "$command" is succeeding always.
The exit status $? in if [ "$?" -eq 0 ] is actually the exit status of echo, the command that is run immediately before the checking.
So do the test immediate after the command:
$command
if [ "$?" -eq 0 ]
and use echo elsewhere
Or if you prefer you don't need the $? check at all, you can run the command and check status within if alone:
if $command; then .....; else ....; fi
If you do not want to get the STDOUT and STDERR:
if $command &>/dev/null; then .....; else ....; fi
** Note that, as #Charles Duffy mentioned in the comment, you should not run command(s) from variables.
Your code is correctly counting the number of times that the echo $i: "$command" command fails. I presume that you would prefer to count the number of times that $command fails. In that case, replace:
$command
((i++))
echo $i: "$command" >> $file
if [ "$?" -eq 0 ]
With:
$command
code=$?
((i++))
echo $i: "$command" >> $file
if [ "$code" -eq 0 ]
Since $? captures the exit code of the previous command, it should be placed immediately after the command whose code we want to capture.
Improvement
To make sure that the value of $? is captured before any other command is run, Charles Duffy suggests placing the assignment on the same line as the command like so:
$command; code=$?
((i++))
echo $i: "$command" >> $file
if [ "$code" -eq 0 ]
This should make it less likely that any future changes to the code would separate the command from the capture of the value of $?.
I have a file with the following format:
123 2 3 48 85.64 85.95
Park ParkName Location
12 2.2 3.2 48 5.4 8.9
Now I could like to write a shell script to extract lines from this file.
The first item from each line is a kind of flag. For different flags, I will make different process.
See my code below:
head= ` echo "$line" | grep -P "^\d+" `
if [ "$head" != "" ]; then
(do something...)
fi
head=` echo "$line" | grep -P "^[A-Z]+" `
if [ "$head" != "" ]; then
(do something...)
fi
The code works. But I dislike the complicated way of writing 2 "if".
I would like to have something simple like:
if [ "$head" != "" ]; then
(do something...)
elif [ "$head" != "" ]; then
(do something...)
fi
Any thoughts?
How about pure bash solution? Bash has a built-in regexp functionality, that you trigger with ~ character.
Be aware though that processing huge files with bash read line will not yield in optimal performance..
#!/bin/bash
file=$1
while read line
do
echo "read [$line]"
if [[ $line =~ ^[0-9] ]]; then
echo ' Processing numeric line'
elif [[ $line =~ ^[A-Za-z] ]]; then
echo ' Processing a text line'
fi
done < $file
How about this. I guess it would fulfill your requirement
file
123 2 3 48 85.64 85.95
Park ParkName Location
12 2.2 3.2 48 5.4 8.9
script.sh
while read line
do
echo $line | grep -qP "^\d+" && echo "Line starts with Numbers"
echo $line | grep -qP "^[a-zA-Z]+" && echo "Line Starts with String"
done < file
Output:-
bash script.sh
Line starts with Numbers
Line Starts with String
Line starts with Numbers
My Code:
#!/bin/bash
rm screenlog.0
screen -X stuff 'X21'$(printf \\r)
while :
do
grep -i "T" $screenlog.0
if [ $? -eq 0 ];
then
FILE=/etc/passwd
VAR=`head -n 1 $FILE`
echo $VAR
rm screenlog.0
break
done
This script is to delete the file "screenlog.0" send a command (X21) to an screen interface.
Thats the first part and it works.
The second Part is the Problem:
That should test the content of "screenlog.0", is there an something with a "T" inside save the contant into a variable.
The error:
line 11: syntax error near unexpected token `done'
line 11: `done'
To the "screen": Its an screen of an usb device that recive radio messages like this:
T350B00A66E2
H34D04DE4254
The script have to scan for the incomming messages with "T" at the beginning (The first letter is a Type field behind this a hex code.
Some ideas to correct or other solutions?
I corrected my code a bit:
#!/bin/bash
>screenlog.0
screen -X stuff 'X21'$(printf \\r)
while :
do
sleep 2
grep -i "T" $screenlog.0
if [ $? -eq 0 ];
then
screenlog.0=/etc/passwd
VAR=`head -n 1 $screenlog.0`
echo $VAR
break
fi
done
The new error is:
grep: .0: No such file or directory
All 5 seconds....
The file screenlog.0 exist .. :(
oh...you missed fi in your script :). Like syntax as follows if [ condition ];then #dosomething fi
For your script
if [ $? -eq 0 ];then
FILE=/etc/passwd
VAR=`head -n 1 $FILE`
echo $VAR
rm screenlog.0
break
fi
I need to teach myself bash scripting. I'm reading this ebook and it has the following code:
#!/bin/bash
# hello.sh
# This is my first shell script!
declare -rx SCRIPT="hello.sh"
declare -rx who="/usr/bin/who"
declare -rx sync="/bin/sync"
declare -rx wc="/usr/bin/wc"
# sanity checks
if test -z "$BASH" ; then
printf "$SCRIPT:$LINENO: please run this script with the BASH shell\n" >&2
exit 192
fi
if test ! -x "$who" ; then
printf "$SCRIPT:$LINENO: The command $who is not available - aborting\n" >&2
exit 192
fi
if test ! -x "$sync" ; then
printf "$SCRIPT:$LINENO: the command $sync is not available - aborting\n">&2
exit 192
fi
if test ! -x "$wc" ; then
printf "$SCRIPT:$LINENO: the command $wc is not available - aborting\n" >&2
exit 192
fi
USERS = `$who | $wc -l`
if [ $USERS -eq 0 ] ; then
$sync
fi
exit 0
When I run it, I get the following error:
hello.sh: line 32: USERS: command not found
hello.sh: line 33: [: -eq: unary operator expected
I don't really know what I'm doing wrong. Am I not allowed to assign USERS to a the output of a command line in that fashion? If I run that line in the command line, it doesn't work either. Any ideas?
Thanks
remove the spaces around the assignment =:
USERS=`$who | $wc -l`
Or it will be interpreted as a command USERS with the two parameters = and `%who | $wc -l`
Replace
USERS = `$who | $wc -l`
with
USERS=`$who | $wc -l`
In Bash (in fact in many shells) you can't have spaces between a variable name and the symbol =
in this case you need to write
USERS=`command`
or
USERS=$(command)
A variable sometimes act as a C++ Macro. If the variable USERS are empty and you type this:
if [ $USERS -eq 0 ] ; then
it will be interpreted like
if [ -eq 0 ] ; then
and the -eq is not a unary operator. to make it right you need write:
if [ "$USERS" -eq 0 ] ; then
to be interpreted
if [ "" -eq 0 ] ; then