linux device which don't allow to write on it - linux

I have a script which write some warnings to separate file (it's name is passed as an argument). I want to make this script fail if there is a warning.
So, I need to pass some file name which must raise an error if someone try to write there.
I want to use some more or less idiomatic name (to include it to man page of my script).
So, let say my script is
# myScript.sh
echo "Hello" > $1
If I call it with
./myScript.sh /dev/stdin
it is not fail because /dev/stdin is not read-only device (surprisingly!)
After
./myScript.sh /
it is failed, as I want it (because / is a directory, you can't write there). But is is not idiomatic.
Is there some pretty way to do it?

if [ -w "$1" ]
then
echo "$Hello" > "$1" # Mind the double-quotes
fi
is what you're looking for. Below would even be better in case you've only
one argument.
if [ -w "$*" ]
then
echo "$Hello" > "$*" # Mind the double-quotes
fi
$* is used to accommodate nonstandard file names. "$*" combines multiple arguments into a single word. Check [ this ].

Related

Am I setting this script up correctly to run specific commands based on user input?

I have a small script that I am working on. This is only the second script that I have made using bash script.
Basically what I am wanting this script to do is take the users input and fire a command based on that choice.
As you can see the user first enters the host address of the instance they are going to ssh into and ultimately tail logs on. There are a couple things that I am not understanding.
If / Then / Else / Elif - The concept seems simple enough but perhaps how these should be used eludes me.
When I run my script through a bash parser, the parser comes back with the following message:
Line 2:
if [ "$mainmenuinput" = "1" ]; then
^-- SC2154: mainmenuinput is referenced but not assigned.
mainmenu() {
if [ "$mainmenuinput" = "1" ]; then
ssh "$customerurl" tail -f /data/jirastudio/jira/j2ee_*/log/main/current
elif [ "$mainmenuinput" = "2" ]; then
ssh "$customerurl" tail -f /data/jirastudio/confluence/j2ee_*/log/main/current
elif [ "$mainmenuinput" = "3" ]; then
ssh "$customerurl" tail -f /data/jirastudio/horde/service/log/main/current
elif [ "$mainmenuinput" = "4" ]; then
ssh "$customerurl" tail -f /data/jirastudio/apache/logs/access_log
fi
}
printf "\nEnter the customers host URL:\n"
read -r customerurl
printf "Press 1 for JIRA\n"
printf "Press 2 for Confluence\n"
printf "Press 3 for Horde\n"
printf "Press 4 for Apache Access\n"
printf "Press 5 for Apache Error\n"
read -p -r "Make your choice:" "$mainmenuinput"
Looking up the SC2154 entry I found that it means this:
ShellCheck has noticed that you reference a variable that is not assigned. Double check that the variable is indeed assigned, and that the name is not misspelled.
I am a little confused on what that means. If someone can explain that, I would greatly appreciate it.
As it stands, when I run the script, it pauses to wait for the user to enter the host address. The user hits ENTER and the script then presents them with the menu to have them choose which log they want to tail. The menu looks a little odd:
Press 1 for JIRA
Press 2 for Confluence
Press 3 for Horde
Press 4 for Apache Access
Press 5 for Apache Error
-r
Im not sure why the -r is showing up at the end of the menu. When a selection is made, the script ends and outputs this:
./tail_logs.sh: line 23: read:Make your choice:': not a valid identifier`
Any help with this would be appreciated or if anything a push in the right direction. I love figuring this stuff out but sometimes, its helpful to get shoved at least in the general direction of the error/resolution.
Thanks
EDIT 1
Ok, I updated my script with your suggestions. It seemed to still balk at a few things. For example:
(mainmenu "$customerurl" "$mainmenuinput")
Using ShellCheck I got back this:
Line 1:
(mainmenu "$customerurl" "$mainmenuinput") {
^-- SC2154: customerurl is referenced but not assigned.
^-- SC2154: mainmenuinput is referenced but not assigned.
^-- SC1070: Parsing stopped here. Mismatched keywords or invalid parentheses?
If I write this out like:
mainmenu() { then it does not complain. Also, if I run the script with it typed out as per the suggested way, I get an error about `syntax error near unexpected token '{'
The current code looks like this:
#!/bin/sh
mainmenu() {
echo "$1"
echo "$2"
if [ "$2" = "1" ]; then
ssh "$1" tail -f "/data/jirastudio/jira/j2ee_*/log/main/current"
elif [ "$2" = "2" ]; then
ssh "$1" tail -f "/data/jirastudio/confluence/j2ee_*/log/main/current"
elif [ "$2" = "3" ]; then
ssh "$1" tail -f "/data/jirastudio/horde/service/log/main/current"
elif [ "$2" = "4" ]; then
ssh "$1" tail -f "/data/jirastudio/apache/logs/access_log"
elif [ "$2" > 4 || < 1 ]; then
echo "Uh uh uh, you didnt say the magic word! The number you picked isnt in the list. Pick again."
fi
}
echo
echo "Enter the customers host address:"
read -r customerurl
echo "Press 1 for JIRA"
echo "Press 2 for Confluence"
echo "Press 3 for Horde"
echo "Press 4 for Apache Access"
echo "Press 5 for Apache Error"
read -r -p "Pick a number: " mainmenuinput
I get no errors when running this. But when I make a selection, the script ends and does not output the tail command at all. Also, Im not sure if I am validating user input outside of 1-4 correctly with the last elif statement although if I change this to else I get an error when I run the script.
I think my issue is in the first part of the function?
mainmenu() {
echo "$1"
echo "$2"
Without having $hostAddress and mainMenuInput does the script not know what should be assigned to $1 and $2 or does it automatically assign the first thing typed in to these variables?
The main problems are with the read command at the end. First, whatever immediately follows the -p option is used as a prompt string; in this case, the next argument is "-r", so it prints that as a prompt. You clearly want "Make your choice:" to be the prompt, so that must go immediately after -p (i.e. use either read -r -p "Make your choice:" ... or read -p "Make your choice:" -r ...). Second, when you use $mainmenuinput, it replaces that with the current value of mainmenuinput. In the shell, you use $variable to get the value of a variable, not to set it. With both of these problems corrected, the last command becomes:
read -p "Make your choice:" -r mainmenuinput
There's also another important thing: after reading the users' input, you need to actually call the mainmenu function. So just add mainmenu as the last line.
As for the if ... then ... elif ... structure, yours looks fine; I'm not sure what the question is. Although personally I'd add an else clause that printed an error that the option was not valid.
I do have some stylistic/best practice recommendations, though:
It's best to pass information to functions in the form of arguments, rather than global variables. That is, rather than using customerurl and mainmenuinput directly in the function, pass them as arguments (mainmenu "$customerurl" "$mainmenuinput"), then reference those arguments ("$1" and "$2") inside the functions. This doesn't matter much in a small script like this, but having clear distinctions between the variables used by different parts of a program makes things much easier to keep straight in larger programs.
In shell scripts, printf is the best way to do complex things like printing lines without a linefeed at the end, or translating escape characters... but if you're just doing a standard print-a-line-with-a-linefeed-at-the-end, echo is simpler. Thus, I'd replace the various printf "something\n" commands with echo "something", and printf "\nEnter the customers host URL:\n" with:
echo
echo "Enter the customers host URL:"
In the command
ssh "$customerurl" tail -f /data/jirastudio/jira/j2ee_*/log/main/current
(or ssh "$1" ... if you follow my recommendation about arguments instead of global variables), the wildcard (*) will be expanded on the local computer before being handed to ssh and passed to the remote computer to be executed. It'd be best to quote that argument to prevent that:
ssh "$customerurl" tail -f "/data/jirastudio/jira/j2ee_*/log/main/current"
Note that the quotes will be removed before it's passed to ssh and then to the remote computer, so they will not prevent the wildcard from being expanded on the remote computer.
The thing you're calling a URL isn't actually a URL; URLs are things like "https://stackoverflow.com/questions". They start with a protocol (or "scheme") like "http" or "ftp", then "://", then a server name, then "/", etc. ssh just takes a raw server name (optionally with a username, in the form user#server).
Update, based on EDIT 1: I wasn't clear on how to call the function; your definition (using mainmenu() { ...) is correct, but having defined the function you then need to actually run the function. Do to this, change the end of the script to something like this:
...
echo "Press 5 for Apache Error"
read -r -p "Pick a number: " mainmenuinput
mainmenu "$customerurl" "$mainmenuinput"
This will run the function, with the first argument ($1) set to "$customerurl", and second argument ($2) set to "$mainmenuinput".
There's also a problem with the elif clause you added in the function. The shell's syntax for test expressions is really really weird (mostly for historical reasons). Also, there are three common variants, the original [ ... ] (which is actually a command) which has the weirdest syntax, bash's [[ ... ]] variant (much cleaner syntax, but not available available in generic POSIX shells), and (( ... )) (cleaner syntax, math- rather than string-oriented, not portable). See BashFAQ #31 for details.
For what you're trying to do, any of these would work:
elif [ "$2" -gt 4 -o "$2" -lt 1 ]; then
# [ ... ] doesn't use || or &&, and uses -lt etc for numeric comparisons.
# < and > do string comparisons, which are ... different. And you'd
# need to quote them to keep them from being mistaken for redirects.
# Also, you need to specify the "$2" explicitly for each comparison.
elif [[ "$2" -gt 4 || "$2" -lt 1 ]]; then
# [[ ... ]] uses || and &&, but still uses -lt etc for numeric comparisons.
# < and > still do string comparisons, but don't need to be quoted
elif (( $2 > 4 || $2 < 1 )); then
# All numeric here, so < and > work
But there's still a problem, since the user might have entered something that isn't a number at all (just pressed return, typed "wibble", etc.), and in all of these cases numeric comparison will fail. Solution: skip the test, and use else instead of elif:
...
elif [ "$2" = "4" ]; then
ssh "$1" tail -f "/data/jirastudio/apache/logs/access_log"
else
echo "Uh uh uh, you didnt say the magic word! The number you picked isnt in the list. Pick again."
fi
}
... that way, if any of the previous conditions aren't met for any reason at all, it'll print the error message.

Confused about use of return status code shell script?

In a book I'm reading the below line
ls "$1" 2>/dev/null | grep "$1" 2>/dev/null 1>&2
when written in a script - by the book it says "The command is executed to check whether the file passed as the command line argument exists. The standard error is redirected to /dev/null (the unix black hole), and standard output is redirected to standard error by using 1>&2. Thus, the command does not produce any output or error message; its only puprose is to set the command returns status value $?."
But running the code:
if [ $? -eq 0 ]
would I not know it otherwise, I have tried without the cmd at beginning and with it as well with having no impact on the results. I'm sure the author would have written for some purpose. I cannot just figure what?
This looks like a very bad book, giving code that noone sane would ever write to poorly illustrate concepts that are generally used in completely different ways in shell scripts.
The line:
ls "$1" 2>/dev/null | grep "$1" 2>/dev/null 1>&2
is as described -- it has no visible effect other than setting the return code. Is your question about what this does in detail to get a return code or something else?
The line:
if [ $? -eq 0 ]
is an incomplete fragment that checks the return code of the previous command. It's incomplete as there is no then or fi, without which the shell will reject it as a syntax error and not do anything (if you type the above at a prompt, you'll get the secondary prompt, telling you the shell is waiting for more input to get a complete command). So without more code there's no apparent effect. Something more complete like:
if [ $? -eq 0 ]; then echo YES; else echo NO; fi
would output YES or NO based on that return code.
A more sensible way of doing the 6 lines starting with the ls would be:
if [ ! -e "$1" ]; then
echo "$1: not found"
exit 1
fi
As to what the ls line actually does, it runs ls (list files) with the name in $1 as an argument, then uses grep to search that listing for the same filename.
So if the file does not exist, ls gives an error and outputs nothing, so the grep fails (setting $? to 1). If the filename exists and is not a directory, the grep will succeed (setting $? to 0). Finally, if the filename exists and is a directory, it will search the contents of that directory, looking for any file or subdirectory with the same name as a substring -- which is probably just a bug. In addition, if $1 is a string beginning with -, it will do something fairly useless and unpredictable.
Overall, a prime example of a shell script that should never be written -- any student that turned in such a monstrosity should get an immediate F.

zsh script [process complete] not returning back to shell

I wrote a zsh function to help me do some grepping at my job.
function rgrep (){
if [ -n "$1" ] && [ -n "$2" ]
then
exec grep -rnw $1 -r $2
elif [ -n "$1" ]
then
exec grep -rnw $1 -r "./"
else
echo "please enter one or two args"
fi
}
Works great, however, grep finishes executing I don't get thrown back into the shell. it just hangs at [process complete] any ideas?
I have the function in my .zshrc
In addition to getting rid of the unnecessary exec, you can remove the if statement as well.
function rgrep (){
grep -rwn "${1:?please enter one or two args}" -r "${2:-./}"
}
If $1 is not set (or null valued), an error will be raised and the given message displayed. If $2 is not set, a default value of ./ will be used in its place.
Do not use exec as it replace the existing shell.
exec [-cl] [-a name] [command [arguments]]
If command is supplied, it replaces the shell without creating a new process. If the -l option is supplied, the shell places a dash at the beginning of the zeroth argument passed to command. This is what the login program does. The -c option causes command to be executed with an empty environment. If -a is supplied, the shell passes name as the zeroth argument to command. If no command is specified, redirections may be used to affect the current shell environment. If there are no redirection errors, the return status is zero; otherwise the return status is non-zero.
Try this instead:
rgrep ()
{
if [ -n "$1" ] && [ -n "$2" ]
then
grep -rnw "$1" -r "$2"
elif [ -n "$1" ]
then
grep -rnw "$1" -r "./"
else
echo "please enter one or two args"
fi
}
As a completely different approach, I like to build command shortcuts like this as minimal shell scripts, rather than functions (or aliases):
% echo 'grep -rwn "$#"' >rgrep
% chmod +x rgrep
% ./rgrep
Usage: grep [OPTION]... PATTERN [FILE]...
Try `grep --help' for more information.
%
(This relies on a traditional behavior of Unix: executable text files without #! lines are considered shell scripts and are executed by /bin/sh. If that doesn't work on your system, or you need to run specifically under zsh, use an appropriate #! line.)
One of the main benefits of this approach is that shell scripts in a directory in your PATH are full citizens of the environment, not local to the current shell like functions and aliases. This means they can be used in situations where only executable files are viable commands, such as xargs, sudo, or remote invocation via ssh.
This doesn't provide the ability to give default arguments (or not easily, anyway), but IMAO the benefits outweigh the drawbacks. (And in the specific case of defaulting grep to search PWD recursively, the real solution is to install ack.)

Script parameters in Bash

I'm trying to make a shell script which should be used like this:
ocrscript.sh -from /home/kristoffer/test.png -to /home/kristoffer/test.txt
The script will then ocr convert the image file to a text file. Here is what I have come up with so far:
#!/bin/bash
export HOME=/home/kristoffer
/usr/local/bin/abbyyocr9 -rl Swedish -if ???fromvalue??? -of ???tovalue??? 2>&1
But I don't know how to get the -from and -to values. Any ideas on how to do it?
The arguments that you provide to a bashscript will appear in the variables $1 and $2 and $3 where the number refers to the argument. $0 is the command itself.
The arguments are seperated by spaces, so if you would provide the -from and -to in the command, they will end up in these variables too, so for this:
./ocrscript.sh -from /home/kristoffer/test.png -to /home/kristoffer/test.txt
You'll get:
$0 # ocrscript.sh
$1 # -from
$2 # /home/kristoffer/test.png
$3 # -to
$4 # /home/kristoffer/test.txt
It might be easier to omit the -from and the -to, like:
ocrscript.sh /home/kristoffer/test.png /home/kristoffer/test.txt
Then you'll have:
$1 # /home/kristoffer/test.png
$2 # /home/kristoffer/test.txt
The downside is that you'll have to supply it in the right order. There are libraries that can make it easier to parse named arguments on the command line, but usually for simple shell scripts you should just use the easy way, if it's no problem.
Then you can do:
/usr/local/bin/abbyyocr9 -rl Swedish -if "$1" -of "$2" 2>&1
The double quotes around the $1 and the $2 are not always necessary but are adviced, because some strings won't work if you don't put them between double quotes.
If you're not completely attached to using "from" and "to" as your option names, it's fairly easy to implement this using getopts:
while getopts f:t: opts; do
case ${opts} in
f) FROM_VAL=${OPTARG} ;;
t) TO_VAL=${OPTARG} ;;
esac
done
getopts is a program that processes command line arguments and conveniently parses them for you.
f:t: specifies that you're expecting 2 parameters that contain values (indicated by the colon). Something like f:t:v says that -v will only be interpreted as a flag.
opts is where the current parameter is stored. The case statement is where you will process this.
${OPTARG} contains the value following the parameter. ${FROM_VAL} for example will get the value /home/kristoffer/test.png if you ran your script like:
ocrscript.sh -f /home/kristoffer/test.png -t /home/kristoffer/test.txt
As the others are suggesting, if this is your first time writing bash scripts you should really read up on some basics. This was just a quick tutorial on how getopts works.
Use the variables "$1", "$2", "$3" and so on to access arguments. To access all of them you can use "$#", or to get the count of arguments $# (might be useful to check for too few or too many arguments).
I needed to make sure that my scripts are entirely portable between various machines, shells and even cygwin versions. Further, my colleagues who were the ones I had to write the scripts for, are programmers, so I ended up using this:
for ((i=1;i<=$#;i++));
do
if [ ${!i} = "-s" ]
then ((i++))
var1=${!i};
elif [ ${!i} = "-log" ];
then ((i++))
logFile=${!i};
elif [ ${!i} = "-x" ];
then ((i++))
var2=${!i};
elif [ ${!i} = "-p" ];
then ((i++))
var3=${!i};
elif [ ${!i} = "-b" ];
then ((i++))
var4=${!i};
elif [ ${!i} = "-l" ];
then ((i++))
var5=${!i};
elif [ ${!i} = "-a" ];
then ((i++))
var6=${!i};
fi
done;
Rationale: I included a launcher.sh script as well, since the whole operation had several steps which were quasi independent on each other (I'm saying "quasi", because even though each script could be run on its own, they were usually all run together), and in two days I found out, that about half of my colleagues, being programmers and all, were too good to be using the launcher file, follow the "usage", or read the HELP which was displayed every time they did something wrong and they were making a mess of the whole thing, running scripts with arguments in the wrong order and complaining that the scripts didn't work properly. Being the choleric I am I decided to overhaul all my scripts to make sure that they are colleague-proof. The code segment above was the first thing.
In bash $1 is the first argument passed to the script, $2 second and so on
/usr/local/bin/abbyyocr9 -rl Swedish -if "$1" -of "$2" 2>&1
So you can use:
./your_script.sh some_source_file.png destination_file.txt
Explanation on double quotes;
consider three scripts:
# foo.sh
bash bar.sh $1
# cat foo2.sh
bash bar.sh "$1"
# bar.sh
echo "1-$1" "2-$2"
Now invoke:
$ bash foo.sh "a b"
1-a 2-b
$ bash foo2.sh "a b"
1-a b 2-
When you invoke foo.sh "a b" then it invokes bar.sh a b (two arguments), and with foo2.sh "a b" it invokes bar.sh "a b" (1 argument). Always have in mind how parameters are passed and expaned in bash, it will save you a lot of headache.

Is the directory NOT writable

Can anyone tell me why this is always saying that the directory is not writable, when it absolutely is?
$dnam="/home/bryan/renametest/C D"
# Is the directory writable
err=0
if [ ! -w $dnam ]
then
# Not writable. Pop the error and exit.
echo "Directory $dnam is not writable"
err=1
fi
You need double-quotes around $dnam -- without them, it's interpreted as two separate shell words, "/home/bryan/renametest/C" and "D", which makes an invalid test expression and hence fails. This should work:
if [ ! -w "$dnam" ]
#tink's suggestion of [[ ]] is a cleaner way of doing tests like this, but is only available in bash (and some other shells with extended syntax). The fact that you get [[: not found means you're using a fairly basic shell, not bash.
I see multiple problems:
You are using a space inside your variable. This is not illegal, but in combination line you use the variable unescaped and generate the following command:
if [ ! -w /home/bryan/renametest/C D ]
This is not a valid syntax. The simplest way to fix this is changing the line to
if [ ! -w "$dnam" ]
The next problem is worse: On my system, help test returns the text:
-w FILE True if the file is writable by you.
Which means, the command doesn't support directories but only files. If you want to check if a directory is writable, you will have to use a different command
As everyone else said, the $dnam variable needs double quotes. Here's why:
The [ ... ] is an alias to the test command. If you look in your system, you will see a file called /bin/[ or maybe /bin/usr/[. On some systems, this is a hard link to /bin/test or /bin/usr/test. The if statement executes what comes after the if, and if that command returns a zero exit status, the if statement will execute the then clause. Otherwise, if there is an else clause, that will execute instead.
To allow for boolean testing, Unix included the test command, so you could do this:
if test -d "$directory"
then
echo "Directory $directory exists!"
fi
Later on, the /bin/[ was added as syntactic sugar. This is identical to the above:
if [ -d "$directory" ]
then
echo "Directory $directory exists!"
fi
Now, both [ and test are builtin commands, but they are *still commands. This means that the shell interpolates the command and then executes it.
Try executing the following:
$ set -xv # Turns on shell debugging
$ dnam="/home/bryan/renametest/C D"
dnam="/home/bryan/renametest/C D"
+ dnam='/home/bryan/renametest/C D'
$ test -d $dnam
test -d $dnam
+ test -d /home/bryan/renametest/C D
$ echo $?
echo $?
+ echo 1
1
$ test -d "$dnam" # Now with quotes
test -d $dnam
+ test -d "/home/bryan/renametest/C D"
$ echo $?
echo $?
+ echo 0
0
$ set +xv # Turn off the debuggin
Each command is echoed twice. The first time as written, and the second time after the line is interpolated. As part of the interpolation, the shell splits parameters on white space. As you can see, the test command is testing the presence of /home/bryan/renamtest/C which doesn't exist and thus not writable. I'm actually surprised that the test command didn't print an error message because you passed it an extra parameter.
In the second attempt, you added quotes. These quotes prevented the shell from splitting your parameters on the space and keep the directory name as a single parameter.
Since [ ... ] is a command, you have to take into account the shell's interpolation of variables and other issues. And, if you're not absolutely careful, you can end up with errors.
Even worse, sometimes the [ ... ] might work and sometimes it might not. If your directory name didn't contain spaces, it will work as expected. Imagine you're writing a program, and you test it and everything works because all directories you've tried don't have spaces. Then, someone uses your program, but has a space in the directory. A substantial number of shell script bugs are do to this type of issue in if statements.
This is why Bash introduced the [[ ... ]] tests. The [[ isn't a command but a statement. This means that the shell doesn't directly interpolate the results. Instead, the parameters are parsed, and then any interpolation is done. Thus, this would have worked:
dnam="/home/bryan/renametest/C D" # No "$" in front of the variable!
# Is the directory writable
if [[ ! -w $dnam ]] # No quotation marks needed!
then
# Not writable. Pop the error and exit.
echo "Directory $dnam is not writable"
err=1
fi
It's almost always better to use the [[ ... ]] test rather than the [ ... ] test, so go ahead and get into the habit.
One more minor error, you had:
$dnam="/home/bryan/renametest/C D"
This gets interpolated by the shell, so the variable being set is whatever the value of $dnam just happens to be. If $dnam happened to equal "foo", you would been doing this:
foo="/home/bryan/renametest/C D"
Not what you want.
You want to leave the $ off when you set variables:
dnam="/home/bryan/renametest/C D"

Resources