Why can't I run my shell script to list users? - linux

users='awk '{print $1}' /etc/passwd | sort -u'
for user in $users
do
echo " - $user"
done
this is my shell script . Problem is that show's an error.
the error is ---> users: command not found
please give me the solution frinds

With the code the way it is now I see that you're not assigning the output of the awk|sort command to the variable (maybe you wanted to use ` instead of ' ?)
This works:
#!/bin/bash
users=$(awk '{print $1}' /etc/passwd | sort -u)
for user in $users
do
echo " - $user"
done
Although you should be aware that /etc/passwd is not separated by spaces, so awk '{print $1}' won't give you the user's name (which maybe is what you wanted)
Edit:
As per #Andy Lester's comment to your question: If you save this code in a file (let's say /tmp/myscript.bash) to run it you have to type in a terminal:
/bin/bash /tmp/myscript.bash
or, since it starts with #!/bin/bash (read here) you could make it executable (using chmod u+x /tmp/myscript.bash) and then call it, just typing /tmp/myscript.bash. You can also save it in one of the PATH directories (type echo $PATH to see which are they), make it executable and then you'll be able to call it from anywhere, but I don't really recommend doing that because you may end up overwriting juicy system's commands if you're not careful. For instance, let's say you call your script with the unfortunate name of ls, save it in the first directory of the $PATH (in my case, /usr/local/sbin) Every time you type ls, you won't be listing directories, but calling your script... Which is bad.

Related

Escaping quotes in bash (Embedded awk)

I have a complex command I am passing via ssh to a remote server. I am trying to unzip a file and then change its naming structure and extension in a second ssh command. The command I have is:
ssh root#server1 "gzip -d /tmp/file.out-20171119.gz; echo file* | awk -F'[.-]' '{print $1$3".log"}'"
Obviously the " around the .log portion of the print statement are failing me. The idea is that I would strip the .out portion from the filename and end up with file20171119.log as an ending result. I am just a bit confused on the syntax or on how to escape that properly so bash interprets the .log appropriately.
The easiest way to deal with this problem is to avoid it. Don't bother trying to escape your script to go on a command line: Pass it on stdin instead.
ssh root#server1 bash -s <<'EOF'
gzip -d /tmp/file.out-20171119.gz
# note that (particularly w/o a cd /tmp) this doesn't do anything at all related to the
# line above; thus, probably buggy as given in the original question.
echo file* | awk -F'[.-]' '{print $1$3".log"}'
EOF
A quoted heredoc -- one with <<'EOF' or <<\EOF instead of <<EOF -- is passed literally, without any shell expansions; thus, $1 or $3 will not be replaced by the calling shell as they would with an unquoted heredoc.
If you don't want to go the avoidance route, you can have the shell do the quoting for you itself. For example:
external_function() {
gzip -d /tmp/file.out-20171119.gz
echo file* | awk -F'[.-]' '{print $1$3".log"}'
}
ssh root#server1 "$(declare -f external_function); external_function"
declare -f prints a definition of a function. Putting that function literally into your SSH command ensures that it's run remotely.
You need to escape the " to prevent them from closing your quoted string early, and you need to escape the $ in the awk script to prevent local parameter expansion.
ssh root#server1 "gzip -d /tmp/file.out-20171119.gz; echo file* | awk -F'[.-]' '{print \$1\$3\".log\"}'"
The most probable reason (as you don't show the contents of the root home directory in the server) is that you are uncompressing the file in the /tmp directory, but feeding to awk filenames that should exist in the root home directory.
" allows escaping sequences with \. so the correct way to do is
ssh root#server1 "gzip -d /tmp/file.out-20171119.gz; echo file* | awk -F'[.-]' '{print \$1\$3\".log\"}'"
(like you wrote in your question) this means the following command is executed with a shell in the server machine.
gzip -d /tmp/file.out-20171119.gz; echo file* | awk - F'[.-]' '{print $1$3".log"}'
You are executing two commands, the first to gunzip /tmp/file.out-2017119.gz (beware, as it will be gunzipped in /tmp). And the second can be the source for the problem. It is echoing all the files in the local directory (this is, the root user home directory, probably /root in the server) that begin with file in the name (probably none), and feeding that to the next awk command.
As a general rule.... test your command locally, and when it works locally, just escape all special characters that will go unescaped, after being parsed by the first shell.
another way to solve the problem is to use gzip(1) as a filter... so you can decide the name of the output file
ssh root#server1 "gzip -d </tmp/file.out-20171119.gz >file20171119.log"
this way you save an awk(1) execution just to format the output file. Or if you have the date from an environment variable.
DATE=`date +%Y%m%d`
ssh root#server1 "gzip -d </tmp/file.out-${DATE}.gz >file${DATE}.log"
Finally, let me give some advice: Don't use /tmp to uncompress files. /tmp is used by several distributions as a high speed temporary dir. It is normally ram based, too quick, but limited space, so uncompressing a log file there can fill up the memory of the kernel used for the ram based filesystem, which is not a good idea. Also, a log file normally expands a lot and /tmp is a local system general directory, where other users can store files named file<something> and you can clash with those files (in case you do searches with wildcard patterns, like you do in your command) Also, it is common once you know the name of the file to assign it to environment variables and use those variables, so case you need to change the format of the filename, you do it in only one place.

referencing stdout in a command that has been piped into

I want to make a simple dmenu command that reads a file of commands and names. Then takes the names and displays them using dmenu then takes dmenu's output and runs the associated command using the file again.
I got to the point where dmenu displays the names, but I don't really know where to go from there. Learning bash is a really daunting task to me and I don't really know where to start with this seemingly simple script/command.
here is the file:
Pushbullet
google-chrome-stable --app=https://www.pushbullet.com
Steam
steam
Chrome
google-chrome-stable
Libre Office
libreoffice
Transmission
transmission-qt
Audio Control Panel
sudo pavucontrol & bluberry
and here is what I have so far for my command:
awk 'NR % 2 != 0' /home/rocco/programlist | dmenu | ??(grep -l "stdout" /home/rocco/programlist....)
It was my thinking that I could somehow pipe into grep or awk with the name of the application then get the line number then add one and pipe that into sh.
Thanks
I have no experience with dmenu but if I understand how it works correctly, this should do what you want. Wrapping a command in $(…) returns the output as a variable, which we can pass on to another command.
#!/bin/bash
plist="/home/rocco/programlist"
# pipe every second line to dmenu
selected=$(awk 'NR % 2 != 0' "$plist" | dmenu)
# search for the selected item, get the command after it
cmd=$(grep -A1 "$selected" "$plist" | tail -n 1)
# run the command
$cmd
Worth mentioning a mistake in your question. dmenu sends to stdout, or standard output, but the next program in line would be reading stdin, or standard input. In any case, grep can't take patterns on standard input, which is why I've saved to a variable instead of trying to pipe it somewhere.
Assuming you have programlist.txt in the working directory you can use:
awk 'NR%2 !=0' programlist.txt |dmenu |awk '{system("grep --no-group-separator -A 1 '"'"'"$0"'"'"' programlist.txt");}' |awk '{if(NR==2){system($0);}}'
Note the quoting of the $0 in the first awk envocation. This is necessary to get names with spaces in them like "Libre Office"

how to pass asterisk into ls command inside bash script

Hi… Need a little help here…
I tried to emulate the DOS' dir command in Linux using bash script. Basically it's just a wrapped ls command with some parameters plus summary info. Here's the script:
#!/bin/bash
# default to current folder
if [ -z "$1" ]; then var=.;
else var="$1"; fi
# check file existence
if [ -a "$var" ]; then
# list contents with color, folder first
CMD="ls -lgG $var --color --group-directories-first"; $CMD;
# sum all files size
size=$(ls -lgGp "$var" | grep -v / | awk '{ sum += $3 }; END { print sum }')
if [ "$size" == "" ]; then size="0"; fi
# create summary
if [ -d "$var" ]; then
folder=$(find $var/* -maxdepth 0 -type d | wc -l)
file=$(find $var/* -maxdepth 0 -type f | wc -l)
echo "Found: $folder folders "
echo " $file files $size bytes"
fi
# error message
else
echo "dir: Error \"$var\": No such file or directory"
fi
The problem is when the argument contains an asterisk (*), the ls within the script acts differently compare to the direct ls command given at the prompt. Instead of return the whole files list, the script only returns the first file. See the video below to see the comparation in action. I don't know why it behaves like that.
Anyone knows how to fix it? Thank you.
Video: problem in action
UPDATE:
The problem has been solved. Thank you all for the answers. Now my script works as expected. See the video here: http://i.giphy.com/3o8dp1YLz4fIyCbOAU.gif
The asterisk * is expanded by the shell when it parses the command line. In other words, your script doesn't get a parameter containing an asterisk, it gets a list of files as arguments. Your script only works with $1, the first argument. It should work with "$#" instead.
This is because when you retrieve $1 you assume the shell does NOT expand *.
In fact, when * (or other glob) matches, it is expanded, and broken into segments by $IFS, and then passed as $1, $2, etc.
You're lucky if you simply retrieved the first file. When your first file's path contains spaces, you'll get an error because you only get the first segment before the space.
Seriously, read this and especially this. Really.
And please don't do things like
CMD=whatever you get from user input; $CMD;
You are begging for trouble. Don't execute arbitrary string from the user.
Both above answers already answered your question. So, i'm going a bit more verbose.
In your terminal is running the bash interpreter (probably). This is the program which parses your input line(s) and doing "things" based on your input.
When you enter some line the bash start doing the following workflow:
parsing and lexical analysis
expansion
brace expansion
tidle expansion
variable expansion
artithmetic and other substitutions
command substitution
word splitting
filename generation (globbing)
removing quotes
Only after all above the bash
will execute some external commands, like ls or dir.sh... etc.,
or will do so some "internal" actions for the known keywords and builtins like echo, for, if etc...
As you can see, the second last is the filename generation (globbing). So, in your case - if the test* matches some files, your bash expands the willcard characters (aka does the globbing).
So,
when you enter dir.sh test*,
and the test* matches some files
the bash does the expansion first
and after will execute the command dir.sh with already expanded filenames
e.g. the script get executed (in your case) as: dir.sh test.pas test.swift
BTW, it acts exactly with the same way for your ls example:
the bash expands the ls test* to ls test.pas test.swift
then executes the ls with the above two arguments
and the ls will print the result for the got two arguments.
with other words, the ls don't even see the test* argument - if it is possible - the bash expands the wilcard characters. (* and ?).
Now back to your script: add after the shebang the following line:
echo "the $0 got this arguments: $#"
and you will immediatelly see, the real argumemts how your script got executed.
also, in such cases is a good practice trying to execute the script in debug-mode, e.g.
bash -x dir.sh test*
and you will see, what the script does exactly.
Also, you can do the same for your current interpreter, e.g. just enter into the terminal
set -x
and try run the dir.sh test* = and you will see, how the bash will execute the dir.sh command. (to stop the debug mode, just enter set +x)
Everbody is giving you valuable advice which you should definitely should follow!
But here is the real answer to your question.
To pass unexpanded arguments to any executable you need to single quote them:
./your_script '*'
The best solution I have is to use the eval command, in this way:
#!/bin/bash
cmd="some command \"with_quetes_and_asterisk_in_it*\""
echo "$cmd"
eval $cmd
The eval command takes its arguments and evaluates them into the command as the shell does.
This solves my problem when I need to call a command with asterisk '*' in it from a script.

How to use select with awk in bash script?

I have to write a bash script for university, the text says:
Write a bash script that allows root user, to get a list of all users
of the machine. Selecting a user, using select, will be required to
indicate a directory (indicate the absolute path). At this point in
the output will have be shown a list of all files folder owned by the
user, ranked in ascending order according to the size of file.
To check if the user is root i used:
if[ "$(id -u)" = 0 ]; then
To get the list of users of the machine I was thinking of using awk:
awk -F':' '{ print$1}' /etc/passwd
How can I use select with awk?
Is there another way without using awk?
Thank you so much in advance
Here is the way to use awk in select statement, you need finish the rest for your homework (for example, sort the result)
#!/usr/bin/env bash
select user in $(awk -F ":" '{print $1}' /etc/passwd )
do
read -p "input the absolute directory: " path
find $path -type f -user "$user" -ls
done
Another way to test the UID by arithmetic (smarter!?) is :
if((UID==0)); then
...
else
...
fi
Check http://wiki.bash-hackers.org/syntax/arith_expr

using grep in a If statement to get all items, ignoring spaces

This is part of a homework problem in a beginning bash class.
I need to bring in the passwd file, which I have done with my passfile variable, then I need to be able to extract certain pieces of it and display the different fields. When I manually grep from CLI using this statement below it works fine. I'm wanting all the variables and I get them all.
grep 1000 passfile | cut -c1-
However, when I do this from the script it stops or breaks or starts over at the first 'blank space' in the users full name. John D. Doe will return 3 lines when I only want one. I see this by echoing the value of i and the following.
for i in `grep 1000 ${passfile} | cut -c1-
user=`echo $1 | cut -d : -f1`
userID=`echo $1 | cut -d : -f3`
For example, if the line reads
jdoe:x:123:1000:John D Doe:/home/jdoe:/bin/bash
I get the following:
i = jdoe:x:123:1000:John
which gives me:
User is jdoe, UID is 509
but then in the next line i starts at R.
i = R. so User is R., UID is R.
next line
i = Johnson:/home/jjohnson:/bin/bash
which returns User is Johnson, UID is /bin/bash
The passwd file holds many users so I need to use the for loop to process them all. I think if I can get it to ignore the space I can get it. But not knowing a whole lot about linux, I'm not sure if I'm even going down the right path. Thanks in Advance for guidence/help.
By default, cut splits on spaces, not colons. If you continue to use it, specify the separator.
You probably want to use IFS=: and a read statement in a while loop to get the values in:
while IFS=: read user password uid gid comment home shell
do
...whatever...
done < /etc/passwd
Or you can pipe the output of grep into the while loop.
Are you allowed to use any external program? If so, I'd recommend awk
UID=1000
awkcmd="\$4==\"$UID\" {print \"user:\",\$1}"
cat $PASSWORDFILE | awk -F ":" "$awkcmd"
when parsing structured files with specific field delimiters such as passwd file, the appropriate tool for the job is awk.
UID=1000
awk -vuid="$UID" '$4==uid{print "user: "$1}' /etc/passwd
you do not have to use grep or cut or anything else. ( Of course, you can also use pure bash while read loops as demonstrated.)

Resources