Use text file as input? What am I doing wrong? - linux

Hello I am trying to create multiple users from a text file
Here:
tony romo
drew brees
laura smith
bob jones
joe hones
kelly clarkson
hank hill
michael scott
and to do that I am using the script
while read first last; do
name="$first $last"
username="$last${first:0:1}"
n=‘egrep –c $username /etc/passwd‘
n=$((n + 1))
username=$username$n
password=‘tr –cd '[:alpha:]' < /dev/urandom | head –c8‘
echo $password | passwd --stdin $username
echo “$username $password” >> /root/tempPasswords
useradd –c “$name” –m $username
done
I then execute the commmand
./name_of_script accounts.txt
Yet nothing happens, any idea why? Thanks for any help!
EDIT
I made the following changes and now I get the useradd command errors any idea why?
I made the changes but now there are errors of the useradd command any reason why?
#!/bin/bash
while read first last; do
name="$first $last"
username="$last${first:0:1}"
n=`egrep –c $username /etc/passwd`
n=$((n + 1))
username=$username$n
password=`tr –cd `[:alpha:]` < /dev/urandom | head –c8`
echo $password | passwd --stdin $username
echo “$username $password” >> /root/tempPasswords
useradd –c "$name" –m $username
done < "$1"

Your quotes are hosed. Where you have a curly quote, you should have a backtick (ASCII 96) which however in this millennium is better written with the modern syntax $(command).
With that out of the way, just add < "$1" after the done to read the first argument as the input for the while read loop.

The script you wrote reads from stdin. Nothing happens because it's waiting for you to type the names out. You can either run your script with a redirection:
./name_of_script < accounts.txt
Or change the script to read from the file on its command-line, $1. To do that, add the redirection to the loop like so:
while read first last; do
...
done < "$1"
You'll also want to get rid of the smart quotes. ‘ should be ` (backtick) and “...” should be "..." (plain double quotes). Avoid editing code in a fancy word processor like MS Word. You'll want to stick to plain text editors like Vim or Notepad++, ones designed to edit code rather than documents.

Related

Creating Users from a text file using Bash

I've been looking at some of the other answers for this question but it hasn't solved my issue.
Basically, I need to create a script that reads from a text file, that is given by a user input, and creates users from the contents of the file. I've managed to get it to create the first user but it doesn't seem to be creating the other users in the file.
My text document is literally:
user1
user2
user3
And heres the code I have:
echo -n "Enter name of text file "; read text
while read USER;
do
USERNAME=$(cut -d$'\n' -f $text)
echo $USERNAME
useradd -m "${USERNAME}"
done < $text
It seems to only be reading the very first entry in the text file but I thought using the \n would mean it cut the other lines and use them next? I tried using the 'cat' command instead but wasn't having much luck with it and this is the furthest I've managed to get but I was hoping someone would help me find where I've gone wrong.
Thanks :)
First, you should start your script with
#!/usr/bin/env bash
to ensure that it is interpreted as bash.
Second, the value you need is already available in the USER variable. There is no need to use cut.
#!/usr/bin/env bash
echo -n "Enter name of text file: "; read FILENAME
while read USER; do
echo "$USER"
useradd -m "${USER}"
done < "${FILENAME}"
You can use awk to print and pipe to shell
$ awk '{print "useradd -m "$1}' <file> | sh

TCSH error in foreach loop: files.list: Command not found

I am trying to pass a user-input variable (a file name) into a foreach loop in tcsh. The user entered variable is, for example, "files.list" (saved in the same folder as the Shell Script is saved, and is being run from).
Here is my code:
#! /usr/bin/tcsh -f
echo please enter files list
set x = $<
foreach i ('$x')
echo $i
end
What I want is for each of the words in "files.list" to be output to the screen. Files.list contains 5 lines, each with a file name.
myScript22.sh
Mad45.sh
Number32.sh
killBill.sh
gotMilk.sh
bugslife.sh
I get an error - "foreach: Words not parenthesized."
Could it be that 'cat $x' isn't calling the x variable correctly? If so, how do I get the file set up so it's contents can be looped thru?
Any help is appreciated!
If you really really really really have to use tcsh, then the following is the propper way to do it:
#!/usr/bin/tcsh -f
echo please enter files list
set x = $<
foreach line (" `cat $x` ")
echo "$line"
end
It is important to see that the cat command is between <double-quotes>. The difference is that otherwise, the foreach statement will read word-by-word, while the double-quoted version will read line-by-line. The logic of this is ... questionable. Also, I quoted the variable line in the echo statement, because it will actually complain when it hits an empty line.
In bash, you would just do the following nice thing:
#!/usr/bin/env bash
echo "Please enter files list"
file=""
while [[ ! -e $file ]]; do read -r file; done
while read -r line; do
echo "$line"
done < "$file"
Very important reads:
CSH PROGRAMMING CONSIDERED HARMFUL
Top Ten Reasons not to use the C shell
For reference purposes, I was missing the cat command, before the $x.
So the code should read:
#! /usr/bin/tcsh -f
echo please enter files list
set x = $<
foreach i ('cat $x')
echo $i
end
You have two mistakes in your tcsh script:
Missing cat command in front of the filename in your foreach condition.
Use of straight single quotes(' ') instead of backquotes (``) in your foreach condition.
The following script should work for you.
#!/usr/bin/tcsh -f
echo please enter files list
set x = $<
foreach i(`cat $x`)
echo $i
end

How to execute Linux shell variables within double quotes?

I have the following hacking-challenge, where we don't know, if there is a valid solution.
We have the following server script:
read s # read user input into var s
echo "$s"
# tests if it starts with 'a-f'
echo "$s" > "/home/user/${s}.txt"
We only control the input "$s". Is there a possibility to send OS-commands like uname or do you think "no way"?
I don't see any avenue for executing arbitrary commands. The script quotes $s every time it is referenced, so that limits what you can do.
The only serious attack vector I see is that the echo statement writes to a file name based on $s. Since you control $s, you can cause the script to write to some unexpected locations.
$s could contain a string like bob/important.txt. This script would then overwrite /home/user/bob/important.txt if executed with sufficient permissions. Sorry, Bob!
Or, worse, $s could be bob/../../../etc/passwd. The script would try to write to /home/user/bob/../../../etc/passwd. If the script is running as root... uh oh!
It's important to note that the script can only write to these places if it has the right permissions.
You could embed unusual characters in $s that would cause irregular file names to be created. Un-careful scripts could be taken advantage of. For example, if $s were foo -rf . bar, then the file /home/user/foo -rf . bar.txt would be created.
If someone ran for file in /home/user; rm $file; done they'd have a surprise on their hands. They would end up running rm /home/user/foo -rf . bar.txt, which is a disaster. If you take out /home/user/foo and bar.txt you're left with rm -rf . — everything in the current directory is deleted. Oops!
(They should have quoted "$file"!)
And there are two other minor things which, while I don't know how to take advantage of them maliciously, do cause the script to behave slightly differently than intended.
read allows backslashes to escape characters like space and newline. You can enter \space to embed spaces and \enter to have read parse multiple lines of input.
echo accepts a couple of flags. If $s is -n or -e then it won't actually echo $s; rather, it will interpret $s as a command-line flag.
Use read -r s or any \ will be lost/missinterpreted by your command.
read -r s?"Your input: "
if [ -n "${s}" ]
then
# "filter" file name from command
echo "${s##*/}" | sed 's|^ *\([[:alnum:]_]\{1,\}\)[[:blank:]].*|/home/user/\1.txt|' | read Output
(
# put any limitation on user here
ulimit -t 5 1>/dev/null 2>&1
`${read}`
) > ${OutPut}
else
echo "Bad command" > /home/user/Error.txt
fi
Sure:
read s
$s > /home/user/"$s".txt
If I enter uname, this prints Linux. But beware: this is a security nightmare. What if someone enters rm -rf $HOME? You'd also have issues with commands containing a slash.

Looping through lines in a file in bash, without using stdin

I am foxed by the following situation.
I have a file list.txt that I want to run through line by line, in a loop, in bash. A typical line in list.txt has spaces in. The problem is that the loop contains a "read" command. I want to write this loop in bash rather than something like perl. I can't do it :-(
Here's how I would usually write a loop to read from a file line by line:
while read p; do
echo $p
echo "Hit enter for the next one."
read x
done < list.txt
This doesn't work though, because of course "read x" will be reading from list.txt rather than the keyboard.
And this doesn't work either:
for i in `cat list.txt`; do
echo $i
echo "Hit enter for the next one."
read x
done
because the lines in list.txt have spaces in.
I have two proposed solutions, both of which stink:
1) I could edit list.txt, and globally replace all spaces with "THERE_SHOULD_BE_A_SPACE_HERE" . I could then use something like sed, within my loop, to replace THERE_SHOULD_BE_A_SPACE_HERE with a space and I'd be all set. I don't like this for the stupid reason that it will fail if any of the lines in list.txt contain the phrase THERE_SHOULD_BE_A_SPACE_HERE (so malicious users can mess me up).
2) I could use the while loop with stdin and then in each loop I could actually launch e.g. a new terminal, which would be unaffected by the goings-on involving stdin in the original shell. I tried this and I did get it to work, but it was ugly: I want to wrap all this up in a shell script and I don't want that shell script to be randomly opening new windows. What would be nice, and what might somehow be the answer to this question, would be if I could figure out how to somehow invoke a new shell in the command and feed commands to it without feeding stdin to it, but I can't get it to work. For example this doesn't work and I don't really know why:
while read p; do
bash -c "echo $p; echo ""Press enter for the next one.""; read x;";
done < list.txt
This attempt seems to fail because "read x", despite being in a different shell somehow, is still seemingly reading from list.txt. But I feel like I might be close with this one -- who knows.
Help!
You must open as a different file descriptor
while read p <&3; do
echo "$p"
echo 'Hit enter for the next one'
read x
done 3< list.txt
Update: Just ignore the lengthy discussion in the comments below. It has nothing to do with the question or this answer.
I would probably count lines in a file and iterate each of those using eg. sed. It is also possible to read infinitely from stdin by changing while condition to: while true; and exit reading with ctrl+c.
line=0 lines=$(sed -n '$=' in.file)
while [ $line -lt $lines ]
do
let line++
sed -n "${line}p" in.file
echo "Hit enter for the next ${line} of ${lines}."
read -s x
done
AWK is also great tool for this. Simple way to iterate through input would be like:
awk '{ print $0; printf "%s", "Hit enter for the next"; getline < "-" }' file
As an alternative, you can read from stderr, which by default is connected to the tty as well. The following then also includes a test for that assumption:
(
tty -s <& 2|| exit 1
while read -r line; do
echo "$line"
echo 'Hit enter'
read x <& 2
done < file
)

Read values into a shell variable from a pipe

I am trying to get bash to process data from stdin that gets piped into, but no luck. What I mean is none of the following work:
echo "hello world" | test=($(< /dev/stdin)); echo test=$test
test=
echo "hello world" | read test; echo test=$test
test=
echo "hello world" | test=`cat`; echo test=$test
test=
where I want the output to be test=hello world. I've tried putting "" quotes around "$test" that doesn't work either.
Use
IFS= read var << EOF
$(foo)
EOF
You can trick read into accepting from a pipe like this:
echo "hello world" | { read test; echo test=$test; }
or even write a function like this:
read_from_pipe() { read "$#" <&0; }
But there's no point - your variable assignments may not last! A pipeline may spawn a subshell, where the environment is inherited by value, not by reference. This is why read doesn't bother with input from a pipe - it's undefined.
FYI, http://www.etalabs.net/sh_tricks.html is a nifty collection of the cruft necessary to fight the oddities and incompatibilities of bourne shells, sh.
if you want to read in lots of data and work on each line separately you could use something like this:
cat myFile | while read x ; do echo $x ; done
if you want to split the lines up into multiple words you can use multiple variables in place of x like this:
cat myFile | while read x y ; do echo $y $x ; done
alternatively:
while read x y ; do echo $y $x ; done < myFile
But as soon as you start to want to do anything really clever with this sort of thing you're better going for some scripting language like perl where you could try something like this:
perl -ane 'print "$F[0]\n"' < myFile
There's a fairly steep learning curve with perl (or I guess any of these languages) but you'll find it a lot easier in the long run if you want to do anything but the simplest of scripts. I'd recommend the Perl Cookbook and, of course, The Perl Programming Language by Larry Wall et al.
This is another option
$ read test < <(echo hello world)
$ echo $test
hello world
read won't read from a pipe (or possibly the result is lost because the pipe creates a subshell). You can, however, use a here string in Bash:
$ read a b c <<< $(echo 1 2 3)
$ echo $a $b $c
1 2 3
But see #chepner's answer for information about lastpipe.
I'm no expert in Bash, but I wonder why this hasn't been proposed:
stdin=$(cat)
echo "$stdin"
One-liner proof that it works for me:
$ fortune | eval 'stdin=$(cat); echo "$stdin"'
bash 4.2 introduces the lastpipe option, which allows your code to work as written, by executing the last command in a pipeline in the current shell, rather than a subshell.
shopt -s lastpipe
echo "hello world" | read test; echo test=$test
A smart script that can both read data from PIPE and command line arguments:
#!/bin/bash
if [[ -p /dev/stdin ]]
then
PIPE=$(cat -)
echo "PIPE=$PIPE"
fi
echo "ARGS=$#"
Output:
$ bash test arg1 arg2
ARGS=arg1 arg2
$ echo pipe_data1 | bash test arg1 arg2
PIPE=pipe_data1
ARGS=arg1 arg2
Explanation: When a script receives any data via pipe, then the /dev/stdin (or /proc/self/fd/0) will be a symlink to a pipe.
/proc/self/fd/0 -> pipe:[155938]
If not, it will point to the current terminal:
/proc/self/fd/0 -> /dev/pts/5
The bash [[ -p option can check it it is a pipe or not.
cat - reads the from stdin.
If we use cat - when there is no stdin, it will wait forever, that is why we put it inside the if condition.
The syntax for an implicit pipe from a shell command into a bash variable is
var=$(command)
or
var=`command`
In your examples, you are piping data to an assignment statement, which does not expect any input.
In my eyes the best way to read from stdin in bash is the following one, which also lets you work on the lines before the input ends:
while read LINE; do
echo $LINE
done < /dev/stdin
The first attempt was pretty close. This variation should work:
echo "hello world" | { test=$(< /dev/stdin); echo "test=$test"; };
and the output is:
test=hello world
You need braces after the pipe to enclose the assignment to test and the echo.
Without the braces, the assignment to test (after the pipe) is in one shell, and the echo "test=$test" is in a separate shell which doesn't know about that assignment. That's why you were getting "test=" in the output instead of "test=hello world".
Because I fall for it, I would like to drop a note.
I found this thread, because I have to rewrite an old sh script
to be POSIX compatible.
This basically means to circumvent the pipe/subshell problem introduced by POSIX by rewriting code like this:
some_command | read a b c
into:
read a b c << EOF
$(some_command)
EOF
And code like this:
some_command |
while read a b c; do
# something
done
into:
while read a b c; do
# something
done << EOF
$(some_command)
EOF
But the latter does not behave the same on empty input.
With the old notation the while loop is not entered on empty input,
but in POSIX notation it is!
I think it's due to the newline before EOF,
which cannot be ommitted.
The POSIX code which behaves more like the old notation
looks like this:
while read a b c; do
case $a in ("") break; esac
# something
done << EOF
$(some_command)
EOF
In most cases this should be good enough.
But unfortunately this still behaves not exactly like the old notation
if some_command prints an empty line.
In the old notation the while body is executed
and in POSIX notation we break in front of the body.
An approach to fix this might look like this:
while read a b c; do
case $a in ("something_guaranteed_not_to_be_printed_by_some_command") break; esac
# something
done << EOF
$(some_command)
echo "something_guaranteed_not_to_be_printed_by_some_command"
EOF
Piping something into an expression involving an assignment doesn't behave like that.
Instead, try:
test=$(echo "hello world"); echo test=$test
The following code:
echo "hello world" | ( test=($(< /dev/stdin)); echo test=$test )
will work too, but it will open another new sub-shell after the pipe, where
echo "hello world" | { test=($(< /dev/stdin)); echo test=$test; }
won't.
I had to disable job control to make use of chepnars' method (I was running this command from terminal):
set +m;shopt -s lastpipe
echo "hello world" | read test; echo test=$test
echo "hello world" | test="$(</dev/stdin)"; echo test=$test
Bash Manual says:
lastpipe
If set, and job control is not active, the shell runs the last command
of a pipeline not executed in the background in the current shell
environment.
Note: job control is turned off by default in a non-interactive shell and thus you don't need the set +m inside a script.
I think you were trying to write a shell script which could take input from stdin.
but while you are trying it to do it inline, you got lost trying to create that test= variable.
I think it does not make much sense to do it inline, and that's why it does not work the way you expect.
I was trying to reduce
$( ... | head -n $X | tail -n 1 )
to get a specific line from various input.
so I could type...
cat program_file.c | line 34
so I need a small shell program able to read from stdin. like you do.
22:14 ~ $ cat ~/bin/line
#!/bin/sh
if [ $# -ne 1 ]; then echo enter a line number to display; exit; fi
cat | head -n $1 | tail -n 1
22:16 ~ $
there you go.
The questions is how to catch output from a command to save in variable(s) for use later in a script. I might repeat some earlier answers but I try to line up all the answers I can think up to compare and comment, so bear with me.
The intuitive construct
echo test | read x
echo x=$x
is valid in Korn shell because ksh have implemented that the last command in a piped series is part of the current shell ie. the previous pipe commands are subshells. In contrast other shells define all piped commands as subshells including the last.
This is the exact reason I prefer ksh.
But having to copy with other shells, bash f.ex., another construct must be used.
To catch 1 value this construct is viable:
x=$(echo test)
echo x=$x
But that only caters for 1 value to be collected for later use.
To catch more values this construct is useful and works in bash and ksh:
read x y <<< $(echo test again)
echo x=$x y=$y
There is a variant which I have noticed work in bash but not in ksh:
read x y < <(echo test again)
echo x=$x y=$y
The <<< $(...) is a here-document variant which gives all the meta handling of a standard command line. < <(...) is an input redirection of a file-substitution operator.
I use "<<< $(" in all my scripts now because it seems the most portable construct between shell variants. I have a tools set I carry around on jobs in any Unix flavor.
Of course there is the universally viable but crude solution:
command-1 | {command-2; echo "x=test; y=again" > file.tmp; chmod 700 file.tmp}
. ./file.tmp
rm file.tmp
echo x=$x y=$y
I wanted something similar - a function that parses a string that can be passed as a parameter or piped.
I came up with a solution as below (works as #!/bin/sh and as #!/bin/bash)
#!/bin/sh
set -eu
my_func() {
local content=""
# if the first param is an empty string or is not set
if [ -z ${1+x} ]; then
# read content from a pipe if passed or from a user input if not passed
while read line; do content="${content}$line"; done < /dev/stdin
# first param was set (it may be an empty string)
else
content="$1"
fi
echo "Content: '$content'";
}
printf "0. $(my_func "")\n"
printf "1. $(my_func "one")\n"
printf "2. $(echo "two" | my_func)\n"
printf "3. $(my_func)\n"
printf "End\n"
Outputs:
0. Content: ''
1. Content: 'one'
2. Content: 'two'
typed text
3. Content: 'typed text'
End
For the last case (3.) you need to type, hit enter and CTRL+D to end the input.
How about this:
echo "hello world" | echo test=$(cat)

Resources