I want to add some users who are in this file like:
a b
c d
e f
firstname lastname always
#!/bin/bash
Lines=$(cat newusers.txt | wc -l)
first=$(cat newusers.txt | awk '{print $1}')
last=$(cat newusers.txt | awk '{print $2}')
#test
echo $Lines;
echo $first;
echo $last;
until [ -z $1]; then
useradd - m -d /home/$1 -c "$1 + $2" $1
fi
before loop it works fine but I can't add newline.
The echo shows a c e and second for lastname b d f.
I tried to add newline in but it doesn't works.
What can i use for this?
Because I guess I can't add the user because of the newline problem.
I also searched on stackoverflow to find out a way to check if the user already exists by /dev/null but which variable do i have to use for it?
It's easier to process the file line by line:
while read first last ; do
useradd -m -d /home/"$first" -c "$fist + $last" "$first"
done < newusers.txt
I do not understand what you mean to do by your code, but if you want to read the file line by line and get the values of different fields then you can use the following code snippet:
#!/bin/bash
filename="newusers.txt"
while read -r line
do
fn=$( echo "$line" |cut -d" " -f1 )
ln=$( echo "$line" |cut -d" " -f2 )
echo "$fn $ln"
done < "$filename"
Note: You cannot add users the way you want to using bash script; since you will be prompted for password which must be supplied using tty you can use expect to program it; or use system calls.
Related
I have been busting my head all day long without coming up with a sucessfull solution.
Setup:
We have Linux RHEL 8.3 and a file, script.sh
There is an enviroment variable set by an application with a dynamic string in it.
export PROGARM_VAR="abc10,def20,ghi30"
The delimiter is always "," and the values inside vary from 1 to 20.
Inside the script I have defined 20 variables which take the values
using "cut" command I take each value and assign it to a variable
var1=$(echo $PROGARM_VAR | cut -f1 -d,)
var2=$(echo $PROGARM_VAR | cut -f2 -d,)
var3=$(echo $PROGARM_VAR | cut -f3 -d,)
var4=$(echo $PROGARM_VAR | cut -f4 -d,)
etc
In our case we will have:
var1="abc10" var2="def20" var3="ghi30" and var4="" which is empty
The loop must take each variable, test if its not empty and execute 10 pages of code using the tested variable. When it reaches an empty variable it should break.
Could you give me a hand please?
Thank you
Just split it with a comma. There are endless possibilities. You could:
10_pages_of_code() { echo "$1"; }
IFS=, read -a -r vars <<<"abc10,def20,ghi30"
for i in "${vars[#]}"; do 10_pages_of_code "$i"; done
or:
printf "%s" "abc10,def20,ghi30" | xargs -n1 -d, bash -c 'echo 10_pages_of_code "$1"' _
A safer code could use readarray instead of read to properly handle newlines in values, but I doubt that matters for you:
IFS= readarray -d , -t vars < <(printf "%s" "abc10,def20,ghi30")
You could also read in a stream up:
while IFS= read -r -d, var || [[ -n "$var" ]]; do
10_pages_of_code "$var"
done < <(printf "%s" "abc10,def20,ghi30")
But still you could do it with cut... just actually write a loop and use an iterator.
i=0
while var=$(printf "%s\n" "$PROGARM_VAR" | cut -f"$i" -d,) && [[ -n "$var" ]]; do
10_pages_of_code "$var"
((i++))
done
or
echo "$PROGRAM_VAR" | tr , \\n | while read var; do
: something with $var
done
suppose I have a text file something.txt. written with
4062,2016-12-31
I want to send "4062" in one command and "2016-12-31" as a string in another command in a script. Can it be done with BASH scripting?
IFS="," read -r a b < file
echo "$a"
echo "$b"
Output:
4062
2016-12-31
This would work in most cases, unless you are handling a very large file.
for line $(cat yourfile.txt)
do
field1=$(echo $line | cut -f1 -d,)
field2=$(echo $line | cut -f2 -d,)
your_command_1 $field1
your_command_2 $field2
done
a=$(cat file.txt); IFS=',' list=($a) ; echo ${list[0]}; echo ${list[1]}
The same way it’s possible to write a file that autoextracts itself, I’m looking for a way to autorun a program within a script (or whatever it needs). I want the program part of the script, because I just want one file. It’s actually a challenge: I have a xz compressed program, and I wanna be able to run it without any intervention of the xz program by the user (just a ./theprogram).
Any idea?
Autorun after doing what? Login? Call it in ~/.bashrc. During boot? Write an appropriate /etc/init.d/yourprog and link it to the desired runlevel. Selfextract? Make it a shell archive (shar file). See the shar utility, http://linux.die.net/man/1/shar
Sorry but I was just thinking... Something like this would not work?
(I am assuming it is a script...)
#!/bin/bash
cat << 'EOF' > yourfile
yourscript
EOF
chmod +x yourfile
./yourfile
Still, it's pretty hard to understand exactly what you are trying to do... it seems to me that the "autorun" is pretty similar to a "call the program from within the script"..
I had written a script for this. This should help:
#!/bin/bash
set -e
payload=$(cat $0 | grep --binary-files=text -n ^PAYLOAD: | cut -d: -f1 )
filaname=`head $0 -n $payload | tail -n 1 | cut -d: -f2-`
tail -n +$(( $payload + 1 )) $0 > /tmp/$filaname
set +e
#Do whatever with the payload
exit 0
#Command to add payload:
#read x; ls $x && ( cp 'binary_script.sh' ${x}_binary_script.sh; echo $x >> ${x}_binary_script.sh; cat $x >> ${x}_binary_script.sh )
#Note: Strictly NO any character after "PAYLOAD:", not even newline...
PAYLOAD:
Sample usage:
Suppose myNestedScript.sh contains below data:
#!/bin/bash
echo hello world
Then run
x=myNestedScript.sh; ls $x && ( cp 'binary_script.sh' ${x}_binary_script.sh; echo $x >> ${x}_binary_script.sh; cat $x >> ${x}_binary_script.sh )
It will generate below file, which you can directly execute. Upon executing below file, it will extract myNestedScript.sh to /tmp & run that script.
#!/bin/bash
set -e
payload=$(cat $0 | grep --binary-files=text -n ^PAYLOAD: | cut -d: -f1 )
filaname=`head $0 -n $payload | tail -n 1 | cut -d: -f2-`
tail -n +$(( $payload + 1 )) $0 > /tmp/$filaname
set +e
chmod 755 /tmp/$filaname
/tmp/$filaname
exit 0
PAYLOAD:myNestedScript.sh
#!/bin/bash
echo hello world
I've tried for hours to create bulk users from a text file but I cant make it work. In the text file i have the following format:
John Smith:Student
The code:
#!/bin/bash
FILE=/home/knoppix/users.txt
GECOS=$(cat $FILE | cut -d: -f1)
USRGRP=$(cat $FILE | cut -d: -f2)
groupadd -f $USRGRP
echo "type username"
read USERNM
USERPW="123456"
useradd $USERNM -p $USERPW -g $USRGRP -c "$GECOS,$USRGRP" -d $HOME/$USERNM -s $USERSH -m
It is not working, when I tried to debug the bash script I think its not reading line by line, its grabbing all the content in the fields.
I'm missing a do while or while IFS or for line in users.txt. Something like this but I'm not expert help. also should I use newusers ir just useradd.
i need something like do while read each line
> outFile
while read var1 var2 var3 var4 ; do
echo $var1 $var2 $var3 >> outFile
echo $var4 >> outFile
done < /home/knoppix/users.txt
This assumes users.txt values are separated by spaces chars and that there are 4 words per line that you want to do some processing on.
If you values are separated with something beside space chars, for example the ':' char, then you have to tell the read command what to use as in IFS (InternalFieldSep), so
savIFS=${IFS}
> outFile
while IFS=":" read var1 var2 var3 var4 ; do
echo $var1 $var2 $var3 >> outFile
echo $var4 >> outFile
done < /home/knoppix/users.txt
IFS=${savIFS}
Also,
Code that is formed like
ECOS=$(cat $FILE | cut -d: -f1)
can be reduced to
ECOS=$( cut -d: -f1 $FILE )
Also note that for each line in $FILE, you will get a value, so if there are 3 lines of text, ECOS will be ECOS="line1Val line2Val line3Val"
I hope this helps.
I am not a Linux scripting expert and I have exhausted my knowledge on this matter. Here is my situation.
I have a list of states passed as a command line argument to a shell script ( e.g "AL,AK,AS,AZ,AR,CA..." ). The Shell script needs to extract each of the state code and write it to a file ( states.txt) , with each state in one line. See below
AL
AK
AS
AZ
AR
CA
..
..
How can this be achieved using a linux shell script.
Thanks in advance.
Use tr:
echo "AL,AK,AS,AZ,AR,CA" | tr ',' '\n' > states.txt
echo "AL,AK,AS,AZ,AR,CA" | awk -F, '{for (i = 1; i <= NF; i++) print $i}';
Naive solution:
echo "AL,AK,AS,AZ,AR,CA" | sed 's/,/\n/g'
I think awk is the simplest solution, but you could try using cut in a loop.
Sample script (outputs to stdout, but you can just redirect it):
#!/bin/bash
# Check for input
if (( ${#1} == 0 )); then
echo No input data supplied
exit
fi
# Initialise first input
i=$1
# While $i still contains commas
while { echo $i| grep , > /dev/null; }; do
# Get first item of $i
j=`echo $i | cut -d ',' -f '1'`
# Shift off the first item of $i
i=`echo $i | cut --complement -d ',' -f '1'`
echo $j
done
# Display the last item
echo $i
Then you can just run it as ./script.sh "AL,AK,AS,AZ,AR,CA" > states.txt (assuming you save it as script.sh in the local directory and give it execute permission)