I want to ask the user to enter a few lines of text, it can by anything and I want to store it as a variable that I can call later on. I don't want to create multiple read commands, just one that can hold multiple paragraphs if needed.
I tried this:
echo "Enter your your paragraph:"
read -d '' -n 1 message
while read -d '' -n 1 -t 2 c
do
message+=$c
done
echo ""
echo "$message"
the output is always put into one line of text without spaces or anything. It would look like this when I run the code and enter a few lines of code:
Enter your broadcast message (When done, wait 2 seconds):
This is supposed to be a sentence.
And so is this.
Thisissupposedtobeasentence.Andsoisthis.
It should output the two sentences on sperate lines and with spaces included.
Don't use read for this; requiring all typing to be done without any two-second pauses (and conversely, forcing a wait of two seconds to complete the input) is not very user-friendly. Instead, just read input directly from standard input, which for interactive use simply requires an EOF (Control-d) to finish the input.
c=$(</dev/stdin)
read uses the characters in $IFS as word delimiters. Change your read statement to:
IFS= read -r -d '' -n 1 -t 2 c
Related
I have a json file that is download using curl. It has some information of a confluence page. I want to extract only 3 parts that downloaded information - that is page: id, status and title.
I have written a bash script for this and my constraint is that I am not sure how to pass multiple variables in grep command
id=id #hardcoded
status=status #hardcoded
echo Enter title you are looking for: #taking input from user here read title_name echo echo echo Here are details
curl -u username:password -sX GET "http://X.X.X.X:8090/rest/api/content?type=page&start=0&limit=200" | python -mjson.tool | grep -Eai "$title_name"|$id|$status"
Aside from a typo (you have an unbalanced quote - please always check the syntax for correctness before you are posting something), the basic idea of your approach would work in that
grep -Eai "$title_name|$id|$status"
would select those text lines which contain those lines which contain the content of one of the variables title_name, id or status.
However, it is a pretty fragile solution. I don't know what can be the actual content of those variables, but for instance, if title_name were set to X.Z, it would also match lines containing the string XYZ, since the dot matches any character. Similarily, if title_name would contain, say, a lone [ or (, grep would complained about an unmatched parentheses error.
If you want to match the string literally and not be taken as regular expressions, it is better to write those pattern into a file (one pattern per line) and use
grep -F -f patternfile
for searching. Of course, since you are using bash, you can also use process substitution if you prefer not using an explicit temporary file.
What I am doing is to encode a character 13 places from its current location.
For example, if I input welcome, it should echo jrypbzr.
this is what I wrote:
read words
echo $words | tr '[A-Za-z]' '[????]' (Please ignore the ???? part.)
This successfully solved the encoding problem, however, I need to input multiple times and the code I wrote only read one time. Can someone tell me how to input multiple times?
Thanks!
First, have your input in a text file. Then
while read words
do
# here, do whatever you want with words
done < your-input-file.txt
Explanation: you feed contents of the input file to the while loop, which reads it line by line and stores in words.
If you want to use a delimiter other than newline, you can use:
while IFS=";" read words
and place within the IFS= " " whatever delimiter you like.
I'm using read builtin to read a variable, but I'd like to let the input appears on the next line, that is, the prompt output a new line, but neither of the two works:
$ read -p "Please input:\n" name
Please input:\n
$ read -p 'Please input:\n" name
Please input:\n
As you see new line escape sequence is not interpreted even in the double quote case. So is there anyway to do that?
You can separate the prompt from the actual read :
echo "Please input:"
read name
You can put both on a single line :
echo "Please input:" ; read name
You can also use a different form of quoting :
read -p $'Please input\n' name
This is barely shorter, and many would probably find it a bit less readable, but that is a matter of taste.
I'm new to linux and have been coding some beginenr level shell scripts.
What I want to do is write 2 scripts. The first script will read input from user and the 2nd script will display this input in a loop till it detects an "exit" from the user.
This is how I've coded the 2 shell scripts.
File1.sh:
read var1
echo $var1
File2.sh:
while [ "$var2" != "exit" ]
do
echo $1
read var2
done
Now, I want to use a named pipe to pass the output of File1.sh as input to var1 of File2.sh. I probably will have to modify code in File2.sh so that it will accept argument from a named pipe (as in instead of $1 the input will be from the named pipe), but I'm not at all sure how to go about it.
Giving the output of File1.sh as input to the named pipe can be given as follows:
mkfifo pipe
./File1.sh > pipe
This command keeps asking for input until i break out using ctrl + c. I don't know why that is.
Also how do I make the File2.sh read from this pipe?
will this be correct?
pipe|./File2.sh
I'm very new to linux but I've searched quite a lot online and there isn't even one example of doing this in shell script.
As for your original question, the syntax to read from a named pipe (or any other object in the file system) is
./File2.sh <pipe
Also, your script needs to echo "$var2" with the correct variable name, and double quotes to guard the value against wildcard expansion, variable substitution, etc. See also When to wrap quotes around a shell variable?
The code in your own answer has several new problems.
In File1.sh, you are apparently attempting to declare a variable pipe1, but the assignment syntax is wrong: You cannot have whitespace around the equals sign. Because you never use this variable for anything, this is by and large harmless (but will result in pipe1: command not found which is annoying, of course).
In File2.sh, the while loop's syntax is hopelessly screwed; you dropped the read; the echo still lacks quotes around the variable; and you repeatedly reopen the pipe.
while [ "$input" != "exit" ]
do
read -r input
echo "$input"
done <pipe1
Redirecting the entire loop once is going to be significantly more efficient.
Notice also the option -r to prevent read from performing any parsing of the values it reads. (The ugly default behavior is legacy from the olden days, and cannot be fixed without breaking existing scripts, unfortunately.)
First in File1.sh, echo var1 should be echo $var1.
In order to get input from pipe, try:
./File2.sh < pipe
This is how I solved it.
First mistake I made was to declare the pipe outside the programs. What I was expecting was there is a special way in which a program accepts input parameters of the type "pipe". Which as far as I've figured is wrong.
What you need to do is declare the pipe inside the program. So in the read program what you do is,
For File1.sh:
pipe1=/Documents
mkfifo pipe1
cat > pipe1
This will send the read input from the user to the pipe.
Now, when the pipe is open, it will keep accepting input. You can read from the pipe only when its open. So you need to open a 2nd terminal window to run the 2nd program.
For File2.sh:
while("$input" != "exit")
do
read -r input < pipe1
echo "$input"
done
So whenever you input some string in the first terminal window, it will be reflected in the 2nd terminal window until "exit" is detected.
I am running find command to get list of files having a particular size and then i save the output in a file, now i traverse through this file one by one and ask user which one he wants to delete. I wanted to do something like add a number next to each file in list so that user can directly enter the number associated with this file and delete instead of having to go over the whole file. Please help.
select f in $(find . -name '*.txt'); do
if [ -n "$f" ]; then
# put your command here
echo "rm $f"
fi
done
find . -size 5k -okdir rm {} ";"
asks you for each file, whether to perform the action or not, without an intermediate file.
The -okdir is a Gnu-extension to find, and not available for all implementations.
Another, lean approach is to use select:
select fno in $(find . -size 5k);
do
echo rm $fno
done
which is a bashism, maybe not present in your shell.
help select shows its usage. Unfortunately, it doesn't, like the find-solution too, allow to select multiple entries at once, but you get repeated opportunity to select something, until you hit Ctrl+D, which is quiet comfortable.
select: select NAME [in WORDS ... ;] do COMMANDS; done
Select words from a list and execute commands.
The WORDS are expanded, generating a list of words. The
set of expanded words is printed on the standard error, each
preceded by a number. If `in WORDS' is not present, `in "$#"'
is assumed. The PS3 prompt is then displayed and a line read
from the standard input. If the line consists of the number
corresponding to one of the displayed words, then NAME is set
to that word. If the line is empty, WORDS and the prompt are
redisplayed. If EOF is read, the command completes. Any other
value read causes NAME to be set to null. The line read is saved
in the variable REPLY. COMMANDS are executed after each selection
until a break command is executed.
Exit Status:
Returns the status of the last command executed.
This is what it looks like:
select fno in *scala ; do echo "fno: " $fno; done
1) Cartesian.scala 6) MWzufall.scala
2) Haeufigkeit.scala 7) Shuffle.scala
3) HelloChain.scala 8) eHWCChain.scala
4) Lychrel.scala 9) equilibrum.scala
5) M.scala 10) scala
#? 3
fno: HelloChain.scala
#? 3 4
fno:
#?
Note that words are separated by spaces, so you have to take care in the second example, if you have to work with whitespace in filenames.