how to read a file line by line and each line as an argument input to a .exe file and capture the output to another file in linux - linux

I try to write a shell script that reads a .txt file line by line, and each line will as the input to my .exe file and I also want to capture the output of the .exe file and export it to another .txt file. my code likes that, but it doesn't work. when I try input manually like "./caculate.exe "1" the program also do not take 1 as an input, still ask me to input manually again.
#!/bin/bash
while IFS= read -r LINE; do
./caculate.exe "$LINE"
done < data.txt > f.txt

If calculate.exe normally gets its input from standard input, you need to pipe the variable to it, not use an argument.
#!/bin/bash
while IFS= read -r LINE
do
printf "%s\n" "$line" | ./calculate.exe
done < data.txt > f.txt

Related

Redirecting from a .txt file to an executable file in Linux

I have a text file with all the numbers from 1 to 1000 (numbers.txt) and I have an executable file (ex2-1) that when it gets all the numbers from 1 to 1000 (it gets the numbers as input one by one) it print "Done!".
When you run the file you see:
please insert 1:
If you enter 1 it shows the same but with 2 and if not it will print "wrong input" and close.
I know how to read from a text file line by line:
#!/bin/bash
filename='numbers.txt'
while read line; do
echo "$line" #echo is just to show where the number is being saved
done < $filename
But is there any way to redirect so that instead of being printed to the screen it will go to the executable file?
You can run all these commands in a subshell and then redirect its output through a pipe to a process corresponding to a running instance of the executable file ex2-1:
(filename='numbers.txt'; while read line; do echo "$line"; done < "$filename") | ex2-1
However, as you read the file line by line, you could simply run cat on the file numbers.txt instead:
cat numbers.txt | ex2-1
or even more concise and with just a single process:
ex2-1 < numbers.txt

For loop in command line runs bash script reading from text file line by line

I have a bash script which asks for two arguments with a space between them. Now I would like to automate filling out the prompt in the command line with reading from a text file. The text file contains a list with the argument combinations.
So something like this in the command line I think;
for line in 'cat text.file' ; do script.sh ; done
Can this be done? What am I missing/doing wrong?
Thanks for the help.
A while loop is probably what you need. Put the space separated strings in the file text.file :
cat text.file
bingo yankee
bravo delta
Then write the script in question like below.
#!/bin/bash
while read -r arg1 arg2
do
/path/to/your/script.sh "$arg1" "$arg2"
done<text.file
Don't use for to read files line by line
Try something like this:
#!/bin/bash
ARGS=
while IFS= read -r line; do
ARGS="${ARGS} ${line}"
done < ./text.file
script.sh "$ARGS"
This would add each line to a variable which then is used as the arguments of your script.
'cat text.file' is a string literal, $(cat text.file) would expand to output of command however cat is useless because bash can read file using redirection, also with quotes it will be treated as a single argument and without it will split at space tab and newlines.
Bash syntax to read a file line by line, but will be slow for big files
while IFS= read -r line; do ... "$line"; done < text.file
unsetting IFS for read command preserves leading spaces
-r option preserves \
another way, to read whole file is content=$(<file), note the < inside the command substitution. so a creative way to read a file to array, each element a non-empty line:
read_to_array () {
local oldsetf=${-//[^f]} oldifs=$IFS
set -f
IFS=$'\n' array_content=($(<"$1")) IFS=$oldifs
[[ $oldsetf ]]||set +f
}
read_to_array "file"
for element in "${array_content[#]}"; do ...; done
oldsetf used to store current set -f or set +f setting
oldifs used to store current IFS
IFS=$'\n' to split on newlines (multiple newlines will be treated as one)
set -f avoid glob expansion for example in case line contains single *
note () around $() to store the result of splitting to an array
If I were to create a solution determined by the literal of what you ask for (using a for loop and parsing lines from a file) I would use iterations determined by the number of lines in the file (if it isn't too large).
Assuming each line has two strings separated by a single space (to be used as positional parameters in your script:
file="$1"
f_count="$(wc -l < $file)"
for line in $(seq 1 $f_count)
do
script.sh $(head -n $line $file | tail -n1) && wait
done
You may have a much better time using sjsam's solution however.

Save Bash Shell Script Output To a File with a String

I have an executable that takes a file and outputs a line.
I am running a loop over a directory:
for file in $DIRECTORY/*.png
do
./eval $file >> out.txt
done
The output of executable does not contain the name of the file.
I want to append the file name with each output.
EDIT1
Perhaps, I could not explain it correctly
I want the name of the file and the output of the program as well, which is processing the same file, Now I am doing following
for file in $DIRECTORY/*.png
do
echo -n $file >> out.txt
or
printf "%s" "$file" >> out.txt
./eval $file >> out.txt
done
For both new line is inserted
If I understood your question, what you want is:
get the name of the file,
...and the output or the program processing the file (in your case, eval),
...on the same line. And this last part is your problem.
Then I'd suggest composing a single line of text (using echo), comprising:
the name of the file, this is the $file part,
...followed by a separator, you may not need that but it may help further processing of the result. I used ":". You can skip this part if this is not interesting for you,
...followed by the output of the program processing the file: this is the $(...) construct
echo $file ":" $(./eval $file) >> out.txt
...and finally appending this line of text to a file, you got that part right.
please use like this
echo -n `echo ${file}|tr -d '\n'` >> out.txt
OR
newname=`echo ${file}|tr -d '\n'`
echo -n $newname >> out.txt

While loop in bash using variable from txt file

I am new to bash and writing a script to read variables that is stored on each line of a text file (there are thousands of these variables). So I tried to write a script that would read the lines and automatically output the solution to the screen and save into another text file.
./reader.sh > solution.text
The problem I encounter is currently I have only 1 variable store in the Sheetone.txt for testing purpose which should take about 2 seconds to output everything but it is stuck in the while loop as well as is not outputting the solution.
#!/bin/bash
file=Sheetone.txt
while IFS= read -r line
do
echo sh /usr/local/test/bin/test -ID $line -I
done
As indicated in the comments, you need to provide "something" to your while loop. The while construct is written in a way that will execute with a condition; if a file is given, it will proceed until the read exhausts.
#!/bin/bash
file=Sheetone.txt
while IFS= read -r line
do
echo sh /usr/local/test/bin/test -ID $line -I
done < "$file"
# -----^^^^^^^ a file!
Otherwise, it was like cycling without wheels...

linux reading file line by line and passing to another program

I have an input file of this form:
Some text here
Another text here
Something else here
Now I want to write a linux script picks one line at a time from input file and creates a separate file which stores just the line received. After this I want to pass this file to a program (for which I have just binary file). Is it possible to write such a linux script. I am used to programming in C++ I know it is possible there. But I want to know if something like this is possible using linux script. Basically I intend to do the following:
read inputfile.txt line by line
store line read in inputFileInside.txt
./myprogram paramater1 inputFileInside.txt //run my C++ binary file which I usually run as (root$./myprogram parameter1 inputFileInside.txt)
sudo sh -c \"sync; echo 3 > /proc/sys/vm/drop_caches\"
exit when the input file has been read
you can read line by line like this using for loop
while read x
do
echo $x > inputFileInside.txt;
# do whatever you want
done < inputfile.txt
this may hep you to loop, $x is line read one by one till it reach end of file
while read x
do
echo $x > $2;
./myprogram paramater1
#your other command
done < $1;
save the above file as any name like prog.sh, then give execute permission and run ur program with argument
chmod u+x prog.sh
./prog.sh inputfile.txt inputFileInside.txt
here $1 is inputfile.txt and $2 is inputFileInside.txt
while read line; do
echo "$line" > inputFileInside.txt
./myprogram paramater1 inputFileInside.txt \
&& sudo sh -c \"sync; echo 3 > /proc/sys/vm/drop_caches\"
done < <(cat inputfile.txt)
This will:
input the file "inputfile.txt" line-by-line
each line will be assigned to $line as it is read
echo the line into "inputFileInside.txt" overwriting any previous information in it
run "./myprogram ..."
then only run "sudo ..." if "./myprogram ..." was successful
go to next line.
Make sure there is no white space (spaces) behind the end-of-line backslash (\).

Resources