linux reading file line by line and passing to another program - linux

I have an input file of this form:
Some text here
Another text here
Something else here
Now I want to write a linux script picks one line at a time from input file and creates a separate file which stores just the line received. After this I want to pass this file to a program (for which I have just binary file). Is it possible to write such a linux script. I am used to programming in C++ I know it is possible there. But I want to know if something like this is possible using linux script. Basically I intend to do the following:
read inputfile.txt line by line
store line read in inputFileInside.txt
./myprogram paramater1 inputFileInside.txt //run my C++ binary file which I usually run as (root$./myprogram parameter1 inputFileInside.txt)
sudo sh -c \"sync; echo 3 > /proc/sys/vm/drop_caches\"
exit when the input file has been read

you can read line by line like this using for loop
while read x
do
echo $x > inputFileInside.txt;
# do whatever you want
done < inputfile.txt
this may hep you to loop, $x is line read one by one till it reach end of file
while read x
do
echo $x > $2;
./myprogram paramater1
#your other command
done < $1;
save the above file as any name like prog.sh, then give execute permission and run ur program with argument
chmod u+x prog.sh
./prog.sh inputfile.txt inputFileInside.txt
here $1 is inputfile.txt and $2 is inputFileInside.txt

while read line; do
echo "$line" > inputFileInside.txt
./myprogram paramater1 inputFileInside.txt \
&& sudo sh -c \"sync; echo 3 > /proc/sys/vm/drop_caches\"
done < <(cat inputfile.txt)
This will:
input the file "inputfile.txt" line-by-line
each line will be assigned to $line as it is read
echo the line into "inputFileInside.txt" overwriting any previous information in it
run "./myprogram ..."
then only run "sudo ..." if "./myprogram ..." was successful
go to next line.
Make sure there is no white space (spaces) behind the end-of-line backslash (\).

Related

Redirecting from a .txt file to an executable file in Linux

I have a text file with all the numbers from 1 to 1000 (numbers.txt) and I have an executable file (ex2-1) that when it gets all the numbers from 1 to 1000 (it gets the numbers as input one by one) it print "Done!".
When you run the file you see:
please insert 1:
If you enter 1 it shows the same but with 2 and if not it will print "wrong input" and close.
I know how to read from a text file line by line:
#!/bin/bash
filename='numbers.txt'
while read line; do
echo "$line" #echo is just to show where the number is being saved
done < $filename
But is there any way to redirect so that instead of being printed to the screen it will go to the executable file?
You can run all these commands in a subshell and then redirect its output through a pipe to a process corresponding to a running instance of the executable file ex2-1:
(filename='numbers.txt'; while read line; do echo "$line"; done < "$filename") | ex2-1
However, as you read the file line by line, you could simply run cat on the file numbers.txt instead:
cat numbers.txt | ex2-1
or even more concise and with just a single process:
ex2-1 < numbers.txt

how to read a file line by line and each line as an argument input to a .exe file and capture the output to another file in linux

I try to write a shell script that reads a .txt file line by line, and each line will as the input to my .exe file and I also want to capture the output of the .exe file and export it to another .txt file. my code likes that, but it doesn't work. when I try input manually like "./caculate.exe "1" the program also do not take 1 as an input, still ask me to input manually again.
#!/bin/bash
while IFS= read -r LINE; do
./caculate.exe "$LINE"
done < data.txt > f.txt
If calculate.exe normally gets its input from standard input, you need to pipe the variable to it, not use an argument.
#!/bin/bash
while IFS= read -r LINE
do
printf "%s\n" "$line" | ./calculate.exe
done < data.txt > f.txt

While loop in bash using variable from txt file

I am new to bash and writing a script to read variables that is stored on each line of a text file (there are thousands of these variables). So I tried to write a script that would read the lines and automatically output the solution to the screen and save into another text file.
./reader.sh > solution.text
The problem I encounter is currently I have only 1 variable store in the Sheetone.txt for testing purpose which should take about 2 seconds to output everything but it is stuck in the while loop as well as is not outputting the solution.
#!/bin/bash
file=Sheetone.txt
while IFS= read -r line
do
echo sh /usr/local/test/bin/test -ID $line -I
done
As indicated in the comments, you need to provide "something" to your while loop. The while construct is written in a way that will execute with a condition; if a file is given, it will proceed until the read exhausts.
#!/bin/bash
file=Sheetone.txt
while IFS= read -r line
do
echo sh /usr/local/test/bin/test -ID $line -I
done < "$file"
# -----^^^^^^^ a file!
Otherwise, it was like cycling without wheels...

Using while/read/do to pass the content of file as the argument of a command

I'm really new to Linux scripting. I am sure this is simple, but I cannot figure it out.
As part of a script, I am trying to pass the content of a file as arguments of a command in a script:
while read i
do $COMMAND $i
done < file.lst
I want to pass every line of the file.lst as the argument of the command except the very first line of the file. How to I do this?
EDIT:
Here is the section of the script:
while read i
do cp --recursive --preserve=all $i $DIR
done < $DIR/file.lst
while read -r i
do
"$COMMAND" "$i"
done < <(sed -n '2,$p' file.lst)
This solutions does not use a while so I am not entirely sure if it solves your problem, but based on your code sample. you can do the following
tail -n +2 input | xargs echo
This will read all lines from input starting at line 2 and execute echo using the value of the line
the file input contains:
skip
1
2
3
executing that command gives
1
2
3
Just substitute input for the file you want and echo for the command you want
Add an extra read to consume the first line before the while loop begins.
{
read -r;
while read -r i; do
"$COMMAND" "$i"
done
} < file.lst

Read filenames from a text file and then make those files?

My code is given below. Echo works fine. But, the moment I redirect output of echo to touch, I get an error "no such file or directory". Why ? How do i fix it ?
If I copy paste the output of only echo, then the file is created, but not with touch.
while read line
do
#touch < echo -e "$correctFilePathAndName"
echo -e "$correctFilePathAndName"
done < $file.txt
If you have file names in each line of your input file file.txt then you don't need to do any loop. You can just do:
touch $(<file.txt)
to create all the files in one single touch command.
You need to provide the file name as argument and not via standard input. You can use command substitution via $(…) or `…`:
while read line
do
touch "$(echo -e "$correctFilePathAndName")"
done < $file.txt
Ehm, lose the echo part... and use the correct variable name.
while read line; do
touch "$line"
done < $file.txt
try :
echo -e "$correctFilePathAndName" | touch
EDIT : Sorry correct piping is :
echo -e "$correctFilePathAndName" | xargs touch
The '<' redirects via stdin whereas touch needs the filename as an argument. xargs transforms stdin in an argument for touch.

Resources