Shell Bash script that loops when writing to a file - linux

I would like to create a shell script that writes some configuration settings to a xml configuration file on Ubuntu. However, the settings are for an MQ cluster and I need the script to loop through a varying number of times (set by an input parameter) for each of the nodes that are being established.
The xml I would like to write to the file is:
<listeners>
<tcp-listener>
<port>1883</port>
<bind-address>10.0.0.4</bind-address>
</tcp-listener>
</listeners>
<mqtt>
<max-client-id-length>65535</max-client-id-length>
<retry-interval>10</retry-interval>
<max-queued-messages>1000</max-queued-messages>
</mqtt>
<cluster>
<enabled>true</enabled>
<transport>
<tcp>
<bind-address>10.0.0.4</bind-address>
<bind-port>7800</bind-port>
</tcp>
</transport>
<discovery>
<static>
<node>
<host>10.0.0.5</host>
<port>7800</port>
</node>
<node>
<host>10.0.0.6</host>
<port>7800</port>
</node>
<node> n times </node>
</static>
</discovery>
<failure-detection>
<heartbeat>
<enabled>true</enabled>
<interval>5000</interval>
<timeout>15000</timeout>
</heartbeat>
</failure-detection>
</cluster>
So basically, the number of <node> objects needs to reflect the variable the script takes in.
But, I am not sure how to loop through based on writing to the file. I was looking into using the tee command, but this doesn't let me loop though. I guess I could write the file up to the node object then loop through doing a write based on
Here is what I have so far that just writes static text:
tee /opt/hivemq/conf/config.xml > /dev/null <<'EOF'
<the xml goes here>
exit 0
EOF
Is there a way to loop during the write? Or do I need to write up to the looped object, stop writing then have a loop that does multiple writes based on the loop counter, and then finally write the last bit.
Any help would be greatly appreciated.

There's a command called 'seq' that you could use to help with iterating, so try something like :
#!/bin/bash
end=$1
(
echo start
for num in $(seq 1 $end)
do
node="10.0.0.$num"
echo $node
done
echo end
) > out.xml

Related

Redirect parallel process bash script output to individual log file

I have a requirement, where i need to pass multiple arguments to the script to trigger parallel process for each argument. Now i need to capture each process output in the separate log file.
for arg in test_{01..05} ; do bash test.sh "$arg" & done
Above piece of code can only give parallel processing for the input arguments. I tried with exec > >(tee "/path/of/log/$arg_filedate +%Y%m%d%H.log") 2>&1 and it was able to create single log file name with just date with empty output. Can someone suggest whats going wrong here or if there is any best way other than using parallel package
Try:
data_part=$(date +%Y%m%d%H)
for arg in test_{01..05} ; do bash test.sh "$arg" > "/path/to/log/${arg}_${data_part}.log" & done
If i use "$arg_date +%Y%m%d%H.log" it is creating a file with just date without arg
Yes, because $arg_ is parsed as a variable name
arg_=blabla
echo "$arg_" # will print blabla
echo "${arg_}" # equal to the above
To separate _ from arg use braces "${arg}_" would expand variable arg and add string _.

balancing the bash calculations

We have a tool for cutting adaptors https://github.com/vsbuffalo/scythe/blob/master/README.md and we wanted it to be used on all the files in the raw folder and make an output of each file separately as OUT+File Name.
Something is wrong with this script I wrote, because it doesn't take each file separately, and the whole thing doesn't work properly. It's gonna generateing empty file named OUT+files
Expected operation will looks:
take file1, use scythe on it, write output as OUTfile1
take file2 etc.
#!/bin/bash
FILES=/home/dave/raw/*
for f in $FILES
do
echo "Processing the $f file..."
/home/deve/scythe/scythe -a /home/dev/scythe/illumina_adapters.fa -o "OUT"+$f $f
done
Additionally, I noticed (testing for a single file) that the script uses only one core out of 130 available. Is there any way to improve it?
There is no string concatenation operator in shell. Use juxtaposition instead; it's "OUT$f", not "OUT"+$f.

In bash, Loop and send commands to execute

Im trying to execute a program and send it commands based on a file that Im reading but as soon as I send the command the loop stop..it must be related to pipes but I would appreciate if someone could explain how they work
Code:
function setValue{
echo "$1" > &9
}
mkfifo program.fifo
./program
exec 9 > program.fifo
while read value
do
setValue $value
done < file.csv
file.csv has 8000 rows, but the program stops after the first line...and finishes the execution without looping on any more lines of the csv
By leaving the pipe open for reading the loop is not finished after reading the first line and it can continue to the end of the file:
exec 9 <> program.fifo
What is reading from program.fifo? If there is no process that is reading it, then your script is blocked trying to write. That write will not complete until some process is reading the fifo.
My guess is that the code you aren't showing us in the while loop is consuming all of its stdin, which is the same as the input of the while loop. The cleanest way to avoid that is to ensure that the commands in the loop are not reading from the same file as the loop. There are many ways to do that; I like:
while read value; do {
commands
} < /dev/null
done < file.csv

Need help to make my program automated

I just wanted to get some idea how I should approach to this. I am trying to automate to get report back to a database with these bunch of scripts (i.e., java -jar snet_client.jar -mode report -id 13528 -props /int2/contact/client0.properties & ). Lets say I have hundreds of this command with unique numbers as it in the script(13528). I need to put that in a loop so that I do not need to write/copy&paste that hundred scripts over and over to execute. Any suggestion would be helpful. It has to be in unix.
This first bash script would iterate over each line in the file textfile, assuming that each of the id values are on a single line, start the java process and wait for it to complete before starting the next one.
# Queueing
# This one will only start the next process when the previous one completes.
OLD_IFS=$IFS
while IFS=$'\n' read -r line_data; do
java -jar snet_client.jar -mode report -id ${line_data} -props /int2/contact/client0.properties &
wait;
done < /path/to/textfile
IFS=$OLD_IFS
Alternatively, this script does the same, as far as getting id values from a text file, but doesn't wait for the first to complete before the next is started. This will likely cause problems is the snet_client.jar program is very resource intensive:
# Non-queueing
# This starts and runs all the processes
OLD_IFS=$IFS
while IFS=$'\n' read -r line_data; do
java -jar snet_client.jar -mode report -id ${line_data} -props /int2/contact/client0.properties &
done < /path/to/textfile
IFS=$OLD_IFS
On both, we store the current IFS value before we begin so we can reset it after the process runs, just in case we need it set back for something later in the script file.
I have not tested these (since I don't have the dependencies available), so you might have to make adjustments for your own environment.

Reading the path of files as string in shell script

My Aim -->
Files Listing from a command has to be read line by line and be used as part of another command.
Description -->
A command in linux returns
archive/Crow.java
archive/Kaka.java
mypmdhook.sh
which is stored in changed_files variable. I use the following while loop to read the files line by line and use it as part of a pmd command
while read each_file
do
echo "Inside Loop -- $each_file"
done<$changed_files
I am new to writing shell script but my assumption was that the lines would've been separated in the loop and printed in each iteration but instead I get the following error --
mypmdhook.sh: 7: mypmdhook.sh: cannot open archive/Crow.java
archive/Kaka.java
mypmdhook.sh: No such file
Can you tell me how I can just get the value as a string and not as a file what is opened. By the way, the file does exist which made me feel even more confused.(and later use it inside a command). I'd be happy with any kind of answer that helps me understand and resolve this issue.
Since you have data stored in a variable, use a "here string" instead of file redirection:
changed_files="archive/Crow.java
archive/Kaka.java
mypmdhook.sh"
while read each_file
do
echo "Inside Loop -- $each_file"
done <<< "$changed_files"
Inside Loop -- archive/Crow.java
Inside Loop -- archive/Kaka.java
Inside Loop -- mypmdhook.sh
Extremely important to quote "$changed_files" in order to preserve the newlines, so the while-read loop works as you expect. A rule of thumb: always quote variables, unless you knows exactly why you want to leave the quotes off.
What happens here is that the value of your variable $changed_files is substituted into your command, and you get something like
while read each_file
do
echo "Inside Loop -- $each_file"
done < archive/Crow.java
archive/Kaka.java
mypmdhook.sh
then the shell tries to open the file for redirecting the input and obviously fails.
The point is that redirections (e.g. <, >, >>) in most cases accept filenames, but what you really need is to give the contents of the variable to the stdin. The most obvious way to do that is
echo $changed_files | while read each_file; do echo "Inside Loop -- $each_file"; done
You can also use the for loop instead of while read:
for each_file in $changed_files; do echo "inside Loop -- $each_file"; done
I prefer using while read ... if there is a chance that some filename may contain spaces, but in most cases for ... in will work for you.
Rather than storing command's output in a variable use while loop like this:
mycommand | while read -r each_file; do echo "Inside Loop -- $each_file"; done
If you're using BASH you can use process substitution:
while read -r each_file; do echo "Inside Loop -- $each_file"; done < <(mycommand)
btw your attempt of done<$changed_files will assume that changed_files represents a file.

Resources