Execute n lines of shell script - linux

Is there a way to execute only a specified number of lines from a shell script? I will try copying them with head and putting them on a separate .sh, but I wonder if there's a shortcut...

Reorganize the shell script and create functions.
Seriously, put every line of code into a function.
Then (using ksh as an example), source the script with "." into an interactive shell.
You can now run any of the functions by name, and only the code within that function will run.
The following trivial example illustrates this. You can use this in two ways:
1) Link the script so you can call it by the name of one of the functions.
2) Source the script (with . script.sh) and you can then reuse the functions elsewhere.
function one {
print one
}
function two {
print two
}
(
progname=${0##*/}
case $progname in
(one|two)
$progname $#
esac
)

Write your own script /tmp/headexecute for example
#!/bin/ksh
trap 'rm -f /tmp/somefile' 0
head -n $2 $1 > /tmp/somefile
chmod 755 /tmp/somefile
/tmp/somefile
call it with the name of the files and the number of lines to execute
/tmp/headexecute /tmp/originalscript 10

Most shells have no such facility. You will have to do it the hard way.

This might work for you (GNU sed):
sed -n '1{h;d};H;2{x;s/.*/&/ep;q}' script
This executes the first two lines of a script.

x=starting line
y=number of lines to execute
eval "$(tail -n +$x script | head -$y)"

Related

File redirection fails in Bash script, but not Bash terminal

I am having a problem where cmd1 works, but not cmd2 in my Bash script ending in .sh. I have made the Bash script executable.
Additionally, I can execute cmd2 just fine from my Bash terminal. I have tried to make a minimally reproducible example, but my larger goal is to run a complicated executable with command line arguments and pass output to a file that may or may not exist (rather than displaying the output in the terminal).
Replacing > with >> also gives the same error in the script, but not the terminal.
My Bash script:
#!/bin/bash
cmd1="cat test.txt"
cmd2="cat test.txt > a"
echo $cmd1
$cmd1
echo $cmd2
$cmd2
test.txt has the words "dog" and "cat" on two separate lines without quotes.
Short answer: see BashFAQ #50: I'm trying to put a command in a variable, but the complex cases always fail!.
Long answer: the shell expands variable references (like $cmd1) toward the end of the process of parsing a command line, after it's done parsing redirects (like > a is supposed to be) and quotes and escapes and... In fact, the only thing it does with the expanded value is word splitting (e.g. treating cat test.txt > a as "cat" followed by "test.txt", ">", and finally "a", rather than a single string) and wildcard expansion (e.g. if $cmd expanded to cat *.txt, it'd replace the *.txt part with a list of matching files). (And it skips word splitting and wildcard expansion if the variable is in double-quotes.)
Partly as a result of this, the best way to store commands in variables is: don't. That's not what they're for; variables are for data, not commands. What you should do instead, though, depends on why you were storing the command in a variable.
If there's no real reason to store the command in a variable, then just use the command directly. For conditional redirects, just use a standard if statement:
if [ -f a ]; then
cat test.txt > a
else
cat test.txt
fi
If you need to define the command at one point, and use it later; or want to use the same command over and over without having to write it out in full each time, use a function:
cmd2() {
cat test.txt > a
}
cmd2
It sounds like you may need to be able to define the command differently depending on some condition, you can actually do that with a function as well:
if [ -f a ]; then
cmd() {
cat test.txt > a
}
else
cmd() {
cat test.txt
}
fi
cmd
Alternately, you can wrap the command (without redirect) in a function, then use a conditional to control whether it redirects:
cmd() {
cat test.txt
}
if [ -f a ]; then
cmd > a
else
cmd
fi
It's also possible to wrap a conditional redirect into a function itself, then pipe output to it:
maybe_redirect_to() {
if [ -f "$1" ]; then
cat > "$1"
else
cat
fi
}
cat test.txt | maybe_redirect_to a
(This creates an extra cat process that isn't really doing anything useful, but if it makes the script cleaner, I'd consider that worth it. In this particular case, you could minimize the stray cats by using maybe_redirect_to a < test.txt.)
As a last resort, you can store the command string in a variable, and use eval to parse it. eval basically re-runs the shell parsing process from the beginning, meaning that it'll recognize things like redirects in the string. But eval has a well-deserved reputation as a bug magnet, because it's easy for it to treat parts of the string you thought were just data as command syntax, which can cause some really weird (& dangerous) bugs.
If you must use eval, at least double-quote the variable reference, so it runs through the parsing process just once, rather than sort-of-once-and-a-half as it would unquoted. Here's an example of what I mean:
cmd3="echo '5 * 3 = 15'"
eval "$cmd3"
# prints: 5 * 3 = 15
eval $cmd3
# prints: 5 [list of files in the current directory] 3 = 15
# ...unless there are any files with shell metacharacters in their names, in
# which case something more complicated might happen.
BashFAQ #50 discusses some other possible reasons and solutions. Note that the array approach will not work here, since arrays also get expanded after redirects are parsed.
If you pop an 'eval' in front of $cmd2 it should work as expected:
#!/bin/bash
cmd2="cat test.txt > a"
eval $cmd2
If you're not sure about the operation of a script you could always use the debug mode to see if you can determine the error.
bash -x scriptname
This will run the command and display the output of variable evaluations. Hopefully this will reveal any issues with syntax.

Loop for reading more than 1 query in a bash file

I need a loop in a Bash script (analysis-run.sh) for running many queries. As I have many queries I can't run them manually so I need a way to automate them. So far, I created a file inputs.txt with all my queries and at the end of the bash script file I added the following:
while read f ; do
./analysis-run.sh $f
done < imputs.txt
With that loop, analysis-run is only running the first query of inputs.txt over and over again. I am really new to this, so any help would be appreciated.
The content of imputs.txt is:
bones
muscles
blood
saliva
and so on..
The content of analysis-run.sh is:
Execute this script as ./analysis-run.sh [query] [group]
query=$1
group=$2
if [ $group = "clean" ]; then
cluster=A
else
cluster=B
fi
adamo-obtain_bundance.py - query $query -ref combined_$cluster.$group.align -splits 1 -group $group
adamo-obtain_structure.py -i $query.combined_$query$group.csv -o $query.$group -cutoff 0.5 -group $group
With that loop, analysis-run is only running the first query of inputs.txt over and over again.
The problem (probably) is that you need to quote $f:
while read -r f ; do
./analysis-run.sh "$f"
done < samples.txt
Without the quotes, the line read from samples.txt will be subject to word splitting and glob expansion.
Read http://tldp.org/LDP/abs/html/quotingvar.html
And run your scripts though ShellCheck
Using loops in Bash can sometimes work, but it is loaded with perils.
Using xargs is usually the cleanest, most robust approach...
<inputs.txt xargs --max-args=1 do_something
The command to execute could be provided as a Bash function...
function do_something
{
echo value=${1}
}
Although the call to xargs is somewhat more involved when taking that approach. See: Calling functions with xargs within a bash script
About xargs
xargs takes a list of arguments (usually file names), which are provided as an input file or stream, and it places those arguments on the command-line for another specified command or function. If the command can handle multiple input arguments, you can drop the --max-args=1 option.

how to pass asterisk into ls command inside bash script

Hi… Need a little help here…
I tried to emulate the DOS' dir command in Linux using bash script. Basically it's just a wrapped ls command with some parameters plus summary info. Here's the script:
#!/bin/bash
# default to current folder
if [ -z "$1" ]; then var=.;
else var="$1"; fi
# check file existence
if [ -a "$var" ]; then
# list contents with color, folder first
CMD="ls -lgG $var --color --group-directories-first"; $CMD;
# sum all files size
size=$(ls -lgGp "$var" | grep -v / | awk '{ sum += $3 }; END { print sum }')
if [ "$size" == "" ]; then size="0"; fi
# create summary
if [ -d "$var" ]; then
folder=$(find $var/* -maxdepth 0 -type d | wc -l)
file=$(find $var/* -maxdepth 0 -type f | wc -l)
echo "Found: $folder folders "
echo " $file files $size bytes"
fi
# error message
else
echo "dir: Error \"$var\": No such file or directory"
fi
The problem is when the argument contains an asterisk (*), the ls within the script acts differently compare to the direct ls command given at the prompt. Instead of return the whole files list, the script only returns the first file. See the video below to see the comparation in action. I don't know why it behaves like that.
Anyone knows how to fix it? Thank you.
Video: problem in action
UPDATE:
The problem has been solved. Thank you all for the answers. Now my script works as expected. See the video here: http://i.giphy.com/3o8dp1YLz4fIyCbOAU.gif
The asterisk * is expanded by the shell when it parses the command line. In other words, your script doesn't get a parameter containing an asterisk, it gets a list of files as arguments. Your script only works with $1, the first argument. It should work with "$#" instead.
This is because when you retrieve $1 you assume the shell does NOT expand *.
In fact, when * (or other glob) matches, it is expanded, and broken into segments by $IFS, and then passed as $1, $2, etc.
You're lucky if you simply retrieved the first file. When your first file's path contains spaces, you'll get an error because you only get the first segment before the space.
Seriously, read this and especially this. Really.
And please don't do things like
CMD=whatever you get from user input; $CMD;
You are begging for trouble. Don't execute arbitrary string from the user.
Both above answers already answered your question. So, i'm going a bit more verbose.
In your terminal is running the bash interpreter (probably). This is the program which parses your input line(s) and doing "things" based on your input.
When you enter some line the bash start doing the following workflow:
parsing and lexical analysis
expansion
brace expansion
tidle expansion
variable expansion
artithmetic and other substitutions
command substitution
word splitting
filename generation (globbing)
removing quotes
Only after all above the bash
will execute some external commands, like ls or dir.sh... etc.,
or will do so some "internal" actions for the known keywords and builtins like echo, for, if etc...
As you can see, the second last is the filename generation (globbing). So, in your case - if the test* matches some files, your bash expands the willcard characters (aka does the globbing).
So,
when you enter dir.sh test*,
and the test* matches some files
the bash does the expansion first
and after will execute the command dir.sh with already expanded filenames
e.g. the script get executed (in your case) as: dir.sh test.pas test.swift
BTW, it acts exactly with the same way for your ls example:
the bash expands the ls test* to ls test.pas test.swift
then executes the ls with the above two arguments
and the ls will print the result for the got two arguments.
with other words, the ls don't even see the test* argument - if it is possible - the bash expands the wilcard characters. (* and ?).
Now back to your script: add after the shebang the following line:
echo "the $0 got this arguments: $#"
and you will immediatelly see, the real argumemts how your script got executed.
also, in such cases is a good practice trying to execute the script in debug-mode, e.g.
bash -x dir.sh test*
and you will see, what the script does exactly.
Also, you can do the same for your current interpreter, e.g. just enter into the terminal
set -x
and try run the dir.sh test* = and you will see, how the bash will execute the dir.sh command. (to stop the debug mode, just enter set +x)
Everbody is giving you valuable advice which you should definitely should follow!
But here is the real answer to your question.
To pass unexpanded arguments to any executable you need to single quote them:
./your_script '*'
The best solution I have is to use the eval command, in this way:
#!/bin/bash
cmd="some command \"with_quetes_and_asterisk_in_it*\""
echo "$cmd"
eval $cmd
The eval command takes its arguments and evaluates them into the command as the shell does.
This solves my problem when I need to call a command with asterisk '*' in it from a script.

Bash: execute a multi-command line string in a script

There is, in a file, some multi-command line like this:
cd /home/user; ls
In a bash script, I would like to execute these commands, adding some arguments to the last one. For example:
cd /home/user; ls -l *.png
I thought it would be enough to do something like this:
#!/bin/bash
commandLine="$(cat theFileWithCommandInside) -l *.png"
$commandLine
exit 0
But it says:
/home/user;: No such file or directory
In other words, the ";" character doesn't mean anymore "end of the command": The shell is trying to find a directory called "user;" in the home folder...
I tried to replace ";" with "&&", but the result is the same.
the point of your question is to execute command stored in string. there are thousands of ways to execute that indirectly. but eventually, bash has to involve.
so why not explicitly invoke bash to do the job?
bash -c "$commandLine"
from doc:
-c string
If the -c option is present, then commands are read from string. If there are arguments after the string, they are assigned to the positional parameters, starting with $0.
http://linux.die.net/man/1/bash
Why dont you execute the commands themselves in the script, instead of "importing" them?
#!/bin/bash
cd /home/user; ls -l *.png
exit 0
Wrap the command into a function:
function doLS() {
cd user; ls $#
}
$# expands to all arguments passed to the function. If you (or the snippet authors) add functions expecting a predefined number of arguments, you may find the positional parameters $1, $2, ... useful instead.
As the maintainer of the main script, you will have to make sure that everyone providing such a snippet provides that "interface" your code uses (i.e. their code defines the functions your program calls and their functions process the arguments your program passes).
Use source or . to import the function into your running shell:
#!/bin/bash
source theFileWithCommandInside
doLS -l *.png
exit 0
I'd like to add a few thoughts on the ; topic:
In other words, the ";" character doesn't mean anymore "end of the
command": The shell is trying to find a directory called "user;" in
the home folder...
; is not used to terminate a statement as in C-style languages. Instead it is used to separate commands that should be executed sequentially inside a list. Example executing two commands in a subshell:
( command1 ; command2 )
If the list is part of a group, it must be succeeded by a ;:
{ command1 ; command2 ; }
In your example, tokenization and globbing (replacing the *) will not be executed (as you may have expected), so your code will not be run successfully.
The key is: eval
Here, the fixed script (look at the third line):
#!/bin/bash
commandLine="$(cat theFileWithCommandInside) -l *.png"
eval $commandLine
exit 0
Using the <(...) form
sh <(sed 's/$/ *.png/' theFileWithCommandInside)

Do a complete flux of work on bash script

I'm trying to automate a proces which I have to do over and over again in which I have to parse the output from a shell function, look for 5 different things, and then put them on a file
I know I can match patterns with grep however I don't know how to store the result on a variable so I can use it after :(
I also have to parse this very same output to get the other 5 values
I have no idea on how to use the same output for the 5 grep's i need to do and then store it to 5 different variables for after use
I know i have to create a nice and tidy .sh but I don't know how to do this
Currently im trying this
#!/bin/bash
data=$(cat file)
lol=$(echo data|grep red)
echo $lol
not working , any ideas?
you should show some examples of what you want to do next time..
assuming you shell function is called func1
func1(){
echo "things i want to get are here"
}
func1 | grep -E "things|want|are|here|get" > outputfile.txt
Update:
your code
#!/bin/bash
data=$(cat file)
lol=$(echo data|grep red)
echo $lol
practically just means this
lol=$(grep "red" file)
or
lol=$(awk '/red/' file)
also, if you are considering using bash, this is one way you can do it
while read -r myline
do
case "$myline" in
*"red"* ) echo "$myline" >> output.txt
esac
done <file
You can use the following syntax:
VAR=$(grep foo bar)
or alternatively:
VAR=`grep foo bar`
The easiest thing to do would be to redirect the output of the function to a file. You can then run multiple greps on it and only delete the file once you are done with it.
To save the output, you want to use command substitution. This runs a command and then converts the output into command line parameter. Combined with variable assignment you get:
variable=$(grep expression file)
Your second line is wrong. Change it to this:
lol=$(echo "$data"|grep red)
use egrep istead of grep.
variable=$(egrep "exp1|exp2|exp3|exp4|exp5" file)

Resources