I am trying to write a small shell script that makes the multiple copies of a file. I am able to take the file name as input but not the number of copies. Here is what I wrote. But I am unable to pass the NUMBER variable to for loop.
echo -n "Enter filename: "
read FILENAME
echo -n "Number of copies to be made: "
read NUMBER
for i in {2..$NUMBER}
do
cp -f $FILENAME ${FILENAME%%.*}"_"$i.csv
done
Unfortunately it doesn't work like that. Bash performs brace expansion before parameter expansion, so your brace will be expanded before $NUMBER is evaluated. See also the Bash Pitfall #33, which explains the issue.
One way to do this, using your code, would be:
for i in $(eval echo {2..$NUMBER})
do
# ...
done
Or, even shorter:
for i in $(seq 2 $NUMBER)
# ...
(thanks, Glenn Jackman!)
Note that typically, variables should be quoted. This is especially important for file names. What if your file is called foo bar? Then your cp -f would copy foo and bar since the arguments are split by whitespace.
So, do something like this:
cp -f "$FILENAME" "${FILENAME%%.*}_${i}.csv"
While it might not matter if your files don't contain whitespace, quoting variables is something you should do automatically to prevent any surprises in the future.
Related
I am having a problem where cmd1 works, but not cmd2 in my Bash script ending in .sh. I have made the Bash script executable.
Additionally, I can execute cmd2 just fine from my Bash terminal. I have tried to make a minimally reproducible example, but my larger goal is to run a complicated executable with command line arguments and pass output to a file that may or may not exist (rather than displaying the output in the terminal).
Replacing > with >> also gives the same error in the script, but not the terminal.
My Bash script:
#!/bin/bash
cmd1="cat test.txt"
cmd2="cat test.txt > a"
echo $cmd1
$cmd1
echo $cmd2
$cmd2
test.txt has the words "dog" and "cat" on two separate lines without quotes.
Short answer: see BashFAQ #50: I'm trying to put a command in a variable, but the complex cases always fail!.
Long answer: the shell expands variable references (like $cmd1) toward the end of the process of parsing a command line, after it's done parsing redirects (like > a is supposed to be) and quotes and escapes and... In fact, the only thing it does with the expanded value is word splitting (e.g. treating cat test.txt > a as "cat" followed by "test.txt", ">", and finally "a", rather than a single string) and wildcard expansion (e.g. if $cmd expanded to cat *.txt, it'd replace the *.txt part with a list of matching files). (And it skips word splitting and wildcard expansion if the variable is in double-quotes.)
Partly as a result of this, the best way to store commands in variables is: don't. That's not what they're for; variables are for data, not commands. What you should do instead, though, depends on why you were storing the command in a variable.
If there's no real reason to store the command in a variable, then just use the command directly. For conditional redirects, just use a standard if statement:
if [ -f a ]; then
cat test.txt > a
else
cat test.txt
fi
If you need to define the command at one point, and use it later; or want to use the same command over and over without having to write it out in full each time, use a function:
cmd2() {
cat test.txt > a
}
cmd2
It sounds like you may need to be able to define the command differently depending on some condition, you can actually do that with a function as well:
if [ -f a ]; then
cmd() {
cat test.txt > a
}
else
cmd() {
cat test.txt
}
fi
cmd
Alternately, you can wrap the command (without redirect) in a function, then use a conditional to control whether it redirects:
cmd() {
cat test.txt
}
if [ -f a ]; then
cmd > a
else
cmd
fi
It's also possible to wrap a conditional redirect into a function itself, then pipe output to it:
maybe_redirect_to() {
if [ -f "$1" ]; then
cat > "$1"
else
cat
fi
}
cat test.txt | maybe_redirect_to a
(This creates an extra cat process that isn't really doing anything useful, but if it makes the script cleaner, I'd consider that worth it. In this particular case, you could minimize the stray cats by using maybe_redirect_to a < test.txt.)
As a last resort, you can store the command string in a variable, and use eval to parse it. eval basically re-runs the shell parsing process from the beginning, meaning that it'll recognize things like redirects in the string. But eval has a well-deserved reputation as a bug magnet, because it's easy for it to treat parts of the string you thought were just data as command syntax, which can cause some really weird (& dangerous) bugs.
If you must use eval, at least double-quote the variable reference, so it runs through the parsing process just once, rather than sort-of-once-and-a-half as it would unquoted. Here's an example of what I mean:
cmd3="echo '5 * 3 = 15'"
eval "$cmd3"
# prints: 5 * 3 = 15
eval $cmd3
# prints: 5 [list of files in the current directory] 3 = 15
# ...unless there are any files with shell metacharacters in their names, in
# which case something more complicated might happen.
BashFAQ #50 discusses some other possible reasons and solutions. Note that the array approach will not work here, since arrays also get expanded after redirects are parsed.
If you pop an 'eval' in front of $cmd2 it should work as expected:
#!/bin/bash
cmd2="cat test.txt > a"
eval $cmd2
If you're not sure about the operation of a script you could always use the debug mode to see if you can determine the error.
bash -x scriptname
This will run the command and display the output of variable evaluations. Hopefully this will reveal any issues with syntax.
Hi… Need a little help here…
I tried to emulate the DOS' dir command in Linux using bash script. Basically it's just a wrapped ls command with some parameters plus summary info. Here's the script:
#!/bin/bash
# default to current folder
if [ -z "$1" ]; then var=.;
else var="$1"; fi
# check file existence
if [ -a "$var" ]; then
# list contents with color, folder first
CMD="ls -lgG $var --color --group-directories-first"; $CMD;
# sum all files size
size=$(ls -lgGp "$var" | grep -v / | awk '{ sum += $3 }; END { print sum }')
if [ "$size" == "" ]; then size="0"; fi
# create summary
if [ -d "$var" ]; then
folder=$(find $var/* -maxdepth 0 -type d | wc -l)
file=$(find $var/* -maxdepth 0 -type f | wc -l)
echo "Found: $folder folders "
echo " $file files $size bytes"
fi
# error message
else
echo "dir: Error \"$var\": No such file or directory"
fi
The problem is when the argument contains an asterisk (*), the ls within the script acts differently compare to the direct ls command given at the prompt. Instead of return the whole files list, the script only returns the first file. See the video below to see the comparation in action. I don't know why it behaves like that.
Anyone knows how to fix it? Thank you.
Video: problem in action
UPDATE:
The problem has been solved. Thank you all for the answers. Now my script works as expected. See the video here: http://i.giphy.com/3o8dp1YLz4fIyCbOAU.gif
The asterisk * is expanded by the shell when it parses the command line. In other words, your script doesn't get a parameter containing an asterisk, it gets a list of files as arguments. Your script only works with $1, the first argument. It should work with "$#" instead.
This is because when you retrieve $1 you assume the shell does NOT expand *.
In fact, when * (or other glob) matches, it is expanded, and broken into segments by $IFS, and then passed as $1, $2, etc.
You're lucky if you simply retrieved the first file. When your first file's path contains spaces, you'll get an error because you only get the first segment before the space.
Seriously, read this and especially this. Really.
And please don't do things like
CMD=whatever you get from user input; $CMD;
You are begging for trouble. Don't execute arbitrary string from the user.
Both above answers already answered your question. So, i'm going a bit more verbose.
In your terminal is running the bash interpreter (probably). This is the program which parses your input line(s) and doing "things" based on your input.
When you enter some line the bash start doing the following workflow:
parsing and lexical analysis
expansion
brace expansion
tidle expansion
variable expansion
artithmetic and other substitutions
command substitution
word splitting
filename generation (globbing)
removing quotes
Only after all above the bash
will execute some external commands, like ls or dir.sh... etc.,
or will do so some "internal" actions for the known keywords and builtins like echo, for, if etc...
As you can see, the second last is the filename generation (globbing). So, in your case - if the test* matches some files, your bash expands the willcard characters (aka does the globbing).
So,
when you enter dir.sh test*,
and the test* matches some files
the bash does the expansion first
and after will execute the command dir.sh with already expanded filenames
e.g. the script get executed (in your case) as: dir.sh test.pas test.swift
BTW, it acts exactly with the same way for your ls example:
the bash expands the ls test* to ls test.pas test.swift
then executes the ls with the above two arguments
and the ls will print the result for the got two arguments.
with other words, the ls don't even see the test* argument - if it is possible - the bash expands the wilcard characters. (* and ?).
Now back to your script: add after the shebang the following line:
echo "the $0 got this arguments: $#"
and you will immediatelly see, the real argumemts how your script got executed.
also, in such cases is a good practice trying to execute the script in debug-mode, e.g.
bash -x dir.sh test*
and you will see, what the script does exactly.
Also, you can do the same for your current interpreter, e.g. just enter into the terminal
set -x
and try run the dir.sh test* = and you will see, how the bash will execute the dir.sh command. (to stop the debug mode, just enter set +x)
Everbody is giving you valuable advice which you should definitely should follow!
But here is the real answer to your question.
To pass unexpanded arguments to any executable you need to single quote them:
./your_script '*'
The best solution I have is to use the eval command, in this way:
#!/bin/bash
cmd="some command \"with_quetes_and_asterisk_in_it*\""
echo "$cmd"
eval $cmd
The eval command takes its arguments and evaluates them into the command as the shell does.
This solves my problem when I need to call a command with asterisk '*' in it from a script.
I have the following hacking-challenge, where we don't know, if there is a valid solution.
We have the following server script:
read s # read user input into var s
echo "$s"
# tests if it starts with 'a-f'
echo "$s" > "/home/user/${s}.txt"
We only control the input "$s". Is there a possibility to send OS-commands like uname or do you think "no way"?
I don't see any avenue for executing arbitrary commands. The script quotes $s every time it is referenced, so that limits what you can do.
The only serious attack vector I see is that the echo statement writes to a file name based on $s. Since you control $s, you can cause the script to write to some unexpected locations.
$s could contain a string like bob/important.txt. This script would then overwrite /home/user/bob/important.txt if executed with sufficient permissions. Sorry, Bob!
Or, worse, $s could be bob/../../../etc/passwd. The script would try to write to /home/user/bob/../../../etc/passwd. If the script is running as root... uh oh!
It's important to note that the script can only write to these places if it has the right permissions.
You could embed unusual characters in $s that would cause irregular file names to be created. Un-careful scripts could be taken advantage of. For example, if $s were foo -rf . bar, then the file /home/user/foo -rf . bar.txt would be created.
If someone ran for file in /home/user; rm $file; done they'd have a surprise on their hands. They would end up running rm /home/user/foo -rf . bar.txt, which is a disaster. If you take out /home/user/foo and bar.txt you're left with rm -rf . — everything in the current directory is deleted. Oops!
(They should have quoted "$file"!)
And there are two other minor things which, while I don't know how to take advantage of them maliciously, do cause the script to behave slightly differently than intended.
read allows backslashes to escape characters like space and newline. You can enter \space to embed spaces and \enter to have read parse multiple lines of input.
echo accepts a couple of flags. If $s is -n or -e then it won't actually echo $s; rather, it will interpret $s as a command-line flag.
Use read -r s or any \ will be lost/missinterpreted by your command.
read -r s?"Your input: "
if [ -n "${s}" ]
then
# "filter" file name from command
echo "${s##*/}" | sed 's|^ *\([[:alnum:]_]\{1,\}\)[[:blank:]].*|/home/user/\1.txt|' | read Output
(
# put any limitation on user here
ulimit -t 5 1>/dev/null 2>&1
`${read}`
) > ${OutPut}
else
echo "Bad command" > /home/user/Error.txt
fi
Sure:
read s
$s > /home/user/"$s".txt
If I enter uname, this prints Linux. But beware: this is a security nightmare. What if someone enters rm -rf $HOME? You'd also have issues with commands containing a slash.
I have two folders full of trainings and corresponding testfiles and I'd like to run the fitting pairs against each other using a shell script.
This is what I have so far:
for x in SpanishLS.train/*.train
do
timbl -f $x -t SpanishLS.test/$x.test
done
This is supposed to take file1(-n).train in one directory, look for file1(-n).test in the other, and run them trough a tool called timbl.
What it does instead is look for a file called SpanishLS.train/file1(-n).train.test which of course doesn't exist.
What I tried to do, to no avail, is truncate $x in a way that lets the script find the correct file, but whenever I do this, $x is truncated way too early, resulting in the script not even finding the .train file.
How should I code this?
If I got you right, this will do the job:
for x in SpanishLS.train/*.train
do
y=${x##*/} # strip basepath
y=${y%.*} # strip extention
timbl -f $x -t SpanishLS.test/$y.test
done
Use basename:
for x in SpanishLS.train/*.train
do
timbl -f $x -t SpanishLS.test/$(basename "$x" .train).test
done
That removes the directory prefix and the .train suffix from $x, and builds up the name you want.
In bash (and other POSIX-compliant shells), you can do the basename operation with two shell parameter expansions without invoking an external program. (I don't think there's a way to combine the two expansions into one.)
for x in SpanishLS.train/*.train
do
y=${x##*/} # Remove path prefix
timbl -f $x -t SpanishLS.test/${y%.train}.test # Remove .train suffix
done
Beware: bash supports quite a number of (useful) expansions that are not defined by POSIX. For example, ${y//.train/.test} is a bash-only notation (or bash and compatible shells notation).
Replace all occurences of .train in the filename to .text:
timbl -f $x -t $(echo $x | sed 's/\.train/.text/g')
I would like to copy the contents of a variable (here called var) into a file.
The name of the file is stored in another variable destfile.
I'm having problems doing this. Here's what I've tried:
cp $var $destfile
I've also tried the same thing with the dd command... Obviously the shell thought that $var was referring to a directory and so told me that the directory could not be found.
How do I get around this?
Use the echo command:
var="text to append";
destdir=/some/directory/path/filename
if [ -f "$destdir" ]
then
echo "$var" > "$destdir"
fi
The if tests that $destdir represents a file.
The > appends the text after truncating the file. If you only want to append the text in $var to the file existing contents, then use >> instead:
echo "$var" >> "$destdir"
The cp command is used for copying files (to files), not for writing text to a file.
echo has the problem that if var contains something like -e, it will be interpreted as a flag. Another option is printf, but printf "$var" > "$destdir" will expand any escaped characters in the variable, so if the variable contains backslashes the file contents won't match. However, because printf only interprets backslashes as escapes in the format string, you can use the %s format specifier to store the exact variable contents to the destination file:
printf "%s" "$var" > "$destdir"
None of the answers above work if your variable:
starts with -e
starts with -n
starts with -E
contains a \ followed by an n
should not have an extra newline appended after it
and so they cannot be relied upon for arbitrary string contents.
In bash, you can use "here strings" as:
cat <<< "$var" > "$destdir"
As noted in the comment by Ash below, #Trebawa's answer (formulated in the same room as mine!) using printf is a better approach than cat.
All of the above work, but also have to work around a problem (escapes and special characters) that doesn't need to occur in the first place: Special characters when the variable is expanded by the shell. Just don't do that (variable expansion) in the first place. Use the variable directly, without expansion.
Also, if your variable contains a secret and you want to copy that secret into a file, you might want to not have expansion in the command line as tracing/command echo of the shell commands might reveal the secret. Means, all answers which use $var in the command line may have a potential security risk by exposing the variable contents to tracing and logging of the shell.
For variables that are already exported, use this:
printenv var >file
That means, in case of the OP question:
printenv var >"$destfile"
Note: variable names are case sensitive.
Warning: It is not a good idea to export a variable just for the sake of printing it with printenv. If you have a non-exported script variable that contains a secret, exporting it will expose it to all future child processes (unless unexported, for example using export -n).
If I understood you right, you want to copy $var in a file (if it's a string).
echo $var > $destdir
When you say "copy the contents of a variable", does that variable contain a file name, or does it contain a name of a file?
I'm assuming by your question that $var contains the contents you want to copy into the file:
$ echo "$var" > "$destdir"
This will echo the value of $var into a file called $destdir. Note the quotes. Very important to have "$var" enclosed in quotes. Also for "$destdir" if there's a space in the name. To append it:
$ echo "$var" >> "$destdir"
you may need to edit a conf file in a build process:
echo "db-url-host=$POSTGRESQL_HOST" >> my-service.conf
You can test this solution with running before export POSTGRESQL_HOST="localhost"