How to pass variable to do external action but not command parameter in bash - linux

I want to do a output redirection in bash for passing variable in shell.
For example:
OPTION=">> /tmp/log1 2>&1"
ADD_OPTION=($OPTION)
ls ${ADD_OPTION[#]}
I want to record result of ls command in /tmp/log1.
So as to, ls >> /tmp/log1 2>&1 to be executed.
But unfortunately, ${ADD_OPTION[#]} was treated as parameter of ls.
The actual result is:
ls: >>: No such file or directory
ls: /tmp/log1: No such file or directory
ls: 2>&1: No such file or directory
What should I do for it?

You can use eval:
eval ls ${OPTION}
As an obvious note, be careful with eval.
Example:
$ OPTION=">> /tmp/log1 2>&1"
$ eval ls ${OPTION}
$ cat /tmp/log1
1
2
3

eval $(echo "ls ${ADD_OPTION[#]}")
is probably what you want to do

Related

What kind of command is "sudo", "su", or "torify"

I know what they do. I was just wondering what kind of command are they. How can you make one using shell scripting.
For example, command like:
ignoreError ls /Home/
ignoreError mkdir /Home/
ignoreError cat
ignoreError randomcommand
Hope you get the idea
The way to do it in a shell script is with the "$#" construct.
"$#" expands to a quoted list of all of the arguments you passed to your shell script. $1 would be the command you want your shell script to run, and $2 $3 etc are the arguments to that command.
The only example I have is from cygwin. Cygwin does not have sudo, but I have this script that emulates it:
#!/usr/bin/bash
cygstart --action=runas "$#"
So when I run a command like
$ sudo ls -l
my sudo script does whatever it needs to do (cygstart --action=runas) and calls the ls command with the -l argument.
Try this script:
#!/bin/sh
"$#"
Call it, for example, run, make it runnable chmod u+x run, and try it:
$ run ls -l #or ./run ls -l
...
output of ls
...
The idea is that the script takes the parameters specified on the command line and use them as a (sub)command... Modify the script this way:
#!/bin/sh
echo "Trying to run $*"
"$#"
and you will see.

Bash: Running one command after another using string variable

I understand that running one command after another is done in bash using the following command
command1 && command2
or
command1; command2
or even
command1 & command2
I also understand that a command stored in a bash variable can be run by simply firing the variable as:
TestCommand="ls"
$TestCommand
Doing the above will list all the files in the directory and I have tested that it does.
But doing the same with multiple commands generates an error. Sample below:
TestCommand="ls && ls -l"
$TestCommand
ls: cannot access &&: No such file or directory
ls: cannot access ls: No such file or directory
My question is why is this happening and is there any workaround?
And before you bash me for doing something so stupid. The preceding is just to present the problem. I have a list of files in my directory and I am using sed to convert the list into a single executable string. Storing that string in a bash variable, I am trying to run it but failing.
When you put two command in a single string variable, it is executed as single command. so when you are using "$TestCommand" to execute two "ls" commands, it is executing only one(first) "ls" command. it considers && and ls(second) as argument of first ls command.
As your current working directory is not having any files named && and ls it is returning error :
ls: cannot access &&: No such file or directory
ls: cannot access ls: No such file or directory
So, basically your commands behaves like this
ls file1 file2 -l
and it will give you output like this if file1 and file2 exists:
HuntM#~/scripts$ ls file1 file2 -l
-rw-r--r-- 1 girishp staff 0 Dec 8 12:44 file1
-rw-r--r-- 1 girishp staff 0 Dec 8 12:44 file2
Now your solution:
You can create function OR one more script to execute 2 commands as below:
caller.sh
#!/bin/bash
myLs=`./myls.sh`
echo "$myLs"
myls.sh
#!/bin/bash
ls && ls -l

redirecting stdout in shell script

Either it is late in the day for me or I am missing something naive here.
here is a contrived example
#!/bin/bash
command="ls -al > check.txt"
$command
When I run this script on a shell it gives me error I guess due to the ">" operator. Anyway I can redirect the output from inside a shell script. I thought this was very straight forward:
ls -la > temp.txt
ls: cannot access >: No such file or directory
ls: cannot access temp.txt: No such file or directory
#!/bin/bash
command="ls -al"
$command > check.txt
> is a special character in Bash (and most shells). It does not belong to a command.
Here is another way to do it using eval,
#!/bin/bash
command="ls -al > check.txt"
eval $command

Redirecting the output of program which itself is an argument

Let me present the scenario first with the command which is not working under linux bash environment.
$ timed-run prog1 1>/dev/null 2>out.tmp
Here in the above case I want to redirect the output of program 'prog1' to /dev/null and out.tmp file. But this command is redirecting the output (if any) of timed-run to out.tmp.
Any help will be appreciated.
From a simple example, I experience exactly the opposite.
$ time ls 1> foo 2> bar
real 0m0.002s
user 0m0.004s
sys 0m0.000s
$ more foo
<show files>
$ more bar
<empty>
$
The output of ls is redirected, and the output of time is not!
The problem here is in timed-run not in bash. If you run the same command replacing timed-run with the standard time command this works as you expect. Mainly timed run needs to run the arguments of prog1 through the shell again. If it is a shell script you can do this with the eval command. For example:
#!/bin/sh
echo here is some output
echo $*
eval $*
now run
timed-run prog1 '1>/dev/null' '2>output.tmp'
How about using sh -c 'cmd' like so:
time -p sh -c 'ls -l xcvb 1>/dev/null 2>out.tmp'
time -p sh -c 'exec 0</dev/null 1>/dev/null 2>out.tmp; ls -l xcvb'
# in out.tmp:
# ls: xcvb: No such file or directory

How to process file names with variables from a list in a file in Bash

I have a file "FileList.txt" with this text:
/home/myusername/file1.txt
~/file2.txt
${HOME}/file3.txt
All 3 files exist in my home directory. I want to process each file in the list from a bash script. Here is a simplified example:
LIST=`cat FileList.txt`
for file in $LIST
do
echo $file
ls $file
done
When I run the script, I get this output:
/home/myusername/file1.txt
/home/myusername/file1.txt
~/file2.txt
ls: ~/file2.txt: No such file or directory
${HOME}/file3.txt
ls: ${HOME}/file3.txt: No such file or directory
As you can see, file1.txt works fine. But the other 2 files do not work. I think it is because the "${HOME}" variable does not get resolved to "/home/myusername/". I have tried lots of things with no success, does anyone know how to fix this?
Thanks,
-Ben
Use eval:
while read file ; do
eval echo $file
eval ls $file
done < FileList.txt
From the bash manpage regarding the eval command:
The args are read and concatenated together into a single command. This command is
then read and executed by the shell, and its exit status is returned as the value of
eval. If there are no args, or only null arguments, eval returns 0.
you will hit "spaces problem" using the for loop with cat. Manipulate IFS, or use a while read loop instead
while read -r line; do eval ls "$line"; done < file
Change "ls $file" to "eval ls $file" to get the shell to do its expansion.

Resources