How should I redirect stderr on commands that with arguement? - linux

I have a shell script I want to do:
./xxx.sh -a 1 -b 2 -c 3 2>/dev/null
However, the 2 is treated as argument
Similerly with the following command
echo test 2>aaa.txt
my intention is to direct strerr to aaa.txt, however, I got:
cat aaa.txt
test 2
as you can see 2 is also treated as an argument. How should I redirect stderr on commands that with argument?

I think you are using csh or tcsh
because 2> redirect does not work with csh or tcsh.
Use the chsh command to change your shell to /bin/sh or /usr/local/bin/bash in order to use the 2> style redirect. Note: Do not change root's shell to /usr/local/bin/bash
csh and tcsh cannot redirect standard out and error separately, but >& will redirect the combined output to a file.
and then you can try to use the same command or with "()"
like ( your command ) 2> /dev/null
or you can also do it with 2 shells.
Example: csh -c 'SOME_COMMAND 1>/dev/null' |& tee file.txt

Related

command to redirect output to console and to a file at the same time works fine in bash. But how do i make it work in korn shell(ksh)

command to redirect output to console and to a file at the same time works fine in bash. But how do i make it work in korn shell(ksh).
All my scripts runs on korn shell so cant change them to bash for this particular command to work.
exec > >(tee -a $LOGFILE) 2>&1
In the code beneath I use the variable logfile, lowercase is better.
You can try something like
touch "${logfile}"
tail -f "${logfile}"&
tailpid=$!
trap 'kill -9 ${tailpid}' EXIT INT TERM
exec 1>"${logfile}" 2>&1
A not too unreasonable technique is to re-exec the shell with output to tee. That is, at the top of the script, do something like:
#!/bin/sh
test -z "$REXEC" && { REXEC=1 exec "$0" "$#" | tee -a $LOGFILE; exit; }

How do I redirect the output from a bash script to stderr

I've written a bash script. It has a few echo statements. It invokes a few 3rd party tools. I would like to redirect everything that goes to stdout (from my echo statements and all the tools output) to stderr. How do I do that?
You need to redirect the stdout of the command to stderr.
your_command.sh 1>&2
If you want to do this from within the script, you can wrap your whole script into one function and redirect its output to stderr:
main() {
echo hello
echo world
some_script.sh
}
main 1>&2
exec >&2
Put that at the top of your script to redirect all future output to stderr.
$ help exec
exec: exec [-cl] [-a name] [command [arguments ...]] [redirection ...]
Replace the shell with the given command.
Execute COMMAND, replacing this shell with the specified program.
ARGUMENTS become the arguments to COMMAND. If COMMAND is not specified,
any redirections take effect in the current shell.
The solution that worked for me was to enclose the text of the script within ()'s and redirect stdout to stderr like so:
(
echo 1
echo 2
tool1
) 1>&2

Shell scripting shell inside shell

I would like to connect to different shells (csh, ksh etc.,) and execute command inside each switched shell.
Following is the sample program which reflects my intention:
#!/bin/bash
echo $SHELL
csh
echo $SHELL
exit
ksh
echo $SHELL
exit
Since, i am not well versed with Shell scripting need a pointer on how to achieve this. Any help would be much appreciated.
If you want to execute only one single command, you can use the -c option
csh -c 'echo $SHELL'
ksh -c 'echo $SHELL'
If you want to execute several commands, or even a whole script in a child-shell, you can use the here-document feature of bash and use the -s (read commands from stdin) on the child shells:
#!/bin/bash
echo "this is bash"
csh -s <<- EOF
echo "here go the commands for csh"
echo "and another one..."
EOF
echo "this is bash again"
ksh -s <<- EOF
echo "and now, we're in ksh"
EOF
Note that you can't easily check the shell you are in by echo $SHELL, because the parent shell expands this variable to the text /././bash. If you want to be sure that the child shell works, you should check if a shell-specific syntax is working or not.
It is possible to use the command line options provided by each shell to run a snippet of code.
For example, for bash use the -c option:
bash -c $code
bash -c 'echo hello'
zsh and fish also use the -c option.
Other shells will state the options they use in their man pages.
You need to use the -c command line option if you want to pass commands on bash startup:
#!/bin/bash
# We are in bash already ...
echo $SHELL
csh -c 'echo $SHELL'
ksh -c 'echo $SHELL'
You can pass arbitrary complex scripts to a shell, using the -c option, as in
sh -c 'echo This is the Bourne shell.'
You will save you a lot of headaches related to quotes and variable expansion if you wrap the call in a function reading the script on stdin as:
execute_with_ksh()
{
local script
script=$(cat)
ksh -c "${script}"
}
prepare_complicated_script()
{
# Write shell script on stdout,
# for instance by cat-ting a here-document.
cat <<'EOF'
echo ${SHELL}
EOF
}
prepare_complicated_script | execute_with_ksh
The advantage of this method is that it easy to insert a tee in the pipe or to break the pipe to control the script being passed to the shell.
If you want to execute the script on a remote host through ssh you should consider encode your script in base 64 to transmit it safely to the remote shell.

How to turn off echo while executing a shell script Linux [duplicate]

This question already has answers here:
How to silence output in a Bash script?
(9 answers)
Closed 6 years ago.
Here is a simple thing i was working on
echo "please enter a command"
read x
$x
checkexitstatus()
{...}
checkexit status is a ifloop created somewhere else just to check exit status
What i want to know is
Is there any way that when i run the $x that it wont be displayed on the screen
I want to know if it is possible without redirecting the output to a file
No, it isn't possible.
$x &> /dev/null
You could use Bash redirection :
command 1> /.../path_to_file => to redirect stdout into path_to_file.
command > /.../path_to_file is a shortcut of the previous command.
command 2> /.../path_to_file => to redirect stderr into path_to_file
To do both at the same time to the same output: command >/.../path_to_file 2>&1.
2>&1 means redirect 2 (stderr) to 1 (stdout which became path_to_file).
You could replace path_to_file by /dev/null if you don't want to retrieve the output of your command.
Otherwise, you could also store the output of a command :
$ var=$(command) # Recent shell like Bash or KSH
$ var=`command` # POSIX compliant
In this example, the output of command will be stored in $var.
If you want to turn off only the echo command and not other commands that send their output to the stdout, as the title suggests, you can possibly (it may break the code) create an alias for echo
alias echo=':'
now echo is an alias for noop. You can unalias it by unalias echo.

Redirecting the output of program which itself is an argument

Let me present the scenario first with the command which is not working under linux bash environment.
$ timed-run prog1 1>/dev/null 2>out.tmp
Here in the above case I want to redirect the output of program 'prog1' to /dev/null and out.tmp file. But this command is redirecting the output (if any) of timed-run to out.tmp.
Any help will be appreciated.
From a simple example, I experience exactly the opposite.
$ time ls 1> foo 2> bar
real 0m0.002s
user 0m0.004s
sys 0m0.000s
$ more foo
<show files>
$ more bar
<empty>
$
The output of ls is redirected, and the output of time is not!
The problem here is in timed-run not in bash. If you run the same command replacing timed-run with the standard time command this works as you expect. Mainly timed run needs to run the arguments of prog1 through the shell again. If it is a shell script you can do this with the eval command. For example:
#!/bin/sh
echo here is some output
echo $*
eval $*
now run
timed-run prog1 '1>/dev/null' '2>output.tmp'
How about using sh -c 'cmd' like so:
time -p sh -c 'ls -l xcvb 1>/dev/null 2>out.tmp'
time -p sh -c 'exec 0</dev/null 1>/dev/null 2>out.tmp; ls -l xcvb'
# in out.tmp:
# ls: xcvb: No such file or directory

Resources