An error occurred while processing STDERR using the pipe - linux

My working directory is as follows:
MyWorkDirectory
└── test
The file "test" contains a line of information:
1:2:3:4:5
When I use the following command:
cat test foo
the output is as follows:
1:2:3:4:5
cat: foo: No such file or directory
I want to ignore STDOUT and only deal with STDERR, and I want to use the cut command to get the third STDERR segment separated by the ":",So I tried the following command:
cat test foo 2>&1 1>/dev/null | cut -d ':' -f 3
I think the output here should be as follows:
No such file or directory
However, there are many different outputs here:
1、
1
No such file or directory
2、
No such file or directory
3
3、
2
4、
...
Why are these outputs generated? What commands should I use if I want to achieve my goal?

The issue here is "multiple redirect".
The reason this can not be reproduced on Ubuntu, is because Ubuntu (like many Debian based Linux distros) uses Ash or Bash shell by default, while recent MacOS version switched to using ZSH.
ZSH on Ubuntu will give similar result to what you are getting.
Apparently, using both redirect > and pipe | causes ZSH to split you stderr similar to tee command.
This post shows a solution to the problem:
https://unix.stackexchange.com/questions/265061/how-can-i-pipe-only-stderr-in-zsh
Specifically, you need to close stdout before redirecting it to /dev/null to prevent the split:
cat test foo 2>&1 >&- > /dev/null | cut -d ':' -f 3
This works with ZSH on Ubuntu, and should work on Mac OS.
If it does not, check the linked post for enabling mult_ios option in your shell.

Related

Unix Script Looping cat file while read line "No such file or directory" error

I have a script the reads a parameter file, and is going to do some actions with the values of each line in that script. My input file has spaces as separators.
The weird thing is, it works on an old version of Linux but not on a newer version.
#! /bin/ksh
su root "cat /var/opt/OV/tmp/HPOV_gg.log" | while read Line
do
echo "${Line}"
done
Error: bash: cat /var/opt/OV/tmp/HPOV_gg.log: No such file or
directory
The error has obv something to do with the new Linux version, parsing the cat command in a different way.
How can I fix this? Or can I rewrite my script to let it work on this new Linux version.
It's better to use sudo to execute commands as root. No quotes are needed, and sudo access can be controlled in a fine grained manner via its configuration file.
sudo cat /var/opt/OV/tmp/HPOV_gg.log | while ...
Just so you know, you could fix your su command by writing su root -c "cat file". Commands need to be passed via the -c option. But still, sudo is better.

bash output always in console AND file

On Linux, is there a way to both get the output of all commands in a bash and simultaneously store it to a file WITHOUT having to pipe anything. I know I could do something like
ls -al | tee output.log
but I just want all output always to be stored in a log so I can look into it even after a few days. I don't want to have to add the pipe with each command.
You might want script command. When you run it, a new shell session is started and both input and output are recorded to a file specified.
Example:
script my_log.txt
# run your commands
exit
Record of your commands is stored in my_log.txt.
ls -al >> your_file.log 2>&1

Redirecting standard error to file and leaving standard output to screen when launching makefile

I am aware that for redirecting standard error and output to file I have to do:
make > & ! output.txt
Note I use ! to overwrite the file. But How can I redirect standard error to file and leave standard output to screen? Or even better having both error and output on file but also output on screen, so I can see how my compiling is progressing?
I tried:
make 2>! output.txt
but it gives me an error.
Note that > it enough to overwrite the file. You can use the tail -f command to see the output on screen if it is redirected to a file:
$(make 1>output.txt 2>error.txt &) && tail -f output.txt error.txt
You can do this simply with pipe into tee command. The following will put both stdout and stderr into a file and also to the terminal:
make |& tee output.txt
Edit
Explanation from GNU Bash manual, section 3.2.2 Pipelines:
If ‘|&’ is used, command1’s standard error, in addition to its
standard output, is connected to command2’s standard input through the
pipe; it is shorthand for 2>&1 |. This implicit redirection of the
standard error to the standard output is performed after any
redirections specified by the command.
You are reading bash/sh documentation and using tcsh. tcsh doesn't have any way to redirect just stderr. You might want to switch to one of the non-csh shells.

Filter All Output in a Bash Session

Just got the filters package and loving that fact that I can run, for example
ls | pirates
and have a funny (if impractical) output.
For International Talk Like A Pirate Day I would like all my output to automatically be filtered by pirate! If I run exec bash -i | pirate, this works, but some strange things occur, namely not every line is finished with a new line and a new bash input character $
I've tried playing with redirecting stdout, and scripts but all I have tried seems to fail. Ideas?
try exec'ing a script that pipes the shell's output to pirates:
cat shell.sh
bash | pirates
chmod +x shell.sh
exec ./shell.sh

How to pipe the output of a command to file on Linux

I am running a task on the CLI, which prompts me for a yes/no input.
After selecting a choice, a large amount of info scrolls by on the screen - including several errors. I want to pipe this output to a file so I can see the errors. A simple '>' is not working since the command expects keyboard input.
I am running on Ubuntu 9.1.
command &> output.txt
You can use &> to redirect both stdout and stderr to a file. This is shorthand for command > output.txt 2>&1 where the 2>&1 means "send stderr to the same place as stdout" (stdout is file descriptor 1, stderr is 2).
For interactive commands I usually don't bother saving to a file if I can use less and read the results right away:
command 2>&1 | less
echo yes | command > output.txt
Depending on how the command reads it's input (some programs discard whatever was on stdin before it displays it's prompt, but most don't), this should work on any sane CLI-environment.
Use 2> rather than just >.
If the program was written by a sane person what you probably want is the stderr not the stdout. You would achieve this by using something like
foo 2> errors.txt
you can use 2> option to send errors to the file.
example:
command 2> error.txt
(use of option 2>) --- see if their would be any error while the execution of the command it will send it to the file error.txt.

Resources