Trouble redirecting an error in pipeline using Bash? - linux

ls -lhR /etc/ | egrep *.conf$ >/home/student/total_size.txt 2>/home/student/error.txt
So I used this command to get all .conf files from /etc/. I want the output in total_size.txt and my errors in error.txt. My output looks good, but the errors won't redirect to error.txt, they appear in my terminal:
ls: cannot open directory '/etc/cups/ssl': Permission denied
I don't know what to do; I tried 2>> instead of 2> but it won't work either.

This happens because ls's stderr still points to the terminal. You need to wrap pipeline in curly braces, and do the redirection outside. E.g:
{ ls -lhR /etc/ | egrep *.conf$; } >/home/student/total_size.txt 2>/home/student/error.txt

Try this, should do the trick.
ls -lhR /etc/ 2>>/home/student/error.txt | egrep *.conf$ >/home/student/total_size.txt
The errors are generated by ls, not egrep.

Related

How to show ls path of the ls-piped-into-grep output?

Does anyone know how to show the containing directory of an ls output command?
For example in the / I issue ls -R | grep something the output finds the file that matches the something without telling me where the file is located, or it's full path.
According to "ls" command,you can only search for files in the current directory, For global search,please use the "find" command.

redirect stderr of ls

I'm trying to redirect ls command's errors. But I found my redirection is wrong. For example, if I wrote this ls commands,
$ ls ;;;
Terminal says,
bash: syntax error near unexpected token `;;'
But, my redirected file wrote this,
ls: cannot access ;;;: No such file or directory
How can I catch differences between redirected file and terminal?
Put the ;;; in quotes, bash will then always pass that argument to the ls command. Without quotes bash is trying to parse the ;;;, hence the error.
ls ';;;' 2> stderr.txt
< no output >
cat stderr.txt
ls: ;;;: No such file or directory

GNU grep on FreeBSD not working properly

I have a weird problem on FreeBSD 8.4-STABLE with grep (GNU grep) 2.5.1-FreeBSD.
If I try to grep -Hnr searchstring I didn't get any output, but grep is running said ps aux and is keep running until I kill the process.
If I copy a testfile in an empty directory and do
cat testfile | grep searchstring it is working.
But if I try to
grep -Hnr searchstring in that directory I also get no output, grep keeps running and running but didn't produce any matches.
Anybody knows how to solve this?
Even though you gave -r, you still have to give grep a file argument. Othersize, as you've discovered, it just sits there waiting for input on stdin.
You want
grep -Hnr searchstring .
# ....................^^
That will recursively find files under the current directory.
Though it doesn't seem to be documented, if you invoke grep with the -r option and no file or directory name arguments, it defaults to the current directory, almost as if you had typed grep -R pattern . except that ./ does not appear in the output.
Apparently this is a fairly new feature.
If you do a recursive grep in a directory with a lot of contents, it could simply take a long time -- perhaps forever if there are device files such as /dev/zero that can produce infinite output.

Redirect argument from a file to a Linux command

I searched the Internet, but maybe I used the wrong keyword, but I couldn't find the syntax to my very simple problem below:
How do I redirect a file as command line arguments to the Linux command "touch"? I want to create a file with "touch abc.txt", but the filename should come from the filename "file.txt" which contains "abc.txt", not manually typed-in.
[root#machine ~]# touch < file.txt
touch: missing file operand
Try `touch --help' for more information.
[root#machine ~]# cat file.txt
abc.txt
Try
$ touch $(< file.txt)
to expand the content of file.txt and give it as argument to touch
Alternatively, if you have multiple filenames stored in a file, you could use xargs, e.g.,
xargs touch <file.txt
(It would work for just one, but is more flexible than a simple "echo").

Is it possible to redirect the output of a command to one file, but still keep the output echoing on the terminal?

Is it possible to redirect the output of a command to one file, but still keep the output echoing on the terminal?
I want the output to go to a file, but I still want to see it on the terminal (just so I know something is going on)
Yes, tee does what you want:
command | tee output.log
You can use "tee" i.e.
ls -ltr | tee -a mylog.log

Resources