redirect stderr of ls - linux

I'm trying to redirect ls command's errors. But I found my redirection is wrong. For example, if I wrote this ls commands,
$ ls ;;;
Terminal says,
bash: syntax error near unexpected token `;;'
But, my redirected file wrote this,
ls: cannot access ;;;: No such file or directory
How can I catch differences between redirected file and terminal?

Put the ;;; in quotes, bash will then always pass that argument to the ls command. Without quotes bash is trying to parse the ;;;, hence the error.
ls ';;;' 2> stderr.txt
< no output >
cat stderr.txt
ls: ;;;: No such file or directory

Related

Trouble redirecting an error in pipeline using Bash?

ls -lhR /etc/ | egrep *.conf$ >/home/student/total_size.txt 2>/home/student/error.txt
So I used this command to get all .conf files from /etc/. I want the output in total_size.txt and my errors in error.txt. My output looks good, but the errors won't redirect to error.txt, they appear in my terminal:
ls: cannot open directory '/etc/cups/ssl': Permission denied
I don't know what to do; I tried 2>> instead of 2> but it won't work either.
This happens because ls's stderr still points to the terminal. You need to wrap pipeline in curly braces, and do the redirection outside. E.g:
{ ls -lhR /etc/ | egrep *.conf$; } >/home/student/total_size.txt 2>/home/student/error.txt
Try this, should do the trick.
ls -lhR /etc/ 2>>/home/student/error.txt | egrep *.conf$ >/home/student/total_size.txt
The errors are generated by ls, not egrep.

Redirecting to dev/null

Why can't I redirect standard error to /dev/null?
xxx:xxx 84> find / -name trans.log 2> /dev/null
Outputs
find: paths must precede expression: 2
I believe what you are looking for is;
The greater-thans (>) in commands like these redirect the program’s output somewhere. In this case, something is being redirected into /dev/null, and something is being redirected into &1
xxx:xxx 84> find / -name trans.log >/dev/null 2>&1
If you don't specify a number then the standard output stream is assumed but you can also redirect errors
> file redirects stdout to file
1> file redirects stdout to file
2> file redirects stderr to file
&> file redirects stdout and stderr to file
/dev/null is the null device it takes any input you want and throws it away. It can be used to suppress any output.
It seems your shell doesn't interpret "2>" as a shell expression, but it passes this argument to the find command:
find: paths must precede expression: 2
The "2" here seems to be analyzed by find as an expression.
I cannot reproduce here with my shell, but try to remove the space between 2> and /dev/null, so it is 2>/dev/null.

Redirect argument from a file to a Linux command

I searched the Internet, but maybe I used the wrong keyword, but I couldn't find the syntax to my very simple problem below:
How do I redirect a file as command line arguments to the Linux command "touch"? I want to create a file with "touch abc.txt", but the filename should come from the filename "file.txt" which contains "abc.txt", not manually typed-in.
[root#machine ~]# touch < file.txt
touch: missing file operand
Try `touch --help' for more information.
[root#machine ~]# cat file.txt
abc.txt
Try
$ touch $(< file.txt)
to expand the content of file.txt and give it as argument to touch
Alternatively, if you have multiple filenames stored in a file, you could use xargs, e.g.,
xargs touch <file.txt
(It would work for just one, but is more flexible than a simple "echo").

"ls" works in cmd line but not in script for directories with space in their names

I am a beginner in bash scripting and I am trying to write a script which has as variables directory names, and who uses those variable values to run simple bash commands such as "ls" and "cd". It works perfectly fine when the directory has a "normal" name, for example
testfolder/folder01
But fails miserably when the directory has spaces and parenthesis in their names, which happens for example when you do a copy a subdirectory and paste in the same directory containing the subdirectory. The problem can be seen in this script:
[boblacerda#localhost MyScripts]$ cat test.sh
#!/bin/bash
VARDIR="testfolder/folder01"
ls $VARDIR
VARDIR="testfolder/folder01\ \(copy\)"
ls $VARDIR
[boblacerda#localhost MyScripts]$
This is the output of the script in debugging mode:
[boblacerda#localhost MyScripts]$ bash -x test.sh
+ VARDIR=testfolder/folder01
+ ls testfolder/folder01
testefile01 testefile02
+ VARDIR='testfolder/folder01\ \(copy\)'
+ ls 'testfolder/folder01\' '\(copy\)'
ls: cannot access testfolder/folder01\: No such file or directory
ls: cannot access \(copy\): No such file or directory
+ exit
[boblacerda#localhost MyScripts]$
As you see the first part, who uses a directory with a "normal" name works, but the second part, who uses a directory with spaces and parenthesis in its name, fails. The problem persists if I quote VARDIR in the ls command, i.e., if I use ls like this
ls "$VARDIR"
The output in this case is like this:
[boblacerda#localhost MyScripts]$ bash -x test.sh
+ VARDIR=testfolder/folder01
+ ls testfolder/folder01
testefile01 testefile02
+ VARDIR='testfolder/folder01\ \(copy\)'
+ ls 'testfolder/folder01\ \(copy\)'
ls: cannot access testfolder/folder01\ \(copy\): No such file or directory
+ exit
[boblacerda#localhost MyScripts]$
A final remark to add that the command
ls testfolder/folder01\ \(copy\)
works fine in the cmd as shown below:
[boblacerda#localhost MyScripts]$ls testfolder/folder01\ \(copy\)
testefile01 testefile02
[boblacerda#localhost MyScripts]$
Thank you all for the attention.
There are two problems with your script. First, you are not setting VARDIR correctly as you have too many backslashes. Second, you should put quotes around any use of any variable.
$ cat test.sh
#!/bin/bash
VARDIR="testfolder/folder01"
ls "$VARDIR"
VARDIR="testfolder/folder01 (copy)"
ls "$VARDIR"
When setting VARDIR, you can either use backslashes, or quotes, but not both:
VARDIR="testfolder/folder01 (copy)"
or
VARDIR=testfolder/folder01\ \(copy\)
Try:
ls "$VARDIR"
The double quotes will preserve the space, no need for backslashes.

Redirect all output to file in Bash [duplicate]

This question already has answers here:
How to redirect and append both standard output and standard error to a file with Bash
(8 answers)
Closed 6 years ago.
I know that in Linux, to redirect output from the screen to a file, I can either use the > or tee. However, I'm not sure why part of the output is still output to the screen and not written to the file.
Is there a way to redirect all output to file?
That part is written to stderr, use 2> to redirect it. For example:
foo > stdout.txt 2> stderr.txt
or if you want in same file:
foo > allout.txt 2>&1
Note: this works in (ba)sh, check your shell for proper syntax
All POSIX operating systems have 3 streams: stdin, stdout, and stderr. stdin is the input, which can accept the stdout or stderr. stdout is the primary output, which is redirected with >, >>, or |. stderr is the error output, which is handled separately so that any exceptions do not get passed to a command or written to a file that it might break; normally, this is sent to a log of some kind, or dumped directly, even when the stdout is redirected. To redirect both to the same place, use:
$command &> /some/file
EDIT: thanks to Zack for pointing out that the above solution is not portable--use instead:
$command > file 2>&1
If you want to silence the error, do:
$command 2> /dev/null
To get the output on the console AND in a file file.txt for example.
make 2>&1 | tee file.txt
Note: & (in 2>&1) specifies that 1 is not a file name but a file descriptor.
Use this - "require command here" > log_file_name 2>&1
Detail description of redirection operator in Unix/Linux.
The > operator redirects the output usually to a file but it can be to a device. You can also use >> to append.
If you don't specify a number then the standard output stream is assumed but you can also redirect errors
> file redirects stdout to file
1> file redirects stdout to file
2> file redirects stderr to file
&> file redirects stdout and stderr to file
/dev/null is the null device it takes any input you want and throws it away. It can be used to suppress any output.
Credits to osexp2003 and j.a. …
Instead of putting:
&>> your_file.log
behind a line in:
crontab -e
I use:
#!/bin/bash
exec &>> your_file.log
…
at the beginning of a BASH script.
Advantage: You have the log definitions within your script. Good for Git etc.
You can use exec command to redirect all stdout/stderr output of any commands later.
sample script:
exec 2> your_file2 > your_file1
your other commands.....
It might be the standard error. You can redirect it:
... > out.txt 2>&1
Command:
foo >> output.txt 2>&1
appends to the output.txt file, without replacing the content.
Use >> to append:
command >> file

Resources