Unable to redirect error to /dev/null in gnu make - linux

In a gnu script, I want to check if an intel c compiler is in the path.
To do this, I run the following command:
COMPILER_IN_PATH := $(shell icc -dumpversion | grep 2021.3.0)
Later I test to see if COMPILER_IN_PATH is set.
If an intel compiler is in the path, the above command works fine.
However, if an intel compiler is not in the path, though the above command works (COMPILER_IN_PATH will not be set), I see the following error message in the output when this line is run:
/bin/sh: icc: command not found
I would like to get rid of that error message. How do I redirect stderr somewhere (to /dev/null, I suppose) while reading the stdout into the COMPILER_IN_PATH variable ?
Here are some of my failed attempts:
icc -dumpversion | grep 2021 > /dev/null 2>&1
Ambiguous output redirect.
icc -dumpversion | grep 2021 2> /dev/null
icc: Command not found.
grep: 2: No such file or directory

You are redirecting the output of the grep command. You want to redirect the output of the icc command.
COMPILER_IN_PATH := $(shell icc -dumpversion 2>/dev/null | grep 2021.3.0)

Related

How can I use WGET to get only status info and save it somewhere?

can I use WGET to get, let's say, status 200 OK and save that status somewhere? If not, how can I do that using ubuntu linux?
Thanks!
With curl you can
curl -L -o /dev/null -s -w "%{http_code}\n" http://google.com >> status.txt
You use --save-headers to add the headers to the output, put the output to the console using -O -, ignore the errors stream using >/dev/null and get only the status line using grep HTTP/.
You can then output that into a file using >status_file
$ wget --save-headers -O - http://google.com/ 2>/dev/null | grep HTTP/ > status_file
The question suggests that the output of the wget command be stored somewhere. As another alternative, the following example shows how to store the output of wget execution in a shell variable (wget_status). Where after the execution of the wget command the status of the execution is stored in the variable wget_status. The wget status is displayed in the console using the echo command.
$ wget_status=$(wget --server-response ${URL} 2>&1 | awk '/^ HTTP/{print $2}')
$ echo $wget_status
200
After the execution of the wget command, the execution status can be manipulated using the value of the wget_status variable.
For more information consult the following link as a reference:
https://www.unix.com/shell-programming-and-scripting/148595-capture-http-response-code-wget.html
The tests were executed using cloudshell on a linux system.
Linux cs-335831867014-default 5.10.90+ #1 SMP Wed Mar 23 09:10:07 UTC 2022 x86_64 GNU/Linux

Where does g++ look for libraries to link?

So I'm currently using Lubuntu 18.4 32bit and was trying to get the GLFW library up and going. I noticed that when you compile a program using GLFW you need to link many libraries and was wondering where exactly does g++ look in the filesystem when you type g++ main.cpp -lglfw?
For compiler:
echo | g++ -x c++ -E -Wp,-v - >/dev/null
echo | prints empty "source code" and closes stdin
-x c++ to specify the language (with this option it prints more detailed info)
-E says g++ to stop after preprocessing stage
- at the end means read code from stdin
-Wp,-v to pass -v directly to preprocessor
>/dev/null to redirect extra output to /dev/null (void)
Example output:
ignoring nonexistent directory "/usr/lib/gcc/x86_64-pc-linux-gnu/10.2.0/../../../../x86_64-pc-linux-gnu/include"
#include "..." search starts here:
#include <...> search starts here:
/usr/lib/gcc/x86_64-pc-linux-gnu/10.2.0/../../../../include/c++/10.2.0
/usr/lib/gcc/x86_64-pc-linux-gnu/10.2.0/../../../../include/c++/10.2.0/x86_64-pc-linux-gnu
/usr/lib/gcc/x86_64-pc-linux-gnu/10.2.0/../../../../include/c++/10.2.0/backward
/usr/lib/gcc/x86_64-pc-linux-gnu/10.2.0/include
/usr/local/include
/usr/lib/gcc/x86_64-pc-linux-gnu/10.2.0/include-fixed
/usr/include
End of search list.
For linker:
ld --verbose | grep SEARCH_DIR | tr -s ' ;' \\012
--verbose tells ld to print settings info
| grep SEARCH_DIR select lib directory info
| tr -s ' ;' \\012 makes output pretty (replace ; with new line)
Example output:
SEARCH_DIR("/usr/x86_64-pc-linux-gnu/lib64")
SEARCH_DIR("/usr/lib")
SEARCH_DIR("/usr/local/lib")
SEARCH_DIR("/usr/x86_64-pc-linux-gnu/lib")
So the GLFW library should be in one of those directories.
Source: https://transang.me/library-path-in-gcc/

How can I exclude all “permission denied” result lines from “grep”?

So, the thing is, I'm on linux terminal using grep command and I want the output without all the lines where it prints at the beginning "grep:" or the lines that begins with "./", because now I'm getting something like this:
grep: ./users/blabla1: Permission denied
grep: ./users/blabla2: Permission denied
grep: ./users/blabla3: Permission denied
grep: ./users/blabla4: Permission denied
grep: ./users/blabla5: Permission denied
grep: ./users/blabla6: Permission denied
grep: ./users/blabla7: Permission denied
grep: ./users/blabla8: Permission denied
./foo/bar/log.log
./foo/bar/xml.xml
I have tried this:
grep -irl "foo" . | grep -v "Permission denied"
I have also tried this one:
grep -irl "foo" . | grep -v "^grep:"
And finally this one:
grep -irl "foo" . | grep "^./"
But I keep getting same results as if I haven't put anything after the |, any ideas? What am I missing?
The messages you are receiving is due to a lack of permission on those files, i.e., those are error messages.
All you have to do is to redirect the stderr (standard error output) to /dev/null, like this:
grep -irl "foo" 2> /dev/null
To lear more about redirection (on bash), read this article:
Bash Reference Manual - Redirections
Edit: You can also just suppress error messages by using:
grep -irl "foo" 2>&-
I prefer to use the -s 'suppress' flag:
grep -irls "foo"
Note the "Portability note" from the grep man page:
-s, --no-messages
Suppress error messages about nonexistent or unreadable files. Portability note: unlike GNU grep, 7th Edition Unix grep did not conform to POSIX, because it lacked -q and its -s option behaved like GNU grep's -q option. USG-style grep also lacked -q but its -s option behaved like GNU grep. Portable shell scripts should avoid both -q and -s and should redirect standard and error output to /dev/null instead. (-s is specified by POSIX.)
Going off of your first try:
grep -irl "foo" . | grep -v "Permission denied"
You're just missing the operator to redirect standard error. '|' redirects standard output. if you add '&', you will also redirect standard error, which is what is giving you the "Permission denied" message. Try:
grep -irl "foo" . &| grep -v "Permission denied"
This works for me, because for some reason my machine doesn't like the "2> /dev/null" option.

bzip command not working with "tee -a"

I want to redirect stdop of bzip command to logfile using tee command but its not working and giving error for '-a' in tee command. Please see error below,
> bzip2 file -c 1> tee -a logfile
bzip2: Bad flag `-a'
bzip2, a block-sorting file compressor. Version 1.0.5, 10-Dec-2007.
usage: bzip2 [flags and input files in any order]
-h --help print this message
-d --decompress force decompression
-z --compress force compression
-k --keep keep (don't delete) input files
-f --force overwrite existing output files
-t --test test compressed file integrity
-c --stdout output to standard out
-q --quiet suppress noncritical error messages
-v --verbose be verbose (a 2nd -v gives more)
-L --license display software version & license
-V --version display software version & license
-s --small use less memory (at most 2500k)
-1 .. -9 set block size to 100k .. 900k
--fast alias for -1
--best alias for -9
If invoked as `bzip2', default action is to compress.
as `bunzip2', default action is to decompress.
as `bzcat', default action is to decompress to stdout.
If no file names are given, bzip2 compresses or decompresses
from standard input to standard output. You can combine
short flags, so `-v -4' means the same as -v4 or -4v, &c.
What is the issue? why bzip is considering the '-a' flag of tee command.
Try:
bzip2 -c file | tee -a logfile
The | (pipe) is redirecting the stdout of the left command to the stdin of the right command.
-c is is an option from bzip2 that says Compress or decompress to standard output.. see man bzip2
Your problem is that 1>does not pipe output of the bzip2 command to the tee command, but instead redirects the output to a file which will be named tee. Furthermore you probably don't want to use -c. You should be using the pipe | instead, as follows:
bzip2 file | tee -a logfile
Also, the reason why bzip2 is complaining is because the command as you mentioned above will be interpreted exactly as this one:
bzip2 file -a logfile 1> tee
And hence all arguments after the teeare actually added to the bzip2 command.
As others have pointed out, you want a pipe, not output redirection:
bzip2 file | tee -a logfile
However, bzip2 doesn't produce any output; it simply replaces the given file with a compressed version of the file. You might want to pipe standard error to the log file:
bzip2 file 2>&1 | tee -a logfile
(2>&1 copies standard error to standard output, which can then be piped.)

Limitation for piped redirection to file on shell?

I'm trying to do the following:
uname>>1.txt | echo #####>>1.txt | echo uname>>1.txt &
to get the following output:
uname
## ## ## ## ##
Linux (or whatever the uname is)
But instead all I get as output is:
uname
However if I try just:
uname>>1.txt | echo uname>>1.txt &
Then I do get the following output:
uname
Linux
Wondering if there is some limitation to this sort of piped redirection?
=======================================================================
I'll be calling this shell command from within a tcl script. Well actually there are a list of commands being executed from within the tcl script, and the outputs need to be formatted in the following way <------->
I wanted to run them in background to decrease the execution time, as the outputs of these commands are not related to each other.
I thought the commands in () would output the formatted output to 1.txt as a background process.
Would you suggest another way of doing this?
There are a number of problems here.
In general it's a bad idea to combine output redirection and pipes. Once redirected, there's nothing left to pipe.
Piping to echo doesn't make a bit of sense.
Use parentheses to put a suite of commands in the background.
You shouldn't be putting this in the background.
In general commands run from left to right, not right to left.
What you want is
(echo uname > 1.txt; echo ------ >>1.txt; uname >>1.txt)
Update (per comments and changes to the question)
You are continuing to invoke what is essentially undefined behavior with this command:
uname>>1.txt | echo uname>>1.txt &
The pipe from uname is invalid because there's nothing to pipe once you have redirected output. The pipe to echo is invalid because doesn't read from standard input. Which of the uname or echo commands prints it's output first to the file 1.txt is up for grabs here. This is apparently what you want:
bash -c 'echo uname >> 1.txt; echo ------ >> 1.txt; uname >> 1.txt'
Note the -c option to bash. This tells bash that the argument following -c is a string that contains shell commands.

Resources