what the meaning of this link in linux script - linux

What is the meaning of the following which is on the top of linux script
#!/bin/bash
return &> /dev/null

The line is to protect users from sourcing the script.
When the script is run (not sourced but run), the script will proceed to the next lines until the end of file.
When the script is sourced, the script will just return and do nothing.

return is supposed to return value from bash function.
When used exactly like in your example but without &> /dev/null, it is invalid use because return does not belong to body of bash function, and would print:
line 2: return: can only `return' from a function or sourced script
However, somebody decided to hide that error message by piping output to /dev/null.
In other words, this all does not make much sense.

This discards the unwanted output of your program. Please check out this link, differences between >, 2> and &> are very well explained over there.
Quoting:
command &>/dev/null
This syntax redirects both standard output and error output messages to /dev/null where it is ignored by the shell.

Related

wget's console output to variable

I have a shell script that I want to automate downloading tasks, i would like to get response of command to a variable, command defines as follow.
var=wget --ftp-user=MyName --ftp-password=MyPassword --directory prefix=/home/pi/Desktop/FTP_File/ ftp://202.xx.xx.xx/VideoFiles/Video_1.mp4 2>&1
echo check "$var"
i have achieved that after adding 2>&1 at the end of line and command is in "``", i would like to know what is meant by 2>&1 and is there any other way to achieve it?
Have a look at this: http://www.learnlinux.org.za/courses/build/shell-scripting/ch01s04.html
Any program that is written must have some error checking and it should output some message if any error occurs.
It is a standard practice to output error messages on stderr, informative messages on stdout.
2>&1 means to display all the prints of stderr and stdout related to a particular command when executed.
Hope I have answered your question.

How to stop writing to log file in BASH?

I'm using this code to write all terminal's output to the file
exec > >(tee -a "${LOG_FILE}" )
exec 2> >(tee -a "${LOG_FILE}" >&2)
How to tell it to stop? If, for example, I don't want some output to get into log..
Thank you!
It's not completely clear what your goal is here, but here is something to try.
Do you know about the script utility? You can run script myTerminalOutputFile and any commands and output after that will be captured to myTerminalOutputFile. The downside is that it captures everything. You'll see funny screen control chars like ^[[5m;27j. So do a quick test to see if that works for you.
To finish capturing output just type exit and you are returned to you parent shell cmd-line.
Warning: check man script for details about the inherent funkyness of your particular OS's version of script. Some are better than others. script -k myTerminalOutputFile may work better in this case.
IHTH
You may pipe through "sed" and then tee to log file

Get output from executing an applescript within a bash script

I tried this technique for storing the output of a command in a BASH variable. It works with "ls -l", but it doesn't work when I run an apple script. For example, below is my BASH script calling an apple script.
I tried this:
OUTPUT="$(osascript myAppleScript.scpt)"
echo "Error is ${OUTPUT}"
I can see my apple script running on the command line, and I can see the error outputting on the command line, but when it prints "Error is " it's printing a blank as if the apple script output isn't getting stored.
Note: My apple script is erroring out on purpose to test this. I'm trying to handle errors correctly by collecting the apple scripts output
Try this to redirect stderr to stdout:
OUTPUT="$(osascript myAppleScript.scpt 2>&1)"
echo "$OUTPUT"
On success, the script's output is written to STDOUT. On failure, the error message is written to STDERR, and a non-zero return code set. You want to check the return code first, e.g. if [ $? -ne 0 ]; then..., and if you need the details then you'll need to capture osascript's STDERR.
Or, depending what you're doing, it may just be simplest to put set -e at the top of your shell script so that it terminates as soon as any error occurs anywhere in it.
Frankly, bash and its ilk really are a POS. The only half-decent *nix shell I've ever seen is fish, but it isn't standard on anything (natch). For complex scripting, you'd probably be better using Perl/Python/Ruby instead.
You can also use the clipboard as a data bridge. For example, if you wanted to get the stdout into the clipboard you could use:
osascript myAppleScript.scpt | pbcopy
In fact, you can copy to clipboard directly from your applescript eg. with:
set the clipboard to "precious data"
-- or to set the clipboard from a variable
set the clipboard to myPreciousVar
To get the data inside a bash script you can read the clipboard to a variable with:
data="$(pbpaste)"
See also man pbpase.

shell prompt seemingly does not reappear after running a script that uses exec with tee to send stdout output to both the terminal and a file

I have a shell script which writes all output to logfile
and terminal, this part works fine, but if I execute the script
a new shell prompt only appear if I press enter. Why is that and how do I fix it?
#!/bin/bash
exec > >(tee logfile)
echo "output"
First, when I'm testing this, there always is a new shell prompt, it's just that sometimes the string output comes after it, so the prompt isn't last. Did you happen to overlook it? If so, there seems to be a race where the shell prints the prompt before the tee in the background completes.
Unfortunately, that cannot fixed by waiting in the shell for tee, see this question on unix.stackexchange. Fragile workarounds aside, the easiest way to solve this that I see is to put your whole script inside a list:
{
your-code-here
} | tee logfile
If I run the following script (suppressing the newline from the echo), I see the prompt, but not "output". The string is still written to the file.
#!/bin/bash
exec > >(tee logfile)
echo -n "output"
What I suspect is this: you have three different file descriptors trying to write to the same file (that is, the terminal): standard output of the shell, standard error of the shell, and the standard output of tee. The shell writes synchronously: first the echo to standard output, then the prompt to standard error, so the terminal is able to sequence them correctly. However, the third file descriptor is written to asynchronously by tee, so there is a race condition. I don't quite understand how my modification affects the race, but it appears to upset some balance, allowing the prompt to be written at a different time and appear on the screen. (I expect output buffering to play a part in this).
You might also try running your script after running the script command, which will log everything written to the terminal; if you wade through all the control characters in the file, you may notice the prompt in the file just prior to the output written by tee. In support of my race condition theory, I'll note that after running the script a few times, it was no longer displaying "abnormal" behavior; my shell prompt was displayed as expected after the string "output", so there is definitely some non-deterministic element to this situation.
#chepner's answer provides great background information.
Here's a workaround - works on Ubuntu 12.04 (Linux 3.2.0) and on OS X 10.9.1:
#!/bin/bash
exec > >(tee logfile)
echo "output"
# WORKAROUND - place LAST in your script.
# Execute an executable (as opposed to a builtin) that outputs *something*
# to make the prompt reappear normally.
# In this case we use the printf *executable* to output an *empty string*.
# Use of `$ec` is to ensure that the script's actual exit code is passed through.
ec=$?; $(which printf) ''; exit $ec
Alternatives:
#user2719058's answer shows a simple alternative: wrapping the entire script body in a group command ({ ... }) and piping it to tee logfile.
An external solution, as #chepner has already hinted at, is to use the script utility to create a "transcript" of your script's output in addition to displaying it:
script -qc yourScript /dev/null > logfile # Linux syntax
This, however, will also capture stderr output; if you wanted to avoid that, use:
script -qc 'yourScript 2>/dev/null' /dev/null > logfile
Note, however, that this will suppress stderr output altogether.
As others have noted, it's not that there's no prompt printed -- it's that the last of the output written by tee can come after the prompt, making the prompt no longer visible.
If you have bash 4.4 or newer, you can wait for your tee process to exit, like so:
#!/usr/bin/env bash
case $BASH_VERSION in ''|[0-3].*|4.[0-3]) echo "ERROR: Bash 4.4+ needed" >&2; exit 1;; esac
exec {orig_stdout}>&1 {orig_stderr}>&2 # make a backup of original stdout
exec > >(tee -a "_install_log"); tee_pid=$! # track PID of tee after starting it
cleanup() { # define a function we'll call during shutdown
retval=$?
exec >&$orig_stdout # Copy your original stdout back to FD 1, overwriting the pipe to tee
exec 2>&$orig_stderr # If something overwrites stderr to also go through tee, fix that too
wait "$tee_pid" # Now, wait until tee exits
exit "$retval" # and complete exit with our original exit status
}
trap cleanup EXIT # configure the function above to be called during cleanup
echo "Writing something to stdout here"

Cannot redirect command output to a file (for postfix status)

I am trying to run the following command:
postfix status > tmp
however the resulting file never has any content written, and instead the output is still sent to the terminal.
I have tried adding the following into the mix, and even piping to echo before redirecting the output, but nothing seems ot have any effect
postfix status 2>&1 > tmp
Other commands work no problem.
script -c 'postfix status' -q tmp
It looks like it writes to the terminal instead to stdout. I don't understand piping to 'echo', did you mean piping to 'cat'?
I think you can always use the 'script' command, that logs everything that you see on the terminal. You would run 'script', then your command, then exit.
Thanks to another SO user, who deleted their answer, so now I can't thank, I was put on the right track. I found the answer here:
http://irbs.net/internet/postfix/0211/2756.html
So for those who want to be able to catch the response of the posfix, I used the following method.
Create a script which causes the output to go to where you wish. I did that like this:
#!/bin/sh
cat <<EOF | expect 2>&1
set timeout -1
spawn postfix status
expect eof
EOF
Then i ran the script (say script.sh) and could pipe/redirect from there. i.e. script.sh > file.txt
I needed this for PHP so I could use exec and actually get a response.

Resources