is there any bash symbol to represent redirection operand in Linux interactive use? - linux

I am looking for a shell parameter to represent the redirection operand.
e.g: data/temp.txt in this command:
cat file.txt > data/temp.txt
Is there any such bash special Parameters which will allow me to open the file that I am redirecting to in interactive use, after the command will exit ?
$ cat file.txt > data/temp.txt
$ vim data/temp.txt

I don't think so. But since the redirection operand usually is the last, you can use Bash's history expansion:
echo hello > testfile
cat !$ # cat testfile
Or for interactive session, alt+. or esc+_ can be used to insert the last word of the previous line, as a comment pointed out.
Bash documentation for history expansion.

Another possibility here is bash's process substitution -- if you're most interested in opening an editor on the output of cat, but you're not as interested in writing the output to a file, you can use
vim <(cat file.txt)
This writes stdout to a named pipe and opens the named pipe (probably named something like /dev/fd/NN) in a text editor. You make your edits, and you can then save your edits using <esc>:w data/temp.txt if you wish.

Related

How can I give special command (insert) to vi?

I remember doing magic with vi by "programming" it with input commands but I don't remember exactly how.
My sepcial request are:
launch vi in a script with command to be executed.
do an insert in one file.
search for a string in the file.
use $VARIABLE in the vi command line to replace something in the command.
finish with :wq.
I my memory, I sent the command exactly like in vi and the ESC key was emulate by '[' or something near.
I used this command into script to edit and change files.
I'm going to see the -c option but for now I cannot use $VARIABLE and insert was impossible (with 'i' or 'o').
#!/bin/sh
#
cp ../data/* .
# Retrieve filename.
MODNAME=$(pwd | awk -F'-' '{ print $2 }')
mv minimod.c $MODNAME.c
# Edit and change filename into the file (from mimimod to the content of $VARIABLE).
vi $MODENAME.c -c ":1,$s/minimod/$MODNAME/" -c ':wq!'
This example is not functionning (it seems the $VARIABLE is not good in -c command).
Could you help me remember memory ;) ?
Thanks a lot.
Joe.
You should not use vi for non-interactive editing. There's already another tool for that, it's called sed or stream editor.
What you want to do is
sed -i 's/minimod/'$MODNAME'/g' $MODNAME.c
to replace your vi command.
Maybe you are looking for the ex command, that can be run non-interatively:
ex $MODENAME.c <<< "1,\$s/minimod/$MODNAME/
wq!
"
Or if you use an old shell that does not implement here strings:
ex $MODENAME.c << EOF
1,\$s/minimod/$MODNAME/
wq!
EOF
Or if you do not want to use here documents either:
echo "1,\$s/minimod/$MODNAME/" > cmds.txt
echo "wq!" >> cmds.txt
ex $MODENAME.c < cmds.txt
rm cmds.txt
One command per line in the standard input. Just remember not to write the leading :. Take a look at this page for a quick review of ex commands.
Granted, it would be better to use the proper tool for the job, that would be sed, as #IsaA answer suggests, or awk for more complex commands.

Linux save string to file without ECHO command

I want to save a command to a file (for example I want to save the string "cat /etc/passwd" to a file) but I can't use the echo command.
How can I create and save string to a file directly without using echo command?
You can redirect cat to a file, type the text, and press Control-D when you're done, like this:
cat > file.txt
some text
some more text
^D
By ^D I mean to press Control-D at the end. The line must be empty.
It will not be part of the file, it is just to terminate the input.
Are you avoiding ECHO for security purposes (e.g. you're using a shared terminal and you don't want to leave trace in the shell history of what you've written inside your files) or you're just curious for an alternative method?
Simple alternative to echo:
As someone said, redirecting cat is probably the simpler way to go.
I'd suggest you to manually type your end-of-file, like this:
cat <<EOF > outputfile
> type here
> your
> text
> and finish it with
> EOF
Here's the string you're asking for, as an example:
cat <<EOF > myscript.sh
cat /etc/passwd
EOF
You probably don't want everyone to know you've peeked into that file, but if that's your purpose please notice that wrapping it inside an executable file won't make it more private, as that lines will be logged anyway...
Security - Avoiding history logs etc..
In modern shell, just try adding a space at the beginning of every command and use freely whatever you want.
BTW, my best hint is to avoid using that terminal at all, if you can. If you got two shells (another machine or even just another secure user in the same machine), I'd recommend you using netcat. See here: http://www.thegeekstuff.com/2012/04/nc-command-examples/?utm_source=feedburner
{ { command ls $(dirname $(which cat)) |
grep ^ca't$'; ls /etc/passwd; } |
tr \\n ' '; printf '\n'; } > output-file
But it's probably a lot simpler to just do : printf 'cat /etc/passwd\n'
To be clear, this is a tongue-in-cheek solution. The initial command is an extraordinarily convoluted way to get what you want, and this is intended to be a humorous answer. Perhaps instructive to understand.
I am not sure I understood you correctly but
cat /etc/passwd > target.file
use the > operator to write it to file without echoing
If you need to use it, inside a program :
cat <<EOF >file.txt
some text
some more text
EOF
I would imagine that you are probably trying to print the content of a string to a file, hence you mentioned echo.
You are avoiding this:
echo "cat /etc/passwd" > target.file
You can use a here string combined with cat.
cat > target.file <<< "cat /etc/passwd"
Now the file target.file will contain a string cat /etc/passwd.
$ cat target.file
cat /etc/passwd
$
To create string:
var1=your command
to save a file or variable in a file without echo use:
cat $FILE/VAR1 > /new/file/path

Bash printf %q invalid directive

I want to change my PS1 in my .bashrc file.
I've found a script using printf with %q directive to escape characters :
#!/bin/bash
STR=$(printf "%q" "PS1=\u#\h:\w\$ ")
sed -i '/PS1/c\'"$STR" ~/.bashrc
The problem is that I get this error :
script.sh: 2: printf: %q: invalid directive
Any idea ? Maybe an other way to escape the characters ?
The printf command is built into bash. It's also an external command, typically installed in /usr/bin/printf. On most Linux systems, /usr/bin/printf is the GNU coreutils implementation.
Older releases of the GNU coreutils printf command do not support the %q format specifier; it was introduced in version 8.25, released 2016-10-20. bash's built-in printf command does -- and has as long as bash has had a built-in printf command.
The error message implies that you're running script.sh using something other than bash.
Since the #!/bin/bash line appears to be correct, you're probably doing one of the following:
sh script.sh
. script.sh
source script.sh
Instead, just execute it directly (after making sure it has execute permission, using chmod +x if needed):
./script.sh
Or you could just edit your .bashrc file manually. The script, if executed correctly, will add this line to your .bashrc:
PS1=\\u#\\h:\\w\$\
(The space at the end of that line is significant.) Or you can do it more simply like this:
PS1='\u#\h:\w\$ '
One problem with the script is that it will replace every line that mentions PS1. If you just set it once and otherwise don't refer to it, that's fine, but if you have something like:
if [ ... ] ; then
PS1=this
else
PS1=that
fi
then the script will thoroughly mess that up. It's just a bit too clever.
Keith Thompson has given good advice in his answer. But FWIW, you can force bash to use a builtin command by preceding the command name with builtin eg
builtin printf "%q" "PS1=\u#\h:\w\$ "
Conversely,
command printf "%s\n" some stuff
forces bash to use the external command (if it can find one).
command can be used to invoke commands on disk when a function with the same name exists. However, command does not invoke a command on disk in lieu of a Bash built-in with the same name, it only works to suppress invocation of a shell function. (Thanks to Rockallite for bringing this error to my attention).
It's possible to enable or disable specific bash builtins (maybe your .bashrc is doing that to printf). See help enable for details. And I guess I should mention that you can use
type printf
to find out what kind of entity (shell function, builtin, or external command) bash will run when you give it a naked printf. You can get a list of all commands with a given name by passing type the -a option, eg
type -a printf
You can use grep to see the lines in your .bashrc file that contain PS1:
grep 'PS1' ~/.bashrc
or
grep -n0 --color=auto 'PS1=' ~/.bashrc
which gives you line numbers and fancy coloured output. And then you can use the line number to force sed to just modify the line you want changed.
Eg, if grep tells you that the line you want to change is line 7, you can do
sed -i '7c\'"$STR" ~/.bashrc
to edit it. Or even better,
sed -i~ '7c\'"$STR" ~/.bashrc
which backs up the original version of the file in case you make a mistake.
When using sed -i I generally do a test run first without the -i so that the output goes to the shell, to let me see what the modifications do before I write them to the file.

What is EOF!! in the bash script?

There is a command I don't understand:
custom_command << EOF!!
I want to ask what EOF!! is in the bash script. I did find EOF with google, but google will ignore the "!!" automatically, so I cannot find EOF!!.
I know the end of the file token, but I don't exactly know what it means with the "!!" in the script. Is this a mark to force something to do something like in vim's wq! ?
Plus, why and when should we use EOF!! instead of EOF?
On the command line, !! would be expanded to the last command executed. Bash will print the line for you:
$ ls
a.txt b.txt
$ cat <<EOF!!
cat <<EOFls
>
In a script, though, history expansion is disabled by default, so the exclamation marks are part of the word.
#! /bin/bash
ls
cat <<EOF!!
echo 1
EOFls
echo 2
Produces:
a.txt b.txt
script.sh: line 7: warning: here-document at line 3 delimited by end-of-file (wanted `EOF!!')
echo 1
EOFls
echo 2
To enable history and history expansion in a script, add the following lines:
set -o history
set -H
You can use whatever string as here document terminator.
EOF!! is just what the person writing the script decided to use.
It's probably just a weird heredoc.
Example:
cat << EOF!!
blabla
EOF!!
Note: this only works in script files. The command line parser interprets !!.
As others already wrote, this is a here-document.
The token used for that should be chosen carefully; as the probability that the here-document contains EOF!! is lower than for EOF itself, they chose that.
I suppose they checked it does not harm before using it; !! in a script does NOT refer to the history, but it stays as it is.
The bash manual lists this under "Event designators", saying:
!!
Refer to the previous command. This is a synonym for !-1`.
I simply searched for "bash manual double exclamation".

Colour highlighting output based on regex in shell

I'd like to know if I can colour highlight the output of a shell command that matches certain strings.
For example, if I run myCommand, with the output below:
> myCommand
DEBUG foo bar
INFO bla bla
ERROR yak yak
I'd like all lines matching ^ERROR\s.* to be highlighted red.
Similarly, I'd like the same highlighting to be applied to the output of grep, less etc...
EDIT: I probably should mention that ideally I'd like to enable this feature globally via a 'profile' option in my .bashrc.
There is an answer in superuser.com:
your-command | grep -E --color 'pattern|$'
or
your-command | grep --color 'pattern\|$'
This will "match your pattern or the end-of-line on each line. Only the pattern is highlighted..."
You can use programs such as:
spc (Supercat)
grc (Generic Colouriser)
highlight
histring
pygmentize
grep --color
You can do something like this, but the commands won't see a tty (some will refuse to run or behave differently or do weird things):
exec > >(histring -fEi error) # Bash
If you want to enable this globally, you'll want a terminal feature, not a process that you pipe output into, because a pipe would be disruptive to some command (two problems are that stdout and stderr would appear out-of-order and buffered, and that some commands just behave differently when outputting to a terminal).
I don't know of any “conventional” terminal with this feature. It's easily done in Emacs, in a term buffer: configure font-lock-keywords for term-mode.
However, you should think carefully whether you really want that feature all the time. What if the command has its own colors (e.g. grep --color, ls --color)? Maybe it would be better to define a short alias to a colorizer command and run myCommand 2>&1|c when you want to colorize myCommand's output. You could also alias some specific always-colorize commands.
Note that the return status of a pipeline is its last command, so if you run myCommand | c, you'll get the status of c, not myCommand. Here's a bash wrapper that avoids this problem, which you can use as w myCommand:
w () {
"$#" | c
return $PIPESTATUS[0]
}
You could try (maybe needs a bit more escaping):
BLUE="$(tput setaf 4)"
BLACK="$(tput sgr0)"
command | sed "s/^ERROR /${BLUE}ERROR ${BLACK}/g"
Try
tail -f yourfile.log | egrep --color 'DEBUG|'
where DEBUG is the text you want to highlight.
You can use the hl command avalaible on github :
git clone http://github.com/mbornet-hl/hl
Then :
myCommand | hl -r '^ERROR.*'
You can use the $HOME/.hl.cfg configuration file to simplify the command line.
hl is written in C (source is available).
You can use up to 42 differents colors of text.
Use awk.
COLORIZE_AWK_COMMAND='{ print $0 }'
if [ -n "$COLORIZE" ]; then
COLORIZE_AWK_COMMAND='
/pattern1/ { printf "\033[1;30m" }
/pattern2/ { printf "\033[1;31m" }
// { print $0 "\033[0m"; }'
fi
then later you can pipe your output
... | awk "$COLORIZE_AWK_COMMAND"
printf is used in the patterns so we don't print a newline, just set the color.
You could probably enable it for specific commands using aliases and user defined shell functions wihtout too much trouble. If your coloring errors I assume you want to process stderr. Since stderr in unbuffered you would probably want to line buffer it by sending through a fifo.

Resources