What's a simple method to dump pipe input to a file? (Linux) - linux

I'm looking for a little shell script that will take anything piped into it, and dump it to a file.. for email debugging purposes. Any ideas?

The unix command tee does this.
man tee

cat > FILENAME

You're not alone in needing something similar... in fact, someone wanted that functionality decades ago and developed tee :-)
Of course, you can redirect stdout directly to a file in any shell using the > character:
echo "hello, world!" > the-file.txt

The standard unix tool tee can do this. It copies input to output, while also logging it to a file.

Use Procmail. Procmail is your friend. Procmail is made for this sort of thing.

Use <<command>> | tee <<file>> for piping a command <<command>> into a file <<file>>.
This will also show the output.

If you want to analyze it in the script:
while /bin/true; do
read LINE
echo $LINE > $OUTPUT
done
But you can simply use cat. If cat gets something on the stdin, it will echo it to the stdout, so you'll have to pipe it to cat >$OUTPUT. These will do the same. The second works for binary data also.

If you want a shell script, try this:
#!/bin/sh
exec cat >/path/to/file

If exim or sendmail is what's writing into the pipe, then procmail is a good answer because it'll give you file locking/serialization and you can put it all in the same file.
If you just want to write into a file, then
- tee > /tmp/log.$$
or
- cat > /tmp/log.$$
might be good enough.

Huh? I guess, I don't get the question?
Can't you just end your pipe into a >> ~file
For example
echo "Foobar" >> /home/mo/dumpfile
will append Foobar to the dumpfile (and create dumpfile if necessary). No need for a shell script... Is that what you were looking for?

if you don't care about outputting the result
cat - > filename
or
cat > filename

Related

Linux save string to file without ECHO command

I want to save a command to a file (for example I want to save the string "cat /etc/passwd" to a file) but I can't use the echo command.
How can I create and save string to a file directly without using echo command?
You can redirect cat to a file, type the text, and press Control-D when you're done, like this:
cat > file.txt
some text
some more text
^D
By ^D I mean to press Control-D at the end. The line must be empty.
It will not be part of the file, it is just to terminate the input.
Are you avoiding ECHO for security purposes (e.g. you're using a shared terminal and you don't want to leave trace in the shell history of what you've written inside your files) or you're just curious for an alternative method?
Simple alternative to echo:
As someone said, redirecting cat is probably the simpler way to go.
I'd suggest you to manually type your end-of-file, like this:
cat <<EOF > outputfile
> type here
> your
> text
> and finish it with
> EOF
Here's the string you're asking for, as an example:
cat <<EOF > myscript.sh
cat /etc/passwd
EOF
You probably don't want everyone to know you've peeked into that file, but if that's your purpose please notice that wrapping it inside an executable file won't make it more private, as that lines will be logged anyway...
Security - Avoiding history logs etc..
In modern shell, just try adding a space at the beginning of every command and use freely whatever you want.
BTW, my best hint is to avoid using that terminal at all, if you can. If you got two shells (another machine or even just another secure user in the same machine), I'd recommend you using netcat. See here: http://www.thegeekstuff.com/2012/04/nc-command-examples/?utm_source=feedburner
{ { command ls $(dirname $(which cat)) |
grep ^ca't$'; ls /etc/passwd; } |
tr \\n ' '; printf '\n'; } > output-file
But it's probably a lot simpler to just do : printf 'cat /etc/passwd\n'
To be clear, this is a tongue-in-cheek solution. The initial command is an extraordinarily convoluted way to get what you want, and this is intended to be a humorous answer. Perhaps instructive to understand.
I am not sure I understood you correctly but
cat /etc/passwd > target.file
use the > operator to write it to file without echoing
If you need to use it, inside a program :
cat <<EOF >file.txt
some text
some more text
EOF
I would imagine that you are probably trying to print the content of a string to a file, hence you mentioned echo.
You are avoiding this:
echo "cat /etc/passwd" > target.file
You can use a here string combined with cat.
cat > target.file <<< "cat /etc/passwd"
Now the file target.file will contain a string cat /etc/passwd.
$ cat target.file
cat /etc/passwd
$
To create string:
var1=your command
to save a file or variable in a file without echo use:
cat $FILE/VAR1 > /new/file/path

Pass sed output as if it's file

I have ruby script that receives name of config file as argument.
I need to run it in loop changing some param inside the config each iteration.
Everything ok with sed, however I have no idea how can I pass the sed's script output to ruby, so that ruby will think that it's file? Is it possible?
It might be clearer with code:
That is how it's usually launched:
ruby script.rb config.conf
What I want is:
sed 's/one_param/another_param/' config.conf | ruby script.rb ???????
What should I put so that ruby script were think that it received file with content as sed's output?
I thought about workaround with saving temporary file as sed's output and then passing the file to script.rb, but I sure there is better way to achieve it
See this answer on how to use process substitution.
In short:
cat <( echo "yo")
Or in your case:
ruby script.rb <(sed 's/one_param/another_param/' config.conf)
To create a process substitution you enclose the command with <(...) like: <(COMMAND)
Check http://mywiki.wooledge.org/ProcessSubstitution
Conventionally, UNIX programs accept - as a filename to mean "read from standard input":
echo foo | wc -
This is a convention that works basically everywhere.
However, script writers who don't know Unix may not think to implement this. This is a bug that should be fixed by them, but you can work around it using /dev/stdin instead:
echo foo | wc /dev/stdin
In your example, this would be one of
sed 's/one_param/another_param/' config.conf | ruby script.rb -
sed 's/one_param/another_param/' config.conf | ruby script.rb /dev/stdin

Bash standard output display and redirection at the same time

In terminal, sometimes I would like to display the standard output and also save it as a backup. but if I use redirection ( > &> etc), it does not display the output in the terminal anymore.
I think I can do for example ls > localbackup.txt | cat localbackup.txt. But it just doesn't feel right. Is there any shortcut to achieve this?
Thank you!
tee is the command you are looking for:
ls | tee localbackup.txt
In addition to using tee to duplicate the output (and it's worth mentioning that tee is able to append to the file instead of overwriting it, by using tee -a, so that you can run several commands in sequence and retain all of the output), you can also use tail -f to "follow" the output file from a parallel process (e.g. a separate terminal):
command1 >localbackup.txt # create output file
command2 >>localbackup.txt # append to output
and from a separate terminal, at the same time:
tail -f localbackup.txt # this will keep outputting as text is appended to the file

How to append several lines of text in a file using a shell script

I want to write several lines (5 or more) to a file I'm going to create in script. I can do this by echo >> filename. But I would like to know what the best way to do this?
You can use a here document:
cat <<EOF >> outputfile
some lines
of text
EOF
I usually use the so-called "here-document" Dennis suggested. An alternative is:
(echo first line; echo second line) >> outputfile
This should have comparable performance in bash, as (....) starts a subshell, but echo is 'inlined' - bash does not run /bin/echo, but does the echo by itself.
It might even be faster because it involves no exec().
This style is even more useful if you want to use output from another command somewhere in the text.

How to redirect output to a file and stdout

In bash, calling foo would display any output from that command on the stdout.
Calling foo > output would redirect any output from that command to the file specified (in this case 'output').
Is there a way to redirect output to a file and have it display on stdout?
The command you want is named tee:
foo | tee output.file
For example, if you only care about stdout:
ls -a | tee output.file
If you want to include stderr, do:
program [arguments...] 2>&1 | tee outfile
2>&1 redirects channel 2 (stderr/standard error) into channel 1 (stdout/standard output), such that both is written as stdout. It is also directed to the given output file as of the tee command.
Furthermore, if you want to append to the log file, use tee -a as:
program [arguments...] 2>&1 | tee -a outfile
$ program [arguments...] 2>&1 | tee outfile
2>&1 dumps the stderr and stdout streams.
tee outfile takes the stream it gets and writes it to the screen and to the file "outfile".
This is probably what most people are looking for. The likely situation is some program or script is working hard for a long time and producing a lot of output. The user wants to check it periodically for progress, but also wants the output written to a file.
The problem (especially when mixing stdout and stderr streams) is that there is reliance on the streams being flushed by the program. If, for example, all the writes to stdout are not flushed, but all the writes to stderr are flushed, then they'll end up out of chronological order in the output file and on the screen.
It's also bad if the program only outputs 1 or 2 lines every few minutes to report progress. In such a case, if the output was not flushed by the program, the user wouldn't even see any output on the screen for hours, because none of it would get pushed through the pipe for hours.
Update: The program unbuffer, part of the expect package, will solve the buffering problem. This will cause stdout and stderr to write to the screen and file immediately and keep them in sync when being combined and redirected to tee. E.g.:
$ unbuffer program [arguments...] 2>&1 | tee outfile
Another way that works for me is,
<command> |& tee <outputFile>
as shown in gnu bash manual
Example:
ls |& tee files.txt
If ‘|&’ is used, command1’s standard error, in addition to its standard output, is connected to command2’s standard input through the pipe; it is shorthand for 2>&1 |. This implicit redirection of the standard error to the standard output is performed after any redirections specified by the command.
For more information, refer redirection
You can primarily use Zoredache solution, but If you don't want to overwrite the output file you should write tee with -a option as follow :
ls -lR / | tee -a output.file
Something to add ...
The package unbuffer has support issues with some packages under fedora and redhat unix releases.
Setting aside the troubles
Following worked for me
bash myscript.sh 2>&1 | tee output.log
Thank you ScDF & matthew your inputs saved me lot of time..
Using tail -f output should work.
In my case I had the Java process with output logs. The simplest solution to display output logs and redirect them into the file(named logfile here) was:
my_java_process_run_script.sh |& tee logfile
Result was Java process running with output logs displaying and
putting them into the file with name logfile
You can do that for your entire script by using something like that at the beginning of your script :
#!/usr/bin/env bash
test x$1 = x$'\x00' && shift || { set -o pipefail ; ( exec 2>&1 ; $0 $'\x00' "$#" ) | tee mylogfile ; exit $? ; }
# do whaetever you want
This redirect both stderr and stdout outputs to the file called mylogfile and let everything goes to stdout at the same time.
It is used some stupid tricks :
use exec without command to setup redirections,
use tee to duplicates outputs,
restart the script with the wanted redirections,
use a special first parameter (a simple NUL character specified by the $'string' special bash notation) to specify that the script is restarted (no equivalent parameter may be used by your original work),
try to preserve the original exit status when restarting the script using the pipefail option.
Ugly but useful for me in certain situations.
Bonus answer since this use-case brought me here:
In the case where you need to do this as some other user
echo "some output" | sudo -u some_user tee /some/path/some_file
Note that the echo will happen as you and the file write will happen as "some_user" what will NOT work is if you were to run the echo as "some_user" and redirect the output with >> "some_file" because the file redirect will happen as you.
Hint: tee also supports append with the -a flag, if you need to replace a line in a file as another user you could execute sed as the desired user.
< command > |& tee filename # this will create a file "filename" with command status as a content, If a file already exists it will remove existed content and writes the command status.
< command > | tee >> filename # this will append status to the file but it doesn't print the command status on standard_output (screen).
I want to print something by using "echo" on screen and append that echoed data to a file
echo "hi there, Have to print this on screen and append to a file"
tee is perfect for this, but this will also do the job
ls -lr / > output | cat output

Resources