Send multiple outputs to sed - linux

When there is a program which, upon execution, prints several lines on stout, how can I redirect all those lines to sed and perform some operations on them while they are being generated?
For example:
7zip a -t7z output_folder input_folder -mx9 > sed 's/.*[ \t][ \t]*\([0-9][0-9]*\)%.*/\1/'
7zip generates a series of lines as output, each including a percentage value, and I would like sed to display these values only, while they are being generated. The above script unfortunately does not work...
What is the best way to do this?

You should use the pipe | instead of redirection > so that sed uses first command output as its input.
The above script line must have created a sed file in the current directory.
Furthermore, maybe 7zip outputs these lines to stderr instead of stdout. If it is the case, first redirect standard error to standard output before piping: 2>&1 |

Related

use sed to delete patterns on a range of lines

I have a large text file that I would like to divide into segments and use sed to delete certain patterns in place. I would like to do this in a single command line using a pipe. For example:
sed -n 1,10p <text file> | sed -i 's/<pattern to remove>//'
The code above attempts to take the first 10 lines of the text file and remove the patterns from the 10 lines in place. The resulting text file should have the first 10 lines modified. The code above doesn't work because the second command after the pipe requires a input file. Please help!
Something like
sed -i '1,10s/pattern//' foo.txt
though for in place editing of a file I prefer ed or perl instead of relying on a non standard sed extension like -i.
This seems to do what you're asking ....
sed -ni '1,10s/pattern//p' file

Get last line from grep search on multiple files, and write them in a output file

I have multiple files located in multiple directories. From them I search a keyword 'ENERGY' by grep. In each file I get multiple match cases. I want to take the last line from each file and save the results in the output.txt file. I wrote the following code:
labl=SubDir
ENERGY=`grep 'ENERGY' MyDir*${labl}*/*.txt`
cat > output.txt << EOF
${ENERGY}
EOF
This code saves all match cases from each file. But as mentioned, I need the last match case from each file. For that I modified the grep command as:
ENERGY=`grep 'ENERGY' MyDir*${labl}*/*.txt|taile -l`
Unfortunately this doesn't do the job either. Instead, it saves all the match cases from the last file only.
How to solve it?
Please don't run multiple processes/pipes to achieve this.
gawk '/ENERGY/{last=$0} ENDFILE{if(last!="") print last; last=""}' MyDir*"$labl"*/*.txt
/ENERGY/{last=$0}: On lines which match the regex ENERGY, set variable last to the contents of the entire line $0
ENDFILE{...} Run this {action} at the end of every input file supplied by the glob.
if(last!="") print last: print last if it's not null
last="": reset this variable to null, avoiding duplication
MyDir*"${labl}"*/*.txt: Quoted variable in glob will match directory names that include spaces
Use a for loop:
for f in MyDir*"$lab1"*/*.txt; do
grep ENERGY "$f" | tail -1 >> output.txt
done
Yet one but probably not last possible approach is to use parallel like this. Probably you can achieve the same with xargs, but I personally prefer parallel as simpler and giving the possibility to scale your process.
ls -1 file* | parallel -j1 "grep ENERGY {} | tail -n 1" > output.txt

How to redirect one of several inputs?

In Linux/Unix command line, when using a command with multiple inputs, how can I redirect one of them?
For example, say I'm using cat to concatenate multiple files, but I only want the last few lines of one file, so my inputs are testinput1, testinput2, and tail -n 4 testinput3.
How can I do this in one line without any temporary files?
I tried tail -n 4 testinput3 | cat testinput1 testinput2, but this seems to just take in input 1 and 2.
Sorry for the bad title, I wasn't sure how to phrase it exactly.
Rather than trying to pipe the output of tail to cat, bash provides process substitution where the process substitution is run with its input or output connected to a FIFO or a file in /dev/fd (like your terminal tty). This allows you to treat the output of a process as if it were a file.
In the normal case you will generally redirect the output of the process substitution into a loop, e.g, while read -r line; do ##stuff; done < <(process). However, in your case, cat takes the file itself as an argument rather than reading from stdin, so you omit the initial redirection, e.g.
cat file1 file2 <(tail -n4 file3)
So be familiar with both forms, < <(process) if you need to redirect a process as input or simply <(process) if you need the result of process to be treated as a file.

Paste a chunk of text from stdin to a specific location in a file

I'm trying to figure out how to efficiently copy-paste from X application to the terminal. Specifically I want to highlight a text section in my web browser, then paste this commented to a file after the shebang line.
the code I have so far is this:
xclip -o | sed 's/^/#/' | sed '2n' myscript.pl
the first command takes the text that I have highlighted in my browser
the second command comments the lines by adding #
the last bit does not work..
what I am trying to do here is append the text after line number 2 to my script. But obviously I am doing this wrong.. Does anyone have a helpful suggestion?
You can use sed read for safely handling all types of input, including input with special characters and multiple lines. This requires an intermediate file:
xclip -o | sed -e 's/^/#/g' -e '$s/$/\n/' > TMP && sed -i '1r TMP' den && rm TMP
sed only operates on one input stream (either a pipe or a file), if you are using the output of xclip as the data stream then you can't also tell sed to read from a file. Instead you could use command substitution to store the modified output, and use that in a separate command. How about:
sed "2i$(xclip -o | sed 's/^/#/')" myscript.pl
This will print the amended file to stdout, if you want to edit the file itself then use the -i flag.

Bash standard output display and redirection at the same time

In terminal, sometimes I would like to display the standard output and also save it as a backup. but if I use redirection ( > &> etc), it does not display the output in the terminal anymore.
I think I can do for example ls > localbackup.txt | cat localbackup.txt. But it just doesn't feel right. Is there any shortcut to achieve this?
Thank you!
tee is the command you are looking for:
ls | tee localbackup.txt
In addition to using tee to duplicate the output (and it's worth mentioning that tee is able to append to the file instead of overwriting it, by using tee -a, so that you can run several commands in sequence and retain all of the output), you can also use tail -f to "follow" the output file from a parallel process (e.g. a separate terminal):
command1 >localbackup.txt # create output file
command2 >>localbackup.txt # append to output
and from a separate terminal, at the same time:
tail -f localbackup.txt # this will keep outputting as text is appended to the file

Resources