Use of "{}" and "\;" in sed command [duplicate] - linux

This question already has answers here:
Unix find command, what are the {} and \; for?
(5 answers)
Closed 6 years ago.
I've been googling this, but although I see people using them, I see no explanation for it. I've also tried "find and sed" searches, but again no explanation on these. The sed man and other sed guides out there also don't include it.
Now, I'm using the find command to find several files and then using sed to replace strings whithin the files. I believe that these "{}" and "\;" at the end of the sed are what allows sed to take each file name from find and search through its text. But I'd rather not guess and know for sure what they are and why they're there. Here's my current command:
output=$(find . -type f -name '*.h' -o -name '*.C' -o -name '*.cpp' -o -name "*.cc" -exec sed -n -i "s/ARG2/ARG3/g" {} \;)
I'm also concerned that the ";" at the end may not be necessary since I'm wrapping it also and throwing it into a variable. Could someone clarify what those curlies and backslash are doing and whether I need them?
EDIT: It turns out that they're properties of find -exec, not sed. So I've been looking in the wrong place. Thanks!

find has two versions of the -exec action:
-exec ... {} +
...runs the command ..., with as many filenames from the results as possible added at the end.
-exec ... {} ';'
...runs the command ... once per file found, with the filename substituted in place of the {} sigil; it's not mandatory that the sigil be at the end of the line in this usage.
Thus, either a + or ; needs to be passed as a literal argument to find for it to accept the action as valid.

Related

find -exec and run command on {} [duplicate]

Is there any way to apply Bash variable substitution on find's output? I know I've seen someone do it on Stack Overflow but I can't seem to find that particular post anymore.
As an example, let's say I want to rename files ending in *.png to *_copy.png. I know I can do this using rename but it's just a thought experiment.
Now I'd like to be able to do something like this:
find . -name "*png" -exec mv "{}" "${{}%.*}_copy.png" \;
Which results in an invalid substitution. Of course, I could first assign the output to a variable and then apply substitution in a sub-shell, but is this really the only way?
find . -name "*.png" -exec bash -c 'var="{}"; mv "{}" "${var%.*}_copy.png"' \;
Or is there any way this can be achieved directly from {}?
Consensus
As Etan Reisner remarked, a better and safer way to handle the output of find would be to pass it as a positional argument:
find . -name "*.png" -exec bash -c 'mv "$0" "${0%.*}_copy.png"' "{}" \;
It took me a while to get the question. Basically you are asking if something like:
echo "${test.png%.png}"
could be used to get the word test.
The answer is no. The string manipulation operators starting with ${ are a subset of the parameter substitution operators. They work only on variables, not with string literals, meaning you need to store the string in a variable before. Like this:
img="test.png"
echo "${img%.png}"
Just for travellers from Google, I would use rename for this particular task. (As the OP already mentioned in his question). The command could look like this:
find -name '*png' -execdir rename 's/\.png/_copy.png/' {} +

Replace all \\ with / in files and subdirs [duplicate]

This question already has answers here:
How to insert strings containing slashes with sed? [duplicate]
(11 answers)
Closed 5 years ago.
i need a quick command (linux or windows) to replace every \\ with a /, and all tries with sed failed because of the /.
(I already tried find . -name '*.*' -exec sed -i 's/\\///g' {} \;, but i think it failed with the "/".
find . -name '*.*' -type f -exec sed -i 's:\\\\:/:g' {} \;
You need to escape each backslash, and using a colon or comma as separators is generally recommended when making replacements with forward-slash. However, escaping the forward slash works too:
find . -name '*.*' -type f -exec sed -i 's/\\\\/\//g' {} \;
As pointed out in comments the OS module is probably what you really need to look at.
Edit: thanks to #tripleee for reminding me of the -type f line, which limits it to files, rather than including the current directory.
Also, I copied the syntax *.* from the OP but in general it isn't helpful. * alone is usually what you want, since files aren't guaranteed to have a dot in their name. Assuming you were happy to include files not containing a dot, the simplest thing to do here is have no -name at all:
find . -type f -exec sed -i 's:\\\\:/:g' {} \;

How to grep contents from list of files from Linux ls or find command

I am running -> "find . -name '*.txt'" command and getting list of files.
I am getting below mention output:
./bsd/contrib/amd/ldap-id.txt
./bsd/contrib/expat/tests/benchmark/README.txt
./bsd/contrib/expat/tests/README.txt
./bsd/lib/libc/softfloat/README.txt
and so on,
Out of these files how can i run grep command and read contents and filter only those files which have certain keyword? for e.g. "version" in it.
xargs is a great way to accomplish this, and its already been covered.
The -exec option of find is also useful for this. It will perform a command over all files returned from find.
To invoke grep as few times as possible, passing multiple filenames to each call:
find . -name '*.txt' -exec grep -H 'foo' {} +
Alternately, to invoke grep exactly once for each file found:
find . -name '*.txt' -exec grep -H 'foo' {} ';'
In either case, {} is like a placeholder for the values from find; if your shell is zsh, it may be necessary to escape it, as in '{}'.
There are several ways to accomplish this.
If there are non-.txt files which might usefully contain the keyword:
grep -r KEYWORD *
This uses the recursive directory search option of grep.
To search only .txt files:
find . -name '*.txt' -exec grep KEYWORD {} \;
or
find . -name '*.txt' -exec grep KEYWORD {} +
or
find . -execdir grep KEYWORD {}
The first runs grep for each matching file. The second runs grep much fewer times, accumulating many matched files before invoking grep. The third form runsgrep` once in every directory.
There is usually a function built into find for that, but to be portable across platforms, I typically use xargs. Say you want to find all the xml files in or below the current directly and get a list of each occurrence of 'foo', you can do this:
find ./ -type f -name '*.xml' -print0 | xargs -0 -n 1 grep -H foo
It should be self-explanatory except for the -print0, which separates filenames with NULs rather than newlines, and the -0, which tells xargs to use those NULs rather than interpreting spaces and quotes as syntax (which can confuse it if filenames contain either).

Shell notation: find . -type f -exec file '{}' \; [duplicate]

This question already has answers here:
Why are the backslash and semicolon required with the find command's -exec option?
(2 answers)
Closed 9 years ago.
This is a relatively simple command, so if a duplicate exists and someone could refer me, I'm sorry and I'll delete/close this question.
Man page for find
find . -type f -exec file '{}' \;
Runs 'file' on every file in or below the current directory. Notice that the braces are enclosed in single quote marks to protect them from interpretation
as shell script punctuation. The semicolon is similarly protected by the use of a backslash, though ';' could have been used in that case also.
I do not understand the notation \;. What in the world is that?
In the find command, the action -exec is followed by a command and that command's arguments. Because there can be any number of arguments, find needs some way of knowing when it ends. The semicolon is what tells find that it has reached the end of the command's arguments.
Left to their own devices, most shells would eat the semicolon. We want that semicolon to be passed to the find command. Therefore, we escape it with a backslash. This tells the shell to treat the semicolon as just one of the arguments to the find command.
MORE: Why not, one may ask, just assume that the exec command's argument just go to the end of the line? Why do we need to signal an end to the exec command's arguments at all? The reason is that find has advanced features. Just for example, consider:
find . -name '*.pdf' -exec echo Yes, we have a pdf: {} \; -o -exec echo No, not a pdf: {} \;

Insert line into multi specified files

I want to insert a line into the start of multiple specified type files, which the files are located in current directory or the sub dir.
I know that using
find . -name "*.csv"
can help me to list the files I want to use for inserting.
and using
sed -i '1icolumn1,column2,column3' test.csv
can use to insert one line at the start of file,
but now I do NOT know how to pipe the filenames from "find" command to "sed" command.
Could anybody give me any suggestion?
Or is there any better solution to do this?
BTW, is it work to do this in one line command?
Try using xargs to pass output of find and command line arguments to next command, here sed
find . -type f -name '*.csv' -print0 | xargs -0 sed -i '1icolumn1,column2,column3'
Another option would be to use -exec option of find.
find . -type f -name '*.csv' -exec sed -i '1icolumn1,column2,column3' {} \;
Note : It has been observed that xargs is more efficient way and can handle multiple processes using -P option.
This way :
find . -type f -name "*.csv" -exec sed -i '1icolumn1,column2,column3' {} +
-exec do all the magic here. The relevant part of man find :
-exec command ;
Execute command; true if 0 status is returned. All following arguments
to find are taken to be arguments to the command until an argument consisting
of `;' is encountered. The string `{}' is replaced by the current file name
being processed everywhere it occurs in the arguments to the command, not just
in arguments where it is alone, as in some versions of find. Both of
these constructions might need to be escaped (with a `\') or quoted to protect
them from expansion by the shell. See the EXAMPLES section for examples of
the use of the -exec option. The specified command is run once for each
matched file. The command is executed in the starting directory. There
are unavoidable security problems surrounding use of the -exec action;
you should use the -execdir option instead

Resources