find and sed doesn't work in centos 7 - linux

I'm trying to find and replace a word in my entire project and I tried below versions of find and sed in centos 7 but nothing works.
find ./ -name "*.php" -exec sed -i '' s/mysql_/mysqli_/g \;
find ./ -name "*.php" -exec sed -i '' s/mysql_/mysqli_/g {} \;
find ./ -name "*.php" -exec sed -i '' 's/mysql_/mysqli_/g' {} \;
find ./ -name "*.php" -ls | xargs sed -i '' 's/mysql_/mysqli_/g'
sed: can't read s/mysql_/mysqli_/g: No such file or directory
All above commands giving me this error in loop even though I'm running these commands from the root of my project. Permissions are all correct. If I simply use find command alone it's working
find ./ -name "*.php" -ls (This Works)
I tried solutions available in stackoverflow but nothing works.

The fist pair of quotes in sed aren't necessary, try:
find ./ -name "*.php" -exec sed -i s/mysql_/mysqli_/g {} \;
The syntax is either -i'prefix' or --in-place='prefix', not -i 'prefix', since you added an space between the prefix and the argument, it's making sed use the prefix (empty string) argument as the regex and use the actual regex as a filename argument, which obviously won't find.
That's why you are getting the can't read s/mysql_/mysqli_/g: No such file or directory error.

Related

Sed and grep in multiple files

I want to use "sed" and "grep" to search and replace in multiples files, excluding some directories.
I run this command:
$ grep -RnI --exclude-dir={node_modules,install,build} 'chaine1' /projets/ | sed -i 's/chaine1/chaine2/'
I get this message:
sed: pas de fichier d'entrée
I also tried with these two commands:
$ grep -RnI --exclude-dir={node_modules,install,build} 'chaine1' . | xargs -0 sed -i 's/chaine2/chaine2/'
$ grep -RnI --exclude-dir={node_modules,install,build} 'chaine2' . -exec sed -i 's/chaine1/chaine2/g' {} \;
But,it doesn't work!!
Could you help me please?
Thanks in advance.
You want find with -exec. Don't bother running grep, sed will only change lines containing your pattern anyway.
find \( -name node_modules -o -name install -o -name build \) -prune \
-o -type f -exec sed -i 's/chaine1/chaine2/' {} +
First, the direct outputs of grep command are not file paths. They look like this {file_path}:{line_no}:{content}. So the first thing you need to do is to extract file paths. We can do this use cut command or use -l option of grep.
# This will print {file_path}
$ echo {file_path}:{line_no}:{content} | cut -f 1 -d ":"
# This is a better solution, because it only prints each file once even though
# the grep pattern appears at many lines of a file.
$ grep -RlI --exclude-dir={node_modules,install,build} "chaine1" /projets/
Second, sed -i does not read from stdin. We can use xargs to read each file path from stdin and then pass it to sed as its argument. You have already done this.
The complete command like this:
$ grep -RlI --exclude-dir={node_modules,install,build} "chaine1" /projets/ | xargs -i sed -i 's/chaine1/chaine2/' {}
Edit: Thanks to #EdMorton's comment, I dig into find. My previous solutions will dig into files not in exclusive directories once by grep, and then process files containing pattern string for another time by sed. However, we can first use find to filter files according to their path names, and then use sed to process files only once.
My find solution is almost the same as #knittl's, but with bug fixed. Besides, I try to explain why it gets the similar results with grep. Because I still not find how to skip binary files like -I option of grep.
$ find \( \( -name node_modules -o -name install -o -name build \) -prune -type f \
-o -type f \) -exec echo {} +
or
find \( \( -name node_modules -o -name install -o -name build \) -prune \
-o -type f \) -type f -exec echo {} +
\( -name pat1 -o -name pat2 \) gives paths matching pat1 or pat2 (include files and directories), where -o means logical or. -prune ignores a directory and the files under it. They combine to achieve similar function with exclude-dir in grep.
-type f gives paths of regular files.

sed ack search / replace line break with string

I have looked into a few SO threads, non of which have helped my specific situation.
I am trying to update a PHP app that I took over from php 5.6 to php 8.0
With that said there are MANY instances that look like:
<?
echo ...
function
I need to find all cases where <? is followed directly by a newline and replace it with <?php(newline)
Per the SO posts I've read .. I think I am coming close with the following:
find ./ -type f -readable -writable -exec sed -i ":a;N;$!ba;s/\<\?\n/\<\?php\n/g" {} \;
I think I am close .. But I can't figure out why it won't replace <?\n with <?php\n as the sed statement works without the newline. But per THIS POST it looks like I am doing it correctly.
What am I doing wrong?
Iterations I've tried:
$ find ./ -type f -readable -writable -exec sed -i ":a;N;$!ba;s/\<\?\n/\<\?php\n/g" {} \;
$ find ./ -type f -readable -writable -exec sed -i ":a;N;$!ba;s/<\?\n/<\?php\n/g" {} \;
$ find ./ -type f -readable -writable -exec sed -i ":a;N;$!ba;s/<?\n/<?php\n/g" {} \;
$ find ./ -type f -readable -writable -exec sed -i ":a;N;$!ba;s/<?\n\r/<?php\n/g" {} \;
$ find ./ -type f -readable -writable -exec sed -i ":a;N;$!ba;s/<?\r\n/<?php\n/g" {} \;
The sed command itself could be something as simple as:
sed -i 's/<?$/<?php/'
Glue that together with find and it might work for you.
$ is an anchor matching the end of a line, you might consider using ^ to anchor the match to the beginning as well:
s/^<?$/<?php/

Missing Syntax of moving file from one folder to another [duplicate]

I was helped out today with a command, but it doesn't seem to be working. This is the command:
find /home/me/download/ -type f -name "*.rm" -exec ffmpeg -i {} -sameq {}.mp3 && rm {}\;
The shell returns
find: missing argument to `-exec'
What I am basically trying to do is go through a directory recursively (if it has other directories) and run the ffmpeg command on the .rm file types and convert them to .mp3 file types. Once this is done, remove the .rm file that has just been converted.
A -exec command must be terminated with a ; (so you usually need to type \; or ';' to avoid interpretion by the shell) or a +. The difference is that with ;, the command is called once per file, with +, it is called just as few times as possible (usually once, but there is a maximum length for a command line, so it might be split up) with all filenames. See this example:
$ cat /tmp/echoargs
#!/bin/sh
echo $1 - $2 - $3
$ find /tmp/foo -exec /tmp/echoargs {} \;
/tmp/foo - -
/tmp/foo/one - -
/tmp/foo/two - -
$ find /tmp/foo -exec /tmp/echoargs {} +
/tmp/foo - /tmp/foo/one - /tmp/foo/two
Your command has two errors:
First, you use {};, but the ; must be a parameter of its own.
Second, the command ends at the &&. You specified “run find, and if that was successful, remove the file named {};.“. If you want to use shell stuff in the -exec command, you need to explicitly run it in a shell, such as -exec sh -c 'ffmpeg ... && rm'.
However you should not add the {} inside the bash command, it will produce problems when there are special characters. Instead, you can pass additional parameters to the shell after -c command_string (see man sh):
$ ls
$(echo damn.)
$ find * -exec sh -c 'echo "{}"' \;
damn.
$ find * -exec sh -c 'echo "$1"' - {} \;
$(echo damn.)
You see the $ thing is evaluated by the shell in the first example. Imagine there was a file called $(rm -rf /) :-)
(Side note: The - is not needed, but the first variable after the command is assigned to the variable $0, which is a special variable normally containing the name of the program being run and setting that to a parameter is a little unclean, though it won't cause any harm here probably, so we set that to just - and start with $1.)
So your command could be something like
find -exec bash -c 'ffmpeg -i "$1" -sameq "$1".mp3 && rm "$1".mp3' - {} \;
But there is a better way. find supports and and or, so you may do stuff like find -name foo -or -name bar. But that also works with -exec, which evaluates to true if the command exits successfully, and to false if not. See this example:
$ ls
false true
$ find * -exec {} \; -and -print
true
It only runs the print if the command was successfully, which it did for true but not for false.
So you can use two exec statements chained with an -and, and it will only execute the latter if the former was run successfully.
Try putting a space before each \;
Works:
find . -name "*.log" -exec echo {} \;
Doesn't Work:
find . -name "*.log" -exec echo {}\;
I figured it out now. When you need to run two commands in exec in a find you need to actually have two separate execs. This finally worked for me.
find . -type f -name "*.rm" -exec ffmpeg -i {} -sameq {}.mp3 \; -exec rm {} \;
You have to put a space between {} and \;
So the command will be like:
find /home/me/download/ -type f -name "*.rm" -exec ffmpeg -i {} -sameq {}.mp3 && rm {} \;
Just for your information:
I have just tried using "find -exec" command on a Cygwin system (UNIX emulated on Windows), and there it seems that the backslash before the semicolon must be removed:
find ./ -name "blabla" -exec wc -l {} ;
For anyone else having issues when using GNU find binary in a Windows command prompt. The semicolon needs to be escaped with ^
find.exe . -name "*.rm" -exec ffmpeg -i {} -sameq {}.mp3 ^;
You need to do some escaping I think.
find /home/me/download/ -type f -name "*.rm" -exec ffmpeg -i {} \-sameq {}.mp3 \&\& rm {}\;
Just in case anyone sees a similar "missing -exec args" in Amazon Opsworks Chef bash scripts, I needed to add another backslash to escape the \;
bash 'remove_wars' do
user 'ubuntu'
cwd '/'
code <<-EOH
find /home/ubuntu/wars -type f -name "*.war" -exec rm {} \\;
EOH
ignore_failure true
end
Also, if anyone else has the "find: missing argument to -exec" this might help:
In some shells you don't need to do the escaping, i.e. you don't need the "\" in front of the ";".
find <file path> -name "myFile.*" -exec rm - f {} ;
Both {} and && will cause problems due to being expanded by the command line. I would suggest trying:
find /home/me/download/ -type f -name "*.rm" -exec ffmpeg -i \{} -sameq \{}.mp3 \; -exec rm \{} \;
In my case I needed to execute "methods" from by bash script, which does not work when using -exec bash -c, so I add another solution I found here, as well:
UploadFile() {
curl ... -F "file=$1"
}
find . | while read file;
do
UploadFile "$file"
done
This thread pops up first when searching for solutions to execute commands for each file from find, so I hope it's okay that this solution does not use the -exec argument
I got the same error when I left a blank space after the ending ; of an -exec command.So, remove blank space after ;
If you are still getting "find: missing argument to -exec" try wrapping the execute argument in quotes.
find <file path> -type f -exec "chmod 664 {} \;"

How to pipe the results of 'find' to mv in Linux

How do I pipe the results of a 'find' (in Linux) to be moved to a different directory? This is what I have so far.
find ./ -name '*article*' | mv ../backup
but its not yet right (I get an error missing file argument, because I didn't specify a file, because I was trying to get it from the pipe)
find ./ -name '*article*' -exec mv {} ../backup \;
OR
find ./ -name '*article*' | xargs -I '{}' mv {} ../backup
xargs is commonly used for this, and mv on Linux has a -t option to facilitate that.
find ./ -name '*article*' | xargs mv -t ../backup
If your find supports -exec ... \+ you could equivalently do
find ./ -name '*article*' -exec mv -t ../backup {} \+
The -t option is a GNU extension, so it is not portable to systems which do not have GNU coreutils (though every proper Linux I have seen has that, with the possible exception of Busybox). For complete POSIX portability, it's of course possible to roll your own replacement, maybe something like
find ./ -name '*article*' -exec sh -c 'mv "$#" "$0"' ../backup {} \+
where we shamelessly abuse the convenient fact that the first argument after sh -c 'commands' ends up as the "script name" parameter in $0 so that we don't even need to shift it.
Probably see also https://mywiki.wooledge.org/BashFAQ/020
I found this really useful having thousands of files in one folder:
ls -U | head -10000 | egrep '\.png$' | xargs -I '{}' mv {} ./png
To move all pngs in first 10000 files to subfolder png
mv $(find . -name '*article*') ../backup
Here are a few solutions.
find . -type f -newermt "2019-01-01" ! -newermt "2019-05-01" \
-exec mv {} path \;**
or
find path -type f -newermt "2019-01-01" ! -newermt "2019-05-01" \
-exec mv {} path \;
or
find /Directory/filebox/ -type f -newermt "2019-01-01" \
! -newermt "2019-05-01" -exec mv {} ../filemove/ \;
The backslash + newline is just for legibility; you can equivalently use a single long line.
xargs is your buddy here (When you have multiple actions to take)!
And using it the way I have shown will give great control to you as well.
find ./ -name '*article*' | xargs -n1 sh -c "mv {} <path/to/target/directory>"
Explanation:
-n1
Number of lines to consider for each operation ahead
sh -c
The shell command to execute giving it the lines as per previous condition
"mv {} /target/path"
The move command will take two arguments-
1) The line(s) from operation 1, i.e. {}, value substitutes automatically
2) The target path for move command, as specified
Note: the "Double Quotes" are specified to allow any number of spaces or arguments for the shell command which receives arguments from xargs

find & sed: remove lines

I am trying to delete some line in PHP files. I tried to use an find, exec combination:
find . -name '*.php' -exec sed '/#category/d' {} \;
but it only prints out the files contents. Is there anythin wrong in the syntax? Or what is the problem?
Could you try this command:
find . -name '*.php' -exec sed -i '/#category/d' {} \;
I think you've missed -i option
It works, but probably not how you expect.
find . -name '*.php' -exec sed -i '/#category/d' {} \;
Will kill the lines in question.
This should be the command for sed so try to add -i :
sed -i ".bak" '/culpa/d' test.txt
find . -name '*.php' -exec sed -i '/#category/d' {} \;
Source of the answer:
Bash - find a keyword in a file and delete its line

Resources