SSH command for search and replace in directories and subdirectories - search

These SSH commands work in changing text for several files in a directory
replace "old-string" "new-String" -- *.ext
replace "old-string" "new-String" -- *
replace "old-string" "new-String" -- filename
however these won't target subdirectories... anybody knows the command to include ALL subdirectories?

I think sed is better for this. Your first two examples can be rewritten:
find . -type f | xargs sed -i s/old-string/new-string/g
find . -type f -name '*.ext' | xargs sed -i s/old-string/new-string/g
You can also pipe the results of find to your replace command, if that is better for you.

Related

Mass Find/Replace within files having specific filename under command line

I am looking for a quick command to search all .htaccess files for a specific IP address and change it to another IP address from the command line
something like
grep -rl '255.255.254.254' ./ | xargs sed -i 's/254/253/g'
I know the above example is a bad way to do it, just an example (and showing I did some searching to find a solution
Search: files with filename .htaccess (within 2 levels deep of current path?)
Find: 255.255.254.254
Replace with: 255.255.253.253
or, is this too much to ask of my server and I would be better off replacing them as I find them?
Try:
find . -type f -name '.htaccess' -execdir sed -i 's/255\.255\.254\.254/255.255.253.253/g' {} +
How it works:
find .
Start looking for files in the current directory.
-type f
Look only for regular files.
-name '.htaccess'
Look only for files named .htaccess.
-execdir sed -i 's/255\.255\.254\.254/255.255.253.253/g' {} +
For any such files found, run this sed command on them.
Because . is a wildcard and you likely want to match only literal periods, we escape them: \.
We use -execdir rather than the older -exec because it is more secure against race conditions.

Search&Replace into multiple files with the name of the containing folder

I have multiple folders with names :
1_1,1_2,...,2_1,...,
each of these folders contains the same file with the name file.sh. The file has the following form :
job_name=NAME
Partition = Long
I want to use a search&replace command in the terminal (Linux) for all my folders, like for example the following
find . -type f -name "file.sh" -print |xargs sed -i 's/job_name/REPLACED_TEXT/g'
and in the position of the REPLACED_TEXT I want the name of the folder. For example, inside folder 1_1, there will be the file.sh file with the modified form:
job_name=1_1
Partition = Long
I haven't found a solution for that yet.
You didn't specify how many subdirectories you might have to traverse, e.g.
./1_1/file.sh
./1_2/file.sh
./a/b/c/1_1/file.sh
So for this I'll just assume one subdirectory like so:
./1_1/file.sh
./1_2/file.sh
Something like the below should be able to get you started, not tested, just writing it off the top of my head. It's bash scripted but you can turn it into one big long command. Make sure to back up your directory first in case the script has unpredictable results.
for i in `find . -type f -print "file.sh"`;
do
subdir=`echo $i | awk -F\/ '{print $2}'`
sed -e s/job_name=NAME/jobname=$subdir/ $i > $i.bak
mv $i.bak $i
done
You can try this line to print all the sed commands you want to execute:
find . -type f -name 'file.sh' | \
sed 's=\(.*\)/\([^/]*\)=sed -i "s/NAME/\1/" \"&\"='
For each file we found, it extracts the name of its directory and creates a sed command able to replace NAME with it.
Output should be something like:
sed -i "s/NAME/1_1/" "1_1/file.sh"
sed -i "s/NAME/1_2/" "1_2/file.sh"
Then, if it looks good to you, you can repeat with the e command for sed, which will make the outer sed execute its result (i.e. inner sed command), like this:
find . -type f -name 'file.sh' | \
sed 's=\(.*\)/\([^/]*\)=sed -i "s/NAME/\1/" \"&\"=e'
# 'e' command added here -------------------------^

Convert all EOL (dos->unix) of all files in a directory and sub-directories recursively without dos2unix

How do I convert all EOL (dos->unix) of all files in a directory and sub-directories recursively without dos2unix? (I do not have it and cannot install it.)
Is there a way to do it using tr -d '\r' and pipes? If so, how?
For all files in current directory you can do it with a Perl one-liner: perl -pi -e 's/\r\n/\n/g' * (stolen from here)
EDIT: And with a small modification you can do subdirectory recursion:
find | xargs perl -pi -e 's/\r\n/\n/g'
You can use sed's -i flag to change the files in-place:
find . -type f -exec sed -i 's/\x0d//g' {} \+
If I were you, I would keep the files around to make sure the operation went okay. Then you can delete the temporary files when you get done. This can be done like so:
find . -type f -exec sed -i'.OLD' 's/\x0d//g' {} \+
find . -type f -name '*.OLD' -delete
Do you have sane file names and directory names without spaces, etc in them?
If so, it is not too hard. If you've got to deal with arbitrary names containing newlines and spaces, etc, then you have to work harder than this.
tmp=${TMPDIR:-/tmp}/crlf.$$
trap "rm -f $tmp.?; exit 1" 0 1 2 3 13 15
find . -type f -print |
while read name
do
tr -d '\015' < $name > $tmp.1
mv $tmp.1 $name
done
rm -f $tmp.?
trap 0
exit 0
The trap stuff ensures you don't get temporary files left around. There other tricks you can pull, with more random names for your temporary file names. You don't normally need them unless you work in a hostile environment.
You can also use the editor in batch mode.
find . -type f -exec bash -c 'echo -ne "%s/\\\r//\nx\n" | ex "{}" ' \;
If \r isn't followed by \n (maybe the case in files of Tim Pote):
deleting \r (using tr -d) may remove newlines
replacing \r with \n may not cause double / triple newlines
Maybe Tim Pote could verify the points above for the files he mentioned.
This removes carriage returns from all files in the current directory and all subdirectories, and should work on most Unix-like OSs:
grep -lIUre '\r' | xargs sed -i 's/\r//'
If its done in widows:
try to run the command in git bash:
$ find | xargs perl -pi -e 's/\r\n/\n/g'
It can show some Can't do inplace edit: type a message so ignore it

Call sed in linux

I need to replace some string into another in files. I know how to do that with single file: sed -i 's/a/b/'. But what about recursive function? I think I have to use find . -name * with xargs somehow.
I need your help :)
You are correct, find and xargs are what you want to use. Here's an example which will find all files with the ".ext" file extension in the current folder and all subfolders ,and replace the letter a with the letter b in the files.
find . -name "*.ext" | xargs sed -i 's/a/b/g'

What's the best way to find a string/regex match in files recursively? (UNIX)

I have had to do this several times, usually when trying to find in what files a variable or a function is used.
I remember using xargs with grep in the past to do this, but I am wondering if there are any easier ways.
grep -r REGEX .
Replace . with whatever directory you want to search from.
The portable method* of doing this is
find . -type f -print0 | xargs -0 grep pattern
-print0 tells find to use ASCII nuls as the separator and -0 tells xargs the same thing. If you don't use them you will get errors on files and directories that contain spaces in their names.
* as opposed to grep -r, grep -R, or grep --recursive which only work on some machines.
This is one of the cases for which I've started using ack (http://petdance.com/ack/) in lieu of grep. From the site, you can get instructions to install it as a Perl CPAN component, or you can get a self-contained version that can be installed without dealing with dependencies.
Besides the fact that it defaults to recursive searching, it allows you to use Perl-strength regular expressions, use regex's to choose files to search, etc. It has an impressive list of options. I recommend visiting the site and checking it out. I've found it extremely easy to use, and there are tips for integrating it with vi(m), emacs, and even TextMate if you use that.
If you're looking for a string match, use
fgrep -r pattern .
which is faster than using grep.
More about the subject here: http://www.mkssoftware.com/docs/man1/grep.1.asp
grep -r if you're using GNU grep, which comes with most Linux distros.
On most UNIXes it's not installed by default so try this instead:
find . -type f | xargs grep regex
If you use the zsh shell you can use
grep REGEX **/*
or
grep REGEX **/*.java
This can run out of steam if there are too many matching files.
The canonical way though is to use find with exec.
find . -name '*.java' -exec grep REGEX {} \;
or
find . -type f -exec grep REGEX {} \;
The 'type f' bit just means type of file and will match all files.
I suggest changing the answer to:
grep REGEX -r .
The -r switch doesn't indicate regular expression. It tells grep to recurse into the directory provided.
This is a great way to find the exact expression recursively with one or more file types:
find . \\( -name '\''*.java'\'' -o -name '\''*.xml'\'' \\) | xargs egrep
(internal single quotes)
Where
-name '\''*.<filetype>'\'' -o
(again single quotes here)
is repeated in the parenthesis ( ) for how many more filetypes you want to add to your recursive search
an alias looks like this in bash
alias fnd='find . \\( -name '\''*.java'\'' -o -name '\''*.xml'\'' \\) | xargs egrep'

Resources