Mass Find/Replace within files having specific filename under command line - search

I am looking for a quick command to search all .htaccess files for a specific IP address and change it to another IP address from the command line
something like
grep -rl '255.255.254.254' ./ | xargs sed -i 's/254/253/g'
I know the above example is a bad way to do it, just an example (and showing I did some searching to find a solution
Search: files with filename .htaccess (within 2 levels deep of current path?)
Find: 255.255.254.254
Replace with: 255.255.253.253
or, is this too much to ask of my server and I would be better off replacing them as I find them?

Try:
find . -type f -name '.htaccess' -execdir sed -i 's/255\.255\.254\.254/255.255.253.253/g' {} +
How it works:
find .
Start looking for files in the current directory.
-type f
Look only for regular files.
-name '.htaccess'
Look only for files named .htaccess.
-execdir sed -i 's/255\.255\.254\.254/255.255.253.253/g' {} +
For any such files found, run this sed command on them.
Because . is a wildcard and you likely want to match only literal periods, we escape them: \.
We use -execdir rather than the older -exec because it is more secure against race conditions.

Related

Search and replace URL in all files

I'm looking to run script or command to mass change a string that is a URL. I've viewed many examples on this forum, however none are working.
I have created a .sh file to run the following:
$SRC='$url = "https://www.myurl.com/subdir/process.do"';
$DST='$url="https://api.myurl.com/subdir/process.do"';
find . -type f -name "*.php" -exec sed -i 's/$SRC/$DST/g' {} +;
This is not working. I'm thinking it may because of having backslashes in the search content? The search/replace is needed to be run across all sub-directories on .php files.
Any assistance would be greatly appreciated.
Thanks!
First thing - check your variable definitions. In bash, variable definitions usually do not start with a leading $. Ie, should be:
SRC='$url = "https://www.myurl.com/subdir/process.do"';
DST='$url="https://api.myurl.com/subdir/process.do"';
Next, you should switch to using single quotes for the pattern, and double quotes for the variable, as per:
https://askubuntu.com/questions/76808/how-do-i-use-variables-in-a-sed-command
Example that seems to work:
sed -i 's,'"$SRC"','"$DST"','
UPDATE: This exact script works perfectly for me on Linux:
#!/bin/bash
SRC='$url = "https://www.myurl.com/subdir/process.do"';
DST='$url="https://api.myurl.com/subdir/process.do"';
find . -type f -name "*.php" -exec sed -i 's,'"$SRC"','"$DST"',' {} \;
Contents of file "asdf.php" created in home directory (before running script):
$url = "https://www.myurl.com/subdir/process.do"
Contents of file "asdf.php" after running script:
$url="https://api.myurl.com/subdir/process.do"

find tekst in files in subfolders

So this question might have been asked before, but after some hours of searching (or searching wrongfully) I decided to ask this question.
If it's already been answered before, please link me the question and close this one.
here's my issue.
I have a folder on my filesystem, ie "files". this folder has got a lot of subfolders, with their subfolders. some levels deep, they all have a file which is called the same in all folders. In that file, a lot of text is in it, but it's not ALL the same. I need to have a list of files that contains a certain string.
I KNOW I can do this with
find ./ -type f -exec grep -H 'text-to-find-here' {} \;
but the main problem is: it will get over every single file on that filesystem. as the filesystem contains MILLIONS of files, this would take up a LONG time, specially when I know the exact file this piece of text should be in.
visually it looks like this:
foobar/foo/bar/file.txt
foobar/foobar/bar/file.txt
foobar/barfoo/bar/file.txt
foobar/raboof/bar/file.txt
foobar/oof/bar/file.txt
I need a specific string out of file.txt (if that string exists..)
(and yes: the file in /bar/ is ALLWAYS called file.txt...)
Can anyone help me on how to do so? i'm breaking my head on an "easy" solution :o
Thnx,
Daniel
Use the -name option to filter by name:
find . -type f -name file.txt -exec grep -H 'text-to-find-here' {} +
And if it's always in a directory named bar, you can use -path with a wildcard:
find . -type f -path '*/bar/file.txt' -exec grep -H 'text-to-find-here' {} +
With single GNU grep command:
grep -rl 'pattern' --include=*file.txt
--include=glob
Search only files whose name matches glob, using wildcard

How to recursively delete all files in folder that dont match a given pattern

I would like to delete all files in a given folder that dont match the pattern ^transactions_[0-9]+
Let's say I have these files in the folder
file_list
transactions_010116.csv
transactions_020116.csv
transactions_check_010116.csv
transactions_check_020116.csv
I would like to delete transactions_check_010116.csv and transactions_check_020116.csv and leave the first two as they are using ^transactions_[0-9]+
I've been trying to use find something like below, but this expression deletes everything in the folder not just the files that dont match the pattern:
find /my_file_location -type f ! -regex '^transactions_[0-9]+' -delete
What i'm trying to do here is using regex find all files in folder that dont start with ^transactions_[0-9]+ and delete them.
Depending on your implementation, you could have to use option -E to allow the use of full regexes. An other problem is that -regex gives you an almost full path starting with the directory you passed.
So the correct command should be:
find -E /my_file_location ! -regex '.*/transactions_[0-9]+$' -type f -delete
But you should first issue the same with -print to be sure...
grep has -v option to grep everything not matching the provided regex:
find . | grep -v '^transactions_[0-9]+' | xargs rm -f

find and replace text in file recursively in linux

I am in need to find and replace a part of text in all my files on my web server. I am aware of the command (by Google'ing it) as
find . -type f -exec sed -i 's/foo/bar/g' {} +
Problem though is that the text I need to replace contains / in it. For instance I need to...
Find
/home/this/root/
With
/home/that/root/
since the command above uses / as a separator to determine find/replace how do I include / in my search so the command does not get confused?
Use a different sed delimiter.
find . -type f -exec sed -i 's~foo~bar~g' {} +

SSH command for search and replace in directories and subdirectories

These SSH commands work in changing text for several files in a directory
replace "old-string" "new-String" -- *.ext
replace "old-string" "new-String" -- *
replace "old-string" "new-String" -- filename
however these won't target subdirectories... anybody knows the command to include ALL subdirectories?
I think sed is better for this. Your first two examples can be rewritten:
find . -type f | xargs sed -i s/old-string/new-string/g
find . -type f -name '*.ext' | xargs sed -i s/old-string/new-string/g
You can also pipe the results of find to your replace command, if that is better for you.

Resources