Find string in file then delete - string

I am trying to find a particular string in files on my server. I have done the following which gives me a list of files, but how do I now delete them?
grep -H -r "example" /home/72754/domains | cut -d: -f1

Try this if you want to delete files:
grep -l -r "example" /home/72754/domains | xargs rm

You can use sed with the same pattern and change what you want
If you want to delete the whole line then something like this
sed '/example/d' /home/72754/domains
And to update the same file, use the -i flag
If you want to update a certain pattern, you can use something like this
sed 's/password/****/' /file
And again you can use the -i flag to update and overwrite the file

Related

Need to delete first N lines of grep result files

I'm trying to get rid of a hacker issue on some of my wordpress installs.
This guy puts 9 lines of code in the head of multiple files on my server... I'm trying to use grep and sed to solve this.
Im trying:
grep -r -l "//360cdn.win/c.css" | xargs -0 sed -e '1,9d' < {}
But nothing is happening, if I remove -0 fromxargs, the result of the files found are clean, but they are not overwriting the origin file with thesed` result, can anyone help me with that?
Many thanks!
You should use --null option in grep command to output a NUL byte or \0 after each filename in the grep output. Also use -i.bak in sed for inline editing of each file:
grep -lR --null '//360cdn.win/c\.css' . | xargs -0 sed -i.bak '1,9d'
What's wrong with iterating over the files directly¹?
And you might want to add the -i flat to sed so that files are edited in-place
grep -r -l "//360cdn.win/c.css" | while read f
do
sed -e '1,9d' -i "${f}"
done
¹ well, you might get problems if your files contain newlines and the like.
but then...if your website contains files with newlines, you probably have other problems anyhow...

Recursively grep unique pattern in different files

Sorry title is not very clear.
So let's say I'm grepping recursively for urls like this:
grep -ERo '(http|https)://[^/"]+' /folder
and in folder there are several files containing the same url. My goal is to output only once this url. I tried to pipe the grep to | uniq or sort -u but that doesn't help
example result:
/www/tmpl/button.tpl.php:http://www.w3.org
/www/tmpl/header.tpl.php:http://www.w3.org
/www/tmpl/main.tpl.php:http://www.w3.org
/www/tmpl/master.tpl.php:http://www.w3.org
/www/tmpl/progress.tpl.php:http://www.w3.org
If you only want the address and never the file where it was found in, there is a grep option -h to suppress file output; the list can then be piped to sort -u to make sure every address appears only once:
$ grep -hERo 'https?://[^/"]+' folder/ | sort -u
http://www.w3.org
If you don't want the https?:// part, you can use Perl regular expressions (-P instead of -E) with variable length look-behind (\K):
$ grep -hPRo 'https?://\K[^/"]+' folder/ | sort -u
www.w3.org
If the structure of the output is always:
/some/path/to/file.php:http://www.someurl.org
you can use the command cut :
cut -d ':' -f 2- should work. Basically, it cuts each line into fields separated by a delimiter (here ":") and you select the 2nd and following fields (-f 2-)
After that, you can use uniq to filter.
Pipe to Awk:
grep -ERo 'https?://[^/"]+' /folder |
awk -F: '!a[substr($0,length($1))]++'
The basic Awk idiom !a[key]++ is true the first time we see key, and forever false after that. Extracting the URL (or a reasonable approximation) into the key requires a bit of additional trickery.
This prints the whole input line if the key is one we have not seen before, i.e. it will print the file name and the URL for the first occurrence of each URL from the grep output.
Doing the whole thing in Awk should not be too hard, either.

Find specific string in subdirectories and order top directories by modification date

I have a directory structure containing some files. I'm trying to find the names of top directories that do contain a file with specific string in it.
I've got this:
grep -r abcdefg . | grep commit_id | sed -r 's/\.\/(.+)\/.*/\1/';
Which returns something like:
topDir1
topDir2
topDir3
I would like to be able to take this output and somehow feed it into this command:
ls -t | grep -e topDir1 -e topDir2 -e topDir3
which would returned the output filtered by the first command and ordered by modification date.
I'm hoping for a one liner. Or maybe there is a better way of doing it?
This should work as long as none of the directory names contain whitespace or wildcard characters:
ls -td $(grep -r abcdefg . | grep commit_id | dirname)

Remove lines containing a string from all files in directory

My server has been infected with malware. I have upgraded my Linux server to the latest version and no new files are being infected, but I need to clean up all the files now.
I can locate all the files doing the following:
grep -H "gzinflate(base64_decode" /home/website/data/private/assets/ -R | cut -d: -f1
But, I want to now delete the line containing gzinflate(base64_decode in every single file.
I'd use sed -i '/gzinflate(base64_decode/d' to delete those matching line in a file:
... | xargs -I'{}' sed -i '/gzinflate(base64_decode/d' '{}'
Note: You really want to be using grep -Rl not grep -RH .. | cut -d: -f1 as -l lists the matching filenames only so you don't need to pipe to cut.
Warning: You should really be concerned about the deeper issue of security here, I wouldn't trust the system at all now, you don't know what backdoors are open or what files may still be infected.
once you got these files using your command
grep -H "gzinflate(base64_decode" /home/website/data/private/assets/ -R | cut -d: -f1
you loop throu files one by one and use
grep -v "gzinflate(base64_decode" file > newfile

Using sed to insert text at the beginning of each line

How would one go about using sed in order to insert
rm -rf
at the start of each line of a file?
sed 's/^/rm -rf /' filename
EDIT
Xargs would be simpler way to delete all of the files listed in another file
xargs -a filename rm -rf
Sometimes you have to go with the original method using sed, especially when you want to do things like
csf -d ipaddr.
xargs doesn't seem to like the output created by some commands and gives up after the first line. ie:
sed 's/^/csf -d /' hacks >>hacks.sh

Resources