Remove lines containing a string from all files in directory - linux

My server has been infected with malware. I have upgraded my Linux server to the latest version and no new files are being infected, but I need to clean up all the files now.
I can locate all the files doing the following:
grep -H "gzinflate(base64_decode" /home/website/data/private/assets/ -R | cut -d: -f1
But, I want to now delete the line containing gzinflate(base64_decode in every single file.

I'd use sed -i '/gzinflate(base64_decode/d' to delete those matching line in a file:
... | xargs -I'{}' sed -i '/gzinflate(base64_decode/d' '{}'
Note: You really want to be using grep -Rl not grep -RH .. | cut -d: -f1 as -l lists the matching filenames only so you don't need to pipe to cut.
Warning: You should really be concerned about the deeper issue of security here, I wouldn't trust the system at all now, you don't know what backdoors are open or what files may still be infected.

once you got these files using your command
grep -H "gzinflate(base64_decode" /home/website/data/private/assets/ -R | cut -d: -f1
you loop throu files one by one and use
grep -v "gzinflate(base64_decode" file > newfile

Related

Find string in file then delete

I am trying to find a particular string in files on my server. I have done the following which gives me a list of files, but how do I now delete them?
grep -H -r "example" /home/72754/domains | cut -d: -f1
Try this if you want to delete files:
grep -l -r "example" /home/72754/domains | xargs rm
You can use sed with the same pattern and change what you want
If you want to delete the whole line then something like this
sed '/example/d' /home/72754/domains
And to update the same file, use the -i flag
If you want to update a certain pattern, you can use something like this
sed 's/password/****/' /file
And again you can use the -i flag to update and overwrite the file

Need to delete first N lines of grep result files

I'm trying to get rid of a hacker issue on some of my wordpress installs.
This guy puts 9 lines of code in the head of multiple files on my server... I'm trying to use grep and sed to solve this.
Im trying:
grep -r -l "//360cdn.win/c.css" | xargs -0 sed -e '1,9d' < {}
But nothing is happening, if I remove -0 fromxargs, the result of the files found are clean, but they are not overwriting the origin file with thesed` result, can anyone help me with that?
Many thanks!
You should use --null option in grep command to output a NUL byte or \0 after each filename in the grep output. Also use -i.bak in sed for inline editing of each file:
grep -lR --null '//360cdn.win/c\.css' . | xargs -0 sed -i.bak '1,9d'
What's wrong with iterating over the files directly¹?
And you might want to add the -i flat to sed so that files are edited in-place
grep -r -l "//360cdn.win/c.css" | while read f
do
sed -e '1,9d' -i "${f}"
done
¹ well, you might get problems if your files contain newlines and the like.
but then...if your website contains files with newlines, you probably have other problems anyhow...

Change directory after unziping

I am making a script that allow's me to unzip a given file. My problem is that i don't now how to change directory to the directory just created by the unzip process.
I tried with this command, but it's not working: SITE_DIRECTORY="$(ls -dt */ | head -1)"
Any idea on how to get the name of the directory just extracted ?
Edit: Now i got to SITE_DIRECTORY=unzip $SITE_NAME | grep 'creating:' | head -1 | cut -d' ' -f5-
But a new problem arise: the unzip command does not extract all the files.
New ideas ?
If the directory is known, you could
unzip -j yourzip.zip -d /path/to/dir && cd /path/to/dir
Extra info from man page (j option)
-j junk paths. The archive's directory structure is not recreated; all files are deposited in the extraction directory (by default, the
current one).
The solution to my problem was the following commands:
unzip $SITE_NAME >output.txt
SITE_DIRECTORY=$(cat output.txt | grep -m1 'creating:' | cut -d' ' -f5-)
rm output.txt
Thanks goes to Evan # Unzip File which directory was created

grep lines with X but not with Y

The problem:
I want to get all lines of code in my project folder that have ".js" in order to check that i don't have un-minimized JavaScript files.
When I'm trying to do the following: grep -H ".js\"" *
I'm getting everything right. but still have a problem as I don't want to get lines with ".min.js" which i don't want to get.
Is it possible using grep command to search my project folder for all files/lines that have ".js" but not ".min.js" ?
Thanks.
GalT.
Just pipe the output to another grep as
grep -H ".js" | grep -vH ".min.js"
You can do this with awk
awk '/.js/ && !/.min.js/'
To print filename:
awk '/.js/ && !/.min.js/ {print FILENAME}' *
The following command will work in folder as well.
For current dir you can use this
find . | xargs grep ".js" | grep -v "min.js"
For any specific folder
find (folder path) | xargs grep ".js" | grep -v "min.js"

grep command working in testdir but not in "real" directory

I just thought I had found my solution because the command works in my test directory.
grep -H -e 'author="[^"].*' *.xml | cut -d: -f1 | xargs -I '{}' mv {} mydir/.
But using the command in the non-test-direcory the command did not work:
This is the error message:
grep: unknown option -- O
Usage: grep [OPTION]... PATTERN [FILE]...
Try `grep --help' for more information.
Not even this worked:
$ grep -H author *.xml
or this:
$ grep -H 'author' *.xml
(same error message)
I suspect it has some relation to the file names or the amount of files.
I have almost 3000 files in the non-test-directory and only 20 in my test directory.
In both directories almost all file names contain spaces and " - ".
Some more info:
I'm using Cygwin.
I am not allowed to change the filenames
Try this (updated):
grep -HlZ 'author="[^"].*' -- *.xml | xargs -0 -I {} mv -- {} mydir/
EXPLANATION (updated)
In your "real" directory you have a file with name starting with -O.
Your shell expands the file list *.xml and grep takes your - starting filename as an option (not valid). Same thing happens with mv. As explained in the Common options section of info coreutils, you can use -- to delimit the option list. What comes after -- is considered as an operand, not an option.
Using the -l (lowercase L) option, grep outputs only the filename of matching files, so you don't need to use cut.
To correctly handle every strange filename, you have to use the pair -Z in grep and -0 in xargs.
No need to use -e because your pattern does not begin with -.
Hope this will help!

Resources