Powershell IIS Log analasys - iis-7.5

I almost have this Powershell script completed but I am stuck at the last part and could really use some help with the final step. Below is my PS Script that I have written so far
$t1 =(get-date).AddMinutes(-10)
$t2 =$t1.ToUniversalTime().ToString("HH:mm:ss")
$IISLogPath = "C:\inetpub\logs\LogFiles\W3SVC1\"+"u_ex"+(get-date).ToString("yyMMdd")+".log"
$IISLogFileRaw = [System.IO.File]::ReadAllLines($IISLogPath)
$headers = $IISLogFileRaw[3].split(" ")
$headers = $headers | where {$_ -ne "#Fields:"}
$IISLogFileCSV = Import-Csv -Delimiter " " -Header $headers -Path $IISLogPath
$IISLogFileCSV = $IISLogFileCSV | where {$_.date -notlike "#*"}
$timeTaken = $IISLogFileCSV | where {$_.("cs-uri-stem") -eq '/Login.aspx' -AND $_.("time") -gt '$t2' } | Format-Table time,s-ip
So basically it looks at the current days IIS Log and filters when a user gets to the login page for the past 10 minutes. The part that I am stuck at is I want to be emailed When an IP hits it more than 10 times within that 10 minutes (basically to be alerted when brute force attacks are happening). I have the email part of the code written just need the portion that says when the s-ip hits /login.aspx greater than 10 times. Also in my "test box" I have altered $t2 and $IISLogPath to be the following
$t2 = 20:00:00
$IISLogPath = C:\test\log.log
Below is my example Log file:
#Software: Microsoft Internet Information Services 7.5
#Version: 1.0
#Date: 2012-06-27 15:05:24
#Fields: date time s-ip cs-method cs-uri-stem cs-uri-query s-port cs-username c-ip cs(User-Agent) sc-status sc-substatus sc-win32-status time-taken
2012-06-27 20:32:35 ::1 GET /Login.aspx - 80 - ::1 Mozilla/5.0+(Windows+NT+6.1;+WOW64;+rv:13.0)+Gecko/20100101+Firefox/13.0.1 500 0 0 24240
2012-06-27 20:32:35 ::1 GET /Login.aspx - 80 - ::1 Mozilla/5.0+(Windows+NT+6.1;+WOW64;+rv:13.0)+Gecko/20100101+Firefox/13.0.1 500 0 0 24240
2012-06-27 20:32:35 ::1 GET /Login.aspx - 80 - ::1 Mozilla/5.0+(Windows+NT+6.1;+WOW64;+rv:13.0)+Gecko/20100101+Firefox/13.0.1 500 0 0 24240
2012-06-27 20:32:35 ::1 GET /Login.aspx - 80 - ::1 Mozilla/5.0+(Windows+NT+6.1;+WOW64;+rv:13.0)+Gecko/20100101+Firefox/13.0.1 500 0 0 24240
2012-06-27 21:32:35 ::1 GET /Login.aspx - 80 - ::1 Mozilla/5.0+(Windows+NT+6.1;+WOW64;+rv:13.0)+Gecko/20100101+Firefox/13.0.1 500 0 0 24240
2012-06-27 21:32:35 ::1 GET /Login.aspx - 80 - ::1 Mozilla/5.0+(Windows+NT+6.1;+WOW64;+rv:13.0)+Gecko/20100101+Firefox/13.0.1 500 0 0 24240
2012-06-27 21:32:35 ::1 GET /Login.aspx - 80 - ::1 Mozilla/5.0+(Windows+NT+6.1;+WOW64;+rv:13.0)+Gecko/20100101+Firefox/13.0.1 500 0 0 24240
2012-06-27 21:32:35 ::1 GET /Login.aspx - 80 - ::1 Mozilla/5.0+(Windows+NT+6.1;+WOW64;+rv:13.0)+Gecko/20100101+Firefox/13.0.1 500 0 0 24240
2012-06-27 21:32:35 ::1 GET /Login.aspx - 80 - ::1 Mozilla/5.0+(Windows+NT+6.1;+WOW64;+rv:13.0)+Gecko/20100101+Firefox/13.0.1 500 0 0 24240
2012-06-27 21:32:35 ::1 GET /Login.aspx - 80 - ::1 Mozilla/5.0+(Windows+NT+6.1;+WOW64;+rv:13.0)+Gecko/20100101+Firefox/13.0.1 500 0 0 24240
2012-06-27 21:32:35 ::1 GET /Login.aspx - 80 - ::1 Mozilla/5.0+(Windows+NT+6.1;+WOW64;+rv:13.0)+Gecko/20100101+Firefox/13.0.1 500 0 0 24240
2012-06-27 21:32:35 ::1 GET /Login.aspx - 80 - ::1 Mozilla/5.0+(Windows+NT+6.1;+WOW64;+rv:13.0)+Gecko/20100101+Firefox/13.0.1 500 0 0 24240
2012-06-27 21:32:35 ::1 GET /Login.aspx - 80 - ::1 Mozilla/5.0+(Windows+NT+6.1;+WOW64;+rv:13.0)+Gecko/20100101+Firefox/13.0.1 500 0 0 24240
2012-06-27 21:32:35 ::1 GET /Login.aspx - 80 - ::1 Mozilla/5.0+(Windows+NT+6.1;+WOW64;+rv:13.0)+Gecko/20100101+Firefox/13.0.1 500 0 0 24240
2012-06-27 21:32:35 ::1 GET /Login.aspx - 80 - ::1 Mozilla/5.0+(Windows+NT+6.1;+WOW64;+rv:13.0)+Gecko/20100101+Firefox/13.0.1 500 0 0 24240

After a little tinkering with the script, I have found the solution. Below is the whole script
$t1 =(get-date).AddMinutes(-10)
$t2 =$t1.ToUniversalTime().ToString("HH:mm:ss")
$IISLogPath = "C:\inetpub\logs\LogFiles\W3SVC1\"+"u_ex"+(get-date).ToString("yyMMdd")+".log"
$IISLogFileRaw = [System.IO.File]::ReadAllLines($IISLogPath)
$headers = $headers | where {$_ -ne "#Fields:"}
$IISLogFileCSV = Import-Csv -Delimiter " " -Header $headers -Path $IISLogPath
$IISLogFileCSV = $IISLogFileCSV | where {$_.date -notlike "#*"}
$timeTaken = ($IISLogFileCSV | where {$_.("cs-uri-stem") -eq '/Login.aspx' -AND $_.("time") -gt '$t2' -AND $_.("cs-method") -eq 'Get'}).count
$count = $timeTaken
if($count -ge 8)
{
Send-MailMessage -From from#domain.com -To to#domain.com -Subject "IIS Alert" -BodyAsHtml "Email body goes here" -Attachments $IISLogPath -SmtpServer ip.add.re.ss
}

You ought to be using Microsoft LogParser for most of the heavy lifting in parsing/querying your logfiles. It'll save you a lot of grief, and probably be faster to boot.
You can wrap it with PowerShell to parse the results of your queries.

Related

Save Alternate Output to Variable in Bash instead of Main Output?

Linux novice here so bear with me here.
I am writing a Bash Script for school (On a CentOS 8 VM) and I am attempting to save the output of Siege (Load Tester) to a variable so I can compare values.
Here is the issue I am running into: The HTTP lines between "The server is now under siege" and "Lifting the server siege..." are what are being stored in the variable, and not the nice little summary after "Lifting the server siege..."
[root#prodserver siege-4.1.1]# siege -c 1 -t 1s 192.168.1.3
** SIEGE 4.1.1
** Preparing 1 concurrent users for battle.
The server is now under siege...
HTTP/1.1 200 0.00 secs: 6707 bytes ==> GET /
HTTP/1.1 200 0.01 secs: 2008 bytes ==> GET /assets/images/taste_bug.gif
HTTP/1.1 200 0.00 secs: 2579 bytes ==> GET /assets/images/backpack_bug.gif
HTTP/1.1 200 0.00 secs: 2279 bytes ==> GET /assets/images/desert_bug.gif
HTTP/1.1 200 0.00 secs: 1653 bytes ==> GET /assets/images/calm_bug.gif
HTTP/1.1 200 0.00 secs: 1251 bytes ==> GET /assets/javascripts/menus.js
...Shortened for readability...
HTTP/1.1 200 0.00 secs: 1251 bytes ==> GET /assets/javascripts/menus.js
HTTP/1.1 200 0.00 secs: 2579 bytes ==> GET /assets/images/backpack_bug.gif
HTTP/1.1 200 0.00 secs: 2279 bytes ==> GET /assets/images/desert_bug.gif
HTTP/1.1 200 0.00 secs: 1653 bytes ==> GET /assets/images/calm_bug.gif
HTTP/1.1 200 0.00 secs: 1251 bytes ==> GET /assets/javascripts/menus.js
HTTP/1.1 200 0.01 secs: 2008 bytes ==> GET /assets/images/taste_bug.gif
HTTP/1.1 200 0.00 secs: 2579 bytes ==> GET /assets/images/backpack_bug.gif
HTTP/1.1 200 0.00 secs: 2279 bytes ==> GET /assets/images/desert_bug.gif
HTTP/1.1 200 0.00 secs: 1653 bytes ==> GET /assets/images/calm_bug.gif
Lifting the server siege...
Transactions: 149 hits
Availability: 100.00 %
Elapsed time: 0.22 secs
Data transferred: 3.95 MB
Response time: 0.00 secs
Transaction rate: 677.27 trans/sec
Throughput: 17.97 MB/sec
Concurrency: 1.00
Successful transactions: 149
Failed transactions: 0
Longest transaction: 0.01
Shortest transaction: 0.00
Currently this is how I am attempting to store the variable in bash:
SIEGE="$(siege -c $1 -t $2 [ip])"
As mentioned before, when I echo $SIEGE, it turns out the variable stored all the HTTP lines and NOT the Summary after "Lifting the siege..."
My question is how can I store that Summary in a variable.
NOTE: I'm not familiar with siege so I have no idea if all of the output is going to stdout or if some of the output could be going to stderr.
Assuming all siege output is going to stdout ... a couple ideas depending on which lines need to be ignored:
# grab lines from `^Lifting` to end of output:
SIEGE="$(siege -c $1 -t $2 [ip] | sed -n '/^Lifting/,$ p')"
# ignore all lines starting with `^HTTP`
SIEGE="$(siege -c $1 -t $2 [ip] | grep -v '^HTTP')"
If it turns out some of the output is being sent to stderr, change the siege call to redirect stderr to stdout:
# from
siege -c $1 -t $2 [ip]
# to
siege -c $1 -t $2 [ip] 2>&1
Though I'd probably opt for saving all output to a file and then parsing the file as needed, ymmv ...

How to filter apache access logs on the basis of ips, domain and url?

i have to filter out group of same ips , domain, and some url pattern and print output as well along with count, domain, and url pattern from my apache access logs.|
Currently i am using awk command but is shows only count and ip's not domain and url patterns.
My input is
Feb 2 03:15:01 lb2 haproxy2[30529]: "www.abc.com" 207.46.13.4 02/Feb/2020:03:15:01.668 GET /detail.php?id=166108259 HTTP/1.1 GET 404 123481 "Mozilla/5.0 (compatible; bingbot/2.0; +http://www.bing.com/bingbot.htm)" "" ci-https-in~ ciapache ci-web1 0 0 1 71 303 762 263 1 1 -- "" "" "" ""
Feb 2 03:15:02 lb2 haproxy2[30530]: "wap.abc.com" 106.76.245.226 02/Feb/2020:03:15:01.987 GET /listing.php?id=1009 HTTP/1.1 GET 200 182 "Mozilla/5.0 (Linux; Android 5.1.1; LG-K420 Build/LMY47V) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/46.0.2490.76 Mobile Safari/537.36" "https://wap.abc.com/s.php?q=land+buyers" ci-https-in~ ciapache ci-web2 0 0 0 18 18 17813 219 0 0 -- "" "" "" ""
Feb 2 03:15:02 lb2 haproxy2[30531]: "wap.abc.com" 106.76.245.226 02/Feb/2020:03:15:02.067 GET /listing.php?id=6397 HTTP/1.1 GET 200 128116 "Mozilla/5.0 (compatible; Googlebot/2.1; +http://www.google.com/bot.html)" "" ci-https-in~ varnish ci-van 0 0 0 1 3 470 1001 0 0 -- "" "" "" ""
Feb 2 03:15:02 lb2 haproxy2[30531]: "wap.abc.com" 106.76.245.226 02/Feb/2020:03:15:02.067 GET /listing.php?id=6397 HTTP/1.1 GET 200 128116 "Mozilla/5.0 (compatible; Googlebot/2.1; +http://www.google.com/bot.html)" "" ci-https-in~ varnish ci-van 0 0 0 1 3 470 1001 0 0 -- "" "" "" ""
Expected output
count ip domain url
2 106.76.245.226 wap.abc.com /listing.php?id=6397
1 106.76.245.226 wap.abc.com /listing.php?id=1009
1 207.46.13.4 www.abc.com /detail.php?id=166108259
currently i am using this command but it is not giving expected output
cat /var/log/httpd/access_log | grep www.abc.com* | awk '{print $7}' | sort -n | uniq -c | sort -rn | head -n 50
grep www.abc.com* /var/log/httpd/access_log | awk '{print $7,$6,$10}' | sort -n | uniq -c | sort -rn | head -n 50
use other columns as well in awk.

Why Scapy ICMP can't get answer but "ping" works fine

from 10.18.90.139 to 10.18.90.254, using normal icmp protocol with scapy via python gets no answer; but ping gets reply, what could be the reason
Tried to ping an IP via scapy
>>> ip = "10.18.90.254"
>>> from scapy.all import sr1, IP, ICMP
>>> sr1(IP(ip/ICMP()))
Begin emission:
.......Finished sending 1 packets.
..........................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................^C
Received 1073 packets, got 0 answers, remaining 1 packets
Checked there is no proxy
[root# run]# env | grep -i pro
[root# run]# env | grep -i ht
But ping works fine
PING 10.18.90.254 (10.18.90.254) 56(84) bytes of data.
64 bytes from 10.18.90.254: icmp_seq=1 ttl=64 time=0.315 ms
64 bytes from 10.18.90.254: icmp_seq=2 ttl=64 time=0.264 ms
^C
--- 10.18.90.254 ping statistics ---
2 packets transmitted, 2 received, 0% packet loss, time 1462ms
rtt min/avg/max/mdev = 0.264/0.289/0.315/0.030 ms
Kernel IP routing table
Destination Gateway Genmask Flags Metric Ref Use Iface
192.168.1.0 0.0.0.0 255.255.255.0 U 0 0 0 eth3
10.18.90.0 0.0.0.0 255.255.255.0 U 0 0 0 eth4
10.9.67.0 0.0.0.0 255.255.255.0 U 0 0 0 eth5
169.254.0.0 0.0.0.0 255.255.0.0 U 1002 0 0 eth5
169.254.0.0 0.0.0.0 255.255.0.0 U 1003 0 0 eth4
135.0.0.0 10.9.67.1 255.0.0.0 UG 0 0 0 eth5
0.0.0.0 10.18.90.254 0.0.0.0 UG 0 0 0 eth4
Try using something like this:
sr1(IP(dst="10.18.90.254") / ICMP())

Is IIS and TortoiseSVN working copy compatible?

I have a question about how IIS handle SVN folder.
I am working with ASP-Web forms and MapGuide. My problem is, when I set the path in IIS to my TortoiseSVN working copy, then MapGuide stops working. But when I just copy and paste all files from my working copy to a standard windows folder and set the path to it, then everything works fine.
So what does TortoiseSVN do?
Edit: here are some logs and errors
2017-11-21 09:36:37 ::1 GET /mapguide/mapviewernet/ajaxviewer.aspx SESSION=78e11ef8-ce9f-11e7-8000-208df200a4f8_en_MTI3LjAuMC4x0AFC0AFB0AFA&WEBLAYOUT=Library://MyProject/Layouts/MyProject.WebLayout 81 - ::1 - - 500 19 5 0
2017-11-21 09:36:37 ::1 GET /xxx/xxx/MapContainerRechtsForm.aspx - 81 - ::1 Mozilla/5.0+(Windows+NT+10.0;+Win64;+x64;+rv:57.0)+Gecko/20100101+Firefox/57.0 http://localhost:81/xxx/xxx/MapContainerForm.aspx 200 0 0 562
2017-11-21 09:36:37 ::1 GET /xxx/javascript/jquery.min.js - 81 - ::1%0 Mozilla/5.0+(Windows+NT+10.0;+Win64;+x64;+rv:57.0)+Gecko/20100101+Firefox/57.0 http://localhost:81/xxx/xxx/MapContainerRechtsForm.aspx 200 0 0 0
2017-11-21 09:36:37 ::1 GET /mapguide/mapviewernet/ajaxviewer.aspx SESSION=78e11ef8-ce9f-11e7-8000-208df200a4f8_en_MTI3LjAuMC4x0AFC0AFB0AFA&WEBLAYOUT=Library://MyProject/Layouts/MyProject.WebLayout 81 - ::1 - - 500 19 5 0
2017-11-21 09:36:37 ::1 GET /xxx/xxx/MapContainerRechtsForm.aspx - 81 - ::1 Mozilla/5.0+(Windows+NT+10.0;+Win64;+x64;+rv:57.0)+Gecko/20100101+Firefox/57.0 http://localhost:81/xxx/xxx/xxx.aspx 200 0 0 31
2017-11-21 09:36:37 ::1 GET /xxx/javascript/jquery.min.js - 81 - ::1%0 Mozilla/5.0+(Windows+NT+10.0;+Win64;+x64;+rv:57.0)+Gecko/20100101+Firefox/57.0 http://localhost:81/xxx/xxx/xxx.aspx 200 0 0 0
2017-11-21 09:36:37 ::1 GET /xxx/xxx/MapContainerForm.aspx - 81 - ::1 Mozilla/5.0+(Windows+NT+10.0;+Win64;+x64;+rv:57.0)+Gecko/20100101+Firefox/57.0 - 200 0 0 515
2017-11-21 09:37:24 ::1 GET /mapguide/mapagent/mapagent.fcgi OPERATION=GETPROVIDERCAPABILITIES&VERSION=2.0.0&SESSION=8d781ed4-ce9a-11e7-8000-208df200a4f8_en_MTI3LjAuMC4x0AFC0AFB0AFA&FORMAT=text%2Fxml&CLIENTAGENT=MapGuide%20Maestro%20v6.0.0.8909&PROVIDER=OSGeo.SDF 81 - ::1 - - 500 19 5 0
2017-11-21 09:38:24 ::1 GET /mapguide/mapagent/mapagent.fcgi OPERATION=GETPROVIDERCAPABILITIES&VERSION=2.0.0&SESSION=8d781ed4-ce9a-11e7-8000-208df200a4f8_en_MTI3LjAuMC4x0AFC0AFB0AFA&FORMAT=text%2Fxml&CLIENTAGENT=MapGuide%20Maestro%20v6.0.0.8909&PROVIDER=OSGeo.SDF 81 - ::1 - - 500 19 5 0
2017-11-21 09:39:24 ::1 GET /mapguide/mapagent/mapagent.fcgi OPERATION=GETPROVIDERCAPABILITIES&VERSION=2.0.0&SESSION=8d781ed4-ce9a-11e7-8000-208df200a4f8_en_MTI3LjAuMC4x0AFC0AFB0AFA&FORMAT=text%2Fxml&CLIENTAGENT=MapGuide%20Maestro%20v6.0.0.8909&PROVIDER=OSGeo.SDF 81 - ::1 - - 500 19 5 0
2017-11-21 09:40:24 ::1 GET /mapguide/mapagent/mapagent.fcgi OPERATION=GETPROVIDERCAPABILITIES&VERSION=2.0.0&SESSION=8d781ed4-ce9a-11e7-8000-208df200a4f8_en_MTI3LjAuMC4x0AFC0AFB0AFA&FORMAT=text%2Fxml&CLIENTAGENT=MapGuide%20Maestro%20v6.0.0.8909&PROVIDER=OSGeo.SDF 81 - ::1 - - 500 19 5 0
How it looks:
How it should look:
Dim Response As Net.WebResponse = Nothing
Dim WebReq As Net.HttpWebRequest = Net.HttpWebRequest.Create(URL)
Response = WebReq.GetResponse <-- exception
> > StatusCode = InternalServerError {500} ResponseUri =
> > {http://localhost:81/mapguide/mapviewernet/ajaxviewer.aspx?SESSION=48f61ece-cea8-11e7-8000-208df200a4f8_en_MTI3LjAuMC4x0AFC0AFB0AFA&WEBLAYOUT=Library://myProject/Layouts/myWebLayout.WebLayout}
Ok I got the solution for my problem:
I have to add "Authenticated Users" group to my project folder. Because my web.config in that folder could not be accessed

Select lines by condition and count with one line command

I need help with analyze nginx logs. Sample of log:
10.10.10.10 - - [21/Mar/2016:00:00:00 +0000] "GET /example?page=&per_page=100&scopes= HTTP/1.1" 200 769 "-" "" "1.1.1.1"
10.10.10.10 - - [21/Mar/2016:00:00:00 +0000] "GET /example?page=&per_page=500&scopes= HTTP/1.1" 200 769 "-" "" "1.1.1.1"
11.11.11.11 - - [21/Mar/2016:00:00:00 +0000] "GET /example?page=&per_page=10&scopes= HTTP/1.1" 200 769 "-" "" "1.1.1.1"
12.12.12.12 - - [21/Mar/2016:00:00:00 +0000] "GET /example?page=&per_page=500&scopes= HTTP/1.1" 200 769 "-" "" "1.1.1.1"
13.13.13.13 - - [21/Mar/2016:00:00:00 +0000] "GET /example HTTP/1.1" 200 769 "-" "" "1.1.1.1"
Is it possible to select with count all uniq ip addresses which contain per_page parameter and this parameter equal or greater than 100?
So, the output can be in any format:
10.10.10.10 - 2 # ip 10.10.10.10 was found twice
12.12.12.12 - 1
Is it possible to get with one command?
$ awk '/per_page=[0-9]{3}/{cnt[$1]++} END{for (ip in cnt) print ip, cnt[ip]}' file
12.12.12.12 1
10.10.10.10 2
This is absolutely basic awk - read the book Effective Awk Programming, 4th Edition, by Arnold Robbins if you're going to be any other text file processing in UNIX.

Resources