I've been working on a web application and finally published it to Azure. The application is not critical and currently I use only one role to keep costs down.
I would like to start try and get a feel of who (if anyone is using my site). Can anyone give me some suggestions on how I could do this. What I would really like is not to use anything like the google scripts that I see some web sites use for monitoring page hits. I would like to do as much as possible on the server.
Help advice on where to start and what to look at would be much appreciated.
Katarina
Aside from things like Google Analytics and StatCounter, you'd want to set up some performance counters that you can watch externally. This requires you to use the Diagnostic Monitor:
Set up performance counters to track, and how often to poll for values
Set up frequency to upload to Table Storage
Diagnostic data is aggregated from all your instances, so then you can run queries against the diagnostic tables. Cerebrata has a page that details these table names (you can also use their Diagnostics Manager tool, other 3rd-party tools, or roll your own).
Igork posted this StackOverflow answer as well, which references some blog posts by Azure MVP Neil Mackenzie.
To add to Dave's answer, there are three levels of monitoring you can do:
If you want to know who is using your site, Google Analytics is best and free... There are a few others, but all involve injecting small javascript on your pages
If you want to know the load your site is under, inspecting performance counters via Cerebrata's tool is likely best # http://www.cerebrata.com
If you want to go one step further and be notified when the load on your site is outside your predefined conditions (active monitoring) or have your website automatically scale up when the load is too high, AzureWatch is probably the best option # http://www.paraleap.com
HTH
Related
I do reporting/analytics for site usage and engagement for a share point online site with my company. I currently run the usage logs manually from site audit reports and the process is very time consuming and not always accurate. Does anyone know a better way to get these logs? Also has anyone had success in implementing a 3rd party platform to capture site visits like google analytics? We have tried to implement Matomo, but not much success.
#B1landry,
You may have a try Azure app insight, provides similar functionality to Google Analytics with the advantage of keeping your data in the same ecosystem.
Check below docs to get started:
https://sharepoint.handsontek.net/2019/02/19/how-to-add-application-insights-to-sharepoint-without-modifying-the-master-page/
https://learn.microsoft.com/en-us/azure/azure-monitor/app/sharepoint
https://learn.microsoft.com/en-us/answers/questions/246834/how-can-i-setup-a-sharepoint-online-site-usage-mon-1.html
BR
I am tasked with the development of a web page.
As a part of this, I also need to collect the details of browser used by the users to browse the web page (along with version, timestamp & IP).
Tried searching over the internet but maybe my search was not properly directed.
I am not sure how to go about this.
Any pointer to get me started here would help me a long way.
I would also like to know how to store the information thus collected.
Not sure if this is a good idea - but just thinking out loud - are there any online service/channel available where the data can be uploaded in real time - like thingspeak.
Thanks
Google Analytics is great, but it doens't give you the data (for free). So if you want your data in e.g. SQL format then I may suggest a you use a tool that collects the data for you and then sends it to Google Analytics.
We use Segment (segment.io, but there are probably other tools out there too) for this, they have a free plan. You will need to create a Warehouse in AWS and all your data will be stored there, incl. details of the browser (version, timestamp, IP).
Disclaimer: I do not work for Segment, but I am a customer.
Enable Google Analytics in your website, then after 1 week, take a look at the Google Analytics page to see data that was collected.
Follow the guide here to configure Google Analytics on your website: https://support.google.com/analytics/answer/1008080?hl=en
You should go for alternatives like Segment(https://segment.com/), Mixpanel(https://mixpanel.com/) and similars. They can guarantee consistency for your data and also integrate to many different analytics tools and platforms.
I just want to try creating a web application to monitor server status. I need some design guidelines.
Should I use some scripting language like Python or ruby to get the stats? Is polling is the only way to do it? If so how frequently should we poll?
If you don't care about data retention, writing a simple web app in ruby or python that polls from the browser would probably be fine. You could alternately use websockets and push data from a CLI-based monitoring agent of some sort that ran in the background on your server.
If you don't care about data fidelity, then you might be able to use something simple like pingdom.
If you do care about data retention and you need lots of custom monitoring, then it's a much harder problem. There are a number of open source projects and paid applications that will solve this problem in various ways. As mentioned in the comment on your post, ganglia could work. You might also look into nagios or munin. If you need app level stats, you could check out statsd/graphite or influxdb/grafana.
If you want server monitoring but don't want to manage additional infrastructure, there are a lot of solutions in the paid space including librato, newrelic, and instrumental.
Note: I am an owner of Instrumental, so I'm biased toward that, but I think your question needs more details to narrow down any recommendations on infrastructure monitoring.
I am getting ready to release a new web site in the coming weeks, and would like the ability to run multivariate or a/b tests between two version of the site.
The site is hosted on azure, and I am using the Service Gateway to split traffic between the instances of the site, both of which are deployed from Visual Studio Online. One from the main branch and the other from an "experimental" branch.
Can I configure Google analytics to assist me in tracking the success of my tests. From what I have read Google analytics seems to focus on multiple versions of a page within the same site for running its experiments.
I have though of perhaps using 2 separate tracking codes, but my customers are not overly technically savvy, so I would like to keep things as simple as possible. I have also considered collecting my own metrics inside the application, but I would prefer to use an existing tool as I don't really have the time to implement something like that.
can this be done? are there better options? is there a good nugget package that might fulfil my needs? any advice welcome.
I'd suggest setting a custom dimension that tells you which version of the site the user is on. Then in the reports you can segment and compare the data.
note: there are few similar questions already asked here - but they are from 2009. May be something has changed since then.
I'm responsible for a bunch of websites hosted on different servers. I do not do any log analysis right now, but I would like to change this. First question - what is the best tool to view ISSUES with the website based on IIS logs (i.e. 404, 500 responses, long page processing, etc)? Ideally with grouping/sorting options? I do not want to spend a lot of time on this, I just want to periodically check if all is good with the website.
Second question (and I know most likely i'm asking for too much) - but is there any way to expose processed logs to web? So I can review things mentioned above without RPDing into the server?
Ideally I'm looking for a free/open source solution, but I'm ready to pay for a good software as well (but not a lot of $$).
Thank you.
You can take a look at our log monitoring solution EventSentry, which can monitor text-based logs like IIS logs. We have standard templates setup for IIS, and we can consolidate the logs in a database with web-access, so that you can review the logs without using RDP.
It's a pretty flexible solution that allows you to pick the fields you are interested in, and ignore the ones you are not - and thus save space in your database.
You can also setup real-time alerts, so that you can get an email when a critical error is encountered in a log file, like a 500 error.
http://www.eventsentry.com/features/log-file-monitoring
Finally, you can also plug-in command line tools which can verify that a given web page is accessible, or get alerted when it changes: http://www.eventsentry.com/features/application-monitoring.
I'm biased of course, but I would say that our solution is pretty affordable. Since it offers additional functionality as well, such as service monitoring (to monitor your IIS services) and event log monitoring (IIS does log critical messages to the event log), you can setup comprehensive monitoring with a single product.
I'd look into #LuckyLuke solution (or similar) - classic "build vs buy" decision. Based on your post, this isn't going to be your "full time" job so IMHO its best to leave it to those who do...
I don't know what "legacy" answers you are referring to, but if you want to tinker you can use Microsoft's own log parser, and depending on how far you want to go with it, you can use it (COM dll) to write your "admin web pages" in .Net/ASP.Net and host it in each of your servers....
If you're very specific about the errors you just want to be alerted about, another "hacky" way would be to provide your own custom error pages (either the default IIS error pages, or configure your Asp.Net apps to use specific error pages).