Difference between User Flows and Funnels features in Application Insights - azure

Azure Application Insights has two different features called User Flows and Funnels.
The documentation states for User Flows:
The User Flows tool starts from an initial page view, custom event, or exception that you specify. Given this initial event, User Flows shows the events that happened before and afterward during user sessions. Lines of varying thicknesses show how many times each path was followed by users.
And for Funnels:
If your application involves multiple stages, you need to know if most customers are progressing through the entire process, or if they are ending the process at some point. The progression through a series of steps in a web application is known as a funnel. You can use Azure Application Insights Funnels to gain insights into your users and monitor step-by-step conversion rates.
Question:
Both features look similar especially if I see these statements even if they are different ones:
...how many times each path was followed by users...
...you need to know if most customers are progressing through the entire process, or if they are ending the process at some point...
Can they be used interchangeably once you would like to make a decision if there are any repetitive events by users like going through a process with exactly the same steps on a website?
Any clarification is appreciated, thanks!

I would say that both features have similarities, but they have differences as well. Whereas with the User Flows feature you are able to analyze individual user journeys on your website, with more granularity, using Funnels is meant to give you an aggregated vision of relevant metrics such as user conversion and user drop percentage, and in those cases understand in which step of the funnel the users are dropping, so then you can enhance your website to retain more users and avoid any drops. I would use funnel in scenarios that you have a specific workflow on your website that you want users to follow (e.g. form to collect data (reviews, specific information,etc.). The user flows is something that you can use in any website to understand the different user flows you have from your website visitors, so then maybe you can change to redirect users to other less visited sections of your website.

Related

Reactive, long-running sequences and persistence in the cloud

I am to build a kind of website tracking system.
Think of a website where users click on various links – a unique user id and an identifier of the page tracks all page views.
Now, a single user might view 20 pages – some relevant some not. What I want to track is if a user follows a specific “path”. Example “Home Page” -> “Product A Page” -> “Get more info page” -> “Buy” -> “Paid”. There might be other page views in between each of these steps; the important thing is IF a user follows a given pattern.
In addition, I need to measure time between each step (each page view has a timestamp).
I have been playing around with Reactive Extensions, but I am not an expert in the area so I would like to hear if this would be a job for the Reactive Framework or if other technologies are more suitable?
I imagine a server getting a stream of website page views and then some fancy reactive LINQ queries that captures the events (this is where I need some help).
Next question that comes to my mind is how do you host this behind a load-balancer (on Windows Azure)? If you run two instances and the “Home Page”-page view goes to instance 1 and “Product A Page” goes to instance 2, how do they communicate about this or should some kind of sharding e.g. per userid be enforced?
Lastly, what about persistence? How to store? Should you store data in an Event Queue pattern and then load everything into memory when you “play back” from a restart of the server?
I know that were many questions, but I do love the philosophy behind Reactive Extensions; I just cannot get my head around how to “put into production in the cloud” :)
Thanks!
Casper
There are lots of solutions out there in this space already that you can integrate into your platform. Are you sure you're not reinventing the wheel? Google Analytics has functionality similar to this. If you need to go your own way, then SQL Server StreamInsight might be a better fit.
For behind-the-firewall solutions, Also look at http://piwik.org/ (free, open-source) and http://www.haveamint.com/.

Automate the export of Facebook Insights data

I'm looking for a way of programmatically exporting Facebook insights data for my pages, in a way that I can automate it. Specifically, I'd like to create a scheduled task that runs daily, and that can save a CSV or Excel file of a page's insights data using a Facebook API. I would then have an ETL job that puts that data into a database.
I checked out the oData service for Excel, which appears to be broken. Does anyone know of a way to programmatically automate the export of insights data for Facebook pages?
It's possible and not too complicated once you know how to access the insights.
Here is how I proceed:
Login the user with the offline_access and read_insights.
read_insights allows me to access the insights for all the pages and applications the user is admin of.
offline_access gives me a permanent token that I can use to update the insights without having to wait for the user to login.
Retrieve the list of pages and applications the user is admin of, and store those in database.
When I want to get the insights for a page or application, I don't query FQL, I query the Graph API: First I calculate how many queries to graph.facebook.com/[object_id]/insights are necessary, according to the date range chosen. Then I generate a query to use with the Batch API (http://developers.facebook.com/docs/reference/api/batch/). That allows me to get all the data for all the available insights, for all the days in the date range, in only one query.
I parse the rather huge json object obtained (which weight a few Mb, be aware of that) and store everything in database.
Now that you have all the insights parsed and stored in database, you're just a few SQL queries away from manipulating the data the way you want, like displaying charts, or exporting in CSV or Excel format.
I have the code already made (and published as a temporarily free tool on www.social-insights.net), so exporting to excel would be quite fast and easy.
Let me know if I can help you with that.
It can be done before the week-end.
You would need to write something that uses the Insights part of the Facebook Graph API. I haven't seen something already written for this.
Check out http://megalytic.com. This is a service that exports FB Insights (along with Google Analytics, Twitter, and some others) to Excel.
A new tool is available: the Analytics Edge add-ins now have a Facebook connector that makes downloads a snap.
http://www.analyticsedge.com/facebook-connector/
There are a number of ways that you could do this. I would suggest your choice depends on two factors:
What is your level of coding skill?
How much data are you looking to move?
I can't answer 1 for you, but in your case you aren't moving that much data (in relative terms). I will still share three options of many.
HARD CODE IT
This would require a script that accesses Facebook's GraphAPI
AND a computer/server to process that request automatically.
I primarily use AWS and would suggest that you could launch an EC2
and have it scheduled to launch your script at X times. I haven't used AWS Pipeline, but I do know that it is designed in a way that you can have it run a script automatically as well... supposedly with a little less server know-how
USE THIRD PARTY ADD-ON
There are a lot of people who have similar data needs. It has led to a number of easy-to-use tools. I use Supermetrics Free to run occasional audits and make sure that our tools are running properly. Supermetrics is fast and has a really easy interface to access Facebooks API's and several others. I believe that you can also schedule refreshes and updates with it.
USE THIRD PARTY FULL-SERVICE ETL
There are also several services or freelancers that can set this up for you at little to no work on your own. Depending on where you want the data. Stitch is a service I have worked with on FB-ads. There might be better services, but it has fulfilled our needs for now.
MY SUGGESTION
You would probably be best served by using a third-party add-on like Supermetrics. It's fast and easy to use. The other methods might be more worth looking into if you had a lot more data to move, or needed it to be refreshed more often than daily.

Use Google Analytics for data to display on our webpage?

On some of our pages, we display some statistics like number of times that page has been viewed today, number of times it's been viewed the past week, etc. Additionally, we have an overall statistics page where we list the pages, in order, that have been viewed the most.
Today, we just insert these pageviews and event counts into our database as they happen. We also send them to Google Analytics via normal page tracking and their API. Ideally, instead of querying our database for these stats to display on our webpages, we just query Google Analytics' API. Google Analytics does a FAR better job figuring out who the real uniques are and avoids counting people who artificially inflate their pageview counts (we allow people to create pages on our site).
So the question is if it's possible to use Google Analytics' API for updating the statistics on our webpages? If I cache the results is it more feasible? Or just occasionally update our stats? I absolutely love Google Analytics for our site metrics, but maybe there's a better solution for this particular need?
So the question is if it's possible to use Google Analytics' API for updating the statistics on our webpages?
Yes, it is. But, the authentication process and xml return may slow things up. You can speed it up by limiting the rows/columns returned. Also, authentication for the way you want to display the data (if I understood you correctly) would require you to use the client authentication method. You send the username and password. Security is an issue.
I have done exactly what you described but had to put a loading graphic on the page for the stats.
If I cache the results is it more feasible? Or just occasionally update our stats?
Either one but caching seems like it would work especially since GA data is not real-time data anyway. You could make the api call and store (or process then store) the returned xml for display later.
I haven't done this but I think I might give it a go. Could even run as a scheduled job.
I absolutely love Google Analytics for our site metrics, but maybe there's a better solution for this particular need?
There are some third-party solutions (googling should root them out) but money and feasibility should be considered.

What is the most efficient way to mock a user flow in a RESTful application?

How can I communicate in a very simple and effective way the path the user takes through my application?
I'm currently working in a Ruby and Rails environment, so I tend to visualize my application in terms of RESTful URIs. So for example, if I want my users to sign up, I could match a new route called /users/new. The thing is, I would like to see beyond that specific action, and visualize how many pages or forms does it take to create an account and some business logic associated with the process in general. In other words, I'd like to see a mix of a workflow diagram and some implementation details (at an interface level).
I was thinking in showing mockup pictures wrapped in boxes, and relate them through arrows with their corresponding GET, POST, PUT, DELETE methods and URIs attached to them. I think it is a good idea, but I haven't seen examples yet that inspire me.
In your experience, what helps you see the big picture? Balsamiq mockups allow to define links and navigate through the app, but it doesn't help to conceptualize.
Have you thought of using a mind-map? You could try the free FreeMind
If you stick with UML, you could consider an Activity diagram.
I think you're on the right path. Showing different screens with possible combinations of users' transactions between them is a good technique. Basically you would be able to show user's flow through your application and stress out decisions a user will make on the way.
The good example for it was presented here http://vimeo.com/43869717
This technique called Storyboarding. You should be able to find some examples. But the one I mentioned above is one of the best Storyboarding techniques. I use it all the time to show the big picture and present application workflow from user perspective to my team.

Is there a service that will check redirection for an e-business

Like a lot of businesses my employer is dealing with the new world of PCI compliance by avoiding the hard stuff and redirecting our customers to a third-party payment service. The process will entail the customer entering order details into our system but then being redirected to the merchant bank's payment service for the entry of those all important card details.
We wish to retain the services of some business that periodically fills in stages 1 and 2 of our order form with some dummy data, presses place order and sees that the URL it ends up at is in fact the one we're expecting, a bit like a bot or a web spider.
If it finds we've been clickjacked it would alert us by text message or twitter feed or whatever the cool kids are using these days.
Does anyone know of a service that performs this function?
No, I don't believe that there is a service like this. Usually companies with specific testing needs like this will use QuickTest Pro.
I'm still in the process of going through some suggestions and hammering out what exactly we're going to do but almost all the info I've gained has come from:
http://www.softwareqatest.com/index.html
A devastatingly useful site which provides more than answers to this functional testing scenario. There are a couple of Web-Based services which execute QA Functional Testing scripts against your site and send alerts and reports if the tests fail.
The two I had a quick look at were http://www.dotcom-monitor.com/ and http://www.watchmouse.com/en/
The latter service uses Badboy scripts in its tests so you can home brew them and then upload to their server for regular execution.

Resources