Does Clarity Actually Push Data to Google Analytics? - node.js

Does Clarity Actually Push Data to Google Analytics?
Context A:
I started a project for our company for internal intranet analytics so they can learn more about which pages work, which ones don't, popular documents, etc. I found Clarity (and thought this is great) it says that it pushes information to Google Analytics, but that doesn't seem to be the case.
Additional context at the bottom to ensure question clarity, pun intended.
Confusing / Misleading Documentation:
Microsoft can you answer this question or reply to my other emails that you and I were working back and forth on? Your documentation says one thing (the quote below), but that doesn't seem to be accurate.
How does Clarity integration work?
Clarity will create a custom dimension called the 'Clarity Playback URL'. Data will start flowing into Google Analytics from the Clarity JavaScript you've added to your site without any more changes.
My Question:
So is this accurate or not, does Clarity pass information over or is this incorrectly describing Clarity behavior and that the two need to independently gather data for it to work correctly?
Initial Microsoft Response:
Sometimes it takes about 1 - 2 days to populate the initial data, if you still don’t see the data by then, you can reach out again!
Second Microsoft Response:
The account/web property that is connected has no rows in it for any query. Could you please double check and connect an account which contains data.
Final Thoughts:
So which one is it? There is no documentation on this and the project I am working on is at a standstill because of this, please advise so I can either continue forward with this or move on to a different solution because ms-clarity doesn't seem to be working as documented.
Context B:
I originally started in on this project because the previous Google Analytics that were linked into the intranet stopped working after modern / online updates (not sure which one did it, that all happened before I got here) and I had to get special approval to use Clarity and we only went with Clarity because of this piece in the documentation that essentially told us we could re-make the link between the intranet and Google Analytics as well as get more functionality coupled with Clarity.
We also did not want to do a weird patch job to get Google Analytics injected into pages and I told them a SPPKG of Clarity would do the trick... however, here we are now.

Clarity DOES NOT push data to Google Analytics. You have to gather data using Clarity and Google Analytics SEPERATLEY, and then when you integrate Clarity will work with Google Analytics to bring the data together.
So you CANNOT only install Clarity, and then expect to integrate with Google Analytics to push Clarity gathered data to Google Analytics.

Related

QnA maker - Different Results between REST API and Preview Page

I'm using the Azure QnA Version 4. I'm posting using the REST API.
If I'm posting against the Live-Database using the parameter isTest=true I'm getting an answer score of around 80% which is very reasonable as my question almost matches the database. I'm getting exactly the same result using the Webinterface on qnamaker.ai.
Using the same POST against the published version (without isTest=true) I'm getting a score of only around 13% (which is very odd for entering almost a question which matches the database).
I've found some hints within the FAQs that slight differences are normal but I don't think 67% difference is normal. Is there anything I can do, so that the published version gets scores closer to the test version?
Pursang has a good point on his answer.
A good way to solve this problem is adding "isTest: true" on QnAMaker post request body. It has worked for me.
Its a qnaMaker bug when we have to add multiples knowledge bases...
{"question":"your question here", "top":3, "isTest": true }
Good Luck!
The test version and the published version are two different knowledge bases. This allows you to make changes and test them without affecting the live knowledge base that your customers are using. If you're getting worse results with your published knowledge base than your test version, that seems to indicate that you've trained your test knowledge base after you've published. Publishing again may fix the issue.
If you publish again and your published version still doesn't seem to behave the same as the test version, consider this entry in the FAQ:
The updates that I made to my knowledge base are not reflected on publish. Why not?
Every edit operation, whether in a table update, test, or settings,
needs to be saved before it can be published. Be sure to click the
Save and train button after every edit operation.
I had the same exact problem. It was related to something going wrong when I created the QnA Service in Azzure. The Language of your QnA Knowldege Base is automatically detected. You can see your Language in your Azure Search Ressource=>testkb=>Fields=>question/awnser MSDN
Mine was set to Standard-Lucene instead of German-Microsoft. I did not find any way to change that, so I had to recreate the QnA Service and move all Knowledge Bases there. Example picture wrong language Example picture correct language
I'm using a QnA service created in February this year. There are discrepancies between the test (QnA portal) & the published version (api). A correct answer would drop 10%, while a bad answer rises 10%, which ultimately converts good matches in test into bad ones in the bot application. Try to explain that to your customer.
It appears that you can run into this trouble if you use multiple KBs (= knowledge bases) on a single search service. The test index is a single index that covers all your KBs for that search service, while production KBs, when published, are indexed separately per KB. The QnA Maker help bot on the QnA portal mentions this:
"The top answer can sometimes vary because of small score variations between the test and production indexes. The test chat in portal hits the test index, and the generateAnswer API hits the production index. This typically happens when you have multiple knowledge bases in the same QnA Maker service. Learn more about confidence score differences.
This happens because all test knowledge bases are combined into a single index, while prod knowledge bases are on separate indexes. We can help you by separating all test and prod into separate indexes for your service."
So we need to contact Microsoft to also split up the test index per KB ? So that will rectify any discrepancies between test & published version ? Did not try this yet, anyone else?
Or do we limit ourselves to a single KB per search service (= multiple search services = expensive).
Or do we put all in a single KB, and use metadata to logically separate the answers and pray that this single massive KB produces good enough results ?

Gather browser details used to browse webpage

I am tasked with the development of a web page.
As a part of this, I also need to collect the details of browser used by the users to browse the web page (along with version, timestamp & IP).
Tried searching over the internet but maybe my search was not properly directed.
I am not sure how to go about this.
Any pointer to get me started here would help me a long way.
I would also like to know how to store the information thus collected.
Not sure if this is a good idea - but just thinking out loud - are there any online service/channel available where the data can be uploaded in real time - like thingspeak.
Thanks
Google Analytics is great, but it doens't give you the data (for free). So if you want your data in e.g. SQL format then I may suggest a you use a tool that collects the data for you and then sends it to Google Analytics.
We use Segment (segment.io, but there are probably other tools out there too) for this, they have a free plan. You will need to create a Warehouse in AWS and all your data will be stored there, incl. details of the browser (version, timestamp, IP).
Disclaimer: I do not work for Segment, but I am a customer.
Enable Google Analytics in your website, then after 1 week, take a look at the Google Analytics page to see data that was collected.
Follow the guide here to configure Google Analytics on your website: https://support.google.com/analytics/answer/1008080?hl=en
You should go for alternatives like Segment(https://segment.com/), Mixpanel(https://mixpanel.com/) and similars. They can guarantee consistency for your data and also integrate to many different analytics tools and platforms.

Measuring Hits on a Webpage

I just had a brief question and hope someone can guide me in the right direction to helping me understand how that works and how its tracked. How do you measure the amount of times a link, button, or search is performed on a specific website? Do you run or can you run reports that give you this information?
Most of that information ends up in your log files. If you have access to your web server log files, you can run them through any log analyzer to get your answers. This will capture all the data you requested except links to other websites.
If you want to capture offsite link clicks, or if you lack access to your log files, you'll need to embed javascript into your pages to record user activity. Google Analytics is one popular tool you can use.
One of the most popular ways to get a detailed information is to use analytics tools, such as Google Analytics. You add a piece of Javascript code and you start gathering various information, which you can review in the site of the analytical tool.
Google Analytics is one of the most popular tools, although definitely not the only one. You can find other both free and commercial tools.

Spotify Google Analytics API

I'm trying to use the Google Analytics API.
But, there is no way to get the Data on google analytics.
Do I forget something ?
var gtrack = sp.require("sp://import/scripts/googletracker");
var tracker = new gtrack.GoogleTracker('UA–xxxxxxx–1');
tracker.track('APPNAME/modulename');
Thanks a lot for your help,
Three things come to mind:
I am assuming in your code you've replaced 'UA–xxxxxxx–1' with your actual property ID from Google Analytics. This is probably obvious, but it can't hurt to ask. :)
Double check you've added the correct permissions to your manifest, as explained here: http://developer.spotify.com/download/spotify-apps-api/guidelines/#usertrackinganalytics. Changes to the manifest only take effect after you restart your Spotify client.
Even once you've configured everything correctly, it can take several hours (even days if you have a lot of traffic) for data to become available on the traditional Google Analytics dashboards. I highly recommend using the real-time dashboard for debugging.

Automate the export of Facebook Insights data

I'm looking for a way of programmatically exporting Facebook insights data for my pages, in a way that I can automate it. Specifically, I'd like to create a scheduled task that runs daily, and that can save a CSV or Excel file of a page's insights data using a Facebook API. I would then have an ETL job that puts that data into a database.
I checked out the oData service for Excel, which appears to be broken. Does anyone know of a way to programmatically automate the export of insights data for Facebook pages?
It's possible and not too complicated once you know how to access the insights.
Here is how I proceed:
Login the user with the offline_access and read_insights.
read_insights allows me to access the insights for all the pages and applications the user is admin of.
offline_access gives me a permanent token that I can use to update the insights without having to wait for the user to login.
Retrieve the list of pages and applications the user is admin of, and store those in database.
When I want to get the insights for a page or application, I don't query FQL, I query the Graph API: First I calculate how many queries to graph.facebook.com/[object_id]/insights are necessary, according to the date range chosen. Then I generate a query to use with the Batch API (http://developers.facebook.com/docs/reference/api/batch/). That allows me to get all the data for all the available insights, for all the days in the date range, in only one query.
I parse the rather huge json object obtained (which weight a few Mb, be aware of that) and store everything in database.
Now that you have all the insights parsed and stored in database, you're just a few SQL queries away from manipulating the data the way you want, like displaying charts, or exporting in CSV or Excel format.
I have the code already made (and published as a temporarily free tool on www.social-insights.net), so exporting to excel would be quite fast and easy.
Let me know if I can help you with that.
It can be done before the week-end.
You would need to write something that uses the Insights part of the Facebook Graph API. I haven't seen something already written for this.
Check out http://megalytic.com. This is a service that exports FB Insights (along with Google Analytics, Twitter, and some others) to Excel.
A new tool is available: the Analytics Edge add-ins now have a Facebook connector that makes downloads a snap.
http://www.analyticsedge.com/facebook-connector/
There are a number of ways that you could do this. I would suggest your choice depends on two factors:
What is your level of coding skill?
How much data are you looking to move?
I can't answer 1 for you, but in your case you aren't moving that much data (in relative terms). I will still share three options of many.
HARD CODE IT
This would require a script that accesses Facebook's GraphAPI
AND a computer/server to process that request automatically.
I primarily use AWS and would suggest that you could launch an EC2
and have it scheduled to launch your script at X times. I haven't used AWS Pipeline, but I do know that it is designed in a way that you can have it run a script automatically as well... supposedly with a little less server know-how
USE THIRD PARTY ADD-ON
There are a lot of people who have similar data needs. It has led to a number of easy-to-use tools. I use Supermetrics Free to run occasional audits and make sure that our tools are running properly. Supermetrics is fast and has a really easy interface to access Facebooks API's and several others. I believe that you can also schedule refreshes and updates with it.
USE THIRD PARTY FULL-SERVICE ETL
There are also several services or freelancers that can set this up for you at little to no work on your own. Depending on where you want the data. Stitch is a service I have worked with on FB-ads. There might be better services, but it has fulfilled our needs for now.
MY SUGGESTION
You would probably be best served by using a third-party add-on like Supermetrics. It's fast and easy to use. The other methods might be more worth looking into if you had a lot more data to move, or needed it to be refreshed more often than daily.

Resources