I just had a brief question and hope someone can guide me in the right direction to helping me understand how that works and how its tracked. How do you measure the amount of times a link, button, or search is performed on a specific website? Do you run or can you run reports that give you this information?
Most of that information ends up in your log files. If you have access to your web server log files, you can run them through any log analyzer to get your answers. This will capture all the data you requested except links to other websites.
If you want to capture offsite link clicks, or if you lack access to your log files, you'll need to embed javascript into your pages to record user activity. Google Analytics is one popular tool you can use.
One of the most popular ways to get a detailed information is to use analytics tools, such as Google Analytics. You add a piece of Javascript code and you start gathering various information, which you can review in the site of the analytical tool.
Google Analytics is one of the most popular tools, although definitely not the only one. You can find other both free and commercial tools.
Related
Does Clarity Actually Push Data to Google Analytics?
Context A:
I started a project for our company for internal intranet analytics so they can learn more about which pages work, which ones don't, popular documents, etc. I found Clarity (and thought this is great) it says that it pushes information to Google Analytics, but that doesn't seem to be the case.
Additional context at the bottom to ensure question clarity, pun intended.
Confusing / Misleading Documentation:
Microsoft can you answer this question or reply to my other emails that you and I were working back and forth on? Your documentation says one thing (the quote below), but that doesn't seem to be accurate.
How does Clarity integration work?
Clarity will create a custom dimension called the 'Clarity Playback URL'. Data will start flowing into Google Analytics from the Clarity JavaScript you've added to your site without any more changes.
My Question:
So is this accurate or not, does Clarity pass information over or is this incorrectly describing Clarity behavior and that the two need to independently gather data for it to work correctly?
Initial Microsoft Response:
Sometimes it takes about 1 - 2 days to populate the initial data, if you still don’t see the data by then, you can reach out again!
Second Microsoft Response:
The account/web property that is connected has no rows in it for any query. Could you please double check and connect an account which contains data.
Final Thoughts:
So which one is it? There is no documentation on this and the project I am working on is at a standstill because of this, please advise so I can either continue forward with this or move on to a different solution because ms-clarity doesn't seem to be working as documented.
Context B:
I originally started in on this project because the previous Google Analytics that were linked into the intranet stopped working after modern / online updates (not sure which one did it, that all happened before I got here) and I had to get special approval to use Clarity and we only went with Clarity because of this piece in the documentation that essentially told us we could re-make the link between the intranet and Google Analytics as well as get more functionality coupled with Clarity.
We also did not want to do a weird patch job to get Google Analytics injected into pages and I told them a SPPKG of Clarity would do the trick... however, here we are now.
Clarity DOES NOT push data to Google Analytics. You have to gather data using Clarity and Google Analytics SEPERATLEY, and then when you integrate Clarity will work with Google Analytics to bring the data together.
So you CANNOT only install Clarity, and then expect to integrate with Google Analytics to push Clarity gathered data to Google Analytics.
How to scrape OTT streaming platforms(Netflix, Prime video, HULU, Hotstar, etc.) catalogue list with details like flixjini, justwatchit and other's do?
Some of the above services used to offer API's to 3rd party search services to help list their content but it does seem that most now do not.
Without this you may find you have to create your own web crawler and also have accounts for every service and region you want to crawl, which may make it unviable commercially. You also probably need to check the applicable laws in any regions you want to do this.
There are some open source web crawling solutions you could look at and also engage with the community on - e.g.
:
https://scrapy.org
I am tasked with the development of a web page.
As a part of this, I also need to collect the details of browser used by the users to browse the web page (along with version, timestamp & IP).
Tried searching over the internet but maybe my search was not properly directed.
I am not sure how to go about this.
Any pointer to get me started here would help me a long way.
I would also like to know how to store the information thus collected.
Not sure if this is a good idea - but just thinking out loud - are there any online service/channel available where the data can be uploaded in real time - like thingspeak.
Thanks
Google Analytics is great, but it doens't give you the data (for free). So if you want your data in e.g. SQL format then I may suggest a you use a tool that collects the data for you and then sends it to Google Analytics.
We use Segment (segment.io, but there are probably other tools out there too) for this, they have a free plan. You will need to create a Warehouse in AWS and all your data will be stored there, incl. details of the browser (version, timestamp, IP).
Disclaimer: I do not work for Segment, but I am a customer.
Enable Google Analytics in your website, then after 1 week, take a look at the Google Analytics page to see data that was collected.
Follow the guide here to configure Google Analytics on your website: https://support.google.com/analytics/answer/1008080?hl=en
You should go for alternatives like Segment(https://segment.com/), Mixpanel(https://mixpanel.com/) and similars. They can guarantee consistency for your data and also integrate to many different analytics tools and platforms.
I've been working on a web application and finally published it to Azure. The application is not critical and currently I use only one role to keep costs down.
I would like to start try and get a feel of who (if anyone is using my site). Can anyone give me some suggestions on how I could do this. What I would really like is not to use anything like the google scripts that I see some web sites use for monitoring page hits. I would like to do as much as possible on the server.
Help advice on where to start and what to look at would be much appreciated.
Katarina
Aside from things like Google Analytics and StatCounter, you'd want to set up some performance counters that you can watch externally. This requires you to use the Diagnostic Monitor:
Set up performance counters to track, and how often to poll for values
Set up frequency to upload to Table Storage
Diagnostic data is aggregated from all your instances, so then you can run queries against the diagnostic tables. Cerebrata has a page that details these table names (you can also use their Diagnostics Manager tool, other 3rd-party tools, or roll your own).
Igork posted this StackOverflow answer as well, which references some blog posts by Azure MVP Neil Mackenzie.
To add to Dave's answer, there are three levels of monitoring you can do:
If you want to know who is using your site, Google Analytics is best and free... There are a few others, but all involve injecting small javascript on your pages
If you want to know the load your site is under, inspecting performance counters via Cerebrata's tool is likely best # http://www.cerebrata.com
If you want to go one step further and be notified when the load on your site is outside your predefined conditions (active monitoring) or have your website automatically scale up when the load is too high, AzureWatch is probably the best option # http://www.paraleap.com
HTH
I usually send out emails with links to document downloads. Is there any chance of registering statistics on those using Google Analytics?
Javascript and Google Analytics doesn't work in HTML emails, so you can't measure the clicks on those directly.
If you use an email service provider, like Mailchimp or one of the many others, they will provide click-tracking at not additional cost, so you could consider that. Most ESPs will have a free or freemium plan you can try.
Otherwise, you can do the following:
Route all clicks through a script that counts the clicks. Pros: accurate count. Cons: custom development, re-inventing the wheel.
Direct all the clicks to a landing page that has Google Analytics implemented. Pros: easy implementation, no custom development. Cons: extra click will lead to some dropoff.
I would recommend an email service provider. Besides the click-tracking, it will give you much better deliverability, testing tools, and reporting options.
You could try using an intermediary page that sends off the GA Tracking Beacon before redirecting to the download. This could either be done using JavaScript that works in a similar way to outbound link tracking or using a meta-refresh. Either way you would want to put a message on the page stating that the download will start shortly and a redirect link in case something goes wrong.