Sharepoint web analytics does not show any data - search

We installed Sharepoint Web Analytics some days ago (separated application pool).
Installation completed successfully.
Search works fine and returns expected results.
But reports still don't show any data :
Data Last Updated: 03.06.2014 02:00:23 There is no data available for
this report. Here are some possible reasons: (1) Web Analytics has not
been enabled long enough to generate data; (2) There is insufficient
data to generate this report; (3) Data logging required for this
report might not be enabled; (4) Data aggregation might not be enabled
at the level required for this report.
What I tried :
connect to website using different users (admin user included) and different browsers
checked that needed services on server are started (especially analytics services)
restart services Web Analytics Data Processing Service and Web Analytics Web Service
checked that all services applications are started (WSS_UsageApplication status was stopped so I started it using Sharepoint 2010 Management Shell)
checked service application associations (especially if Analytics Service Application Proxy is checked)
manual execution of jobs (Web Analytics Trigger Workflows Timer Job, Microsoft SharePoint Foundation Usage Data Import, Microsoft SharePoint Foundation Usage Data Processing)
manual start of incremental crawling
IIS restart
virtual machine reboot
checked scope of data logging (especially if Enable usage data collection and Enable health data collection are checked)
checked that the .usage file are generated correctly on the disk
checked in the Logging database (WSS_UsageApplication) that the RequestUsage view contains data collected from the .usage files
checked that data is successfully extracted from the logging database into the staging database (LastLoggingExtractionTime)
checked that data was successfully copied from the staging database to the reporting database (LastDataCopyTime)
checked on the website side that Advanced Web Analytics feature is Active
granted ROOT\SPAppPool full control on the Web Analytics Service Application
recreate web analytics application
cleaned sharepoint cache configuration (deleted all .xml files from C:\ProgramData\Microsoft\SharePoint\Config\ and reinitialize cache.ini to 1)
cleaned the drive where .usage file are generated so the message "Drives are running out of free space. Available drive space is less than twice the value of physical memory." does not appear anymore in ULS logs
added ROOT\SPAppPool user to the Performance Log Users group on the local machine
added ROOT\SPAppPool user to the Farm Admin group
added dbowner permission for user ROOT\SPAppPool on WSS_UsageApplication and Sharepoint_Config databases
install patch 2204024 (was already installed)
manual execution of procedures in the staging and reporting databases (proc_DefragmentIndices, proc_UpdateStatistics, proc_WA_CleanFloatingFeedbackData, proc_WA_DeleteInvalidAdjacentHierarchyData, proc_WA_DeleteInvalidFactData, proc_WA_DeleteInvalidInventoryData in the reporting database and proc_DefragmentIndices, proc_UpdateStatistics, proc_WA_EnsureServiceBrokerEnabled in the staging database)
checked that Windows service Sharepoint 2010 Timer is started
checked for any message in ULS logs that could help...
All appears to work fine but I can't see any data in the reports.
Anything else I can try ?
EDIT 14 aug 2014:
Inventory data is collected successfully and I can see the related reports. Traffic and Search data still empty :

last time i had a problem i wrote the following steps.
maybe they can help
1) create the web analytics service application
http://technet.microsoft.com/en-us/library/gg266382%28v=office.14%29.aspx
http://blogs.technet.com/b/wbaer/archive/2009/11/21/step-by-step-provisioning-the-web-analytics-service-application-on-microsoft-sharepoint-server-2010-beta.aspx
2) enable "web analytics data processing service" and "web analytics web service" services on the application server
where analytics will run
http://sharepoint-community.net/profiles/blogs/how-does-web-analytics-works-under-sharepoint-2010
3) make sure reporting account is a member of the farm administrators group
In Central Administration, on the Home page, click Monitoring.
On Monitoring page, in the Reporting section, click Configure usage and health data collection.
On the Event Selection section, click all check boxes to select them, and then click OK.
https://sharepoint.stackexchange.com/questions/51621/web-analytics-not-working
4) force population of results
You can run the time job manually to get the data into the analytics report straight away by going into the
'Monitoring' section in Central Administration.
There you will find a link under the 'Timer Jobs' heading called 'Review job definitions'.
In there will be a whole load of job definitions, look for
'Web Analytics Trigger Workflows Timer Job'
with the correct Web Application specified (for me it was on the 2nd page).
Click the name of the job and click the 'Run Now' button.
https://sharepoint.stackexchange.com/questions/51621/web-analytics-not-working
5) check your reports
http://you.com/_layouts/WebAnalytics/Report.aspx?t=SummaryReport&l=s
* troubleshoot *
http://blogs.msdn.com/b/sharepoint_strategery/archive/2012/03/16/troubleshooting-sharepoint-2010-web-analytics.aspx
http://blogs.technet.com/b/manharsharma/archive/2012/10/13/sharepoint-2010-web-analytics-troubleshooting-reporting-db.aspx

Related

Azure application gateway firewall logs not being populated to log analytics workspace

We have provisioned the instance of the Azure app gateway (Standard v2 East AU region) and has enabled the diagnostics settings of it to dump all metrics and logs to the log analytics workspace and this seem to be working fine, however we wanted to additional insights of the request and hence have scaled up the tier and enabled the WAF v2 (as shown in the image below).
Now based on this documentation here https://learn.microsoft.com/en-us/azure/application-gateway/application-gateway-diagnostics#diagnostic-logging and after waiting for some time, we expected that the firewall logs will be automatically populating in the same log analytics workspace however this does not seem to work and they are simply not populated there.
Note that we can see the "ApplicationGatewayAccessLog" logs and below query is evident of the same AzureDiagnostics | distinct Category that returns only one category i.e. "ApplicationGatewayAccessLog"
Does anyone know if we are missing something or have any input?
Sometimes, the output is not the same when you explore data from Application Gateway ---logs and from your specific Log Analytics workspace---logs. You cam compare these results on your side. See this issue.
In this case, you should have finished some access actions to your Application Gateway and trigger the firewall access log collection before the data can be collected by the Azure monitoring. Though document stated Firewall logs are collected every 60 seconds. Sometimes, the data delays(even more than 2 days) to be logged in the logs and your located region also impacts on the data display time. From this blog, you can see hourly log of firewall actions on the WAF.
For more information, you can use Log Analytics to examine Application Gateway Web Application Firewall Logs.

How to create workspace in A1 cored PowerBI service in azure portal?

Over the months of exploration into PBI, started with successfully creating a workspace using PowerBI pro license and ended with hosting a pbi report embedding into my custom MVC site using apps-own-data model.
First experience is maximum allowed embedded tokens running out.
My company decided to create a dedicated A1 core powerbi embedded service in a azure account. Now I have overcame token running out of count issue but seems cringy that my powerbi embedded service besides paused still my embedded site runs and accesses powerbi reports without any interruption.
Previously have created AD using embed tool provided by microsoft. I can see my AD been created in azure portal too.
How this is possible to view a pbi report where my azure powerbi embedded service been paused.
Am i supposed to use those pbi reports without getting billed?
Microsoft has limited information on documentation to clarify my doubts, but the PBI community site is somewhat helpful still having trouble getting clarification for the same.
Help required.
For your question:
How this is possible to view a pbi report where my azure powerbi embedded service been paused. Am i supposed to use those pbi reports without getting billed?
If the A1 Node is paused, then no, you will not be able to see your report or use the service in your front end. It has to be running to deliver the reports in your custom front end. You can still go into the Power BI Service with an assigned Power BI Pro licence and see your report, the workspace that the report has been deployed to, is flagged as 'embedded capacity' that will be shown as a diamond shape next to it.
You allocate the workspace to a capacity by editing the workspace and selecting the 'Advanced' option then 'Dedicated Capacity'
The MS documentation outlines pausing will not deliver content.
Pausing a capacity may prevent content from being available within
Power BI. Make sure to unassign workspaces from your capacity before
pausing to prevent interruption.
Pausing is designed to allow you to stop delivering connect for example, out side business hours, I have a few clients that only run their internal and external report during 7am to 7pm, the other 12 hours the service is paused. The A sku billing costs are reduced to 50%.
Hope that helps

Monitor when Azure Web App is unloaded?

What would be the best way to monitor when our Azure web app is being unloaded when no requests have been made to the web app for a certain amount of time?
Enabling Logstream for the web server doesn't seem to reveal anything of use..
Any hints much appreciated!
You can use Azure Application Insights to create a web test that will alert you when the site is not available anymore. It will ping your site from the data centers you select and perform some action you select (mail, webhook, etc).
However, if you want your web app to stay online, you could upgrade its plan to be at least basic, and under settings enable always on.
In addition to the kim’s response:
If you are running your web app in the Standard pricing tier, Web Apps lets you monitor two endpoints from three geographic locations.
Endpoint monitoring configures web tests from geo-distributed locations that test response time and uptime of web URLs. The test performs an HTTP GET operation on the web URL to determine the response time and uptime from each location. Each configured location runs a test every five minutes.
Uptime is monitored using HTTP response codes, and response time is measured in milliseconds. A monitoring test fails if the HTTP response code is greater than or equal to 400 or if the response takes more than 30 seconds. An endpoint is considered available if its monitoring tests succeed from all the specified locations.
Web Apps also provides you with the ability to troubleshoot issues related to your web app by looking at HTTP logs, event logs, process dumps, and more. You can access all this information using our Support portal at http://.scm.azurewebsites.net/Support
The Azure App Service support portal provides you with three separate tabs to support the three steps of a common troubleshooting scenario:
-Observe current behavior
-Analyze by collecting diagnostics information and running the built-in analyzers
-Mitigate
If the issue is happening right now, click Analyze > Diagnostics > Diagnose Now to create a diagnostic session for you, which collects HTTP logs, event viewer logs, memory dumps, PHP error logs, and PHP process report.
Once the data is collected, the support portal runs an analysis on the data and provides you with an HTML report.
In case you want to download the data, by default, it would be stored in the D:\home\data\DaaS folder.
Hope this helps.

service fabric, control what events get saved to table storage

I have a service fabric cluster on Azure and it has a very simple app running on it. The app is from this tutorial.
When running the app locally, the Visual Studio Diagnostic Events shows 3 events.
CRM
MasterCRM
ServiceMessage
I believe the CRM and MasterCRM are related to the cluster manager and the ServiceMessage shows events from my app, in this case just a message saying the current value of a counter.
This data is also saved in a table storage, I was wondering is there any way for me to control what gets saved to the table storage? Right now my table consists of pages and pages of CRM and MasterCRM messages and I've yet to see messages from my app, I'm sure if I keep going I might eventually see it, but so far no luck.
I'd like to just save the events from my app to the table storage and ignore the rest. I've looked around and found no way to do it.
The events you refer to are coming from ETW from the fabric runtime (CRM, MasterCRM) and your application (ServiceMessage) like you mentioned. The diagnostics viewer in Visual Studio is getting these events directly from ETW and not Azure Table Storage. If you want to filter the events showing up in the diagnostics viewer you can click the gear icon and edit the sources listed.
*CRM comes from Microsoft-ServiceFabric:5:0x4000000000000000.
Controlling what events get uploaded to Azure Table Storage in an Azure hosted cluster would require editing the ARM template's diagnostics section similarly.

How to clear the SharePoint Usage Logs and/or Web Analytics Logs

How do I clear the SharePoint Usage Logs and/or Web Analytics Logs?
I've tried deleting the *.usage files found in {SharePoint Hive}\Logs, and deleting the Usage Service Application as well as the Web Analytics Service Application?
The reason why I as is because I have a web part that determines the most visited sites. In my code I use the
SPWeb.GetUsageData(SPUsageReportType.url, SPUsagePeriodType.lastMonth)
method, and it always returns the same data. I would like to reset the values it returns.
Have you tried looking at the usage database? Usage might be backed by that database.
You can also limit the retention timeframe for specific usage items: Set-SPUsageDefinition -Identity"Page Requests" -DaysRetained 3 or see this blog post: How to reduce the size of logging database OR How to purge the old data from Logging Database.
Another way of clearing the logs is shown here using only central admin: How to Delete Usage Logs on Sharepoint. Sahhil Malik shows yet more ways to refresh the logs: SharePoint 2010 - Drives are running out of free space.
If you are just trying to restart the logging, try:
◦Open the ​"SharePoint 2010 Management Shell" (Powershell for SharePoint).
◦Type "New-SPLogFile"
Done.
The logs will be in "C:\Program Files\Common Files\Microsoft Shared\Web Server Extensions\14\LOGS"
You may delete WEB Analytics Service along with associated Data
under
Application Management->Service Applications->Manage service applications
select web analytics from service
and delete it along with associated data

Resources