cognos monitoring - cognos

Does cognos 8 have an api which can be queried to find out:
the last time a scheduled report has run
was it successful?
if not, what caused it to fail?
it would be preferable for the api to be web based. I have read about a JMX interface but documentation was lacking.

Cognos 8 has an option to enable logging to an Audit Database. The database would then be queryable for details such as when a report ran, what parameters were used, and what if any errors there were.
Link to IBM site for setting up the logging database.
Link to IBM site for setting up the connection to the database.
Basically you create a compatible database with the proper settings, then tell Cognos how to connect to it, then the next time Cognos services start up, it will automatically create necessary tables for logging and begin populating them automatically.

Typically, to accomplish this in the API fashion, you want to use Cognos SDK. SDK allows you to query scheduling and history associated with report runs and see if the request was completed or failed. If failed, you will see History associated with the failure much like you see in the Cognos Administration section when you go to look at the failed runs.
This is a good place to start to look at a sample:
http://www-01.ibm.com/support/docview.wss?uid=swg21343791

Related

Unable to connect to AAS - Resolution of actual cluster endpoint of Azure Analysis Server failed

Can someone help explain the following error message? We get it when connecting from Excel pivot tables to Azure Analysis Services (AAS) .
"Resolution of actual cluster endpoint of Azure Analysis Server: ' ' failed. Response from Server:,
Technical Details :
Root Activity :
Date (UTC) :
Eventually the error went away when I repeatedly tested my pivot table for ten minutes.
I'd like to understand the internal components in Azure that generate such a confusing error. Hopefully there is something that customers can do to investigate, since Microsoft doesn't advertise their service outages very widely .... unless they are "globally impacting" all of their customers (eg. like when AAD went offline around the world last week).
Is this message indicating an outage in AAS itself or is there some other component(s) involved. Note that I was able to connect from SSMS, but not from Excel pivot tables.
If Microsoft were to decide to post a service health alert about this type of message, what service would it be listed under? AAS? Or some type of DNS?
Any clues would be appreciated. I'd like to understand the underlying issue and be able to predict how long it will normally take for these errors to clear up, once we've encountered them.
There are client libraries that Excel uses to connect to AAS. The libraries are based on OLEDB and they are used for connecting to "MSOLAP" which is a term used for both multidimensional and tabular cubes. Depending on whether you are using the x86 or x64 version of Excel, you will need to find and install a different package to deploy the latest libraries to your machine.
I was instructed to install the latest libraries that are available here:
https://learn.microsoft.com/en-us/analysis-services/client-libraries
Updates to the libraries can become available on a regular basis, and it is important to keep your client libraries up to date if you are connecting to azure resources (since they may be changing on a regular basis as well. )
After updating the client libraries, I no longer encounter the obscure message when I run my repro. Moreover, I'm able to simulate connectivity problems and I am now getting more meaningful errors.
As I started working with Microsoft support, they were able to determine that the original message ("Resolution of actual cluster endpoint of Azure Analysis Server failed") was most likely an authentication issue (expired token or similar). Apparently this error happens for a small handful of reasons .
Internally there are a couple steps that happen while Excel pivot tables are authenticating to AAS. The first step retrieves a token from an identity service using an ADAL library, and the next one uses the token to establish a connection to AAS.
I am happy for the help from Azure support to clear this up. Hopefully this information will help others as well. The moral of the story is that Azure services (like AAS) can change over time, and so you must keep updating your local client libraries so that they don't have any integration failures and they don't generate incoherent error messages.

Dashboard F5 data download

In F5>Statistics>Dashboard it is possible to download raw data with the 'history' icon.
I need to download this on regular basis so automation comes into place.
I can't find such report in F5 regular report depository. I tried to link F5 to zabbix to analyze there but I don't have access to F5 backend. I set up a UI macro but I would like to have something more reliable in place.
Any tips most welcomed.
F5dashboard screenshot
Thanks!
Since you're on version 13, you have the option to load the Telemetry streaming iControl LX RPM onto BIG-IP and pull the data from any number of preconfigured systems or set up a generic pull JSON request through the API for the same data.
https://clouddocs.f5.com/products/extensions/f5-telemetry-streaming/latest/
This is the same set of data that they use to populate the BIG-IQ Centralized Management performance and health analytics.
The caveat is depending on what you request and how much you request, it can start to tax the system so enable what you need and in chunks to see how it affects system performance. I've seen the mightiest systems grid to a halt when asked to provide ALL telemetry data to Splunk or Sumologic.
Hope this helps.

replace NLog in Azure

We have a running site using NLog for logs. We are not only login errors, we use it to measure things relative to business logic.
Now we are moving to Azure and that's why I'm searching for a better way to log this type of info in azure. I'm looking for something like graylog.
Things to have in mind:
What azure provides to log info is easy to read?
Can i make queries to read data?
Is there an API to log?
Check out the following stuff, which is more or less native to Azure. Also you could probably use some of the third parties, like New Relic.
Log Analytics
Application Insights
Operations Management Suite
Application Insights not only has out of the box monitoring but also provides capabilities to create your own queries.
ps. Just my 2 cents, I'd go for OMS, Microsoft is pushing it oh so hard, it is evolving rapidly, even if you are missing some capabilities they are going to be there soon and in the long run, Microsoft is really unlikely to drop OMS anytime soon, since they started forcing it like 1.5 year ago.

Monitor Database Calls with Application Insights

So I've been reading through the Application Insights information published by Microsoft, and in particular this article: https://azure.microsoft.com/en-gb/documentation/articles/app-insights-search-diagnostic-logs/
So what I want to ask is, whats the most logical methodology to log database calls?
In my head, I want to be able to log into application insights, see the most common database calls being made, and see what their average call times are. That way, I can say "wow the lookup to the membership profile table is taking a few seconds today, what's the deal?"
So I have a database name, a stored procedure name, and an execution time, what's the best way for me to take that data and store it in AI? As a metric, an event, something else?
First of all AI has dependency calls autocollection. Please read this. Secondly it is planned to release SDK 1.1 next week. As part of that release there you will have DependencyTelemetry type that is added specifically for monitoring SQL, http, blob and other external dependencies.

Combining data from Project Server and SharePoint into a single report

I need to combine data from the Project Server reporting database with data from custom lists in SharePoint workspaces. The results need to be displayed within a single report. How should this be done? Options I've thought of:
Extend the reporting database with the custom list data (if this is possible). Use Reporting Services to display the output.
Query the reporting database and the SharePoint workspaces and combine results in memory. Write custom code to display the output.
Any other ideas? I have the skills to develop this but am very open to purchasing a product if it solves the problem.
I've had this sort of problem as well. My apporach:
Create a Custom reporting Db.
Run regular jobs from the SQL Server to query sharepoint (via WS) and store the results in the db.
i use the ListItemsChangesSinceToken is Lists.asmx to improve effeciency. Also I utilise the sitedataquery tool set. I wrote a really simple interface into it for the ability to call a sitedataquery remotely, returning a dataTable.
Use Reporting Services / any tool to extract and report on the data.
The reason I opted for a staging Db was for
Performance - the WS calls are pretty slow.
Service continuity - if SP is down for any reason or slow then queries will fail.
Hope this helps.
I also found the tool SharePoint Data Miner which appears to do the same as DJ's answer.

Resources