Dealing with error message in Cognos - cognos

I have installed Cognos 10 Framework Manager in my client computer and when I tried creating a new project, this pop up message showed up.
I heard that you need to have a webservice in this context. The question is> How should I solve this problem_

Framework Manager should typically be configured to connect to a running Cognos BI instance. If you don't have one of those, you will get errors trying to perform a variety of activities within an FM project (such as applying row-level security). Even getting started will be difficult, since Framework Manager will reach out to the BI Server to retrieve things like data source information.
If you have a running BI Server somewhere, you need to configure the FM instance (Cognos Configuration on your client system) to point at the Gateway and Dispatcher URLs for the Cognos BI Server. Here's the relevant docs on IBM's site:
http://publib.boulder.ibm.com/infocenter/cbi/v10r1m1/topic/com.ibm.swg.ba.cognos.inst_cr_winux.10.1.1.doc/t_steps_configure_fm_environment_properties.html#steps_configure_FM_environment_properties

Related

Fetching Data from power BI XMLA Endpoints from Linux

Can any help me in fetching data from power BI endpoint without the need of using Power Shell, as want to know a way of directly fetching in Linux only?
I know a power shell can be installed in Linux , but is there any way I can skip and directly fetch the data?
reference - https://learn.microsoft.com/en-us/power-bi/admin/service-premium-connect-tools
Your Power BI XMLA endpoint is accessible through your Azure Analysis Services (AAS) instance tied to the given datasource/workspace, which means that you should be able to connect to that AAS instance and work with the data there via the web. I am not aware of any currently available Linux compatible tools that allow this. I did a bit of research and was surprised to find that there was not a VS Code extension that allowed this (might have to get to work on that ;)).
That being said, Microsoft has several different client libraries (for both AMO and ADOMD.NET) built within their .NET Core framework that would theoretically be able to used by a client application that could be built for supported Linux OS (Microsoft doc here). In other words, (again, theoretically) it should be relatively painless to build a simple tool for a supported Linux OS that takes in XMLA commands and executes them on a provided connection.
EDIT: Another good option to consider might be Microsoft's Power BI REST API (documentation here). If the functionality you are looking for is available within their REST API, you should be able to write a client tool (using one of many different options, but .NET Core could still be the in there) targeting Linux that makes use of the API for your Power BI instance in place of directly using the XMLA endpoint. I would consider this the better alternative. This is going is a less 'Microsoft-y' way of doing this, and is going to be much easier to maintain and develop over time. I would start by confirming that the functionality you want is not available in this API first.
EDIT: After reading further in above linked document regarding AMO and ADOMD.NET client libraries:
TCP based connectivity is supported for Windows computers only.
Interactive login with Azure Active Directory is supported for Windows computers only. The .NET Core Desktop runtime is required.
So it looks like there are currently some limitations to these libraries regarding a Linux runtime. I am not positive that you could use something other than TCP based connectivity to accomplish this, but if I find a way (or someone is able to suggest something), then I will update.

Unable to connect to AAS - Resolution of actual cluster endpoint of Azure Analysis Server failed

Can someone help explain the following error message? We get it when connecting from Excel pivot tables to Azure Analysis Services (AAS) .
"Resolution of actual cluster endpoint of Azure Analysis Server: ' ' failed. Response from Server:,
Technical Details :
Root Activity :
Date (UTC) :
Eventually the error went away when I repeatedly tested my pivot table for ten minutes.
I'd like to understand the internal components in Azure that generate such a confusing error. Hopefully there is something that customers can do to investigate, since Microsoft doesn't advertise their service outages very widely .... unless they are "globally impacting" all of their customers (eg. like when AAD went offline around the world last week).
Is this message indicating an outage in AAS itself or is there some other component(s) involved. Note that I was able to connect from SSMS, but not from Excel pivot tables.
If Microsoft were to decide to post a service health alert about this type of message, what service would it be listed under? AAS? Or some type of DNS?
Any clues would be appreciated. I'd like to understand the underlying issue and be able to predict how long it will normally take for these errors to clear up, once we've encountered them.
There are client libraries that Excel uses to connect to AAS. The libraries are based on OLEDB and they are used for connecting to "MSOLAP" which is a term used for both multidimensional and tabular cubes. Depending on whether you are using the x86 or x64 version of Excel, you will need to find and install a different package to deploy the latest libraries to your machine.
I was instructed to install the latest libraries that are available here:
https://learn.microsoft.com/en-us/analysis-services/client-libraries
Updates to the libraries can become available on a regular basis, and it is important to keep your client libraries up to date if you are connecting to azure resources (since they may be changing on a regular basis as well. )
After updating the client libraries, I no longer encounter the obscure message when I run my repro. Moreover, I'm able to simulate connectivity problems and I am now getting more meaningful errors.
As I started working with Microsoft support, they were able to determine that the original message ("Resolution of actual cluster endpoint of Azure Analysis Server failed") was most likely an authentication issue (expired token or similar). Apparently this error happens for a small handful of reasons .
Internally there are a couple steps that happen while Excel pivot tables are authenticating to AAS. The first step retrieves a token from an identity service using an ADAL library, and the next one uses the token to establish a connection to AAS.
I am happy for the help from Azure support to clear this up. Hopefully this information will help others as well. The moral of the story is that Azure services (like AAS) can change over time, and so you must keep updating your local client libraries so that they don't have any integration failures and they don't generate incoherent error messages.

bluemix xpages performance and architecture

I have a few questions about bluemix xpages runtime.
As of now (Aug 2016) Xpages NoSQL Database is still experimental. Is there an ETA for this NoSQL service to become GA ?
As of now, to have better control over performance, a separate domino server has to be provisioned to host the NSF datastore as described in https://developer.ibm.com/bluemix/2015/11/10/hybrid-xpages-applications-on-bluemix/
What are the best practices to minimize latency for the traffic between the XPages frontend and the backend server hosting the NSF datastore ? Should the domino server be hosted on IBM SoftLayer ?
Does the XPages runtime provide visibility into the network performance between the Xpages runtime and the NSF backend ?
I presume that the number of xpages runtime instances can be increased to handle increased traffic (horizontal scaling). However, the domino backend where the NSF is stored would eventually become a bottleneck, and can only be scaled by increasing the power (CPU/RAM) of the machine. (vertical scaling). Are there plans to offer Xpages NoSQL backend that can also scale horizontally?
In a hybrid bluemix xpages setup, the xpages runtime can be stood up using a custom server.id.
When the xpages runtime is scaled up by increasing the number of instances, would all the instances use the same server.id ? AFAIK, in a domino domain, each server would use a unique server.id. Should this be a cause for concern ?
Is xpages buildpack available (under some license) to be run on any other cloudfoundry instance ?
Thank you in advance for responding.
The question on NSF availability on Bluemix is better asked at the Blumix forum on ibm.com/developerworks. Or ask your IBM representative.
So far I have not seen any plan regarding such a service.
You need to look at your use case carefully:
you want to "just go to cloud": move your Domino servers to softlayer and you are done
you want to spice up your Domino applications with some Bluemix services (Watson seems popular these days): define that services in Bluemix, unbound to any runtime. They all expose https APIs. Call those from your main Domino server
You want to use Domino data in other Bluemix applications: either call DAS on your main Domino directly or, when it remains behind a (corporate) firewall use API Management and/or the secure tunnel service in Bluemix
want the performance monitoring service: if it's mainly about the traffic: use an nginx build pack (64M will do nicely) and add the service there. Will give you modern protocols and deep control what to accept/send in http. Use that as proxy in front of your Domino
need auto scale for your application: use XPages on Bluemix (note: doesn't scale database servers)
Hope that helps
As far as I know there are no plans to make the Domino database layer available on Bluemix. As a result, there are also no plans, as far as I know, to have a horizontally scaling backend. I think your concern over scalability is valid, I've not heard a reasonable answer.
For these very valid reasons I've not investigated XPages on Bluemix very deeply, as well as the fact that the Domino (data) server you use elsewhere will also have an XPages runtime, included in the cost rather than charged in addition.
In terms of communication, this is not via HTTP but via NRPC. At IBM Connect earlier this year the server guys outlined steps they were taking or had taken to further secure the NRPC communication (I believe it's pretty secure already, this was extra encryption I believe, but as a non-admin I didn't fully understand the reasons). If you look at the URL for attachments or images stored in rich text fields, for example, you'll see it uses XPiNC syntx rather than the syntax you normally get for XPages on the web.
I believe additional instances would have the same server ID. You only upload the NSF once to Bluemix, it handles the deployment to the multiple instances.
I don't think the XPages buildpack is available for other cloudfoundry instances, but I can't say for certain. I think what you currently see on Bluemix is all there is.

need wcf and wpf load testing tool

I need to find out the performance of my application.This application works as follows-
It's a WPF windows application, which requires some data to be filled by user
On clicking Submit button, it calls WCF web services
These services save these values in DB
Which tool would be the best for this scenario?
The best approach would be having several thousands of geo-distributed real users having different Internet connection speed and using your WPF application normally. If you are not able to arrange this - I would suggest mimicking these several thousands users on protocol level. You mentioned WCF web services - go and find a web service load testing tool. For example good ones are:
SoapUI - designed for web services testing. Has some load testing capabilities. See Creating and Running LoadTests to get started.
Apache JMeter - multiprotocol load testing tool, supports web services as well. See Testing SOAP/REST Web Services Using JMeter guide.
In fact any tool which is capable of sending a HTTP request will fit.
You can use some of usual load testing tools? for example Apache Jmeter (free) or HP LoadRunner (community license for 50 free users).
Just record communication between your WPF application and WCF ( I belive it communicate via HTTP) and add needed parameters.

Excel recording in Loadrunner

I am trying to do a Performace test on Excel based application with loadrunner. Started of running the protocol advisor. which is trowing error.
My main target is to record the excel based application. For simulation created a database and calling database from excel.
Any sugestions what protocol to use. Or any other tools for conducting performance test on excel based application.
Here is where the foundation classes of knowledge on development and architecture come into play for a performance test professional.
Tell us about the next upstream component from excel? Is it connecting directly to the database or is it going through a web services or other application server layer? Do you have SQL Queries you are trying to reproduce or some other mechanism so as accessing your source?
What have you tried? (Other than protocol confuser?)

Resources