Unable to connect to AAS - Resolution of actual cluster endpoint of Azure Analysis Server failed - azure

Can someone help explain the following error message? We get it when connecting from Excel pivot tables to Azure Analysis Services (AAS) .
"Resolution of actual cluster endpoint of Azure Analysis Server: ' ' failed. Response from Server:,
Technical Details :
Root Activity :
Date (UTC) :
Eventually the error went away when I repeatedly tested my pivot table for ten minutes.
I'd like to understand the internal components in Azure that generate such a confusing error. Hopefully there is something that customers can do to investigate, since Microsoft doesn't advertise their service outages very widely .... unless they are "globally impacting" all of their customers (eg. like when AAD went offline around the world last week).
Is this message indicating an outage in AAS itself or is there some other component(s) involved. Note that I was able to connect from SSMS, but not from Excel pivot tables.
If Microsoft were to decide to post a service health alert about this type of message, what service would it be listed under? AAS? Or some type of DNS?
Any clues would be appreciated. I'd like to understand the underlying issue and be able to predict how long it will normally take for these errors to clear up, once we've encountered them.

There are client libraries that Excel uses to connect to AAS. The libraries are based on OLEDB and they are used for connecting to "MSOLAP" which is a term used for both multidimensional and tabular cubes. Depending on whether you are using the x86 or x64 version of Excel, you will need to find and install a different package to deploy the latest libraries to your machine.
I was instructed to install the latest libraries that are available here:
https://learn.microsoft.com/en-us/analysis-services/client-libraries
Updates to the libraries can become available on a regular basis, and it is important to keep your client libraries up to date if you are connecting to azure resources (since they may be changing on a regular basis as well. )
After updating the client libraries, I no longer encounter the obscure message when I run my repro. Moreover, I'm able to simulate connectivity problems and I am now getting more meaningful errors.
As I started working with Microsoft support, they were able to determine that the original message ("Resolution of actual cluster endpoint of Azure Analysis Server failed") was most likely an authentication issue (expired token or similar). Apparently this error happens for a small handful of reasons .
Internally there are a couple steps that happen while Excel pivot tables are authenticating to AAS. The first step retrieves a token from an identity service using an ADAL library, and the next one uses the token to establish a connection to AAS.
I am happy for the help from Azure support to clear this up. Hopefully this information will help others as well. The moral of the story is that Azure services (like AAS) can change over time, and so you must keep updating your local client libraries so that they don't have any integration failures and they don't generate incoherent error messages.

Related

Fetching Data from power BI XMLA Endpoints from Linux

Can any help me in fetching data from power BI endpoint without the need of using Power Shell, as want to know a way of directly fetching in Linux only?
I know a power shell can be installed in Linux , but is there any way I can skip and directly fetch the data?
reference - https://learn.microsoft.com/en-us/power-bi/admin/service-premium-connect-tools
Your Power BI XMLA endpoint is accessible through your Azure Analysis Services (AAS) instance tied to the given datasource/workspace, which means that you should be able to connect to that AAS instance and work with the data there via the web. I am not aware of any currently available Linux compatible tools that allow this. I did a bit of research and was surprised to find that there was not a VS Code extension that allowed this (might have to get to work on that ;)).
That being said, Microsoft has several different client libraries (for both AMO and ADOMD.NET) built within their .NET Core framework that would theoretically be able to used by a client application that could be built for supported Linux OS (Microsoft doc here). In other words, (again, theoretically) it should be relatively painless to build a simple tool for a supported Linux OS that takes in XMLA commands and executes them on a provided connection.
EDIT: Another good option to consider might be Microsoft's Power BI REST API (documentation here). If the functionality you are looking for is available within their REST API, you should be able to write a client tool (using one of many different options, but .NET Core could still be the in there) targeting Linux that makes use of the API for your Power BI instance in place of directly using the XMLA endpoint. I would consider this the better alternative. This is going is a less 'Microsoft-y' way of doing this, and is going to be much easier to maintain and develop over time. I would start by confirming that the functionality you want is not available in this API first.
EDIT: After reading further in above linked document regarding AMO and ADOMD.NET client libraries:
TCP based connectivity is supported for Windows computers only.
Interactive login with Azure Active Directory is supported for Windows computers only. The .NET Core Desktop runtime is required.
So it looks like there are currently some limitations to these libraries regarding a Linux runtime. I am not positive that you could use something other than TCP based connectivity to accomplish this, but if I find a way (or someone is able to suggest something), then I will update.

replace NLog in Azure

We have a running site using NLog for logs. We are not only login errors, we use it to measure things relative to business logic.
Now we are moving to Azure and that's why I'm searching for a better way to log this type of info in azure. I'm looking for something like graylog.
Things to have in mind:
What azure provides to log info is easy to read?
Can i make queries to read data?
Is there an API to log?
Check out the following stuff, which is more or less native to Azure. Also you could probably use some of the third parties, like New Relic.
Log Analytics
Application Insights
Operations Management Suite
Application Insights not only has out of the box monitoring but also provides capabilities to create your own queries.
ps. Just my 2 cents, I'd go for OMS, Microsoft is pushing it oh so hard, it is evolving rapidly, even if you are missing some capabilities they are going to be there soon and in the long run, Microsoft is really unlikely to drop OMS anytime soon, since they started forcing it like 1.5 year ago.

Fetching data from cerner's EMR / EHR

I don't have much idea in medical domain.
We evaluating a requirement from our client who is using Cerner EMR system
As per the requirement we need to expose the Cerner EMR or fetch some EMR / EHR data and to display it in SharePoint 2013 portal.
To meet this requirement what kind of integration options Cerner proposes. Is there any API’s or Web services exposed which can be used to build custom solutions for the same?
As far as I know Cerner did expose EMR / EHR information in HL7 format, but i don't have any idea how to access that.
I had also requested Cerner for the same awaiting replies from their end.
If anybody who have associated with similar kind of job can through some light and provide me with some insights.
You will need to request an interface between your organization and the facility with the EMR. An interface in the Health Care IT world is not the same as a GUI. Is is the mechanism (program/tool) that transfers HL7 data between one entity and the other. There will probably be a cost to have an interface setup. However, that is the traditional way Cerner communicates with 3rd parties. HIPAA laws will require that this connection be very secure.
You might also see if the facility with the EMR has an existing interface that produces the info you are after. You may be able to share that data or have a flat file generated from that interface that you could get access to. Because of HIPAA regulations, your client may be reluctant to share information in that manner.
I would suggest you start with your client's interface/integration team. They would be the ones that manage the information into and out of Cerner. They could also shed some light on how they prefer to see things done.
Good Luck
There are two ways of achieving this as I know. One is a direct connectivity to Cerner's Oracle database. This seems less likely to be possible as Cerner doesn't allow other vendors to have a direct access to their database.
The other way is to use Cerner's mPage Web Services. We have achieved this using mPage Web Services. The client needs to host the web services on a IBM WAS or some other container. We used WAS as that was readily available to us. Once the hosting is done, you will get a URL and using that you can execute any CCL program which will return you the data in JSON/XML format. mPage webservice has a basic HTTP authentication.
Now, CCL has to be written in a way which can return you the data you require.
We have a successful setup and have been working on this since 2014. For more details you can also try uCERN portal.
Thanks,
Navin

Recurring Timeout on Sql-Azure

On our system, which is implemented by a web role that uses a database sql-azure, we are experiencing recurring timeout on a specific query.
These timeouts occur for a few hours during the day and then do not show up anymore.
The query has two tables with a number of rows is not very high (about 800,000 rows) with joins using primary keys.
The execution plan is ok, the indexes are used properly, the query normally takes two seconds to be performed.
Tests without EntityFramework give the same result.
Transient fault handling are not applicable in the case of timeout.
What can be the cause of this behavior?
We have experienced similar issues in the past using SQL Azure; frequently queries running against tables with less that 10 rows and even the standard .Net membership provider queries, all failed intermittently with timeouts. This is usually when we have little to no activity on our service; mostly at night.
In commonly used areas where it is safe to retry on SQL Timeout (Usually read operations) we have added the timeout exception to our custom error detection strategy, taken from the Transient Fault Handling Block; however as you stated this is not appropriate in most cases.
The best explanation we have received from Azure support thus far is that as SQL Azure is really a shared SQL Server instance that is used by multiple clients; if one user performs an intensive operation it can affect other users in this way. However; believe this not to be acceptable we are still in contact with SQL Azure support to ascertain why throttling is not stopping this sort of activity from affecting us.
You best bet is to:
Contact SQL Azure Support either through the forums or directly (If you have a support package)
If possible; try setting up a new SQL Azure instance and migrating your database across
Whilst we get this issue intermittently on one SQL Azure instance; we have never experienced it on our other 2 instances.
As a side note; we are still waiting on Azure Support to get back to us regarding why we were still receiving timeout exceptions.

Dealing with error message in Cognos

I have installed Cognos 10 Framework Manager in my client computer and when I tried creating a new project, this pop up message showed up.
I heard that you need to have a webservice in this context. The question is> How should I solve this problem_
Framework Manager should typically be configured to connect to a running Cognos BI instance. If you don't have one of those, you will get errors trying to perform a variety of activities within an FM project (such as applying row-level security). Even getting started will be difficult, since Framework Manager will reach out to the BI Server to retrieve things like data source information.
If you have a running BI Server somewhere, you need to configure the FM instance (Cognos Configuration on your client system) to point at the Gateway and Dispatcher URLs for the Cognos BI Server. Here's the relevant docs on IBM's site:
http://publib.boulder.ibm.com/infocenter/cbi/v10r1m1/topic/com.ibm.swg.ba.cognos.inst_cr_winux.10.1.1.doc/t_steps_configure_fm_environment_properties.html#steps_configure_FM_environment_properties

Resources