Issues while building Tabular Data Model from Vertica - azure

I have been assigned a new project where I need to prepare a PowerBI report using Azure Analysis Services (Data mart). Here the flow is Data from Vertica DW -> Azure Analysis Services (via tabular Model)-> PowerBI. I am pretty much new to Tabular Model and Vertica
Scenario:
1) The DW is in Vertica Platform online.
2) I am trying to build a data model using Analysis Services Tabular Project in VS 2019
3) This model will be deployed on Azure which will act as data source to PowerBI
4) I cannot select individual tables directly (from Vertica) while performing "Import from Data Source". I have to use a view here.
5) I have been given a single big table with around 30 columns as a source from Vertica
Concerns:
1) While importing data from Vertica, there is no option to "Transform" it as we used to have it in PowerBI Query Editor while importing data. However, I tried to import a local file and at this time, I could find this option
2) with reference to Scenario #5, how can I split the big table in various Dimensions in Model.bim? Currently, I am adding them as calculated tables. Is this optimal way or you guys can suggest something better?
Also, any good online material where I can get my hands dirty on modeling in Analysis Services Tabular Project (I can do it very well in PowerBI)?
Thanks in advance
Regards

My personal suggestion is to avoid using Visual Studio as hell. Unfortunately, it is not only useless but also damages you.
Instead, use Tabular Editor. From there you can easily work with the Tabular Model.
My personal suggestion is to avoid using calculated table as dimensions, instead create several tables in Tabular Editor and simply modify the source query / fields.

In reference to the 1st question, I believe there is some bug while connecting Vertica with PowerBI it works perfectly elsewhere except for this combination.
For #2, I can use I can choose "Import new tables" from the connected data source. It can be found under Tabular Editor View.

Related

Power BI performance

I need some help with improving power BI performance to read some data.
I currently import data from an excel sheet, a table with lots of different data types. And i was wondering if it is viable to change the data source, since it would have to be a one man job.
Does power BI has better performance importing from another data source? Im considering access because of the simplicity of the change. Using a proper database like SQL is on the table but it wouldn't be as easy to do in a short time change.
I would suggest you using SQL SSAS. You can create all the metrics outside of PBI to after import only the summarized statistics going to the drill-down level you need for.

Access Excel Data Model (Power Query) tables from ODBC

We can access Excel data using ODBC (Excel ODBC driver). Can we also access the data in the data model (i.e. Power Query tables)? Basically I am thinking about (mis)using Excel/Power Query as a database and let an external application retrieve data from it (using SQL).
To read from Sheet1 I can do:
SELECT ... FROM [Sheet1$]
but
SELECT ... FROM [table in data model]
does not seem to work for me. Is this supposed to work or is this not supported at all?
There is a ton of information about Power Query using ODBC to import data. Here I am looking at the other way around.
You should distinguish for yourself Power Query tables and Data Model (Power Pivot) tables. You can set up some PQ tables as tables, loadable to DM, so data will be "transferred" from PQ to DM only for that particular tables.
I'm pretty sure that it is impossible to get data from "PQ only" tables. You can just get m queries (not their results) via VBA or unpacking Excel.
Regarding PP (DM) tables. Actually, there is Analytical Services (VertiPac) engine inside Excel (just in case - as well inside PowerBI Desktop). So as soon as you start Excel or PBI, you actually start AS engine instance as well. The data in it are reachable via:
Excel VBA (Visual Basic for Applications). You have Thisworkbook.Model.DataModelConnection.* API, and can get to data itself and to model as well. This is the only "official" way to get the data programmatically.
Power Query - as Analytical Services data source. This is unofficial way, but I read, that Microsoft told that they are not going to close it in the future (but you never know :-)). E.g. Dax Studio can do that - https://www.sqlbi.com/tools/dax-studio/.
Unfortunatelly, while getting to PBI AS service is quite easy, I don't know how to get to Excel AS service without Dax Studio. As far as I understand, the main problem here is how to get an AS port number, launched by Excel. But I hope that this info will at least help you understand the way for further searching, if you want to go Power Query way. Or may be it is reasonable to use Power BI Desktop for the task.
Excel is just a zip file, so definitely AS files are inside of it. I never went this way, but you can observe what is inside exel zip - possibly the AS files may be in some useful form there.

is Sharepoint and AnalysisServices required in BI Semantic Model?

Im new to ReportModels. I'm was planning to test it on our new SSRS2012 and I just found out microsoft already depreciated this feature.
Furthur reading, it was replaced by BI Semantic Model. Long story short, I can seem to confirm if we need to setup Sharepoint for this to work.
1.) Is ShareportServices required for the BISM to work?
2.) do we also need Analysis Services for BISM to work too?
thanks
Perhaps this white paper will help you.
BISM refers to a couple of related technologies/tools, so the answers to your questions are not a simple yes/no. As usual, it depends...
BI Semantic models can be in the form of individual Power Pivot models inside of Excel, shared in the same manner you would share any Excel file. They can also refer to Power Pivot models inside of SharePoint. Or they can refer to an SSAS Tabular model, which can be consumed with Power View inside of SharePoint or Power View in Excel (or just base Excel, or for that matter SSRS).
So if you are using Power Pivot in SharePoint or using Power View in SharePoint, then you will need SharePoint services. If you are going to use SSAS Tabular, you will need SSAS. If you are using Power Pivot in SharePoint, you need to install SSAS for Power Pivot and configure Power Pivot.

Reports in Visual Studio 2012

I am trying to use the embedded reporting tools in VS2012. I never used them before.
The report wizard force the user to select a table and I do not see any feature that allows to add or select more than one table in the same report.
I see that there are not so many Q&A on this topic and also I have not found any decent tutorial rather then very simple samples with two columns, but lets give it some time.
Is there a way to get data from different tables of the same DB into the same report?
Your best bet is to create a SQL view and then use that. The wizard does however give you the option to choose more than one table when you create a new dataset.

Can I query SAP BO WEBI via Excel VBA? Can I do it fast enough?

Following up on my previous post, I need to be able to query a database of 6M+ rows in the fastest way possible, so that this DB can be effectively used as a "remote" data source for a dynamic Excel report.
Like I said, normally I would store the data I need on a separate (perhaps hidden) worksheet and I would manipulate it through a second "control" sheet. This time, the size (i.e. number of rows) of my database prevents me from doing so (as you all know, excel cannot handle more than 1,4M rows).
The solution my IT guy put in place consists of holding the data on a txt file inside of a network folder. This far, I managed to query this file through ADO (slow but no mantainance needed) or to use it as a source to populate an indexed Access table, which I can then query (faster but requires more mantainance & additional software).
I feel both solutions, although viable, are sub-optimal. Plus it seems to me as all of this is but an unnecessary overcomplication. The txt file is actually an export from SAP BO, which the IT guy has access to through WEBI. Now, can't I just query the BO database through WEBI myself in a "dynamic" kind of way?
What I'm trying to say is, why can't I extract only bits of information at a time, on a need-to-know basis and directly from the primary source, instead of having all of the data transfered in bulk on a secondary/duplicate database?
Is this sort of "dynamic" queries even possible? Or will the "processing" times hinder the success of my approach? I need this whole thing to really feel istantaneuos, as if the data was already there and I'm not actually retrieving it all the times.
And most of all, can I do this through VBA? Unfortunately that's the only thing I will be having access to, I can't do this BO-side.
I'd like to thank you guys in advance for whatever help you can grant me!
Webi (short for Web Intelligence) is a front-end analytical reporting application from Business Objects. Your IT contact apparently has created (or has access to) such a Webi document, which retrieves data through a universe (an abstraction layer) from a database.
One way that you could use the data retrieved by Web Intelligence as a source and dynamically request bits instead of retrieving all information in one go, it to use a feature called BI Web Service. This will make data from Webi available as a web service, which you could then retrieve from within Excel. You can even make this dynamic by adding prompts which would put restrictions on the data retrieved.
Have a look at this page for a quick overview (or Google Web Intelligence BI Web Service for other tutorials).
Another approach could be to use the SDK, though as you're trying to manipulate Web Intelligence, your only language options are .NET or Java, as the Rebean SDK (used to talk to Webi) is not available for COM (i.e. VBA/VBScript/…).
Note: if you're using BusinessObjects BI 4.x, remember that the Rebean SDK is actually deprecated and replaced by a REST SDK. This could make it possible to approach Webi using VBA after all.
That being said, I'm not quite sure if this is the best approach, as you're actually introducing several intermediate layers:
Database (holding the data you want to retrieve)
Universe (semantic abstraction layer)
Web Intelligence
A way to get data out of Webi (manual export, web service, SDK, …)
Excel
Depending on your license and what you're trying to achieve, Xcelsius or Design Studio (BusinessObjects BI 4.x) could also be a viable alternative to the Excel front-end, thereby eliminating layers 3 to 4 (and replacing layer 5). The former's back-end is actually heavily based on Excel (although there's no VBA support). Design Studio allows scripting in JavaScript.

Resources