We can access Excel data using ODBC (Excel ODBC driver). Can we also access the data in the data model (i.e. Power Query tables)? Basically I am thinking about (mis)using Excel/Power Query as a database and let an external application retrieve data from it (using SQL).
To read from Sheet1 I can do:
SELECT ... FROM [Sheet1$]
but
SELECT ... FROM [table in data model]
does not seem to work for me. Is this supposed to work or is this not supported at all?
There is a ton of information about Power Query using ODBC to import data. Here I am looking at the other way around.
You should distinguish for yourself Power Query tables and Data Model (Power Pivot) tables. You can set up some PQ tables as tables, loadable to DM, so data will be "transferred" from PQ to DM only for that particular tables.
I'm pretty sure that it is impossible to get data from "PQ only" tables. You can just get m queries (not their results) via VBA or unpacking Excel.
Regarding PP (DM) tables. Actually, there is Analytical Services (VertiPac) engine inside Excel (just in case - as well inside PowerBI Desktop). So as soon as you start Excel or PBI, you actually start AS engine instance as well. The data in it are reachable via:
Excel VBA (Visual Basic for Applications). You have Thisworkbook.Model.DataModelConnection.* API, and can get to data itself and to model as well. This is the only "official" way to get the data programmatically.
Power Query - as Analytical Services data source. This is unofficial way, but I read, that Microsoft told that they are not going to close it in the future (but you never know :-)). E.g. Dax Studio can do that - https://www.sqlbi.com/tools/dax-studio/.
Unfortunatelly, while getting to PBI AS service is quite easy, I don't know how to get to Excel AS service without Dax Studio. As far as I understand, the main problem here is how to get an AS port number, launched by Excel. But I hope that this info will at least help you understand the way for further searching, if you want to go Power Query way. Or may be it is reasonable to use Power BI Desktop for the task.
Excel is just a zip file, so definitely AS files are inside of it. I never went this way, but you can observe what is inside exel zip - possibly the AS files may be in some useful form there.
Related
I'm working on automating a lot of the data reporting in the business I work at.
It's all various tables orginating from a central database and spread out across Excel workbooks.
I'm largely limited to MS office tools at the moment.
Power Query is a great deal faster than the current methods and easier to maintain.
I notice that a lot of the reporting uses the same results over and over again. As such, I can write a query and distribute to my coworkers in an ODC file or otherwise through a file server or Teams.
However, loading an ODC loads in the raw PQ code into the file.
Which means any changes made to the master query have to be manually loaded into each file.
Is there a way update PowerQuery code across multiple worksheets?
I'm trying to avoid having to write database level queries as possible. I have minimal support on it, would prefer not to freeze the system, and learning the IBM i-series is a disproportionately larger trial.
Store the M code in flat files in a Onedrive synced folder. Then load the queries dynamically using Expression.Evaluate . Chris has a great article here https://blog.crossjoin.co.uk/2014/02/04/loading-power-query-m-code-from-text-files/
I have been assigned a new project where I need to prepare a PowerBI report using Azure Analysis Services (Data mart). Here the flow is Data from Vertica DW -> Azure Analysis Services (via tabular Model)-> PowerBI. I am pretty much new to Tabular Model and Vertica
Scenario:
1) The DW is in Vertica Platform online.
2) I am trying to build a data model using Analysis Services Tabular Project in VS 2019
3) This model will be deployed on Azure which will act as data source to PowerBI
4) I cannot select individual tables directly (from Vertica) while performing "Import from Data Source". I have to use a view here.
5) I have been given a single big table with around 30 columns as a source from Vertica
Concerns:
1) While importing data from Vertica, there is no option to "Transform" it as we used to have it in PowerBI Query Editor while importing data. However, I tried to import a local file and at this time, I could find this option
2) with reference to Scenario #5, how can I split the big table in various Dimensions in Model.bim? Currently, I am adding them as calculated tables. Is this optimal way or you guys can suggest something better?
Also, any good online material where I can get my hands dirty on modeling in Analysis Services Tabular Project (I can do it very well in PowerBI)?
Thanks in advance
Regards
My personal suggestion is to avoid using Visual Studio as hell. Unfortunately, it is not only useless but also damages you.
Instead, use Tabular Editor. From there you can easily work with the Tabular Model.
My personal suggestion is to avoid using calculated table as dimensions, instead create several tables in Tabular Editor and simply modify the source query / fields.
In reference to the 1st question, I believe there is some bug while connecting Vertica with PowerBI it works perfectly elsewhere except for this combination.
For #2, I can use I can choose "Import new tables" from the connected data source. It can be found under Tabular Editor View.
I need some help with improving power BI performance to read some data.
I currently import data from an excel sheet, a table with lots of different data types. And i was wondering if it is viable to change the data source, since it would have to be a one man job.
Does power BI has better performance importing from another data source? Im considering access because of the simplicity of the change. Using a proper database like SQL is on the table but it wouldn't be as easy to do in a short time change.
I would suggest you using SQL SSAS. You can create all the metrics outside of PBI to after import only the summarized statistics going to the drill-down level you need for.
Following up on my previous post, I need to be able to query a database of 6M+ rows in the fastest way possible, so that this DB can be effectively used as a "remote" data source for a dynamic Excel report.
Like I said, normally I would store the data I need on a separate (perhaps hidden) worksheet and I would manipulate it through a second "control" sheet. This time, the size (i.e. number of rows) of my database prevents me from doing so (as you all know, excel cannot handle more than 1,4M rows).
The solution my IT guy put in place consists of holding the data on a txt file inside of a network folder. This far, I managed to query this file through ADO (slow but no mantainance needed) or to use it as a source to populate an indexed Access table, which I can then query (faster but requires more mantainance & additional software).
I feel both solutions, although viable, are sub-optimal. Plus it seems to me as all of this is but an unnecessary overcomplication. The txt file is actually an export from SAP BO, which the IT guy has access to through WEBI. Now, can't I just query the BO database through WEBI myself in a "dynamic" kind of way?
What I'm trying to say is, why can't I extract only bits of information at a time, on a need-to-know basis and directly from the primary source, instead of having all of the data transfered in bulk on a secondary/duplicate database?
Is this sort of "dynamic" queries even possible? Or will the "processing" times hinder the success of my approach? I need this whole thing to really feel istantaneuos, as if the data was already there and I'm not actually retrieving it all the times.
And most of all, can I do this through VBA? Unfortunately that's the only thing I will be having access to, I can't do this BO-side.
I'd like to thank you guys in advance for whatever help you can grant me!
Webi (short for Web Intelligence) is a front-end analytical reporting application from Business Objects. Your IT contact apparently has created (or has access to) such a Webi document, which retrieves data through a universe (an abstraction layer) from a database.
One way that you could use the data retrieved by Web Intelligence as a source and dynamically request bits instead of retrieving all information in one go, it to use a feature called BI Web Service. This will make data from Webi available as a web service, which you could then retrieve from within Excel. You can even make this dynamic by adding prompts which would put restrictions on the data retrieved.
Have a look at this page for a quick overview (or Google Web Intelligence BI Web Service for other tutorials).
Another approach could be to use the SDK, though as you're trying to manipulate Web Intelligence, your only language options are .NET or Java, as the Rebean SDK (used to talk to Webi) is not available for COM (i.e. VBA/VBScript/…).
Note: if you're using BusinessObjects BI 4.x, remember that the Rebean SDK is actually deprecated and replaced by a REST SDK. This could make it possible to approach Webi using VBA after all.
That being said, I'm not quite sure if this is the best approach, as you're actually introducing several intermediate layers:
Database (holding the data you want to retrieve)
Universe (semantic abstraction layer)
Web Intelligence
A way to get data out of Webi (manual export, web service, SDK, …)
Excel
Depending on your license and what you're trying to achieve, Xcelsius or Design Studio (BusinessObjects BI 4.x) could also be a viable alternative to the Excel front-end, thereby eliminating layers 3 to 4 (and replacing layer 5). The former's back-end is actually heavily based on Excel (although there's no VBA support). Design Studio allows scripting in JavaScript.
I have a ton of data in a sql database which I would like to be able to import and display in excel (I can already do this) and additionally modify or append to the dataset within excel and write the changes/additions back to the database.
What is the best way to go about doing something like this?
Please let me know, thanks!
The way to do this is via Sql Server's DTS/SSIS capabilities. Create SSIS packages for Excel import and export and execute them as needed.
However you still have the issue of people having to share this massive spread sheet. You should consider importing the data into the db permanently and providing a winforms interface for the data entry. You'd be surprised how quickly you could whip out an app with a databound grid view control that would give you decent, Excel-like ability to add/edit/delete table data.
Although Excel is great at displaying/reporting on data stored within a SQL DB, it has no built-in controls for updating the data.
I would recommend investigating using VBA (Visual Basic for Applications) or based on your coding experience/tools available to you, VSTO (Visual Studio Tools for Office).
This method will allow all of your users to share the spreadsheet at the same time and allow incremental updates plus validation of the data being entered by the user at the point they enter it.
All the usual gotchas apply though - mainly GIGO (Garbage In, Garbage Out). Correctly authenticate your users and what they are allowed to update