I'm trying to build up a report of data in a SQL database using Excel. I am able to create connections and pull data into pivot tables and pivot charts.
However, I don't like having to create a new connection for every different query I want to run. Is it possible to have one single datasource connection open to my whole database, and then specify queries for each pivot table? This could also potentially improve refreshing performance as Excel doesn't have to create a new connection for each query.
Are there perhaps better tools for graphing lots of different SQL queries?
Thanks!
Related
There is a use case in my company to enable business users with no technical knowledge to use the data from Azure cloud. Back in the SQL server days this was easily solved through OLAP cubes. You could write a query for data that's backing up the cube, and then business people could just connect to the cube and data was downloaded as a pivot table, the only problem with large datasets there was compute (the larger the data, the slower the pivot table) but not really the row limit.
With the current Azure Synapse set up it seems that Excel is trying to download the entire data set and obviously always hits a 1M row limit. Is there anyway to directly use the data in the pivot table without bringing it in full to Excel? Because all my tables are >1M rows.
UPD: You can load data directly to Pivot, but it does load the data to RAM and the actual loading takes time. I am looking for a similar to cube solution, where the pivot table is available immediately and the querying happens once you're adding fields and calculations to the pivot table.
I am working with a Pivot table in Excel that uses a connection to a SAS db. I want to join in a few of my own tables to the connection to add to the pivot table, and have added them into different tabs in the Excel workbook and then use the Get Data feature to import them into the data model. However, it won't let me merge any of the new tables with the original connection. Is there a way to do this?
There is no way for me to alter the SAS db as its owned by a different team in my organization.
We need to update about 40 Excel data tables from queries in a single Access database, on demand. Ideally, Data > Refresh All should perform the update.
I’ll want to use VBA to set this up; a master Excel table identifies the data tables and corresponding query names.
What’s a clean way to design this? Should my VBA use ODBC, OLE DB, or ADO to create 40 database connections? (For maintainability, I'd prefer not to have a lot of VBA complexity, since our shop doesn't otherwise use these APIs -- just Excel with simple VBA.)
I’m running MS Office 2013.
I have a ton of records across 5 different tables in Access that I'm consolidating in one query.
I want to now be able to connect that query to excel and pivot the data that is in the query. Is that possible? If so, is there a way to do it without added VBA? I was thinking of slowly exporting the consolidated data into a new table in access but with over 7 million records, it'll take some time. Anyway to save myself that headache? When I connect the DB to excel, the only tables that are coming up are the 5 tables and not the query.
I have a spreadsheet that is emailed to me by an outside vender. It contains a bunch of pivot tables. I really couldn't care less about the pivot tables, I just want the underlying data. The data comes from a sql server that I don't have access to, but the data is stored within the spreadsheet. Is there anyway that I can access the data, (I think it's the PivotCache) directly without drilling into one of the pivot tables?
I'd love some sort of ODBC/ADO.Net command that I can use from SSIS. But am open to just about anything that doens't require me to open and save the workbook.
I'd also like to avoid macros if at all possible.
Here's the answer. Or at least what I did to resolve my problem.
There is really no way to accss the underlying PivotCache data except via pivot tables. So direct accss was out. I ended up using a script task with excel ole to dynamically create a pivot table with the data I needed at run time. Once the script task is done, I then use the new pivot table as my dataflow source. Excel can be a little tricky to automate, but it's worth it.