I have been working on creating a QlikView dashboard for my senior management to use, the current build uses a simple AccessDB back-end to source all tables loaded into the dashboard. However, due to our system limitations, if we'd like to host the dashboard on our intranet the back-end has to be switched to Excel.
Instead of creating multiple Excel files to load them up separately, I was thinking of connecting all my tables directly into Excel with multiple sheets representing multiple tables. By default when you load Excel into QV it only reads the first sheet, is there a way to get it to read all sheets in that Excel file?
Let me know your thoughts.
Regards,
Yasir
I saw the solution to this a few days ago. But I am not sure where is the post anymore or whether it works. Regardless, here is what I remember:
Here is the usual one:
(biff, no labels, table is [Table$])
But if you want to load all sheets,
(biff, no labels)
^In order to do this, make sure all sheets are in the same format/ table.
you need to set vFileName, vStartIndex, vEndIndex
// create a dummy table. it will be used in first concatenation
Excel:
Load * Inline
[DummyFiled];
// loop all your sheets and build Excel table
FOR index = vStartIndex TO vEndIndex
concatenate(Excel)
LOAD
*
FROM [$(vFileName).xlsx]
(ooxml, embedded labels, table is [Page $(index)]);
NEXT index;
Related
I have no previous experience in Access, VBA coding or in Excel macros prior to teaching myself the past month via these forums. Thank you forums and contributors. I have enjoyed my Access learnings so far, the challenge that it has provided and appreciate any help that I can get. As such, the code and methods that I have used to this point may well be convoluted and confusing. I will do my best to provide relevant details and accurate terminology.
I work in a lab and I am creating an Access Form for semi-automated reporting. Samples are received from clients and are logged into the Excel Table R&D Log. The worksheet is InProcess. Samples are sorted based on the site in which they originate and given a one or two letter site code (G, D, WH, etc.) and an ID "yy-000" in separate Excel columns (i.e. D 18-096). Samples may be submitted for multiple analyses (Metals, Water, Soil, etc.) and may even have multiple rows of reporting if multiple analytes are identified in the sample. There are several other columns, such as receipt date, reporting date, units, etc. Once samples are reported, I manually copy and paste them into the Archived worksheet, and delete the record and blank row from the InProcess worksheet. Since one sample may have multiple analyses and even more potential results, each record would be reported on a new Excel row (with the same D 18-096 ID number). Thus, there is not a single unique identifier or primary key for each sample in the current format. R&D Log is updated manually by lab technicians and the worksheet InProcess is a linked table in an Access Database.
The Access Database is using two combo boxes on a Form frmInProcess to filter a Query qryInProcess of the linked table. The combo boxes are filtering the report destination (one client may receive multiple site codes) and the analysis (reports are separated based on type of analysis). The Query is also filtering out blank results and blank dates, so only completed samples will appear on the filtered Form. I have generated VBA code to this point that will export the Form to a .pdf, save the file with unique filename, and open outlook to mail out the report. I have also managed to export the filtered Form frmInProcess to an Excel file Access Test (not the linked file).
What I would like to do now is to automate the transfer of completed test results from the Excel worksheet R&D Log: InProcess to R&D Log: Archived and delete the record from the InProcess worksheet. I am not sure if I can export the filtered Form into a linked Excel table, or if I must use a separate Excel file (or if it even matters for simplicity of code?). I would now like to read the exported filtered Form in Excel Access Test, lookup matching rows in R&D Log based on several criteria (site, ID, Analysis, Analyte, Report Date) and automate the transfer of records between R&D Log worksheets. End result being that Access generates reports for completed tests, and the records are removed from InProcess testing and transferred to Archived testing in Excel. I am guessing that I may need to close the Access application and perform this in Excel. Hope this is easy enough to follow.
Thank you.
In my experience, importing an Excel document into a temporary NEW (or totally empty) Access table is usually the easiest way to go. Then you do not have to worry about cell references like you do in Excel VBA. Even if the Excel document has old data in it with just a few new changes each time, importing it into a temporary Access table could be the simplest way to go, because then you can compare the data in this table with the data in another, permanent Access table and update the latter based on the former.
As far as the original Excel file, if you need to delete rows there, it might be quicker to export a new Excel file with just the data the old one is supposed to end up with, and then use VBA to delete (or - safer! - rename) the old file.
So the development process goes something like this:
Save import steps by first importing an Excel file via Access' ribbon options "External Data" (tab) ->"Excel" and when you finish, be sure to check the "Save import steps" box and note the name you give the "saved import" because you will need that in your VBA code.
In Access, write a function for deleting the table. The VBA code is:
Const cTable = "MyExcelTempTable"
If TableExists(cTable) Then
DoCmd.DeleteObject acTable, cTable
End If
Now you can test your delete function on the data you imported.
Write VBA code to import the same spreadsheet to create the same table:
Const cSavedImport = "Import-MyExcelTempTable"
' Import the Excel file
DoCmd.RunSavedImportExport cSavedImport
Write more VBA function(s) to check the imported table for bad data and then to copy it into the permanent table. You might be updating existing records or adding new ones. Either way, you could use Access queries or SQL to do this and run them from VBA.
Write a VBA function to rename the old Excel file. (You could use an InputBox if the Excel file name is different each time. I do this for importing Excel files, and I set a default value so I do not have to type as much.)
Write a VBA function to export the new version of the Excel file.
Make yourself a button on a form that, when clicked, runs a VBA function. Inside that function, run Steps 2 through 6, above.
I am not sure my answer exactly matches what you are trying to do, but hopefully you get enough of a picture of the workflow to figure out the details of what you need.
I have an Excel 2016 with 30 graphs based on PowerPivot. PowerPivot fetches the data from another Excel sheet, but I want it to get the data from a SQL server table instead.
How can I change the data source type in PowerPivot? I've tried looking in the Excel xml without any luck. Would be a lot of work re-creating all graphs over again just to switch data source
Thanks
Dennis
One suggestion I would make for the future, if all the users are using 2016 is to use Power Query which comes standard with that version of excel. In the Power Query loading data into Power Pivot scenario, all Power Pivot cares about is the column names. This means that the query can be changed between data source types without causing issues, as long as the same column names are changed.
As an example, I have one file that based on a parameter flag rips data out of a series of excel files on a shared network drive or Share Point. Both of which would be different data sources. The first opening a folder as the data source, then excel files listed within the folder. The other opening a share point list as its data source, then navigating though excel files.
We're looking at allowing our customers to download an Excel file from our web application which contains a raw export of their data along with some basic charts and pivot tables based on that data.
The basic way, we want to make this work is that we have a fixed Excel file which contains all the reporting elements in one worksheet and have room for the underlying data in another worksheet. When the user requests their Excel report, we programmatically fill out the data worksheet with their results and send them the final Excel file.
Everything seemed a bit to easy when doing the prototyping with a fixed set of data. The dataset we worked with was added to the Excel Data Model and we then set up the charts and other reporting elements. However, when using that file as the template for the generated Excel file in our application we are finding that the definition of the data model still remains - meaning, that we built the "protype" with a table definition of $A$1:$T$5879
but when generating the report, that definition isn't changed to contain whatever size the added dataset might have.
We're using EPPlus to work with the generation of our Excel sheets and have so far been unable to find any sort of solution to this kind of problem. This might very much be due to us being quite Excel novices. The goal is to have the user experience, that the charts and pivot tables contained in the Excel sheet reflects the total dataset contained in the Excel file without them having to do anything.
Ok, I've actually found a solution for it.
The solution was right infront of us.
We define the dataset as a named set - this is done under under the "Formulas" and inside the "Name Manager". We have a range which defines our dataset - the "Refers To" field when defining a range can take a formula. So intead of giving it a fixed size, we use this: =OFFSET(Data!$A$1;0;0;COUNTA(Data!$A:$A);COUNTA(Data!$1:$1))
This counts the amount of rows and columns, with reference to A1 in our Data worksheet. All our pivots are set to reload on startup and that seems to work.
I thought there would be a simple way of doing this, but unfortunately I have not come across one. My company has an Excel workbook with 12 sheets (1 for each month), into which I enter sales data as accounts are written. I reformatted each month's data into tables, thinking that this would provide an easy reference to gather the data into a pivot table that joins all the months and would be updated as I enter data; however, a pivot table based on multiple sets of data allows highly limited manipulation.
So what I want to do is create a new table that is automatically populated as I enter data in any of the 12 current tables, to combine them into a master listing. I have tried doing a query, but when I try to set up the data sources, it doesn't recognize my tables. I tried Power Query, but I couldn't get it to update the data as I updated the source. Consolidate also was not a useful feature, as it required all the data to be somehow calculated, and my columns need to simply be copied over, not summed or averaged.
As you can probably tell from my explanations and terminology, I'm no Excel expert. I don't know what VBA even is, let alone know how to use it, but I've seen it mentioned a lot, so I figure at some point in my life I should learn it.
Is there a formula or some other Excel 2010 feature that can automatically copy all of this data onto one running list, and keep it updating as I enter data in the source tables? It would have to run automatically.
I believe your end goal is to have a pivot table which consolidates data from each of the individual 12 sheets/tables and not really to have the intermediate "single running list which is an aggregation of all the 12 sheets".
If so, I suggest to create an Excel Pivot table directly based upon the 'Multiple consolidation ranges'.
To start, create a new spreadsheet and select a cell (say A3) and use the click sequence Alt+D+P, this will bring up the PivotTable and PivotChart Wizard, and proceed further using the third option - 'Mulitple consolidation ranges'.
I will have to refer you to the below site for a detailed step by step instructions on the above: http://www.contextures.com/xlPivot08.html
Please be aware that the Difficulty level for this solution is Medium, suggest you to bookmark the solution from maintainability reasons, in case you choose to implement it.
Short version: Is there any way/hack to use the embedded DataModel/PowerPivot cube of an Excel 2013/6 file from another Excel file?
Long version:
We have a large Excel Data Model with >400k rows and >100 measurements, feeding multiple reports (i.e. PivotTable on separate worksheets). As all this is growing, we want to split this out into a (large) data model and multiple reports. I know this could be done with SharePoint or PowerBI - however one of the key requirements is to be able to analyse the data offline. Hence, I'm trying to figure out any way to connect to the data model from another file....
There's no way that I know to do what you're asking. Is there any reason you can't just include all the reports in one workbook with the data model? Since you have to be able to analyze offline, anyway, everyone will need a local copy of the model. If the concern is just that there will be too many sheets in a single workbook, you could just put a thin veneer of VBA in it to hide and unhide sheets in groups for ease of use.
It looks like Microsoft has added an option to establish connection via ODC file.
See this f.e. https://learn.microsoft.com/en-us/sql/reporting-services/report-data/use-an-office-data-connection-odc-with-reports?view=sql-server-ver15
However it's not working out for me, I am using Excel 2016, exported data model from the file with data model as a separate odc file but when I try to add this as a connection in another file - I get the message - can't open the file. Looks like creating ODC file is not that straightforward.
Anyone had similar issues?