We are looking into HPC for improving processing on some of our larger, more complex spreadsheets. We have demo'd this to some colleagues and the area they can see this being useful is with a spreadsheet tool that, as part of the calculation routine, generates a 5-dimensional array (each dimension about 100 rows by 100 columns) to be used in later calculations. They asked if it was possible to use HPC to share the calculation load of this object rather than write the results back to cells?
I've done some searches but am unable to find anything that covers this...does this mean it can't be done, or just that it's not really how HPC is intended for use?
If you are referring to Microsoft HPC 2012 or 2008 R2, it can be used to do quite a number of things from processing raw unstructured data to doing excel calculations.
http://msdn.microsoft.com/en-us/library/cc853440(v=vs.85).aspx
That link is for the 2008 R2 documentation (no new samples for 2012 have been published) one of the samples is a spreadsheet that has its values calculated via HPC. You want to look at the MonteCarloCLI example if memory serves from the Excel directory in the SP4 sample pack.
Related
I have 13 long sets of data of different variations of an instrument. For a particular set of parameters (2 parameters - say X collectively), I have a series of formulas that calculate certain outputs (26 outputs - 2 for each data set, say Y collectively) in a table.
Now, I build a new table populated with about 20 sets of all different Xs and all the resultant Ys using a method suggested here, wherein the INDEX function and the data table feature of Excel are used along with 2 selectors (1 for the parameter set and 1 for the output set) to create desired multi-variable table for a table full of different parameters.
I have used this method a few times before as well without any problems. This time, however, the data tables are too long and each output calculation needs going through all of them using multiple INDEX functions. This is causing Excel to hang up a lot. For a single set of parameters it works just fine. As soon as I try to populate my multi-parameter table, it starts acting up. So bad that my PC almost freezes.
Any suggestion on how can this process be made faster? I had seen some options for multi-variable tables and found this a convenient one. In terms of computational complexity is this not an efficient method and is there another more efficient one?
I tried using Excel Online hoping for Microsoft to use it's own computing power there, but it acts weirdly online - random deletion of cells and stuff. Has anyone had success with Excel Online in such computation issues? (I have Office 365 subscription)
My PC specs:
Intel Core i7 5500U
16GB DDR3 RAM
Windows 10 x64
Let me know if any other information is required. Thank you in advance.
Edit 1: A minimal example is best represented on this page - https://www.mathscinotes.com/2016/10/excel-data-table-with-more-than-two-input-variables/
My own sheet uses exactly the same method except with higher number of parameter sets and output sets and larger amount of data to create output from.
Edit 2: I am attaching a screenshot of my Excel sheet as an attempt to explain my own formulas, as suggested by BigBen. My Excel sheet's formulas' screenshot
In the event that the link I posted dies, I am also attaching the image of the example that I had previously mentioned. Mark Biegert of Math Encounters' example
I have never used Visual Basic before but could do with a pointer on where to begin.
I have 750 excel spreadsheets that contains various amounts of data of different types. The columns are always the same, but the number of data rows vary per spreadsheet. I need to extract data and put it into two new spreadsheets.
Obviously to do this 750 times manually would be a nightmare. I just want to run a script that can do it for me and thus thought of Visual Basic although i've never used it before.
My specific questions are:
What type of command should i research that would allow me to copy data where the row number to start at varies (as data above varies in no of rows). There is a title before this new data - how can i get it to search for this title and then choose the row below?
Would all my spreadsheets have to be in one folder so that the script goes through them all, or can i have some kind of folder structure in that folder too?
Anyone recommend any good resources for me to get to grips with visual basic and grasp what i need to do?
thanks
Tom
So the compilation task got easier with the introduction of MS PowerQuery. If you are using MS Excel 2013, you already have this. If no, you should download it and use the extension from MS.
The following guide outlines how to Using Power Query to Combine Data from Multiple Excel Files into One Table. This means that with Power Query (PQ), MS has taken and enabled easy aggregation using a few simple button clicks. PQ is a lightweight alternative to a lot of tasks that used to require VBA.
In this example, you will use PQ to point to an entire folder (750 should be no problem) worth of commonly formatted Excel files. The only limitation is that each data file should have a similarly named tab.
I won't repeat the details of the guide for how to do it, as it is in-depth and visual. But if you run into issues, get in touch.
I have loads of excel files with huge amount of numbers and graphs and formulas. What I would like to do is to store everything in database. It could be oracle, ms sql or my sql. Main goal is to have all excels in one place and create graphs, tables, comparison and so on, in one place. Is there online some product/software (free or not) for this kind of system? I could edit it, but don't want to make it from scratch.
I need a bit of help. My company has data in multiple excel sheets. Some sheets are straight forward (in that they easily map data types). But most of them are merged rows and cells etc within one header. I am developing an application in c# for maintaining a massive database with proper user and role management and multiple departments as stake holders.
I have identified the relations from within the excel sheets and all is well. What I cannot understand is how to import that historical data to map to the data tables? Basically, when a new custom system is designed, how would you import senseless data within it?
The only thing I could think of was writing a utility program that reads every row and every cell of the excel sheets and then extract the required values to insert to the proper database table. But this would take ages due to the numerous number of excel sheets.
Wondering anyone of you went through the same thing as I have?How did or would you handle this?
Many thanks guys :)
If the data is not regular, you've got a world of pain ahead of you. The object model in Excel can be driven by any of the Windows Scripting Host languages, like VBScript and JScript. In fact, most scripting languages have some support for Excel traversal.
However, as you've already noted, the data layout isn't the same across all spreadsheets. I had a similar problem years ago traversing SCADA data from power stations: each station did it slightly differently and changed their format from time to time.
I'm creating an Excel dashboard that imports a variable number months' worth of financial/accounting information from a database to an Excel sheet. Using this information I have a Calculations sheet that computes some financial indicators, again, month by month. Finally, this information is displayed in graphs on a separate sheet (one indicator per graph, with the monthly information plotted to see the tendencies). Currently I have written VBA code that formats the sheets to accomodate the number of months requested, pull the data from the SQL server, and update the graphs. Since there are 53 indicators for each operation (6 operations), this process takes about 3 minutes.
Does anyone recommend a better way to do this? The current way 'works' but I've often thought that there must be a more efficient way to do this.
Thanks!
Chris
You could look at skipping out the excel part and using SQL server reporting services (SSRS). If you have ever used business objects or crystal reports its kind of the same thing and I would imagine would offer better performance than doing things in excel.