How to make Excel run in multi-CPU mode - excel

I have big excel files, thousands lines and rows.
Gigs in size.
When i do work inside this files in excel, it throttling and lagging. Sometimes just get stucked ans freeze.
When i open a task manager, i see that Excel didnt eat even a one CPU.
RAM usage also not overloaded.
How to make excel use all my cores?
Excel 2019.

I see you are familiar with Python. Why don't you move your databases to Python? Actually I don't know capability of it. I work in R. Some of my data in Excel and even in *.csv take 100-200 Mb, while in *.Rdata format the same information is less then 5Mb. R functions work faster then heavy Excel.
And yes, how to make excel use all cores of processor is interesting to me too.

Related

Jet.OLEDB or ACE.OLEDB MS Access

I'm using excel vba to pull data from a MS Access DB - this is using Excel 2013 and Access 2013 32bit. The code historically has used:
Provider=Microsoft.Jet.OLEDB.4.0;
However some computers have upgraded to Excel 2016 64bit and the Jet provider is not available for 64bit. I have changed the code to:
Provider=Microsoft.ACE.OLEDB.12.0;
which works for both 64bit and 32bit systems. However, I have noticed a significant speed drop in loading/saving data just from changing this line. Does anyone know why this can be and how I can improve it?
You are correct in having to choose the ACE provider for x64 bits.
And the big advantage of JET was it was (and still is) installed on all copies of windows by default. So no need to install Access or the runtime, or previous the office connectivity package.
As for performance? There has been a few comments about performance in regards to ACE x64.
However, one trick or suggestion is to ensure that the connection stays open. In other words, are you sure the row processing is going slow, or it is the overall time?
(perhaps put a test msg box, or test in your code.
Eg:
Dim T as single
T = timer()
‘ your code here
Debug.print timer() – t
The above will thus spit out the time to the debug window (while in VBA ide hit ctrl-g to display the immediate/debug window.
The reason why I suggest force open idea is often you find that ACE takes a VERY long time to open. But once open then the data reading has good performance (same as before).
So, I suggest to check and try this fix.
So open a table (any table) and KEEP it open. Now run your existing code (that may well open + close other tables). The issue is when ACE attempts to open a table, it tries to put locks on the mdb/accdb file and it is this process that takes VERY VERY long time.
However, if you force (keep) open one table, then this VERY slow process of ACE attempting to lock the file for read/write does not occur each time you execute a query, or create additional recordsets in code.
So, if the row reading speed is fast, but the time to START + open is very slow, then before you run + test your routines, force open a table to some reocrdset (keep it active and in scope), and THEN try your code.
I find 9 out of 10 times, this results in elimination of this slow speed, and often I seen the results are nothing short of spectacular (it will run faster then before!!!)

VBA - Export Image from Excel *without* using Clipboard (Copy/Paste)

There are a lot of great examples of how to take an Excel range, create an image from it, and save it to the drive. Here is one: Export pictures from excel file into jpg using VBA
This works great on a small scale, but when you try to run this through 3,000 or more iterations, a "memory leak" caused by the repeated use of the clipboard eventually erodes the process and the macro fails somewhere along the way. This occurs even when running 64-bit Excel on a powerful machine (50+ GB of RAM).
Are there any ways to do this without using the clipboard?? My first thought was to try to fix the memory leak issue, but all of those attempts have been unsuccessful. For context, I'm basically using the exact code as provided in the solution on link above (with a couple of added features to try to reduce memory leaking like auto-saving the workbook after every 100 images, etc.).
I'm also looking for what you mentioned; here's how to do it with a chart:
Dim file As String ' the path to the saved image, in the temp dir
file = Environ$("temp") & "\chart.gif"
Sheets("Sheet1").ChartObjects(0).Activate
Sheets("Sheet1").ChartObjects(0).Chart.Export Filename:=file, FilterName:="GIF"
There was ultimately no solution for the memory leak, it seems to be a systemic problem with VBA.
For those trying to programmatically generate charts, it is much easier to build in PHP.

Excel 2016 Upgraded From 2007; Any Workarounds for Apparent Memory Issue?

This isn't really a programming question perse, but at work we have been forced to upgrade from Excel 2007 to Excel 2016 which has caused some productivity issues with respect to opening multiple workbooks at once.
The problem is that our entire file system has a bunch of linked formulas and iterative calculations where it is necessary to have multiple workbooks open at once (around 60+). Previously with Excel 2007 we were able to do this easily with relatively low power computers (4 gigs of ram, with a mediocre hyperthreaded i3 processor) but with the new change to Excel 2016 we keep getting an error stating to "upgrade to 64-bit Excel" or "install more physical memory". The error message can be seen here. We have already upgraded to 64-bit Excel, and on top of that, I tried using other co workers computers with 8 and 16 gigs of ram which resulted in little to no difference. Does this imply that perhaps Excel 2016 is not well optimized for having this many workbooks open at the same time? Are there solutions or workarounds this problem? I find it hard to believe that 16 gigs of ram is still insufficient when 2007 was able to go through this process easily, though perhaps the change from MDI to SDI means less efficiency?
As an aside, yes, I do indeed wish we were not using Excel for such a computationally expensive process in which Excel may not have been designed for tasks like these, but to move everything to something like SAS would take a lot of time I'd imagine and ultimately, I don't make the decisions :(
Thanks for any help.

Slow xlsread in MATLAB

Here is the result of a profiled simulation run of my MATLAB program. I need to run this simulation several hundred thousand times (~100,000 times).
Thus I need a faster way to read the Excel file.
Specifications: The Excel file is of 10000x2 cells and each simulation run is reading one such sheet each from 5 separate Excel files.
UPDATE: I put the xlsread in basic mode and also reduced the number of calls by combining my input into a single file. Next target is xlswrite now. Ah, that sinking feeling. :|
NOTE: Although writing to a CSV file using dlmread is very fast (around 20 times), I need to use the comfort of separate sheets that an .xls file provides.
I don't think you would be able to wring much out of xlswrite if you need Excel sheets as the output.
How about parallelizing?
Do you have access to the parallel computing toolbox? Or maybe you can run two instances of MATLAB if your box supports it. If so, you could consider two approaches:
Have the first process do the xlsread part, the simulation part and then write to mat files/plain binary/CSV, whatever is the fastest while preserving your data integrity. Have another process convert the matfiles/intermediate data files into Excel using xlswrite.
Have N MATLAB instances/workers (N depends on your physical machine capacity). Parallelize the whole read-process-write part across N workers. Note, I am not sure how Excel would scale when called by N workers! (xlswrite uses activeX/MS Excel to write the data).
As any parallel approach, your mileage will vary on the complexity of the simulation vs. required file I/O and its performance.

Does Monitoring An Excel Spreadsheet Via RDP Make It Slower?

We have a massive spreadsheet which does a lot of calculations and not much drawing / writing to spreadsheets
My question is : Does monitoring the spreadsheet whilst it is running via RDP actually make this slower??
Put differently if rdp was disconnected would this result in improved speed??
I've actually done a lot of work from home via Remote Desktop that involved an Excel Workbook (and Access Applications) doing lots of hefty calculations. From my experience, I didn't notice any slowdown in the calculations on the Excel sheet, but occasionally the connection would slow and anything that refreshed the screen a lot would make the PC difficult to use.
The most important thing, however, is to write code that modifies the visual elements of the screen as much as possible. For example, instead of looping through a bunch of cells and setting each one as the active cell to find its value, loop through a set of range values that don't require the sheet to refresh. This, by far, has created the biggest performance boost in my VBA code.
If your code is already fairly optimized, you'll probably not see any difference monitoring it over RDP. However, if monitoring is your issue, you ought to consider outputing data to a separate Excel or Text file that might be stored on a shared server. If done correctly, I imagine that would have a smaller impact on your CPU than RDP. THis will still allow you to monitor the progress of the Excel application without having to log in.
Just look at the CPU usage of Excel and the RDP server. If Excel isn't getting its 100% while calculating, or if the RDP server seems to be using too much... then yes, RDP is making things slower.

Resources