Excel automatic calculation crashes, but manual calculation works - excel

I have a relatively large (7MB) but more importantly formula and macro heavy excel file that I have been developing recently.
For the end user it must be able to be used in Automatic Calculation mode.
While calculating the whole sheet in Manual Calculation mode using F9 (Calculate Now) will work and calculate in ~10 seconds, but when I turn it onto Automatic mode, it will calculate repeatedly (i.e reaches 100% and then immediately starts again from 0%) and so freezes indefinitely.
So far I have tried:
Putting break points in all the VBA macros to see if it is hanging inside a macro
Removing all of the macros from the file (I was worried one of them was being triggered to run repeatedly in automatic mode)
but neither has worked leading me to wonder if the issue is not in fact VBA related.
Does anyone have any ideas about:
What might be causing this?
How I might diagnose the cause?
Happy to give more context if helpful. I am a relatively experienced excel user, and while generally don't write VBA macros from scratch am pretty confident at re-purposing code I inherit / find online. I am using a relatively powerful 6 core machine, but have tested it on others to the same results.

The issue turned out to be a volatile action that a macro was triggering. I had a macro that looped through hiding empty rows, but hiding a row is volatile, so then triggers a new calculation before the next run through the loop. The loop itself was 500 iterations, so that meant 500 sets of 3 second calculations!
See this link for more on volatile actions:
https://msdn.microsoft.com/en-us/library/bb687891(v=office.15).aspx

Related

Looped .Refresh causing slow Excel

I am trying to use VBA in Excel for Mac to query a database. A simple query works fine and I've been using it for years. Now I'm getting into more complex queries, where the results of one query (about 1,000 records) are used sequentially to query the database again (so about 1,000 consecutive queries). The results (about 5,000 records) return in about 2.5 minutes, but clicking in the Excel sheet is not responsive for about a minute afterward. This behavior continues for another 2 minutes or so before clicking becomes pretty much instantaneous. Running the macro again gives similar results but slower. The 3rd time is even slower. I suspect a memory leak. Restarting Excel makes the problem reset. Here is the code for the actual query:
With ActiveSheet.QueryTables.Add(Connection:=strURL, Destination:=Range(strStartCell))
.PostText = "user=" & strUserName & ";password=" & strUserPassword
.RefreshStyle = xlOverwriteCells
.SaveData = True
.BackgroundQuery = False
.Refresh
End With
I've tried to actually send just the first query (so it gets data from which the other queries can be built). The rest of them, I build the query, but don't send it. So, in this experiment, the above code got used only once. Running it this way, it comes back in about 8 seconds. So the other 2 minutes and 20+ seconds are back and forth between my computer and the database. More importantly, after running this way, there is no lag after it is done running. So it seems like if it is a memory leak, the leak is in the query process, or maybe the actual writing of the data.
I have programmatically turned off all of the screen updating, page break showing, and calculating at the beginning and returned them to the original settings at the end.
My computer is a Mac mini (Late 2014) 3GHz Intel Core i7 with Office 2011, but I've tried running it on a newer M1 with the newest version of Excel also. It was much faster, but the lag after the results were returned, though shorter, was still a problem. My computer is representative of where the spreadsheet will be run for the near future.
The lag afterwards is really going to kill this part of the project. Has anybody seen this problem before? Is there something I can do to trace what is causing the problem and if there is a way around it?
ChatGPT solved the crux of my issue. It pointed out that I'm creating a separate Query Table for each of my 1000+ queries and those are eating up my memory, causing my application to become slow. It said that if I add a .Delete line after the With... End With block it would wipe out the previous Query Table and not use up all that memory. The answer was slightly inaccurate. The .Delete needed to go inside the block (at the end). Anyhow, I tried that and it made my code somewhat slower, but when it was done, the application was not slow. That is a win in my book.
It further suggested that ADO might do a better job at this task. So I guess that is something else to learn. Thought this might be useful for someone else experiencing a similar issue.

Getting reasonable performance from excel while receiving RTD feeds and avoid blocking

I am handling RTD feeds again and remembering the difficulties, but now we have multi-core machines and multi-threading. maybe anyone can advise.
As I understand/rememeber: pushing data into Excel is not one(obvious reasons) so it sends a friendly nod to say your parcel is ready come and get it. Then when Excel has done its nails and feels in the mood, it might get the data.
So this kind of architecture is right back on the dance-floor and hopefully I can make it work.
Problem.
I need to run a bot that examines the data and responds differently;
1. looping or running a winapi timer even is still enough to keep excel busy and no data, or so it seems.
Definitely, executing bot logic , however small, will bring on an Excel fainting fit.
Tried responding via calculation event. Very hit-and-miss and definitely not up to the job. There is no logic obvious as to when and why it fires or does not other than a " bad hair day"
Tried winapi timer looking at the newly acquired data every second comparing to old in a separate data structure, running some EMAs and making a decision. No dice.
The timer is enough to put a delay up to 10 or even 20 seconds between occasional delivery of data.
Options I am thinking about:
1. The timer running outside of the excel environment and lookin in a the data. e.g an AQddon via pia etc. What I don't know is whether this Addon, perhaps inc C# or vb.net, could utilze multithreading, via tasks I think and do its bit without "scarin the panties of her ladyship"?
2. I remember hearing that XLL UDFs could be asynchronous, does anyone know if this is a potential option?
Any ideas?

Run-time Error -2147417848 (80010108) appears for big amounts of data to process

I have a problem with the Excel macro I created. The macro is quite complicated (7 modules with ~2500 lines of code) and it's used to automatically assign pallets to trucks and sort them properly. It works perfectly for 99% cases but when recently it had to assign more than 1200 pallets for one direction it stopped working.
The weird thing is that normally it can process way more than that but only if it is splitting for different directions. Also it crashes not in one place but sometimes when building pallet number 1108 sometimes 1110 etc. Between 1108 and 1111.
Also the crashing is quite weird. Before I turned on the "Break on all errors" It was just exiting the Sub that it was at that time working on, going back to the first one and then processing the rest of the code without executing it (ignoring every variable changes, every If etc.) The only line that was working was the Message Box. Also Excel was unresponsive afterwards.
I am declaring every variable, I have Option Explicit on top of every module I am using few Public variables and a lot of Local? (the ones that are across the Module not across the Project).
It is crashing mostly at this line:
w2.Cells(r, 4).delete Shift:=xlUp
where w2 is set as:
Set w2 = Workbooks(wb1).Worksheets("SDP Temp")
Have you ever encountered something like this?
0x80010108 RPC_E_DISCONNECTED The object invoked has disconnected from its clients.
Is what it means. COM uses RPC for out of process.
I managed to find a fix for the problem.
When the code was executing he was never truly exiting the SUBs until the directory was finished.
Because of that when there was more than 28 trucks it had around 50 subs opened in the background.
When I fixed the code so the subs were properly closed after they were used it started working properly.

Excel on exit and macro run: 'Code execution has been interupted'

I am getting a 'Code execution has been interrupted' message on exit from excel intermittently of late. And its frequency is increasing. End allows me to leave excel, but once I get the message, my machine must be rebooted to allow for excel to open and run macros again. This intermittent message does come up with basic excel usage at exit, and is not limited to excel sessions following VBA macros use. Has anyone seen this or have a solution. It is getting very annoying.
I have came across this issue few times during the development of one complex Excel VBA app. Sometimes Excel started to break VBA object quite randomly. And the only remedy was to reboot machine. After reboot, Excel usually started to act normally.
Soon I have found out that solution to this issue is to hit CTRL+Break once when macro is NOT running. Maybe this can help to you too.

Excel VBA Application.OnTime. I think its a bad idea to use this... thoughts either way?

I have a number of users I support that are asking for things to happen automatically ( well more automagically but that's another point!).
One want events to happen every 120 secs ( see my other question ) and also another wants 1 thing to happen say at 5pm each business day. This has to be on the Excel sheet so therefore VBA as addins etc will be a no no, as it needs to be self contained.
I have a big dislike of using Application.OnTime I think its dangerous and unreliable, what does everyone else think?
EDIT:
Cross post is at VBA Macro On Timer style to run code every set number of seconds, i.e. 120 seconds
Application.OnTime is absolutely 100% reliable and is most definitely not dangerous. However, it is only exposed via VBA and you are regarding VBA as a "no no" for some reason here, so this option would appear to be unavailable to you.
I would generally not use OnTime for long-term scheduling, such as scheduling Excel to execute a command each day at 5pm. The problem is that if the user closes Excel, then the OnTime scheduling is lost. What you would need, in this case, is to use the Task Scheduler, or create your own application or windows service to open Excel and execute your commands.
For scheduling an event to occur every 120 seconds, however, using Application.OnTime would be perfect for this -- you would simply need to re-schedule OnTime to occur again in 120 seconds each time that OnTime calls back, because OnTime only fires once per scheduling, not on a repeat basis. I would absolutely use VBA for this task. If you don't want VBA commencing the action, that is fine: just have the VBA contained in a workbook which is then opened by your program or via the Task Scheduler. From that point onward, the VBA code can fire every 120 seconds.
Make sense?
You're right. an "infinite" interval of "onTime" calling itself, creates an infinite recursion.
It will cause a stack overflow after few thousand/million/billion function calls, and it will "leak" memory.

Resources