Not sure if this is a code-specific issue or not, so I will be general for now.
I have a somewhat complicated macro that begins by reading financial market data that is manually inputted by a user into a worksheet, and then proceeds to process that market data, generating the required market curves, etc., and then calculate the certain valuations of interest.
The process requires a lot of looping since there are thousands of instruments that need to be valued. However, I noticed that every now and then the macro will loop extremely slowly - on the order of about 2-3 seconds per iteration. When I have the Excel workbook up, I can see down at the bottom it is saying "Calcuating (4 Processors x% complete)".
To resolve the issue I have to manually force Excel to shut down; usually this fixes the problem and the next time I run the program it works fine.
I am running Windows 8 (not 8.1) and Excel 2013. I've heard that this combination is particularly prone to crashing/bugs (I've experienced this several times myself where Excel will take a very long time to process basic requests such as font formatting or will spontaneously crash for no apparent reason).
However, I'd like to ask the community to see if the problem is more universal/known.
Thanks!
As a general tip for creating fast excel macros: wherever prossible don't loop through cells, you will get much better performance using a with statement on a range object, or where you need to operate on the data in a more elaborate way try copying your range of data into a 2 dimensional array, looping through an array will be orders of magnitude faster than looping though cells in a worksheet, you can then dump the array back to the range.
Try application.visible = false in the beginning of your code. Then make sure to do application.visible = true at the end. Should help.
Try to change the cursor to xlBeam
It changes everything in terms of speed in Windows 8.1
Sub CurseurDefault(zz As Boolean)
If zz = True Then
Application.Cursor = xlDefault
'Call ShowCursor(True)
Else
Application.Cursor = xlIBeam
'Call ShowCursor(False)
End If
End Sub
So, in retrospect, this may not have been the best way to go about de-bugging my application. But here is a brief overview of what has happened:
Created macro to loop through files and extract data from each file and dump into a consolidated workbook
Macro was getting hung in really long loop near the end
In order to preserve the data after it hung (for the purposes of understanding where/how it was getting hung in the loop), I added a Workbook.Save command at the end of each iteration
The loop made it to the final iteration and got hung.
I re-opened the workbook and all the data is present, however it is still hung in the loop
My questions:
I cannot break (CTRL + Break) the loop, is there any other way to kill current execution?
Is it possible that when Excel saves the document after the iteration, it will pick back up at the point of execution within the macro when the book is re-opened?
Finally, is there anyway to extract the macro from the seemingly corrupted workbook?
I'll walk a little bit in the dark here, as you didn't give any code to refer to.
If loop doesn't have an end it usually ends with restarting Excel. Breaking option is not really working in that situation.
If you had assigned Workbook to the variable you can close it with saving the changes:
Dim wb as Workbook
...
wb.Close True
You can copy Sheet with a macro in it to the another Workbook.
I have an Access (2013) database that I use to store/process all of our data received from a Qualtrics online survey. The raw data downloaded from Qualtrics is in a csv file that's poorly formatted for importing to Access, so I've got a fairly complex Macro in Excel (2013) that I use to pre-process the data before importing to Access.
In Access, I use the following code to open the excel file that contains the macro, run the macro, save the workbook, and close it. This had been working well several for several months, but now when I run it, it stops near the end of the Excel Macro with the run time error: -2147417856 Automation error System call failed.
ActivateOrOpenWorkbook WbkName & ".xlsm", strWbkPath
appExcel.Run ProcName, ProcArg
appExcel.Workbooks(WbkName).Save
If appExcel.Workbooks.Count = 1 Then
appExcel.Quit
Else
appExcel.Workbooks(WbkName).Close True
End If
ActivateOrOpenWorkbook is just a custom function to do exactly what the name implies, appExcel is the Excel Application. The workbook always opens fine, and the macro begins to run, but it never actually reaches the point where control returns to the Access VBA and saves the workbook.
It does run fine if I open the Workbook before running the Access procedure, insert breakpoints at every major VBA step (in both Access and Excel), and step through the whole thing one Sub at a time. It just fails if I try to let VBA run it all from start to finish on it's own.
Based on that evidence plus stories of similar problems that I've seen online, I suspect that the error is occurring because the Excel macro is taking too long to run (we recently added some new variables to the Qualtrics survey), and Access is trying to take back control before Excel is finished. I just haven't found any viable way to solve that suspected problem or investigate it further.
I did try inserting this makeshift Wait routine into my ErrorHandling for the Access Sub, but it didn't work at all, because the error message still popped up in the same amount of time as before.
If Err = -2147417856 Then
TWait = Time
TWait = DateAdd("s", 15, TWait)
Do Until TNow >= TWait
TNow = Time
Loop
Resume Next
Any help will be appreciated!
Have you tried using DoEvents in the pre-processing macro?
If you have a loop where you're processing the CSV file line by line then use DoEvents every so often to give any pending events a chance to run. It would be overkill to call it on every line so maybe start with every 100 lines and adjust from there.
Given that the macro works when you step through the Subs, it would appear that it isn't the overall execution time that's the problem. If Access is throwing an error because Excel appears to be unresponsive then DoEvents might be enough to stop Access from giving up
I have a rather large workbook that takes a really long time to calculate. It used to be quite a challenge to get it to calculate all the way, since Excel is so eager to silently abort calculation if you so much as look at it.
To help alleviate the problem, I created some VBA code to initiate the the calculation, which is initiated by a form, and the result is that it is not quite as easy to interrupt the calculation process, but it is still possible. (I can easily do this by clicking the close X on the form, but I imagine there are other ways)
Rather than taking more steps to try and make it harder to interrupt calculation, I'd like to have the code detect whether calculation is complete, so it can notify the user rather than just blindly forging on into the rest of the steps in my code. So far, I can't find any way to do that.
I've seen references to Application.CalculationState, but the value is xlDone after I interrupt calculation, even if I interrupt the calculation after a few seconds (it normally takes around an hour).
I can't think of a way to do this by checking the value of cells, since I don't know which one is calculated last. I see that there is a way to mark cells as "dirty" but I haven't been able to find a way to check the dirtiness of a cell. And I don't know if that's even the right path to take, since I'd likely have to check every cell in every sheet.
The act of interrupting calculation does not raise an error, so my ON ERROR doesn't get triggered.
Is there anything I'm missing? Any ideas?
Any ideas?
I think the trick you need to implement (if you're application runs in Excel 2007 or later) is to handle this with the Application.AfterCalculate event, which is raised after both calculation is complete and there are no outstanding queries.
If you've never worked with events in VBA before, there is a good overview from cpearson.com.
The (MSDN) solution by Charles Williams above worked for me where I had 1000's of VLOOKUP's that neeeded to recalculate as the code was changing the lookup value because of an iteration loop. Results were skewed as calculations were not running to 100% completion.
At the beginning of my subroutine the code executes
Application.Calculation = xlManual
This eliminated unnecessary calculations by Excel until I was ready.
Now at the critical point the code executes
Application.Calculation = xlAutomatic
ThisWorkbook.ForceFullCalculation = True
Application.Calculate
Having forced Excel to perform a full calculation, the code could then saved the result and move onto the next iteration ... but before doing so
ThisWorkbook.ForceFullCalculation = False
Application.Calculation = xlManual
Remembering at the very end
Application.Calculation = xlAutomatic
I've never actually used it but I think this might work to prevent calculation from being interrupted.
Application.CalculationInterruptKey = xlNoKey
I think I'm hearing that you need a way to monitor whether each step within the calculations being performed was executed.
Assuming that you're not interested in re-engineering the workbook to use methods that are easier to track than spreadsheet calculations (such as volatile calculations within VBA or Pivot Tables), this may work for you:
Within VB, you can utilize .EnableCalculation and .Calculate to set an entire worksheet as "Dirty" (needing calculation) and then recalculate. The key difference between this and your current process is that we will perform these actions one worksheet at a time in manual mode. By initiating the calculations one worksheet at a time from within VBA, you will be able to perform additional intermediate actions that can be used to track how far you got in the calculation process.
Please note that this approach assumes a fairly linear workbook structure such that your workbook will produce the correct results if we first recalculate Sheet1, then Sheet2, Sheet3, and so on, in whatever order you wish. If your formula dependencies are more "spaghetti" than linear, this probably won't work for you. It also assumes you are working in Excel 2000 or later.
For example, you could write a VBA routine that accomplishes the following steps.
You will need to know your dependencies in order to know which calculations must come before others, and start with the worksheet in a "clean" state where no calculations are currently pending.
Step 1: Set the active sheet to the first worksheet where recalculation is needed
Step 2: Set the calculation mode to manual as follows:
Application.Calculation = xlCalculationManual
Step 3: "Dirty" the entire active sheet as follows:
With ActiveSheet
.EnableCalculation = False
.EnableCalculation = True
Step 4: Initiate a recalculation for this worksheet only (not the entire workbook) using:
.Calculate
End With
Note that if the calculation mode were set to automatic, Step 3 would initiate a re-calculation across the entire workbook. By using manual mode and With, we are constraining that calculation to the current sheet.
Now you have dirtied and re-calculated the first sheet (hurray!). Now, by embedding Steps 3 and 4 above into a For/Each or For/Next loop, you can repeat the process for each worksheet in your workbook. Again, make sure you know the order in which your worksheets need to be calculated (if an order is needed).
Now for the big finish - by creating a counter variable within your loop, you can track how far you got in the calculations by updating your counter variable value each time you complete a worksheet calculation. For example, after you recalculate a worksheet, you can set the counter value to current value + 1 and store the results either in a global variable (so that it will persist even after your VBA routine ends), or in a cell within your worksheet. That way, you can check this value later to see how many worksheets were updated before the calculations finished or were interrupted.
If you have relatively few worksheets in your workbooks, the same approach could be applied to one range at a time rather than a sheet.
I won't go into detail about how to construct a "counter", loops, or global variables here, but if needed, this information can be easily found using your favorite search engine. I would also highly recommend re-enabling automatic calculations once you are done as it is easy to forget that it's been set to manual mode.
I hope this works for you - for more information on calculation modes and recalculation, this is a helpful link:
http://msdn.microsoft.com/en-us/library/bb687891.aspx
Perhaps the following would work:
Do Until Application.CalculationState = xlDone
DoEvents
Loop
Can't say I've tested it, nor that I know how robust the functionality of Application.CalculationState really is to determine whether 'complete' calculation occurred, as opposed to something interrupting the process and flagging the calculation state as done.
Private sub SomeCodeThatGeneratesFormulas
Application.Calculation = xlCalculation.xlCalculationManual
'...Some formulas are copied here'
Application.OnTime DateTime.DateAdd ("s",.01,DateTime.Now), "Module1.CalculateFullRebuildAndSubsequentSteps" 'By using Application.OnTime, this method will be called in a way that locks the end-user out of providing inputs into Excel until the calculation itself is complete.
end sub
public sub CalculateFullRebuildAndSubsequentSteps
Application.CalculateFullRebuild
'...Do next steps, i.e. paste as values'
end sub
On the status bar, right hand side, it will say Calculating (N processors) X% (where N is the number of processors on your computer and X% is how much it has completed) when recalculating. If you don't see text there, it's not recalculating.
I'm using Office 2010, but it should be there in all versions. It's just kinda subtle so it's easy to miss.
Arrays in Excel can be a bit stupid. That is that in order to accomplish some tasks people avoid to use intermediate columns/rows to store (temporary) data, so arrays have to recalculate staff from the beginning every time, thus getting really slow. My Suggestion would be:
fix arrays to avoid multiple searches. Use hidden cells or even hidden sheets
Avoid using A:A and rather use A1:A1000 specially in excel 2007 or later
use formulas to equal zero or error (ex: NA()) while previous items aren't calculated, so you can clearly see if an operation is done at all.
some VBA could be used to inject formulas in place one step at a time, perform calculations, then proceed to next step, but this could mean lots of work...
Does the size of excel workbook effect the running of a VBA code? I have written a VBA code into a excel workbook whose size is 200 MB.The excel workbook has the browse button and these browse button picks another Excel file and performs operations and write backs some of the needed contents to the 200 MB size file. But these operation is too slow taking 8 to 13 min. These excel workbook contains charts, graphs and more modules and 16 sheets and pivot tables lots of stuff.(will these effect to run the vba code?)
But the same code I have written to another new excel workbook which is of size 200KB with the same browse button and all that stuff and here it took 4 seconds to finish. I wonder whats going wrong. Using for loops is harmful for a big file but I cannot perform my operation using without a loop in my code.If for loop is harmful then it should even effect 200KB size but no danger with that! what I have to do now?
Make sure you have used Application.ScreenUpdating=False and Application.Calculation=xlManual before the code that writes to the workbook. (Then reset them at the end).
Also there is a big overhead on each read and write to a Range, so its usually much faster to read a large block of data into a variant.
Dim vArr as Variant
vArr=Worksheets("Sheet1").Range("A1:Z9999").Value2
'Manipulate the resulting array.
For j = LBound(vArr) To UBound(vArr)
For k = LBound(vArr,2) To UBound(vArr,2)
vArr(j,k)=vArr(j,k)*2
Next k
Next j
'And then write it back to its destination.
Worksheets("Sheet2").Range("A1:Z9999")=vArr
Even the mere action of reading and/or writing the file takes a significant amount of time in such a large file, many intermediate operations may be affected as well.