Macro slow when other Excel files are open - excel

I've seen other users post somewhat the same question, but the core problem doesn't seem to be the same (as far as I can tell).
I have an Excel workbook that goes through about 80 000 rows and four columns of data. This takes about 1-2 seconds at the most. The workbook does do operations in other worksheets but for the test I have turned those subs off. If I open an .xlsx of about 10 Mb it takes a bit longer but not much. If I open up an .xlsm with some considerable code, it takes about 6-7 seconds to do the same thing.
What I have read so far is that one should use set variables of ranges instead of cell references, calculation set to manual, screenupdating off. None of these seem to do the trick though.

If the events were not disabled, this could also be added just before the code begins:
Application.EnableEvents = False
Changing it to True when everything is finished.
If you wish to optimize things further, one possibility is to assign it to a variable:
Dim Temp as Variant
Temp = ActiveSheet.UsedRange.Value
Analyze and change Temp, and then put it back
ActiveSheet.UsedRange = Temp

I finally figured it out. It has to do with instances of Excel. If multiple heavy macro files are opened in the same instance of Excel it takes a lot longer than if one instance per file is used. I don't know why this is, but from what I've read it's a bug that's been around since Office 2003 or something.
So now all I have to do is make sure that everyone uses multiple instances. Unfortunately you have to do some register changes to make that happen automatically. I'm currently going with the .bat file alternative.

Related

Providing a list of files in Excel of a directory is slowing down excel

I need to provide a current list of files in a directory in an Excel workbook and everything is working as required, just too slowly. I really only need the list to check it is current once upon opening the workbook. It takes around 11 seconds to do this which is acceptable but the problem is it keeps rechecking this every time I carry out even minor edits to the workbook (I guess due to the fact that it is brought in as an Excel table). I determined the lag in my workbook using the rangetimer() function that is provided and it is the only thing taking a long time to calculate. I should also state that the table containing the list of files is finally used in a cell on another worksheet to provide a data validation drop-down list but don't believe this is really the issue.
I did some Googling on reducing Excel calculation times and discovered that there are some Excel functions that are definitely culprits for increasing calculation times (described as volatile) and three of these (NOW,INDEX and ROW) are used in providing the functionality I would like in this part of the workbook.
I have tried two solutions so far:
1. Force Full Calculation set to True in VBA properties window
2. Switched calculations to manual. I set this back to automatic once I identified that this part of the workbook was the issue as I don't want manual calculation generally.
The formula I have in the 'refers to' box of the named range (TutorFileList) is:
'''=FILES("\O008DC01\Shared\Tutor Qualifications*")&T(NOW())'''
The formula I have in each cell of the excel table is:
'''=IFERROR(INDEX(TutorFileList,ROW()-1),"")'''
What I would like to have is the ~11secs of calculated time to find these files reduced down to just one check of the networked directory rather than it taking 11secs of automatic recalculation every time the workbook is modified.
If there is a more efficient way to achieve what I am doing I am prepared to redesign things but I do need the functionality of a drop-down list of files in the specific directory in a cell.
Many thanks for assistance from anyone on this.
I have resolved my issue by reducing the number of rows back to around 200 instead of 500 rows. This brings the calculation lag back to about a second which I can live with.

Force excel workbook to avoid auto recalculation upon opening it on other PC until changes are made

I have the following situation:
I created a rather gigantic excel workbook with a bunch of excel worksheets and a lot of cross-dependencies between worksheets in it and a lot of heavy formulas. I saved this file. When I open it on my laptop it doesn't try to automatically recalculate content since Excel realizes that data didn't change. When I make changes in data recalculation is fast since changes in data will be localized and won't affect the whole workbook making it possible to make adjustments to workbook without spending hours to wait for calculation completion.
When I give a copy of this workbook to anyone else and they open it on their PC it seems that Excel decides (not sure why - wasn't able to find any answer) to recalculate entire workbook. Probably that's a default behavior when excel file is opened on different PC.
Since workbook is huge recalculation of everything in it will take forever
Is there any way to force excel to assume that whichever values are populated in the cells right now are 'correct' (that all cells don't require recalculation) but still preserve the Automatic recalculation behavior when user changes something in the data? Basically, we need to remove 'dirty' status from all cells in the workbook when it is opened on new PC.
I can not answer Your question, but i may provide a solution for Your problem:
Have You tried to enter
Application.Calculation= xlCalculationManual
Application.CalculateBeforeSave=False
into the direct-window (Ctrl+G while in developers mode)?
You can reactivate the automatic calculation modes later, for example with an "Workbook_Change" -Event, or manually in the direct-window.
I have a few other possible "Work-arounds" in mind, please let me know, if you are interested. This might also help to understand Excels calculations: http://www.decisionmodels.com/calcsecrets.htm

.xlsm files freezes over activating after long hrs of inactivity

Like suppose, your .xlsm file is about more than 5 mb
It has more than 20 sub procedures any many more. Many inbuilt formulas & functions are used in the background on the excel sheet. Yes, more than 20 Excel sheets. So this is how large Excel VBA Application looks like.
Now, I found one issue with the large application, It stops working or freezes or not respond when your application is open in your system for more than few hrs without having any action on it!
Yes, and as soon as you activate your workbook after a long hr of other work, workbook almost freezes or showing not responding. and we need to close the excel file anyhow for further any process.
It's very common encounters who has worked on large excel files. Please discuss here the reasons, causes, and solutions.

Excel will not update links, entire day of research and tests with random results

Short version, my Excel is set properly to automatically update links and all my files(locally stored) work fine for years. Suddenly one will not update linked data. I click on Connections>Edit connections>Check Status every linked file has "Warning! Values referring to other workbooks were not updated"
Refresh/calculate all does nothing.Changing to manual and doing this, back to auto, open and closing, restarting, using these same files on another PC. Nothing I did fixes it.
Clicking into an individual cell(F2) then back out though updates that one cell.
Open
All security settings are correct I am 98% sure, regardless whatever settings I had haven't changed and it did work.
I read a post that seemed exactly the same but his solution was enable protected content. Not the case here, i disabled it fully. There seems to be an error causing this possibly..
Long version. This is my largest file I continue to build
I have a main excel sheet that is linked to 35 workbooks. The source workbooks have lists of 3-4 columns, ranging from 1,000-10,000 rows long. The main WB uses index match for each source to pull two small fields. It takes about 5-6mins to do a calculate all with a desktop i7 3.64ghz ivy with 16gb ram.I never have issues on. Win 10/64bit and office 2016 64bit.
Some source files are .xls, I am in the process of changing them to xlsx but when I open the xls file and the values update. I then save as xlsx with a shorter name as well (trying to lighten the formulas) I go back to Connections>Edit connections>Check Status and the same warning is there. However I can click update values and it says OK.
This is very important file and is not physically monitored. Until something sells wrong I realize it wasnt updated, I am hoping for a concrete answer I can solve instead of just changing random things and hoping.. all help is greatly appreciated!
I actually resolved this by opening all source workbooks with the main workbook open and resaving them one by one (Status on some turned OK, most remained). Then I simply used Edit Links and Update Values for each connection which changed the status to OK. Saved the main workbook and no more issues..
Yes I need to take those suggestions and the workbook to the next level as it has clearly outgrown my skillset, but getting there..

Excel Advanced Filter Very slow to run, but only after autofilter has been run

I have a very difficult issue I have been trying to solve for a few days, I would very much appreciate some help as I have tried to research this issue completely already.
One one sheet I have a database (18 columns and 72,000 rows) in 32 Bit Excel 2010, so its a large database. On this sheet I also have some entries to auto-filter some columns, as well as an advance filter. When I run the Advanced filter, the data filters in 1 second exactly. If I run an auto-filter, (via vba macros) then run advance filter afterwords, the Advanced filter takes 60 seconds to run, even after turning autofiltermode to false. Here is what i have tried but no luck
Removing all shapes on the sheet
THere are no comments on the sheet so none to removed
Removing all regular and conditional formatting
Turning off auto-filter mode
Setting all cell text on the sheet to WrappedText = False
Un-protecting the sheet
Un-hiding any rows and columns
Removing any sorting (.sort.sortfields.clear)
What else could cause this code to run 60 times slower but only after autofilter has previously run on the sheet, and how can i return it to that state? Any and all help would be greatly appreciated
In my case I was programmatically creating named ranges for later use, and these named ranges used the .End(xlDown) functionality to find the end of the data set. For e.g:
Set DLPRange = .Range(.Cells(2, indexHeaders(2)), .Cells(i, indexHeaders(2)).End(xlDown))
DLPRange.Name = "DLPRangeName"
... which creates a column range from the starting cell to the end of the document in my case. Originally this 'bad' way of finding the range went unnoticed because the file format of the workbook was .xls and maxed out at 65k rows. When I needed more I forced it to create a workbook in .xlsm format, which has ~1M rows. The filter ran on the whole column even though the huge majority of it was empty, resulting in the huge time overhead.
tl;dr: you've tricked excel into thinking it has a huge amount of data to filter. Untrick it by checking and making sure it's only filtering the range you think it should be filtering.
After trial and mostly lots of error I was able to find a solution. I determined that almost any action, even without auto-filter would cause this slowdown, and I felt that simply this was a memory issue for Excel with all of that data (even though it ran find sometimes, the 'cache' I'm guessing would fill up and then run slow.
So what I did was use a new and temporary workbook in which the Advanced filter would add the data on filter. I then took this data and copied a portion of it back into my workbook. Then I closed this temporary workbook without saving it. This also brought the code run from 1 second to .3 seconds and I never got the slow Advanced filter run time, regardless of what code I ran or what I did on the original workbook.
I posted this so if anyone else had a similar issue they might use this as a solution for large amounts of data.
A little late, but recently I had the same issue with a not so large database (4000+ rows, 70 columns) and solved it, so just sharing.
In my case, problem was with wrapped text in data range. Setting WrappedText to false as you said helps is not enough, you need to replace Chr(10) in the range you are filtering. Huge difference.

Resources