VBA Delete Visible Rows Very Slow w/ large database - excel

I am not clear as to why this Delete command is so slow in VBA, when the database is large. The database has about 80,000 records and the delete command bellow takes about 5 minutes to run.
'Deletes all Card interactions
ActiveSheet.Range("A1:H1" & LastRowEx).AutoFilter Field:=3, Criteria1:="*card*", Operator:=xlFilterValues
ActiveSheet.Range("A1:H1" & LastRowEx).Offset(1, 0).SpecialCells _(xlCellTypeVisible).EntireRow.Delete
ActiveSheet.ShowAllData
Any idea why this is so slow and how I can speed it up? I have to do this a few times for other fields in the database and it is extending the run time quit a bit. Thanks for the help!!

Related

Power BI and Excel daily report

I’m receiving a daily excel sheet report that contains daily incidents
My task is the following:
1- Consolidate the daily with the master sheet (including update the status of already existed TTs)
2- Calculate the SLA
3- Update charts for the TTs exceeded SLA
I was searching online to find a mechanism to automate the process by automate the merge using power automate then use power query to calculate the SLA finaly use power bi to visualise the data.
I would highly appreciate if anyone could advice me if my approach is right or to suggest another way.
Thanks a lot
your tasks are not clear, which connections are you using or if you consolidate in excel or where-else ?
if you don't need to do anything in excel, did you try to refresh it on the server?
Go to Data Hub --> https://app.powerbi.com/datahub (or your local server)
Settings ---> Scheduled Refresh
Your request seems a little vague, but, at least for the first part, it seems like you want to automate the process of automating worksheets. Check out the VBA script below. Get that working, then move on to step #2 and step #3.
Sub Combine()
'UpdatebyExtendoffice
Dim J As Integer
On Error Resume Next
Sheets(1).Select
Worksheets.Add
Sheets(1).Name = "Combined"
Sheets(2).Activate
Range("A1").EntireRow.Select
Selection.Copy Destination:=Sheets(1).Range("A1")
For J = 2 To Sheets.Count
Sheets(J).Activate
Range("A1").Select
Selection.CurrentRegion.Select
Selection.Offset(1, 0).Resize(Selection.Rows.Count - 1).Select
Selection.Copy Destination:=Sheets(1).Range("A65536").End(xlUp)(2)
Next
End Sub
https://www.extendoffice.com/documents/excel/1184-excel-merge-multiple-worksheets-into-one.html

SaveAs function extremely Slow

I have a fairly simple loop that copies data from a few locations, creates a new worksheet with it, creates a workbook from that sheet, saves it, and deletes it.
Sometimes this loop runs nearly instantly and I can save 20ish files in about 5 seconds, however sometimes it will take 15 to 20 seconds to finish each iteration of this loop, which causes it to take several minutes unexpectedly. I haven't been able to find any outside source (tried with all other programs closed) that seems to slow it down. I've also done the typical methods of speeding up the loop like disabling screen updating and calculations.
I don't know if there is a more efficient way to loop this process, but I've verified this section in particular is the part that causes the delays as placing breaks while debugging has shown the differences in time and the other sections are all nearly instant. The section below is slightly modified to remove code that is known not to be problematic.
Edit - Additionally, I should mention each of these sheets are extremely small, holding only around 10 rows and 2 columns of data.
for i = 1 to 100
featureSheet.Copy
ActiveWorkbook.SaveAs "C:\DIR\" & HeaderFile.Sheets("SheetName").Range("C" & i).Value & ".json", FileFormat:=xlTextPrinter
ActiveWorkbook.Close
featureSheet.Delete
Next i

Writing to SQL server - too slow

I have an Excel sheet (using Excel version 1902) where i use VBA and an ADODB.connection (using Microsoft ActiveX Data objects 2.8 library) to read/write to an sql server (2016).
At first both the read/write operations were very slow, taking approximately 15 seconds each. There will be multiple read/write sessions so 15 seconds is not acceptable.
The amount of data to be read/written varies but is approximately 500 rows in 15 columns in total (split between 7 tables, so the number of rows and columns varies per table). In other words, the amount of data to be transferred is not massive.
At first I thought the problems was in my VBA code (loops, searching for text etc). But by removing those steps, I narrowed the problem down to moving through the recordset (.movenext row by row to read or write to the database).
For the read operation I managed to get acceptable speeds by doing the following:
Changed the CursorType from adOpenKeyset to adOpenForwardOnly
Changed the LockType from adLockOptimistic to adLockReadOnly
This reduced read times from 15 seconds to 5 seconds, which is acceptable.
However, I have not managed to achieve any improved speeds for the write operations which are still at 15 seconds.
I first tried:
Changed the CursorType from adOpenKeyset to adOpenStatic
Changed the LockType from adLockOptimistic to adLockBatchOptimistic
And then altered the .update command to .updatebatch.
I thought maybe that updating all in a batch would speed things up, but that did nothing.
Then I tried changing the connection .open statement from:
cn.Open "Provider = sqloledb;" & _
"Data Source=datasourcename;" & _
"Initial Catalog=catalogname;" & _
"User ID=UserIdname;"
etc. to:
cn.Open "Driver={SQL Server Native Client 11.0};" & _
"Server=servername;" & _
"Database=databasename;" & _
"Uid=UserIDname;"
Again that did nothing. With the updatebatch the native sql server connection performed worse (28 seconds) than the oledb connection. But with .update (no batch update) both connections were similar in performance at about 15 seconds.
Does anyone have any tips on how to possibly speed up the write operation ?

File size of an Excel Workbook increases every time I create a new pivot table and save the file

I have a vba macro that creates a pivot table based on some data in an input sheet. Every time the macro runs, the old pivot is deleted and a new one is created.
The problem I'm facing is that every time I save the file after running the macro, the file size increases by roughly 14MB.
This is how I delete the old pivot table:
For Each pivot In reportSht.PivotTables
pivot.TableRange2.Clear
Next pivot
My theory is that some part of the pivot isn't being deleted but I can't seem to put my finger on what.
I have found the solution to my problem. When I create the pivot tables I also add connections, since I need to display the number of unique entries in the pivot table:
ActiveWorkbook.Connections.Add2 "WorksheetConnection_" & inputDataArea, "", _
"WORKSHEET;" & ActiveWorkbook.Path & "\[" & ActiveWorkbook.name & "]" _
& inputSht.name, inputDataArea, 7, True, False
Where inputDataArea is a String with the range used by for the pivot table. My problem was, that I was not deleting these connections when I was deleting the pivot table. So a new connection was being added every time the macro was executed.
I added this piece of code to also remove any connections that are no longer needed after removing the pivot table:
Dim connection As Object
For Each connection In ActiveWorkbook.Connections
If connection.name <> "ThisWorkbookDataModel" Then connection.Delete
Next connection
The file is still large but manageable and most importantly it's not growing anymore.
Thanks to Pᴇʜ for suggesting that I remove pivot caches and pointing out, that these are deleted along with the connections.

Display values greater than a specific number

I have a report in Excel which lists clients and the amount of hours that we work on them.
I want to create a separate sheet within this workbook that pulls the clients that we work on over 40 hours, through VBA.
For example basic version of my report looks like this in Sheet 1
Client------Client ID---Hours
Client 1------1947--------30
Client 2------6465--------46
Client 3------8787--------20
Client 4------7878--------15
Client 5------4873--------48
I want my new sheet to display
Client------Client ID---Hours
Client 2------6465--------46
Client 5------4873--------48
I am wondering if it is a while loop but wouldn't it break as soon as it finds the first value greater than 40 then it wouldn't continue to the next set of values?
I recorded a quick macro to get the code for the filter criteria and then used Copy Destination:= . Recording macros and reviewing them is a great way to start learning some VBA
Sub Shorty()
Cells.Select
Selection.AutoFilter
ActiveSheet.UsedRange.AutoFilter Field:=3, Criteria1:=">40", _
Operator:=xlAnd
ActiveSheet.UsedRange.Copy Destination:=ActiveWorkbook.Sheets("Sheet2").Range("A1")
Selection.AutoFilter
End Sub
You can find the documentation for the Copy Method here

Resources