VBA: Querying Access with Excel. Why so slow? - excel

I found this code online to query Access and input the data into excel (2003), but it is much slower than it should be:
Sub DataPull(SQLQuery, CellPaste)
Dim Con As New ADODB.Connection
Dim RST As New ADODB.Recordset
Dim DBlocation As String, DBName As String
Dim ContractingQuery As String
If SQLQuery = "" Then
Else
DBName = Range("DBName")
If Right(DBName, 4) <> ".mdb" Then DBName = DBName + ".mdb"
DBlocation = ActiveWorkbook.Path
If Right(DBlocation, 1) <> "\" Then DBlocation = DBlocation + "\"
Con.ConnectionString = DBlocation + DBName
Con.Provider = "Microsoft.Jet.OLEDB.4.0"
Con.Open
Set RST = Con.Execute(SQLQuery)
Range(CellPaste).CopyFromRecordset RST
Con.Close
End If
End Sub
The problem is that this code takes very long. If I open up Access and just run the query in there it takes about 1/10th the time. Is there anyway to speed this up? Or any reason this might be taking so long? All my queries are simple select queries with simple where statements and no joins. Even a select * from [test] query takes much longer than it should.
EDIT: I should specify that the line
Range(CellPaste).CopyFromRecordset RST
was the one taking a long time.

I'm no expert, but I run almost exactly the same code with good results. One difference is that I use the Command object as well as the Connection object. Where you
Set RST = Con.Execute(SQLQuery)
I
Dim cmd As ADODB.Command
Set cmd.ActiveConnection = con
cmd.CommandText = SQLQuery
Set RST = cmd.Execute
I don't know if or why that might help, but maybe it will? :-)

I don't think you are comparing like-with-like.
In Access, when you view a Query's dataview what happens is:
an existing open connection is used
(and kept open);
a recordset is partially filled
with the first few rows only (and
kept open);
the partial resultset is shown in a
grid dedicated to the task and
optimized for the native data access
method Access employs (direct use of
the Access Database Engine DLLs,
probably).
In your VBA code:
a new connection is opened (then
later closed and released);
the recordset is fully populated
using all rows (then later closed and
released);
the entire resultset is read into a
Excel's generic UI using non-native
data access components.
I think the most significant point there is that the dataview in Access doesn't fetch the entire resultset until you ask it to, usually by navigating to the last row in the resultset. ADO will always fetch all rows in the resultset.
Second most significant would be the time taken to read the fetched rows (assuming a full resultset) into the UI element and the fact Excel's isn't optimized for the job.
Opening, closing and releasing connections and recordsets should be insignificant but are still a factor.
I think you need to do some timings on each step of the process to find the bottleneck. When comparing to Access, ensure you are getting a full resultset e.g. check the number of rows returned.

Since you're using Access 2003, use DAO instead, it will be faster with the Jet engine.
See http://www.erlandsendata.no/english/index.php?d=envbadacexportdao for sample code.
Note that you should never use the "As New" keyword, as it will lead to unexpected results.

I would recommend you to create the Recordset explicitly rather than implicitly using the
Execute method.
When creating explicitly you can set its CursorType and LockType properties which have impact on performance.
From what I see, you're loading data in Excel, then closing the recordset. You don't need to update, count records, etc... So my advice would be to create a Recordset with CursorType = adOpenForwardOnly & LockType = adLockReadOnly:
...
RST.Open SQLQuery, Con, adOpenForwardOnly, adLockReadOnly
Range(CellPaste).CopyFromRecordset RST
...
Recordset Object (ADO)

I used your code and pulled in a table of 38 columns and 63780 rows in less than 7 seconds - about what I'd expect - and smaller recordsets completed almost instantaneously.
Is this the kind of performance you are experiencing? If so, it is consistent with what I'd expect with an ADO connection from Excel to an MDB back-end.
If you are seeing much slower performance than this then there must be some local environment conditions that are affecting things.

Lots of formulas may reference the query. Try temporarially turning on manual calculate in the macro and turning it off when all of your queries are done updating.
This should speed it up a bit, but still doesn't fix the underlying problem.

If you retrieve a lot of records, it would explain why the Range(CellPaste)takes so long. (If you execute the query in Access it wouldn't retrieve all the records, but if you do the CopyFromRecordset it requires all the records.)
There is a MaxRows parameter for CopyFromRecordset:
Public Function CopyFromRecordset ( _
Data As Object, _
<OptionalAttribute> MaxRows As Object, _
<OptionalAttribute> MaxColumns As Object _
) As Integer
Try if settings this to a low value (like 10 or so) changes the performance.

What about the following turnarounds or improvements:
Once opened, save the recordset as xml file (rst.saveToFile xxx) and then have Excel reopen it.
Once opened, put recordset data in an array (rst.getRows xxx), and copy the array on the active sheet
And, at any time, minimise all memory / access requirements: open the recordset as read-only, forward only, close the connection once the data is on your side, etc.

I don't know if it will help, but I am using VBA and ADO to connect to an Excel spreadsheet.
It was retrieving records lightning-fast (<5 seconds), but then all of a sudden it was awfully slow (15 seconds to retrieve one record). This is what lead me to your post.
I realized I accidentally had the Excel file open myself (I had been editing it).
Once I closed it, all was lightening fast again.

The problem 9 times out of 10 is to do with the Cursor Type/Location you are using.
Using dynamic cursors over network connections can slow down the retrieval of data, even if the query executed very fast.
IF you want to get large amounts of data very quickly, you'll need to use CursorLocation = adUseClient on your connection. This mean's you'll only have a static local cursor, so you won't get live updated from other users.
However - if you are only reading data, you'll save ADO going back to the DB for each individual record to check for changes.
I recently changed this as I had a simple loop, populating a list item, and each loop was taking around 0.3s. Not to slow, but even on 1,000 records thats 30 seconds! Changing only the cursor location let the entire process complete in under 1 second.

Related

How can I have an Excel VBA application in use without multiuser lock?

I have an app I coded in Excel that suits the needs of my project; it serves the purpose of keeping track of quite a lengthy process and prerequisites and such.
It feeds off of a certain number of tables in my file.
The thing is, only one user can currently work on that file; and since we have multiple teams working on different parts in parallel, it would be nice to host that somehow in a way that would remove the single-user restriction.
Do any of you have an idea of how I could work around this?
I worked on a solution for a very similar project of keeping track of a hospital's labor utilization (nursing employee census, if you will) on a day-to-day basis across every nursing-based department in the hospital system. This solution relies on a couple conditions:
That it will be unlikely two or more people will need to save data to the final file at the same time (meaning within seconds of each other).
All the various users of the file will have access to at least one commonly-shared network drive or location.
In our case, we created a new file each day, but it wouldn't be difficult to adjust the data-writing code to append data, rather than create a new file and dump data into that new file.
The rough outline of the process is this:
Create a read-only destination file (.xlsx in our case) in a network location that contains tables of data split between n worksheets.
Create an interactive form (.xlsm) that allows user input and then on form submission, opens the destination .xlsx file and saves the form data to it, then closes it. This interactive .xlsm file can be placed in the same network location, with shortcuts created on as many peoples' desktops (or departmental shares, for example) as necessary.
With the speed of Excel and VBA, this means you're only "opening" the destination file for a second or two to write the form data, no matter how long one user may have a copy of the form open.
One thing that will be necessary is to check if the file is open, and gracefully alert the user if they need to try again, which you can do with a function covering the related error codes, for example:
Function IsFileOpen(FileName As String)
Dim iFilenum As Long
Dim iError As Long
On Error Resume Next
iFilenum = FreeFile()
Open FileName For Input Lock Read As #iFilenum
Close iFilenum
iError = Err
On Error GoTo 0
Select Case iError
Case 0: IsFileOpen = False
Case 70: IsFileOpen = True
Case 53: IsFileOpen = "Not Found"
Case Else: Error iError
End Select
End Function
which can be called via some code like (pseudo code):
Private Sub UpdateData(ByVal thesheet As String)
Dim xlApp As New Excel.Application
Dim xlWkbk As New Excel.Workbook
If Not IsFileOpen(FileName) Then
Set xlWkbk = xlApp.Workbooks.Open(filename)
xlWkbk.Worksheets(theSheet).Activate
Else
MsgBox "Sorry, the file is currently in use. Please try again", vbOKOnly
Exit Sub
End If
End Sub
Or you could have it simply wait a few seconds (e.g. Wait 5) or more if the writing process doesn't cover that much data. The specific amount of seconds to wait would depend on testing write times based on your scenario and your data. That would be added as a nested If Not statement inside the previous one.
Then, when the result is that the file is not in use, simply write a series of subroutines to write the form data (stored as variables) to the destination sheet. End with something like
xlWkbk.Save
xlWkbk.Close
Set xlWkbk = Nothing
Set xlApp = Nothing
to save and close the workbook and clear your variables (memory cleanup and all that).
You may already be aware of this practice, but while you'll want to keep Excel visible during development, you'll definitely want to set Application.Visible = False on the production files for two reasons:
This will prevent users from getting confused by a lot of automation
It covers Application.Updating as well, which will really speed up data processing.

Run Access query from Excel VBA with module name containing slash

I want to run a MS Access query from Excel VBA. For that I'm using ADODB.Connection and ADODB.Command to call the query. In principle it works, but there is an issue with the name, because it contains slashes: "Query_3/6/1/1". Running the script below, I get an error message like "The Microsoft Office Access database engine cannot find the input table or query 'Query_3'." So it doesn't consider the rest of the name following the slash. I already tried escaping it with brackets [], but it doesn't help and other than that I didn't find a solution.
Renaming the module works, but there are lots of them and there are already other dependendies, so that is not really a solution.
I'm very glad for any help!
Dim con As ADODB.Connection
Dim cmd As ADODB.Command
With con
.Provider = "Microsoft.ACE.OLEDB.12.0"
.Open "C:\Users\...\file.accdb"
End With
With cmd
.ActiveConnection = con
.CommandText = "Query_3/6/1/1"
.CommandType = adCmdStoredProc
End With
Set rs = New ADODB.Recordset
rs.Open cmd
...
I don't think you are going to like the answer. I did some testing and Excel is just not going to honor that naming convention. I even called a query built from that Query_3/6/1/1 and it would not work. I encountered this same issue with queries that used Nz() function. Had to redesign queries.
Only alternative I can see is a procedure in Access that writes the query records to a 'temp table' (table is permanent, records are temporary) and the Excel calls this procedure then opens recordset based on the temp table.

Oracle DB View -> Copying View to Excel using VBScript

english isn't my native tongue, but I hope I can explain my problem sufficiently.
I made a View in the Oracle DB which only contains the data I need.
Using SQL in my VBScript file, I select the View by using:
"SELECT * FROM TEST_1234"
I have selected the complete view now, that works fine.
Now I need to 'export' or copy the complete View to Excel using VBScript (via UFT [Unified Functional Testing]).
Is there an easy way to just copy the whole thing at once or at least complete rows or columns?
If 1. doesn't work, can I just 'iterate' through the rows and columns using two loops and copy the data from every field to the respective field in Excel?
It would be nice to be able to copy the Data without using the names of the columns in a recordset (is there a way to use numbers until EOC [End of columns]?), because there is a very high amount of columns to be copied and the column names are subject to change.
Thanks for any help!
From a programmer==code writer's point of you the most attractive solution is your very first approach (copy the whole thing with just one SQL statement). Depending on the providers' capabilities this statement could look like
INSERT INTO [DstTable] SELECT * FROM [SrcTable] IN '' 'odbc;dsn=DSNName'
or
SELECT * INTO [DstTable] FROM [SrcTable] IN '' 'odbc;dsn=DSNName'
Look here for a working solution that couldn't be simpler; but I admit that a dsnless connection to the destination database looks more complicated and your drivers may have other incantations to refer to the external Database. Furthermore, your pair of providers may not support an external connection from the source to the destination and the dirty trick of using the Access OLEDB driver (which came/still comes? with ADO) to connect to both Databases externally may not work for you. In all, it's certainly not easy to get "INSERT/SELECT INTO External Database" right. [Look at my (just downvoted) answer to see that people dispair and fall back (and upvote) code that uses single-item-copy-loops.] In your case, you'll have to research whether at least one of the Oracle providers available to you supports external connections to Excel (or vice versa).
From a programmer==hacker's point of view (let's get the job done with minimal fuss) an easy solution could be to export the views/tables to .csv (
I looked at this and was disappointed, but you may know much better) and to import them into Excel (just load .csv and save .xls)
If you can't/won't use the file system, you could go thru memory: Use GetRows to get the data into a two dimensional array and assign that to the desired Excel range.
If all the above fails and you need assignments to single cells in row and column loopings over the recordset, remember that the Fields collection gives you access to not only the data but the meta-info (number of columns, column-names, types, ...) too.
Thanks for the help, and the links you provided, Ekkehard and Bond! After reading them and trying a lot, i got a very simple solution.
Here's some working code, if anybody else faces the same or a similar problem:
Option explicit
Dim conn, rec, xlStat, xlStatW, dbCnnStr, SQLSec, statArt
Set conn = Createobject("ADODB.Connection")
Set rec = CreateObject("ADODB.Recordset")
Set xlStat = CreateObject("Excel.Application")
dbCnnStr = "[your DB-connection]"
conn.open dbCnnStr
'Start Excel XXX
Set xlStatW = xlStat.Workbooks.Add()
xlStatW.Sheets(1).Name = "AAA_123"
xlStatW.Sheets(2).Name = "BBB_123"
xlStatW.Sheets(3).Name = "CCC_123"
SQLSec = "SELECT * FROM XXX_123"
rec.open SQLSec,conn
xlStatW.Sheets(1).cells(2,1).CopyFromRecordset rec
rec.Close
SQLSec = "SELECT * FROM YYY_123"
rec.open SQLSec,conn
xlStatW.Sheets(2).cells(2,1).CopyFromRecordset rec
rec.Close
SQLSec = "SELECT * FROM ZZZ_123"
rec.open SQLSec,conn
xlStatW.Sheets(3).cells(2,1).CopyFromRecordset rec
rec.Close
xlStatW.SaveAs ("C:\test.xlsx")
xlStatW.Close
'Ende Excel XXX
conn.Close

When using Access from within an Excel spreadsheet, the PC won't go to sleep

I've stumbled accross a problem I am unable to solve :(
Neither Google nor StackOverflow have given me any usable answers, so I'm turning to you.
The problem is this:
I've created a spreadsheet that loads data from an Access database stored on a network drive.
The data-loading part is only done once, i.e. when opening the file.
I open the connection like this:
Dim con As ADODB.Connection
Dim rs As New ADODB.Recordset
Dim sql As String
Set con = GetConString()
rs.Open "SELECT ID, somevalue FROM sometable", con
Where to connection string is something like this
"Provider=Microsoft.Jet.OLEDB.4.0;Data Source='" & (network path to file) & "'"
Then I dump the information in the spreadsheet and terminate the connection like this:
rs.Close
con.Close
However, when I try to hibernate the PC while this Excel spreadsheet is still open, I get an error message.
It translates to something along the lines of "Excel has prevented the computer from going to sleep".
This seems to happen only when using this constellation...
Does anyone have any idea on how to prevent this behavior?
I'd like my PC to go to sleep, when I tell it to - even though the Excel spreadsheet is still open.
Thank you very much :)
After your Close lines you could
Set rs = Nothing
Set con = Nothing
although I cannot guarantee that this is what is causing your computer to not hibernate - more information would be required.

Null values reading data from Excel using ADO

I am reading data from an Excel 2007 spreadsheet using ADO. Setting up the connection is easy:
Dim ado As ADODB.Connection
Set ado = CreateObject("ADODB.Connection")
ado.ConnectionString = "Provider=Microsoft.ACE.OLEDB.12.0;Data Source=myFilename.xlsx;Extended Properties=""Excel 12.0 Xml;HDR=NO;IMEX=1"";"
ado.Open
I can call ado.OpenSchema without any trouble on this object. However, when I try to query the data:
Dim rs As ADODB.recordSet
Set rs = ado.Execute("SELECT * FROM [Current Work Load$]")
I simply get a table full of Nulls.
This is mentioned as an issue on the Microsoft Support site - but I have explicitly enabled "Import Mode" (as you can see in the code above - IMEX=1).
The Execute method does not return any records as it is for action queries.
Your might want to try the OpenRecordset method.
Dim rs As ADODB.recordSet
Set rs = ado.OpenRecordset("SELECT * FROM [Current Work Load$]")
I've found the ADO connection strings here are unbelievably picky. I've gotten reading the spreadsheets to work but with a slightly different connection string:
Provider=Microsoft.ACE.OLEDB.12.0;Data Source=" + fileName + #";Extended Properties="Excel 12.0;IMEX=1";
(I don't have the XML after the Excel 12.0 declaration).
SpreadsheetGear for .NET can read Excel workbooks and enables you to access any cells without the kinds of issues / limatations you can run into with ADO.
You can see live C# & VB samples here and download the free trial here.
Disclaimer: I own SpreadsheetGear LLC
As well as using IMEX=1 in the connection string, you need to review a couple of registry keys. For more details, see this answer on SO.

Resources