Jet.OLEDB or ACE.OLEDB MS Access - excel

I'm using excel vba to pull data from a MS Access DB - this is using Excel 2013 and Access 2013 32bit. The code historically has used:
Provider=Microsoft.Jet.OLEDB.4.0;
However some computers have upgraded to Excel 2016 64bit and the Jet provider is not available for 64bit. I have changed the code to:
Provider=Microsoft.ACE.OLEDB.12.0;
which works for both 64bit and 32bit systems. However, I have noticed a significant speed drop in loading/saving data just from changing this line. Does anyone know why this can be and how I can improve it?

You are correct in having to choose the ACE provider for x64 bits.
And the big advantage of JET was it was (and still is) installed on all copies of windows by default. So no need to install Access or the runtime, or previous the office connectivity package.
As for performance? There has been a few comments about performance in regards to ACE x64.
However, one trick or suggestion is to ensure that the connection stays open. In other words, are you sure the row processing is going slow, or it is the overall time?
(perhaps put a test msg box, or test in your code.
Eg:
Dim T as single
T = timer()
‘ your code here
Debug.print timer() – t
The above will thus spit out the time to the debug window (while in VBA ide hit ctrl-g to display the immediate/debug window.
The reason why I suggest force open idea is often you find that ACE takes a VERY long time to open. But once open then the data reading has good performance (same as before).
So, I suggest to check and try this fix.
So open a table (any table) and KEEP it open. Now run your existing code (that may well open + close other tables). The issue is when ACE attempts to open a table, it tries to put locks on the mdb/accdb file and it is this process that takes VERY VERY long time.
However, if you force (keep) open one table, then this VERY slow process of ACE attempting to lock the file for read/write does not occur each time you execute a query, or create additional recordsets in code.
So, if the row reading speed is fast, but the time to START + open is very slow, then before you run + test your routines, force open a table to some reocrdset (keep it active and in scope), and THEN try your code.
I find 9 out of 10 times, this results in elimination of this slow speed, and often I seen the results are nothing short of spectacular (it will run faster then before!!!)

Related

How to make Excel run in multi-CPU mode

I have big excel files, thousands lines and rows.
Gigs in size.
When i do work inside this files in excel, it throttling and lagging. Sometimes just get stucked ans freeze.
When i open a task manager, i see that Excel didnt eat even a one CPU.
RAM usage also not overloaded.
How to make excel use all my cores?
Excel 2019.
I see you are familiar with Python. Why don't you move your databases to Python? Actually I don't know capability of it. I work in R. Some of my data in Excel and even in *.csv take 100-200 Mb, while in *.Rdata format the same information is less then 5Mb. R functions work faster then heavy Excel.
And yes, how to make excel use all cores of processor is interesting to me too.

When redemption.dll is loaded and operational excel 2016 pauses before launching saved files from disk

I have identified a possible issue with redemption slowing down excel when you click a file from disk (with excel closed), it introduces a delay of minimum 4-5 seconds.
If excel is open the files open immediately from disk, the problem goes away when we close the program that has launched the redemption.dll process is closed.
If you launch excel via the command line with the file as an argument then it also launches immediately eg. "c:\path to office\excel.exe" myfile.xls its jsut if you click it then there is a pause.
All the important helpful bits
Machine is a 2016 RDS Host (Latest patches (inc Jan 2020 patch)), 8
cores, 16GB Memory, users and admins are affected (tested multiple
accounts)
Office is 2016 standard edition with all items installed (again fully patched)
Redemption.dll version is 5.20.0.5298
I have tried just de-registering the redemption.dll but I suspect the program that launches it looks where it expects to find it within its world, and re-registers it (kind of expected tbh), if I de-register it and delete the dll from disk then the program falls back to the outlook security triggering tick the box method but excel does not get slowed down at all when launching files with excel closed.
Side note: Word is unaffected
Thanks in advance
Issue turned out to be a setting within the software thats using redemption, something calander related as the setting we altered to resolve the issue related to diary monitoring.
Thanks.

Excel 2016 Upgraded From 2007; Any Workarounds for Apparent Memory Issue?

This isn't really a programming question perse, but at work we have been forced to upgrade from Excel 2007 to Excel 2016 which has caused some productivity issues with respect to opening multiple workbooks at once.
The problem is that our entire file system has a bunch of linked formulas and iterative calculations where it is necessary to have multiple workbooks open at once (around 60+). Previously with Excel 2007 we were able to do this easily with relatively low power computers (4 gigs of ram, with a mediocre hyperthreaded i3 processor) but with the new change to Excel 2016 we keep getting an error stating to "upgrade to 64-bit Excel" or "install more physical memory". The error message can be seen here. We have already upgraded to 64-bit Excel, and on top of that, I tried using other co workers computers with 8 and 16 gigs of ram which resulted in little to no difference. Does this imply that perhaps Excel 2016 is not well optimized for having this many workbooks open at the same time? Are there solutions or workarounds this problem? I find it hard to believe that 16 gigs of ram is still insufficient when 2007 was able to go through this process easily, though perhaps the change from MDI to SDI means less efficiency?
As an aside, yes, I do indeed wish we were not using Excel for such a computationally expensive process in which Excel may not have been designed for tasks like these, but to move everything to something like SAS would take a lot of time I'd imagine and ultimately, I don't make the decisions :(
Thanks for any help.

Apache POI in Windows Server 2012 R2

We have a set of utility programs which reads an .xlsx file for some input data and generate reports, Apache POI is used for this purpose. Excel file got 8 sheets with an average of 50 rows and 20 columns of data. Everything was working fine in normal Windows 7 box (Read developers machine). The file reading will get finished in few seconds.
Recently we moved these jobs to a Windows Server 2012 R2 box and we have noticed that the last sheet in the excel file takes lots of time to finish reading. I have duplicated the last sheet to confirm that this is not the data issue and executed the job, the second last sheet( was the last one in the previous execution) got finished reading in milli seconds and the last one (duplicated sheet) got again stuck for 15 minutes. My best guess here is that this may be because the time taken to close the file is getting too high but that is just a guess and no concrete evidence to prove that, also if that is the case I am not sure why so. Only difference between working Windows boxes and non-working boxes are the OS, rest all configurations are similar. I have analyzed the heap and thread dump and no issues found.
Is there any known compatibility issues with POI and Windows server boxes? Or is it something related to code? We are using POI-XSSF implementation.
Ok, finally we got the problem; the issue identified is with the the VM itself, the Disk I/O operation is always 100% and file read/write was taking a lot of time to complete, this caused the program to stuck there. However we couldnt identify why the disk I/O is high, tried some blogs but didnt work hence we downgraded the OS to Windows 2008 server and it worked well.
Note that there is nothing to do with POI or anything, it was certainly a VM/OS issue.

Does Monitoring An Excel Spreadsheet Via RDP Make It Slower?

We have a massive spreadsheet which does a lot of calculations and not much drawing / writing to spreadsheets
My question is : Does monitoring the spreadsheet whilst it is running via RDP actually make this slower??
Put differently if rdp was disconnected would this result in improved speed??
I've actually done a lot of work from home via Remote Desktop that involved an Excel Workbook (and Access Applications) doing lots of hefty calculations. From my experience, I didn't notice any slowdown in the calculations on the Excel sheet, but occasionally the connection would slow and anything that refreshed the screen a lot would make the PC difficult to use.
The most important thing, however, is to write code that modifies the visual elements of the screen as much as possible. For example, instead of looping through a bunch of cells and setting each one as the active cell to find its value, loop through a set of range values that don't require the sheet to refresh. This, by far, has created the biggest performance boost in my VBA code.
If your code is already fairly optimized, you'll probably not see any difference monitoring it over RDP. However, if monitoring is your issue, you ought to consider outputing data to a separate Excel or Text file that might be stored on a shared server. If done correctly, I imagine that would have a smaller impact on your CPU than RDP. THis will still allow you to monitor the progress of the Excel application without having to log in.
Just look at the CPU usage of Excel and the RDP server. If Excel isn't getting its 100% while calculating, or if the RDP server seems to be using too much... then yes, RDP is making things slower.

Resources