In my organization (Baggage handling system), we need to record every error occur in the system, it is a new system so we are using paper to record the errors.
what I need is to build a program that makes things easier.
An error recorder program, which store the value in an excel sheets (backup) and can also export a daily report that contain all the errors that occur in one day in excel sheet.
in general specking : how to make a program interact (input, output) with excel sheets? and which is better to me Java or VB .net?
Thank you all in advance.
Related
I'm using excel vba to pull data from a MS Access DB - this is using Excel 2013 and Access 2013 32bit. The code historically has used:
Provider=Microsoft.Jet.OLEDB.4.0;
However some computers have upgraded to Excel 2016 64bit and the Jet provider is not available for 64bit. I have changed the code to:
Provider=Microsoft.ACE.OLEDB.12.0;
which works for both 64bit and 32bit systems. However, I have noticed a significant speed drop in loading/saving data just from changing this line. Does anyone know why this can be and how I can improve it?
You are correct in having to choose the ACE provider for x64 bits.
And the big advantage of JET was it was (and still is) installed on all copies of windows by default. So no need to install Access or the runtime, or previous the office connectivity package.
As for performance? There has been a few comments about performance in regards to ACE x64.
However, one trick or suggestion is to ensure that the connection stays open. In other words, are you sure the row processing is going slow, or it is the overall time?
(perhaps put a test msg box, or test in your code.
Eg:
Dim T as single
T = timer()
‘ your code here
Debug.print timer() – t
The above will thus spit out the time to the debug window (while in VBA ide hit ctrl-g to display the immediate/debug window.
The reason why I suggest force open idea is often you find that ACE takes a VERY long time to open. But once open then the data reading has good performance (same as before).
So, I suggest to check and try this fix.
So open a table (any table) and KEEP it open. Now run your existing code (that may well open + close other tables). The issue is when ACE attempts to open a table, it tries to put locks on the mdb/accdb file and it is this process that takes VERY VERY long time.
However, if you force (keep) open one table, then this VERY slow process of ACE attempting to lock the file for read/write does not occur each time you execute a query, or create additional recordsets in code.
So, if the row reading speed is fast, but the time to START + open is very slow, then before you run + test your routines, force open a table to some reocrdset (keep it active and in scope), and THEN try your code.
I find 9 out of 10 times, this results in elimination of this slow speed, and often I seen the results are nothing short of spectacular (it will run faster then before!!!)
I developed a vb.net program that uses excel file to generate some reports.
Once the program takes too much time to generate a report, I usually do other things while the program is running. The problem is that sometimes I need to open other excel files and the excel files used in the program are shown to me. I want to still hide those files being processed even when I run other excel files. Is this possible? Thanks
The FileSystem.Lock Method controls access by other processes to all or part of a file opened by using the Open function.
The My feature gives you better productivity and performance in file I/O operations than Lock and Unlock. For more information, see FileSystem.
More information here.
I have been creating a tool in excel that is based around data entry by a user from a userform. This entire program and the transformations done through vba span approximately 950 lines of code. Addionally, I have several power queries built to generate live-updating stats as new data is entered. The file as a whole is 64,597 KB. Currently, when I open the file it takes a descent amount of time to open with some "Excel not responding" errors before resuming the opening process. I am nervous that once I roll out this program to the users that it may "break." I am still fairly new to programming and don't have much formal training for best practices, so any tips for improving the speed/efficiency would be greatly appreciated!
I have a vba script that extract information from huge text files and does a lot of data manipulation and calculations on the extracted info. I have about 1000 files and each take an hour to finish.
I would like to run the script on as many computers (among others ec2 instances) as possible to reduce the time needed to finish the job. But how do I coordinate the work?
I have tried two ways: I set up a dropbox as a network drive with one txt file with the current last job number thart vba access, start the next job and update the number but there is apparently too much lag between an update on a file on one computer is updated throughout the rest to be practical. The second was to find a simple "private" counter service online that updated for each visit so han would access the page, read the number and the page would update the number for the next visit from another computer. But I have found no such service.
Any suggestions on how to coordinate such tasks between different computers in vba?
First of all if you can use a proper programming language, forexample c# for easy parallel processing.
If you must use vba than first optimize your code first. Can you show us the code?
Edit:
If you must than you could do the following. First you need some sort of fileserver to store all text files in a folder.
Then in the macro, foreach every .txt file in folder,
try to open the first in exclusive mode if the file can be opened, then ("your code" after your code is finished move the file elsewhere) else Next .txt file.
We have a massive spreadsheet which does a lot of calculations and not much drawing / writing to spreadsheets
My question is : Does monitoring the spreadsheet whilst it is running via RDP actually make this slower??
Put differently if rdp was disconnected would this result in improved speed??
I've actually done a lot of work from home via Remote Desktop that involved an Excel Workbook (and Access Applications) doing lots of hefty calculations. From my experience, I didn't notice any slowdown in the calculations on the Excel sheet, but occasionally the connection would slow and anything that refreshed the screen a lot would make the PC difficult to use.
The most important thing, however, is to write code that modifies the visual elements of the screen as much as possible. For example, instead of looping through a bunch of cells and setting each one as the active cell to find its value, loop through a set of range values that don't require the sheet to refresh. This, by far, has created the biggest performance boost in my VBA code.
If your code is already fairly optimized, you'll probably not see any difference monitoring it over RDP. However, if monitoring is your issue, you ought to consider outputing data to a separate Excel or Text file that might be stored on a shared server. If done correctly, I imagine that would have a smaller impact on your CPU than RDP. THis will still allow you to monitor the progress of the Excel application without having to log in.
Just look at the CPU usage of Excel and the RDP server. If Excel isn't getting its 100% while calculating, or if the RDP server seems to be using too much... then yes, RDP is making things slower.