I am currently working on an Excel file to automate a number of actions. There are about a dozen users using a copy of this file and I cannot create a shared file because each user needs to have a certain amount of data that is unique to them and cannot be shared with others.
The problem is that I have to make regular updates of my VBA code and that every time I have to manually update files of all users.
Would there be a way to store my code in a file on the network or on Github where I could modify my macros and where my file could "read" my code?
I’ve heard about XLAM files but I haven’t been able to get anything functional with it. Is this solution viable?
Related
I'm working on automating a lot of the data reporting in the business I work at.
It's all various tables orginating from a central database and spread out across Excel workbooks.
I'm largely limited to MS office tools at the moment.
Power Query is a great deal faster than the current methods and easier to maintain.
I notice that a lot of the reporting uses the same results over and over again. As such, I can write a query and distribute to my coworkers in an ODC file or otherwise through a file server or Teams.
However, loading an ODC loads in the raw PQ code into the file.
Which means any changes made to the master query have to be manually loaded into each file.
Is there a way update PowerQuery code across multiple worksheets?
I'm trying to avoid having to write database level queries as possible. I have minimal support on it, would prefer not to freeze the system, and learning the IBM i-series is a disproportionately larger trial.
Store the M code in flat files in a Onedrive synced folder. Then load the queries dynamically using Expression.Evaluate . Chris has a great article here https://blog.crossjoin.co.uk/2014/02/04/loading-power-query-m-code-from-text-files/
An application works with entities that consist of images and texts. For the program, several JPGs, PNGs, TIFFs, texts or jsons are considered as a one whole work, but they are a bunch of different files on a disk. It's convinient for a user to be able to easily copy works, sent them to another people, download, upload etc. While a work is actually several files it's cumbersome to work with them for a user.
Probably, I can zip required files but it looks like a heavy workaround. I don't need compression, and speed of the application is important.
I'm not sure, may be, it's okay to just strictly read the whole data from each file and sequentially write it to a final file, construct it as a json or something else... but I suppose that a turnkey solution may be here.
My question:
How can I pack different files into one in order to go quickly through them, copy, move, delete, edit in a GUI application like any file manager does, and to be able for a person to manually copy works, send to a friend etc.?
I code in Python.
Thank you in advance.
I have an excel file about 340 mb which contains more than 2000 worksheets and dozens of long VBA program. The file is getting so large that it takes around 10-15 mins to open the file and often get into "NO RESPONSE", "NOT ENOUGH USEABLE RESOURCES" when I save or debug the file.
I searched online people suggest migrating to Acesss. However, I have never used Access before. SO I wonder
1) How to migrate the excel file to access?
2) Will the VBA program be carried to the new access file
3) Do i need to modify the excel VBA code to fit Access?
2) Can Access handle a 300-500mb file?
thank you
500MB Excel file does not mean 500MB Access file. Moving from Excel to Access is a very good thought. However, you need to understand the basics first. Excel it completely different from MS Access access except when Access looks similar to excel in a datasheet form.
Access is a "Relational"-Database + graphical front-end software where you can present your data in a form or through components. In a relational database you identify entities and define relationships between them. By doing so you eliminate [Data inconsistency] throughout your application. A proper modelling might reduce your data from 500MB to just 50MB.
To answer your questions:
1> How to migrate the excel file to access?
First of all migration in your case needs a fresh new MS Application. Start modelling your business first. Read about Rational-Databases, Read about MS Access tables, relationships, queries. Think if Access is a suitable platform for your business. Create the application in MS Access and then you can come back to us and ask about migration.
OR:
You can use the current Excel sheet as an external data-source and link them as linked table within the access application. This is very very not recommended because you are effectively not benefiting anything better than your current situation. (Except queries to find a data)
2> Will the VBA program be carried to the new access file
Do you mean VBA forms? no it won't. Access has its own Forms and controls which looks better than Excel userforms. You have to redesign that part completely in Access.
3> Do i need to modify the excel VBA code to fit Access?
In most cases YES unless its a generic function. Most Excel VBA codes are referring to a cell or worksheet which Access does not have! In other words, it depends how big and complex your codes are. Any specific code has to be adjusted to MS Access platform but the adjustments are very minor not a major task. Again it depends on your code complexity.
4> Can Access handle a 300-500mb file?
Yes it can! newest versions can reach up to 2GB but i personally would not stay with access when i have to work with such amount of data. I would look for splitting/upgrading the database to a proper dedicated database engine such as MSSQL or free MySQL servers.
Some advise from me: 500MB of Excel sheet is potentially dangerous you should seek an alternative very soon. If you are going to fiddle on you own please always "backup" because Access throws random errors which are very hard to understand. Find an experienced IT person to help you before you delete/update/wipe off your data. Good luck
Interesting
I faced much of the same problem years ago on a POS system that recorded the transaction history in excel via some VBA. I moved to Access and found (Access) get a little corruptible the larger the file gets. I had to build in safeguards to restore from a backup should the file quit on me. I eventually moved to Visual FoxPro as my VBA could just about be translated straight across. Works to this day as a matter of fact.
How can I develop an Excel plug-in to edit external data in an Excel data table?
Excel can make connections to external data sources but as far as I am can see they are one-direction read-only data tables. What I am trying to do is something like TFS plug-in for excel. I am sure there are many more ones like that.
For those who do not know that plug-in:
When installed, TFS Excel plug-in takes place as a new menu in Excel. Through that menu you can open a connection to a TFS server and bring your (work item) records into Excel as an Excel table. You can add new rows or edit the data in the table. Some cells has drop down lists attached to them but only valid options are shown in the list and that is different for each record. You can edit rows in the table and you can bulk push those records back to server.
I don't know if it makes a difference but the connection and update operations on my datasource will be through web services.
I guess this would require some serious development but I am lost between web pages about external data ranges (which are only for reading). Can someone please direct me to some further reading on the topic?
External Data Ranges will not help you so you can stop reading web pages about them. You're correct that they're read only. You could use them for the read part of your operation, but you'll be doing so much coding around the write part, you might as well just control everything. You just won't get enough benefit from External Data Ranges to warrant using them at all for this type of situation. In my opinion, of course.
If you were reading and writing to a database, you would likely use ActiveX Data Objects (ADO). You would read in a recordset, monitor it's changes, then write back to the database using UPDATE, DELETE, and INSERT statements as necessary.
If you'll be interacting with the database through an API, as you seem to indicate, you will probably use Microsoft XML library, specifically the MSXML2.XMLHTTP object. You can use GET, POST, PUT, DELETE, and anything else you can do through HTTP.
If you've never used XMLHTTP before, you'll have a little learning to do. But it's not particularly difficult and there's a ton of info available. The hard part, in my opinion, is tracking the changes made to the Excel sheet. If you allow the user to use Excel's native editing features, it can be difficult to keep track of the changes made. If you go to a total lockdown situation where the user has to, say, use your menu item to delete a record, then you have to ask why you're using Excel (there still may be good reasons, but familiarity with Excel's interface won't be one of them because you'll be replacing it with yours).
Maybe you already have a strategy for this. But if not, search for "detect deleted row with worksheet change event" to get a feel for some of the challenges you'll face. If you have a way forward, then go read up on XMLHTTP and you should be all set.
A little background to my question. I work for a company that is charged with retrieving data from databases from all 50 states and DC. I take this data and reformat it in excel. Once it's reformatted I use SQL Server to upload it to our website vetportal.agdata.net. While some states are not so bad, retrieving information from others make it very painful to sort through.
I have 2 questions:
Can a code be written so that a new database can be crossed checked with the old database (our records) and update the information in the old database while also excluding duplicate information?
Can a code be written to take a number from an open excel sheet, switch over to an open website, input the number, search for the individual, and extract his/her information, and finally update the excel with that information then move on to the next person? Ex, WA State's website is set up so that you can only look up one person at a time which is very tedious when going through 1200+ individuals.
I have some experience with C++ and have written programs that draw code from other files, but mainly only equations or values which then get evaluated in my code so I know this is a bit different.
I guess if you have a repetitive technological problem you can solve it with some programming.
Your questions:
You can make that with a little app, that using SQL reads the information from the new database and checks/updates the information of the old database.
This code is a little more difficult to do, but i guess it can be done. In C++, I don't know if there is any library that can already open Excel files, but in Java you have the Apache POI, that way you can open your excel file in the application, then while iterating through the information you open the website in the application, and submit the form you want with your number, getting the response and parsing it.
If you want to make this in Java I think it will not waste you too much time if you know C++ . The only exception is opening the website in java and parsing it which will take more time to learn and do.
Hope it helps!
1) Yes. Depending on the databases, you may be able to do a db to db connection. You could then write a query using an INNER JOIN to update information in the old database and exclude duplicates.
2) A few ways to approach this problem. Depending on your language (mine is PHP) you could use an open source class such as PHPExcel to open the sheet and fetch & update website data (cURL). You could also write some VBA within Excel that could do similar functionality.