Essentially I have an Excel file that is going to need to be worked on concurrently for a prolonged period of time. In the past I would simply 'Share the Workbook' and this would allow users on the network to view/change the file at the same time as other users on the network but in this particular instance everyone is disconnected from a central network and there is no mechanism available to place them on the same network. Does anyone know of a service out there that will allow all parties to edit this document in a central location concurrently?
A MOSS box came to mind but that seems like overkill for a single document. Thanks.
You could use SharePoint Foundation 2010 (free if you have Windows Server 2008 - so maybe a better fit than paying for MOSS) and import the Excel file as a list, and then sync the list with Excel.
Or, Google docs has a spreadsheet program - I think it allows importation of Excel spreadsheets.
Related
I have several 2016 MS Access database applications that are well designed and implemented over a shared network. Currently there are minimal issues and the databases are utilized by about 100+ people. All users have front ends that are automatically updated whenever the master copy, also on the shared network, changes versions. These front ends took 2-3 years to develop and were mindful of the company's limitations at the time.
The company is considering completely reverting to Microsoft Office 365, but I have been told that they plan to hold on to MS Access as long as possible. That said, I'd rather take a proactive approach. I have done some tests with Microsoft SharePoint and found that as long as the linked list follows the requirements necessary within SharePoint, like the 'Title Field', autonum ID field, I can keep all of the complex queries as long as they stay in my front ends. This is ideal as an interim solution, because it would buy time to completely upgrade my front end to something more current.
Is there a better approach to preserve the front end design without having to start again from scratch, and if not, what medium would be best if I have access to Visual Studio and SQL Server?
Maybe what I'm going to ask is crazy, but it's what they're asking me and I don't know if it can be done.
Is it possible to expose a service oData or Rest with excel? For example, excel is listening to a request and answered the sum of 2 numbers.
Thank you very much for your answers.
Please note that Questions asking to suggest a product are usually discouraged as they attract highly opinionated responses!
While possible, without compatible infrastructure it is usually easier to create your own API, it can still use an Excel workbook as a backend using techniques commonly referred to as Excel Automation.
One MS example: How to automate Microsoft Excel from Microsoft Visual C#.NET
As part of requirements gathering, as this is a very specific request you should ask them to produce at least a set of the queries and the expected URL format that they intend to use.
A request like this usually means they are coming across from an existing platform or they are trying to copy the functionality from another provider/competitor
MS have a general solution for this called Excel Services REST API that allows you to upload workbooks to SharePoint Services and query against them, the idea is that now you can have a single source of "truth", effectively a live workbook without passing around copies of it.
NOTE:
The Excel Services REST API applies to SharePoint and SharePoint 2016 on-premises. For Office 365 Education, Business, and Enterprise accounts, use the Excel REST APIs that are part of the Microsoft Graph endpoint.
The other MS offering that can create a REST interface for Excel is Power BI, basic instruction on how to upload an offline workbook to Power BI can be found here.
I'm offering this advice because I have fielded this type of request from management or clients in the past, it is important to help them fully understand their reasons behind the request before proceeding as there can be significant licensing and setup costs compared to implementing a custom API manually
Short version: (after finding out the answer)
I have an Excel VBA application with an MS Access database (.accdb) on a SharePoint library.
The behavior that was examined (and apparently documented - see answer):
It takes a long time to perform the ADODB Open and Close methods (~15 seconds).
If multiple users are connected at the same time to the database, only the changes made by the user which closed the database connection LAST are saved. Changing cursor types, cursor locations or lock types didn't help. No error shown.
Why does this happen?
Original Question:
First question here. Hope this isn't too wordy:
I've built an Excel application using VBA to communicate with an MS Access database (.accdb) that should have support for concurrent users accessing it. It is meant to be placed on a Sharepoint site as an accessible file (not integrated into it in any other way). When I was testing the Excel file and the database on my home network it worked like a charm, transactions and all. However, once I migrated it to Sharepoint, I've noticed some extreme differences from the way it acted on my personal network:
The ADODB {.open} and {.close} methods take at least 15 seconds each (making Excel freeze until done). Due to this, I've decided to open and close connections only once throughout the lifetime of the application, and restore the connection if it is broken along the way. I'm aware of the fact that this is highly not recommended, but can't afford having my users wait so long. This hasn't caused any problems that I'm aware of, perhaps apart from the one I'm about to explain.
The problem: Changes aren't saved to the actual database unless all active user connections to the database are closed, even if the only active thing is the connection. Everything passes without errors for each user when attempting to update, and each user can access his/her changes, I suppose until all connections are terminated. I tried all possible cursor types and lock types, nothing seemed to work. It is as if a local copy of the database is stored on the user's computer (hence the long wait while opening and closing the connection), and updates are stored on the temporary version, not the actual one.
I tried all possible combinations for cursor types, cursor locations, lock types and what not (found out along the way that dynamic cursors aren't supported in my case - I wonder if that's the answer).
Due to this I have no other choice but to make the program accessible to only one user at a time, or changes seem to get lost along the way, making the program highly unreliable.
I read something about having to "flush the buffer" or "refresh the cursor". Is this even possible/necessary? Or the case? If I'm using a keyset cursor, shouldn't my edited records be shown to all other users? (not talking about new ones)
For what it's worth, I map the path to the sharepoint folder before accessing it.
Have any of you experienced something like this? Or have any suggestions?
If you need samples of my code I'll post it soon. Thanks so much!
I found the solution to my problem:
Although you can save an Access database file to OneDrive or a SharePoint document library, we recommend that you avoid opening an Access database from these locations. The file may be downloaded locally for editing and then uploaded again once you save your changes to SharePoint. If more than one person opens the Access database from SharePoint, multiple copies of the database may get created and some unexpected behaviors may occur. This recommendation applies to all types of Access files including a single database, a split database, and the .accdb, .accdc, .accde, and .accdr file formats. For more information on deploying Access, see Deploy an Access application.
Source: Ways to share an Access desktop database
I need to combine the energy consumption every month in my company from various sources. I do the calculations in the excel sheets which I receive every month. How do i combine all the sheets and make a dashboard and also update the dashboard every month automatically once the excel is updated?
Which is the best form of tableau to use(Public,desktop or server)?
What exactly is the difference among the three?
Are the excel sheets a good data source in tableau?
You are asking a lot of questions which should probably be raised separately, but I will try to answer some of them anyway since they all relate to the same use case.
1. How do I combine and make a dashboard
Since Tableau 9.3 you are able to use Union. This will combine all your excel files into a single source of data you can use. I think your data sources should however have the same structure. Meaning the sheets containing information should have the same columns.
You can dynamically and automatically do this using wildcard search. This way it will try and add all files that for example are located in the same folder.
More information on this here.
From the moment you have at least one file as a data source you can start creating a dashboard.
2. Which is the best form of Tableau
I don't think you truly understand the difference between the Tableau applications you mention.
You will need Tableau Dashboard to actually create a dashboard.
If you want to be able to share this dashboard through the web you will need either Tableau Server, Tableau Public or Tableau Online. Everything published on Tableau public will be publicly available. So if your data is considered restricted, sensitive or should not be shared outside your company you should not consider this.
Tableau server on the other hand is server software you can install on a local host which allows you to publish your dashboards and sheets so people with a Tableau server license can access it through a web interface.
Then there is Tableau Online which offers almost the same except that Tableau will take care of the hosting. This is the SaaS solution for making your dashboards available online.
Lastly there is Tableau Reader which is a free desktop application that is able to open your Tableau workbooks, but cannot modify them and has limited access to external data sources.
3. Is Excel a good data source
This really depends on your use case and is probably opinion based. Since the possibility of union and the ability to automatically bring in and update data I think Excel files can be a useful resource. What you need to consider is where the Excel files are stored, how you will connect to them and how many users will need to access them. If other users can easily modify the Excel file and create errors this is another downside of using them as a source.
When you publish your dashboard on e.g. Tableau Server and you want the dashboard to automatically change there as well, the Excel file needs to be accessible from there as well and should not be included in the dashboard. If you feel like none of the above is an issue then at the moment Excel is great for you.
I've been asked to figure out a way to do this, so please fill me in on whether this is even possible or if it shouldn't be done.
The goal is to automate data entry into VMS (we use E-Term32 for connecting to VMS). Things that have been discussed for this purpose: Excel spreadsheets, Dynamic Data Exchange, the macro tools available in E-Term32 (Emulation Command Language - ECL), OLE Automation, etc.
The envisioned process would go like:
Receive Excel file (or other data format like a text file)
Connect to VMS
Run Macro
Macro navigates the menu system and uses data from Excel file to enter data
I know there are "better" or easier ways to do this like building an application to enter the data, but my supervisors are concerned about circumventing the business logic built into the "Blue Screen" menu/applications. They are also sticklers on building new applications for stuff like this anyways.
How is the data stored on OpenVMS, may we assume in native RMS (indexed) files? or some database (RDB, Oracle, Adabas,...) ?
Whichever, it is sure to be perfectly possible to write directly in the datastore through some ODBC or JDBC method. Freeware, or commercial (ConnX, Attunity,...). Just google: OpenVMS ODBC
Once you find a method for direct data access one should indeed be concerned about the business logic. Field formatting, values ranges, foreign keys,...
Thus access can only be granted to (software managed by) trusted team players.
But that can be perfectly manageable and you may find the new method can be made more robust than those green-screen apps.
If direct data access is no negotiable, then there are still many options.
Screen-scrapers have been build, you should not attempt to write from scratch.
Check out commercial terminal-centric modernization tools like: http://www.rocketsoftware.com/products/rocket-legasuite-gui
presentation: http://de.openvms.org/Spring2009/03-Dutch_Police_FINAL.pdf
(I am not associated with the fine folks # Rocket, it is just one example I am aware of. There are surely more (commercial) options.
Now about those business rules. How clean is this implement?
Strict form/function seperation? Probably not, otherwise you would not be asking.
There are several RPC, GSOAP, methods available, free and for fee, that allow one to call OpenVMS service routines, passing in external data. That's probably the best path forward.
The company I work for, Attunity, sells such 'legacy adapter' tools in addition to ODBC/JDBC data access to files directly.
For example, using Attunity Connect software you can connect a row in a table, to the call of a subroutine. The basic plan would be to just use an SQL INSERT statement on Linux or Windows to an ODBC datasource which is connected to an OpenVMS target. The connect software will then call an OpenVMS subroutine in a shareable library to process the row, using or at least re-using the existing business logic for validation, normalization or de-normalization as the case might be. And yes, they can also expose a SELECT or MODIFY for lookups that are more complex than can be described in SQL.
Everybody happy! You can use modern tools, they can use the old code and checks.
And ofcourse another time-honored method is to just have an FTP drop point for data to be entered. Some OpenVMS job scans an agreed upon directory for 'action' files, and runs an OpenVMS program to process the data in the fashion similar to the terminal UI app. Cobol, Basic, re-using as much of the existing terminal code and logic as possible
Good luck!
Hein