I have a custom document process built into a ssjs object. When I click on the Edit button in a document in read mode, I call the method that sets a lock date/time and lock owner in the backend document and then returns true. Then the ChangeDocumentMode simple action can be used to change the document to edit mode. However, the first time I save the document (such as with a simple action), it creates a conflict document. It is likely that the frontend document is not aware of the backend document modification and save I did before going into edit mode.
If I change this process so that I let my document locking code set the two backend doc fields and then use context.redirectToPage, the document opens into edit mode and saving it from the ui does not create any conflict documents. However, if after using my code to unlock the document I use the Open Page simple action to go to "Previous Page" to exit the document, it goes only back to read mode instead of actually closing the document. I am sure that the initial redirectToPage disrupted the history and causes this problem.
The question: Does anyone have a suggestion on how I can lock the document before going into edit mode, go into edit mode, save without causing conflict documents, and still be able to exit using the Open Page simple action (after unlocking the document)?
Here is a sample of the relevant code for locking, including code to go into edit mode:
thisDoc.replaceItemValue("LockOwner",context.getUser().getCommonName());
thisDoc.replaceItemValue("LockDate",session.createDateTime(#Now()));
thisDoc.save();
var url = view.getPageName()+"?action=editDocument&documentId="+thisDoc.getNoteID();
context.redirectToPage(url);
It depends on your use case. If your application is the way users access the documents, I would recommend not to write anything into the documents for locking - you only end up with the need for an unlock admin function for users who disconnected (network, close browser, crash) before unlocking.
The way locking is done e.g in WebDAV is a in-server-memory time lock that a call renews every 30 seconds. You would use a classic Ajax call for that.
The openNTF WebDAV for Domino project has the server side of such a locking mechanism, you might want to copy it from there.
In case you must write into the document: change the sequence and update the document in the queryOpen event in readmode - that covers the cases where a user had bookmarked the edit URL too.
Let us know how it goes!
Related
I have a tool that uses an external data connection to keep a dashboard updated from single CSV which contains a data log. I don't want anyone touching the log itself and fiddling with it, so I have put together a small dashboard that is fed from the data displayed by the connection.
I have the connection set to update on open and every 10 minutes in the background, but it does not do this. Instead, I need to manually press refresh, whereupon I get the following message:
'Microsoft Office has ifentified a potential security concern' message box example
I absolutely trust the source in question, but do not want my user to have to click refresh every now and then to make sure their data is up to date.
How can I make this connection 'trusted' and have this persist through sessions?
(Apologies if I have missed anything or this isn't the right way to ask this question. I'm new! :) )
Moved the tool and it's peripherals to a (trusted) shared drive, and no problems.
I am relatively new to VBA and I need a solution to a problem I have.
Currently I have two excel documents - one which is a form where the user enters data and this is then sent to a second document which acts as a database for this information. My issue is that when people are looking in the database data cannot be sent to the second document because it is open and there are overwrite errors.
It is my intention to have the databases location hidden away within a network drive so it cannot be found meaning the only way to access I is through a button on the first document.
Is there a way that when the open button is pressed to access the database that a copy of the document is opened instead of the actual document itself so data can still be sent?
So, you have a few options here:
a) Use an actual database. Not so much to solve this specific problem (that too) but also because making a "database" in excel is usually a terrible idea in the long run. But I know that is not always possible, and it's overkill in many situations so...
b) Have that "database" workbook set as "Shared Workbook". Shared Workbooks come with their own host of issues (namely, can't edit certain aspects like the VBA in them, conditional formatting, and others). But if it's strictly for reading and writing data, it can work. It also comes with its own version control, so that's good.
c) What you originally asked. It is quite simple really, as you said, you make a copy of the document, ideally in one of the user's local folders . You can do this using FSO (see VBA to copy a file from one directory to another)
The only issue with this last approach is that you need some way to track when the user has closed that temporary file, in order to delete it (if you absolutely don't want to leave any traces). Otherwise, you can just leave it, and overwrite that same file each time the user presses the "View Database" button.
We have an application to track work orders (hundreds of thousands) built on an Oracle database. Data entry is cumbersome and report features non-existent. IT is inflexible. We do not get support from that end. Accordingly, users have created Excel "tools" to run queries and make sense of data using ADO or ODBC connections.
What we also need is a way to record comments on specific work order (WO) records and have those comments travel with those records somehow. There are multiple users using their own spreadsheets, all querying the same database.
I'm envisioning a junction-table approach, perhaps using Access, where some VBA could take a users comment from the row in the worksheet, capture the WO number, user id, date, and comment text and store it in an Access table. Those fields could then be retrieved by some more code. This would allow any user to see all comments by any other user related to a specific WO.
I'd greatly appreciate feedback ... on the practicality, preferable constructive, but brutally honest is ok too.
Much thanks,
Kevin
Apart from this being a horrible mess you could:
use SQL Server instead of access - you can get a free copy that will probably cater for your needs. Access will also work, but it is bodge. SQL server is more professional!
I would avoid doing data entry in excel. You could build a front end (possibly in Access with SQL Server backend) that allows data entry. You could add pretty simple code to all your workbooks, which will probably reference code that will be in an excel addin. The addin has code to simply open an access database, open a form and find the WO that the user was viewing in excel.
2a. Or you could use an excel addin with a data entry form, to do data entry, but beware managing the locking, refreshing and update of displayed data.
Creating excel addins is easy, (re-)distributing them is easy, access (or whatever front end you use) is designed to do record management (ie lock and update or lock and cancel or just view) etc.... Plus you want to avoid addinghte same or similar code to all your workbooks.
Each users workbook would have very simple code. Just to tell the addin what WO you want to operate on.
Do you have sharepoint? If so Access 2013 can deliver forms as webpages - very easily, so you might not even have to manage a front end access file.
Happier now?
For those that may stumble on this post, what we ended up doing was use VBA to store comments in a separate SQL database. Users double click the cell with the WO number to get an input form prompting for comments with options to add new, append or remove existing. Entries are passed to the SQL db and also to columns in the worksheet so users can see all the entries. Time stamp/Network ID provides when/who provided comment. Existing comments are fetched when users re-open file. Works great.
To remove the stale object issue(ie..when we run the test script for multiple input,it fails for the second iteration as the object is not cleared at the end of each run)in my script, I have added always search configuration in the designer file. After this my script runs successfully on multiple inputs, but if there is a need to add some objects newly to the same designer file then my designer file will be regenerated and the Always search configuration changes will be lost.
Is there any way to retain the always search configuration remain in the designer file ever even when the designer file is regenerated?
When you generate a UI map there are actually two files that come with it. Firstly, as you've discovered, there's a generated file with all the ugly code that's generated by the coded UI test builder. Of course, making any changes to this outside of the code will regenerate the file. The second file is a partial class that accompanies the generated designer class. This file does NOT get regenerated but as a partial contains all the same object references and properties as the designer file (it just looks empty). You can reference the control you want to add this property to here and it will not be regenerated.
The other alternative to this, albeit probably not a good idea, is to put
Playback.PlaybackSettings.AlwaysSearchControls = true
inside of your test method/class initialize/test initialize. This will force the test(s) to always search for each and every control. As you might imagine, this can have a significant performance impact though when you're dealing with large UI maps or particularly long test methods.
You might also set the control object's search configuration to always search. Keep in mind that this will do searching for this control and all of it's children so I would not advise putting it on a parent with several children, such as the document.
aControl.SearchConfigurations.Add(SearchConfiguration.AlwaysSearch);
I have to programmatically move (archive) a document from document library of a site collection to a document library of another site collection in SharePoint 2010, when a specific value is set for a column in the doc lib.
Would it be possible to write code for this scenario in an event receiver? Is there any other way?
If anybody has any relevant piece of code or links, please share.
Thanks in advance!
You could perhaps do a copy operation, then delete the original file.
Have a look at the following link, which discusses copying a file from one site to another:
http://msdn.microsoft.com/en-us/library/microsoft.sharepoint.spfilecollection.aspx
The example uses one site collection. However, if you convert the source document to a byte array you can always instantiate the target site collection and add the binary data to a document library within that site collection.
Certainly the copy operation should would work within the event receiver. However, I'm not certain what will happen if you try to delete the file within the receiver (there may be concurrency issues). If the delete does not work, consider firing a one-time timer job to delete the file (which would occur in a different process).
You can try SPExport Class of SharePoint, as per this Article Copy or Move SharePoint items looks like few of the Operations that we do in SharePoint UI uses this API internally to acheive the task. Also this Approach depends if you are trying to do it one time or its going to be a repeatative process.