I am currently storing a key value pair in Office.context.documents.settings using the following function:
Office.context.document.settings.set(name, value);
Once the key-value pair is stored , I am relaunching the add-in and trying to fetch the value using the following function -
Office.context.document.settings.get(name);
But the function is returning null instead of the proper value. Does the value stored in document settings persist across multiple sessions of an application or does it get refreshed once we close the application?
Your question doesn't have a lot of detail but there are two common errors when working with settings:
Failure to Load Settings
Prior to reading a given setting, you need to populate the settings object. This is done using refreshAsync():
Office.context.document.settings.refreshAsync(function(){
Office.context.document.settings.get(name);
});
Side-loaded Add-ins
When you side-load an add-in, Office generates a random ID and assigns it to your add-in. If you remove and re-side-load the add-in, it will generate a new ID. You'll also get two distinct IDs if you side-load the same add-on on two different machines.
This will affect how settings function since settings are keyed by the Add-in ID when they are stored or recalled from a document. For details on how this works (and how to get around it), see Issue with Office.context.document.settings.get.
The setting isn't being saved because you did not call saveAsync. The set method only saves the setting in memory, not to the file. To save to the file you must first call set, then call:
Office.context.document.settings.saveAsync(callback);
Then when you reload the add-in you will be able to retrieve the settings with get. Here's the documentation page for the saveAsync method: https://dev.office.com/reference/add-ins/shared/settings.saveasync
-Michael, PM for add-ins
Related
Background: I'm using python 3 (with Flask and Bootstrap) to create a website with fields to capture data. There is an API running from a HighQ database to take data from a record, and once the form is submitted, put updated / new data back into the same record in the HighQ database (and also to an SQL database). This is all working successfully.
This is a tool that is being used internally within the company I work for, and hosted on the intranet, so I haven't set up user logins as I want it to be as quick and easy as possible for the team to use this form and update records.
Issue: When two or more instances of the form are being used at the same time (whether it's on one persons computer, or if two people are testing on their own computers) the data submitted from one person's form will overwrite the destination record that the other person has called into their form. I'm struggling to find the solution that ensures each time the form is launched, it ringfences the data so that this does not happen.
I've tried a number of things to resolve this (including lots of stackoverflow searches):
Originally I used global variables to pass data across different routes. I thought the issue was that everyone launching the form could access the same global variables, so I changed the code to remove any global variables, and instead used a combination of taking content within each of the fields in the site, and saving it as a local variable within the route I needed it in, and for any variables that were not obtainable from a field, I saved them to the 'session'.
The problem is still persisting and i'm now at a loss on what to try next. Any help on this would be much appreciated.
I am using the test plugin for VS 2012 (although have just installed 2013), and need to know:
Is it possible to have a parameter pass a different value from a selected list while load testing?
I have used the sample load test located here: http://www.visualstudio.com/get-started/load-test-your-app-vs and created a new web test that meets my needs as below.
I have a simple journey recorded that is an email registration web page. The journey is essentially completing name & address, email, conf email, password, conf password. On submission of the form, a verification email is sent.
I need to check that this process can handle around 3000 users. The email to actually send the verification has been hardcoded for test purposes, but I need a unique email to submit the form. I would essentially like to run 3000 test cases through, and just change the email address each time.
What is the best way to do this?
The simple answer is do a web search for data driving (or data driven) Visual Studio web performance tests. You should find many articles and tutorials.
In more detail:
Outline of how to data drive a test
Firstly, Visual Studio distinguishes different types of test. A Load Test is a way of running individual test cases many times, as if by many simultaneous users, gathering data about the test executions and producing a report. The test cases that a load test can execute include Web Performance Tests and Coded UI Tests; both of these can be data driven.
Data driving a Web Performance Test requires a data source. The data can be CSV, XML, Spreadsheet, database and in TFS. I will describe using CSV.
Create a CSV file, containing something similar to the following. Note that the top line of field names is required and those names are used within the test.
Name,Email,Telephone
Fred,fred#example.com,0123 456789
George,george#example.com,0123 456790
Harry,harry#example.com,0123 456791
See also CodedUI test does not read data from CSV input file for some notes CSV file creation.
Open the test project in Visual Studio and open the .webtest file for the test. Use the context (right-click) menu of the top node of the test, ie the test's name (or use the corresponding icon) and select "Add data source ...". Follow the prompts to add the CSV file into the project.
Within the Web Performance Test expand the request to show the form parameters or query string or whatever that is to use the data. View the properties panel of the relevant field and select the appropriate property, in many cases it is the Value property. Click the little triangle for choosing a value for the property. The popup should show the data source, expand the items shown and select the required field. After selecting the field the property will show a value such as {{DataSource1.FileName#csv.Email}}. The doubled curly braces ({{ and }}) indicate the use of a context parameter. All the used data source fields are available as context parameters. All of the data source fields can be made available by altering the Select Columns property of the data source file. Data source field can be used as part of a property value by using values such as
SomeText{{DataSource1.FileName#csv.Email}}AndMoreText
Data source access methods
The data from the datasource can be read and used in four ways. The default is Sequential. Other orders are selected using Solution Explorer to access the properties of the file (eg FileName#csv). The Access Method property can be set to one of:
Sequential data is read sequentially through the file. After the last line of the file is read, the first line of the file will be next line to be read. Thus each line may be read more than once.
Random data is read randomly.
Unique data is read sequentially through the file. After the end of the file is read the test will not be executed again. Thus each line in can only be read once.
Do not move cursor automatically intended for more complex tests where the cursor is moved via calls from plugins.
A web test may use more than one data source file. These files may have different access methods. For example one file containing login names and passwords could be accessed Sequentially and another file with other data could be accessed Randomly. This would allow each login to try many different sets of the other data.
Data sources and loops
Web performance tests may contain loops. The properties of a loop include Advance data cursors. This allows, for example, a data source file to contain items to be found and added to a shopping basket such that each loop iteration adds a new item.
I have a SharePoint document library I am working on. It has a list of document sets. Each document set has a few fields that are marked as "Shared" so that they can be inherited by the documents inside.
When I upload a document inside a form opens and all the fields on the form are pre-filled with the shared values of the corresponding columns. But when I use create document from template, it opens the template in the corresponding Office application but the document property fields are empty and not read-only which is against the requirements of this project. I require them to be synced and filled exactly like when a document is uploaded.
There is one thing though. The user can fill any value he wants in those fields and they will still be saved a synced copy from parent in the library discarding what the user filled in, which is good, but why not show those values up in the document in the first place?
Anyone has experience in handling this please help. I have searched a lot on the internet but either my keywords are wrong or no one has had this problem before.
SharePoint version: 2010 Server
Office version: 2010 Professional
It sounds like you need a simple event reciever, which fire on itemadded. It would then go back up the tree to find the document set. Capture which properties are marked as shared. Adjust the item that is being added to force the values.
Probably 8 lines of code
I have an NSTableView which is populated via a CoreData-backed NSArrayController. Users are able to edit any field they choose within the NSTableView. When they select the rows that they have modified and press a button, the data is sent to a third-party webservice. Provided the webservice accepts the updated values, I want to commit those values to my persistent store. If, however, the webservice returns an error (or simply fails to return), I want the edited fields to revert to their original values.
To complicate matters, I have a number of other editable controls, backed by CoreData, which do not need to resort to this behaviour.
I believe the solution to this problem revolves around the creation of a secondary Managed Object context, which I would use only for values edited within that particular NSTableView. But I'm confused as to how the two MOC would interact with each other.
What's the best solution to this problem?
The easiest solution would be to implement Core Data's undo functionality. That way you make the changes to Core Data but if the server returns the error, you just rollback the changes. See the Core Data docs for details.
I'm currently writing an application that moves Notes documents between databases based on the amount of days that have elapsed from the creation/modified/last accessed dates. I would just like to get ideas on a simple and convenient way to create documents with specific dates, without having to change the time on the Domino server, so that I could test out my application.
The best way I found so far was to create a local replica and change the system clock to the date I want. Unfortunately there are problems associated with this method. It does not work on the modified date - I'm not sure how it is getting the modified date information when the location is set to Island (Disconnected) - and it also changes the modified and last accessed dates when the documents are replicated to the server replica.
Someone suggested trying to create a DXL of the document, modify the date time in the DXL file, then import it back into the database as a Notes document; but that does not work. It just takes on the date-time that it was created.
Can anyone offer any other suggestions?
You can set the created date for a document by setting the UNID (which is fundamentally a struct of timestamps, although the actual implementation has changed in recent versions). Accessed and modified times, though, would be unsettable from within the Notes/Domino environment, since the changes you make would be overwritten by the process of saving the changes. If you have a flair for adventure and a need to run with scissors, you could make the changes in the database file itself either programmatically from an external application, or manually with a hex editor. (Editing the binary will work -- folks have been using hex editors to clear the "hide design" flag safely for years. Keep in mind that signed docs will blow up badly, and that you need to ensure that local encryption is off for the database file.)
There's actually a very simple way to spoof the creation date/time: just add a field called $Created with whatever date/time you want. This is alluded to in the Notes C API header file nsfdata.h:
Time/dates associated with notes:
OID.Note Can be Timedate when the note was created
(but not guaranteed to be - look for $CREATED
item first for note creation time)
Obtained by NSFNoteGetInfo(_NOTE_OID) or
OID in SEARCH_MATCH.
Unfortunately, there's no analogous technique for spoofing the mod or access dates. At least none that's ever been documented, as far as I know.
I imagine given how dependent Lotus Notes is on timestamps (for replication, mainly), there isn't an API call that allows you to change the modified, created, or last access dates of a note. (More on the internals of Lotus Notes can be found here.)
I dug around the Notes C API documentation, and found only one mention on how to get/set information in the note's header, including the modified date. However, the documentation states that when you try to update that note (i.e. write it to disk), the last modified date will be overwritten with the date/time it is written to disk.
As an alternative, I would suggest creating your own set of date items within the documents that only you control, for example MyCreated, MyModified, and MyAccessed, and reference those in your code that moves documents based on dates. You would then be able to change these dates as easily as changing any other document item (via agents, forms, etc.)
For MyCreated, create a hidden calculated form field with the formula of #CREATED or #NOW. Set the type to computed when composed.
For MyModified, create a hidden calculated form field with the formula #NOW, and set the type to computed.
MyAccessed gets a bit tricky. If you can do without it, I suggest you live work with just the MyCreated and MyModified. If you need it, you should be able to manage it by setting a field value within the QueryOpen or PostOpen events. Problems occur if your users have only read access to a document - the code to update the MyAccessed field won't be able to store that value.
Hope this helps!