RestKit - delete an object based on incoming attribute - core-data

I am using basic RestKit mapping with JSON I am receiving via a REST call.
RKManagedObjectMapping* contactMapping = [RKManagedObjectMapping mappingForClass:[Contact class]];
[contactMapping mapAttributes:#"id", #"firstName", #"middleName", #"lastName", #"title", #"email", nil];
contactMapping.primaryKeyAttribute = #"id";
[contactMapping mapRelationship:#"addresses" withMapping:addressMapping];
[contactMapping mapRelationship:#"phoneNumbers" withMapping:phoneMapping];
// Register our mappings with the provider
[objectManager.mappingProvider setMapping:contactMapping forKeyPath:#"contacts"];
I have a new field, status, which I now need to check to see if it's "Delete" (or whatever) and if it is, delete that contact from the database, if they exist. However I'm at a loss as to where to put this. Do I need to create my own extended Object Mapping class to handle this? Are there any examples anywhere of this? My Google Fu is weak on this one.

Related

How to add collections in transformations when writing(creating) a Document in MarkLogic

I wrote a transformation in xquery which unquotes an XML-String and inserts a element with its content. This works fine.
I need to create a collection dependant on the root element of this element as well. I can't do this on new documents as xdmp:document-add-collections() is not working. How do I add the collection to new Documents in transformations?
Here my ServerSide xQuery Code:
xquery version "1.0-ml";
module namespace transform = "http://marklogic.com/rest-api/transform/smtextdocuments";
import module namespace mem = "http://xqdev.com/in-mem-update" at '/MarkLogic/appservices/utils/in-mem-update.xqy';
declare function transform(
$context as map:map,
$params as map:map,
$content as document-node()
) as document-node()
{
let $uri := base-uri($content)
let $doccont := $content/smtextdocuments/documentcontent
let $newcont := xdmp:unquote($doccont)
let $contname := node-name($newcont/*)
let $result := if ( exists($content/smtextdocuments/content))
then mem:node-replace($content/smtextdocuments/content, <content>11{$newcont}</content>)
else mem:node-insert-after($doccont, <content>{$newcont}</content>)
let $log := xdmp:log($content)
return (
$result,
xdmp:document-add-collections($uri, fn:string($contname)),
xdmp:document-remove-collections($uri, "raw")
)
};
The script ist running with the java api (4.0.4) create methode via parameter ServerTransform transform. As per documentation the transformation script is running before the document is stored in the Database.
Its a new document; I need to transform the content and then create the collection.
I can see the document after the create, the content is available. Just the collection is missing. I can try xdmp:document-insert method but is it correct writing the document while create is running?.
The transform mechanism of the Java API / REST API takes responsibility for the document write. At present, there's no way for the transform to supply collections to the writer. That would be a reasonable request for enhancement.
The transform shouldn't attempt to write the document, because the writer would also attempt to write the same document.
One alternative would be to transform the document in Java before writing it and specify the collection as part of the write request.
Another alternative would be to rewrite the transform as a resource service extension, implement the write within the resource service extension, and modify the Java client to send the document to the resource service extension.
Depending on the model, a final alternative might be to use a range index on an element within the document to collect documents into sets instead of using a collection on the document.
Hoping that helps,
What do you mean by "new documents"? Is the document already inserted into the MarkLogic database at the time you are adjusting the collections of it? If not, you may want to modify your return to ($result, xdmp:document-insert($uri, $result, xdmp:default-permissions(), fn:string($contname)) ) for that case.
Otherwise, can you edit your question to share the error or problem more specifically you are facing?
It is a pity that REST transforms do not allow this, like MLCP transforms do. Until changed you have the options drawn by ehennum, or you can consider delaying adding of collections to a pre- or post-commit trigger. It takes some overhead, but it sometimes makes perfect sense to do something like that in a trigger, since it makes sure it is always enforced, and a good place to do content validation, audit logging, and things like that as well.
HTH!

How to load document out of database instead of memory

Using Raven client and server #30155. I'm basically doing the following in a controller:
public ActionResult Update(string id, EditModel model)
{
var store = provider.StartTransaction(false);
var document = store.Load<T>(id);
model.UpdateEntity(document) // overwrite document property values with those of edit model.
document.Update(store); // tell document to update itself if it passes some conflict checking
}
Then in document.Update, I try do this:
var old = store.Load<T>(this.Id);
if (old.Date != this.Date)
{
// Resolve conflicts that occur by moving document period
}
store.Update(this);
Now, I run into the problem that old gets loaded out of memory instead of the database and already contains the updated values. Thus, it never goes into the conflict check.
I tried working around the problem by changing the Controller.Update method into:
public ActionResult Update(string id, EditModel model)
{
var store = provider.StartTransaction(false);
var document = store.Load<T>(id);
store.Dispose();
model.UpdateEntity(document) // overwrite document property values with those of edit model.
store = provider.StartTransaction(false);
document.Update(store); // tell document to update itself if it passes some conflict checking
}
This results in me getting a Raven.Client.Exceptions.NonUniqueObjectException with the text: Attempted to associate a different object with id
Now, the questions:
Why would Raven care if I try and associate a new object with the id as long as the new object carries the proper e-tag and type?
Is it possible to load a document in its database state (overriding default behavior to fetch document from memory if it exists there)?
What is a good solution to getting the document.Update() to work (preferably without having to pass the old object along)?
Why would Raven care if I try and associate a new object with the id as long as the new object carries the proper e-tag and type?
RavenDB leans on being able to serve the documents from memory (which is faster). By checking for persisting objects for the same id, hard to debug errors are prevented.
EDIT: See comment of Rayen below. If you enable concurrency checking / provide etag in the Store, you can bypass the error.
Is it possible to load a document in its database state (overriding default behavior to fetch document from memory if it exists there)?
Apparantly not.
What is a good solution to getting the document.Update() to work (preferably without having to pass the old object along)?
I went with refactoring the document.Update method to also have an optional parameter to receive the old date period, since #1 and #2 don't seem possible.
RavenDB supports optimistic concurrency out of the box. The only thing you need to do is to call it.
session.Advanced.UseOptimisticConcurrency = true;
See:
http://ravendb.net/docs/article-page/3.5/Csharp/client-api/session/configuration/how-to-enable-optimistic-concurrency

Sharepoint Client Object Model: Usage of Load / Update / Delete methods

Can someone explain the differences or the reasons why there're the two methods ClientContext.Load and e.g. for ListItems ListItem.RefreshLoad()? Is there a difference?
Why has the ClientContext no equivalent .Update or Delete methods?
And when do I have to call the ClientContext.ExecuteQuery method?
ListItem item = ...;
// 1. Is there a difference between ClientContext.Load(ListItem) and ListItem.RefreshLoad()?
clientContext.Load(item);
item.RefreshLoad();
// 2. Why aren't there methods like ClientContext.Update(...) or ClientContext.Delete(...)?
item.Update();
item.DeleteObject();
// 3. When is the ClientContext.ExecuteQuery needed (load / update / delete)?
clientContext.ExecuteQuery();
Thank you!
The main thing to realize is that the client object model is designed to be asynchronous from the get go.
Think of your client context object as a vessel for sending instructions and receiving data. The .Load() method queues up instructions, such as .Load(item) queuing up the instructions to retrieve data about a given list item.
The .ExecuteQuery() and .ExecuteQueryAsync() methods send those queued instructions and retrieve the results from the server.
Those operations are different from the operations you can perform against actual SharePoint objects, such as lists and list items. Consider this example from Microsoft:
ListItemCreationInformation itemCreateInfo = new ListItemCreationInformation();
ListItem newListItem = targetList.AddItem(itemCreateInfo);
newListItem["Title"] = "New Announcement";
newListItem["Body"] = "Hello World!";
newListItem.Update();
clientContext.Load(newListItem);
clientContext.ExecuteQuery(); // only at this point is the item actually created
When you create a ListItem object in the client object model, all you're doing is creating an object in local memory-- you haven't sent anything to the server yet to actually create an item in the list. The ListItem object is just a placeholder, and anything you do to it (such as create it and set its field values in the example above) is stored as instructions that need to be carried out.
When you load that object into a client context object (via clientContext.Load(newListItem) you're just feeding those instructions to your Client Context. Once you run clientContext.ExecuteQuery(), those instructions are carried out and the placeholder object gets populated with any actual relevant data returned from the server.

CRM 2011 JavaScript How to access data stored in an entity passed from a lookup control?

As the question suggests, I need to find out how to access entity data that has been passed into a JavaScript function via a lookup.
JavaScript Code Follows:
// function to generate the correct Weighting Value when these parameters change
function TypeAffectedOrRegionAffected_OnChanged(ExecutionContext, Type, Region, Weighting, Potential) {
var type = Xrm.Page.data.entity.attributes.get(Type).getValue();
var region = Xrm.Page.data.entity.attributes.get(Region).getValue();
// if we have values for both fields
if (type != null && region != null) {
// create the weighting variable
var weighting = type[0].name.substring(4) + "-" + region;
// recreate the Weighting Value
Xrm.Page.data.entity.attributes.get(Weighting).setValue(weighting);
}
}
As you can see with the following line using the name operator I can access my Type entity's Type field.
// create the weighting variable
var weighting = type[0].name.substring(4) + "-" + region;
I am looking for a way now to access the values stored inside my type object. It has the following fields new_type, new_description, new_value and new_kind.
I guess I'm looking for something like this:
// use value of entity to assign to our form field
Xrm.Page.data.entity.attributes.get(Potential).setValue(type[0].getAttribute("new_value"));
Thanks in advance for any help.
Regards,
Comic
REST OData calls are definitely the way to go in this case. You already have the id, and you just need to retrieve some additional values. Here is a sample to get you started. The hardest part with working with Odata IMHO is creating the Request Url's. There are a couple tools, that you can find on codeplex, but my favorite, is actually to use LinqPad. Just connect to your Org Odata URL, and it'll retrieve all of your entities and allow you to write a LINQ statement that will generate the URL for you, which you can test right in the browser.
For your instance, it'll look something like this (it is case sensitive, so double check that if it doesn't work):
"OdataRestURL/TypeSet(guid'" + type[0].Id.replace(/{/gi, "").replace(/}/gi, "") + "'select=new_type,new_description,new_value,new_kind"
Replace OdataRestURL with whatever your odata rest endpoint is, and you should be all set.
Yes Guido Preite is right. You need to retrieve the entity by the id which come form the lookup by Rest Sync or Async. And then get Json object. However for make the object light which is returned, you can mention which fields to be backed as part of the Json. Now you can access those fields which you want.

LINQ-to-NHibernate Filter IQueryable by Composite Field - "Could not resolve property" error

Quick background - I have a form that offers a handful of optional options to users and a search method on my service that accepts all those fields and attaches the necessary Where() conditions on the master IQueryable list.
One of those filters is a list of strings that must be compared to a combination of three different fields in the IQueryable. here's the code throwing the "could not resolve property" error:
var searchResults = _transactionHeaders.Retrieve();
if (subgroups.Any())
searchResults = searchResults.Where(s => subgroups.Contains(s.CustomerType + s.RusNumber + s.GroupNumber));
return searchResults.ToList()
I've read a few posts that suggest an alias needs to be created for any properties not directly mapped in the NHibernate mapping. I'm not quite sure that is the solution to my problem.
Suggestions? Thanks for any help you can offer.
Linq2Nhibernate can't understand a .Contains method call to. You'll have to change your query so it's compatible with linq2nhibernate.

Resources