SPPersistedObject and List<T> - sharepoint

I want sharepoint to "persist" a List of object
I wrote a class SPAlert wich inherit from SPPersistedObject :
public class SMSAlert: SPPersistedObject
{
[Persisted]
private DateTime _scheduledTime;
[Persisted]
private Guid _listId;
[Persisted]
private Guid _siteID;
}
Then I wrote a class wich inherit from SPJobDefinition an add a List of my previous object:
public sealed class MyCustomJob: SPJobDefinition
{
[Persisted]
private List<SMSAlert> _SMSAlerts;
}
The problem is :
when I call the Update method of y MyCustomJob:
myCustomJob.Update();
It throw an exception :
message :
An object in the SharePoint
administrative framework, depends on
other objects which do not exist.
Ensure that all of the objects
dependencies are created and retry
this operation.
stack
at
Microsoft.SharePoint.Administration.SPConfigurationDatabase.StoreObject(SPPersistedObject
obj, Boolean storeClassIfNecessary,
Boolean ensure) at
Microsoft.SharePoint.Administration.SPConfigurationDatabase.PutObject(SPPersistedObject
obj, Boolean ensure) at
Microsoft.SharePoint.Administration.SPPersistedObject.Update()
at
Microsoft.SharePoint.Administration.SPJobDefinition.Update()
at
Sigi.Common.AlertBySMS.SmsAlertHandler.ScheduleJob(SPWeb
web, SPAlertHandlerParams ahp)
inner exception
An object in the SharePoint
administrative framework depends on
other objects which do not exist.
The INSERT statement conflicted with
the FOREIGN KEY constraint
"FK_Dependencies1_Objects".
The conflict occurred in database
"SharePoint_Config, table
"dbo.Objects", column 'Id'. The
statement has been terminated.
Can anyone help me with that??

Ensure that your class is marked with a unique GUID, using [System.Runtime.InteropServices.Guid("GUID")] and ensure that the persisted object's class has a default constructor. Hope this help.

Both the above suggestions are very important, adding a Guid attribute and ensuring you have a default constructor. Not only for your persisted SMSAlert object, but make sure you have these for your SPJobDefinition as well.
Additionally, if you create collections of SPPersistedObject you have to ensure that each object in the collection is also updated. A better alternative is to make SMSAlert an SPAutoSerializingObject. Collections of SPAutoSerializingObject, as the name implies, are automatically serialized.
For more information on persisted objects see this extremely useful post:
http://www.chaholl.com/archive/2011/01/30/the-skinny-on-sppersistedobject-and-the-hierarchical-object-store-in.aspx

Did you specify the default constructor for SMSAlert?

Related

How to use the strategy pattern with managed objects

I process messages from a queue. I use data from the incoming message to determine which class to use to process the message; for example origin and type. I would use the combination of origin and type to look up a FQCN and use reflection to instantiate an object to process the message. At the moment these processing objects are all simple POJOs that implement a common interface. Hence I am using a strategy pattern.
The problem I am having is that all my external resources (mostly databases accessed via JPA) are injected (#Inject) and when I create the processing object as described above all these injected objects are null. The only way I know to populate these injected resources is to make each implementation of the interface a managed bean by adding #stateless. This alone does not solve the problem because the injected members are only populated if the class implementing the interface is itself injected (i.e. container managed) as opposed to being created by me.
Here is a made up example (sensitive details changed)
public interface MessageProcessor
{
public void processMessage(String xml);
}
#Stateless
public VisaCreateClient implements MessageProcessor
{
#Inject private DAL db;
…
}
public MasterCardCreateClient implements MessageProcessor…
In the database there is an entry "visa.createclient" = "fqcn.VisaCreateClient", so if the message origin is "Visa" and the type is "Create Client" I can look up the appropriate processing class. If I use reflection to create VisaCreateClient the db variable is always null. Even if I add the #Stateless and use reflection the db variable remains null. It's only when I inject VisaCreateClient will the db variable get populated. Like so:
#Stateless
public QueueReader
{
#Inject VisaCreateClient visaCreateClient;
#Inject MasterCardCreateClient masterCardCreateClient;
#Inject … many more times
private Map<String, MessageProcessor> processors...
private void init()
{
processors.put("visa.createclient", visaCreateClient);
processors.put("mastercard.createclient", masterCardCreateClient);
… many more times
}
}
Now I have dozens of message processors and if I have to inject each implementation then register it in the map I'll end up with dozens of injections. Also, should I add more processors I have to modify the QueueReader class to add the new injections and restart the server; with my old code I merely had to add an entry into the database and deploy the new processor on the class path - didn't even have to restart the server!
I have thought of two ways to resolve this:
Add an init(DAL db, OtherResource or, ...) method to the interface that gets called right after the message processor is created with reflection and pass the required resource. The resource itself was injected into the QueueReader.
Add an argument to the processMessage(String xml, Context context) where Context is just a map of resources that were injected into the QueueReader.
But does this approach mean that I will be using the same instance of the DAL object for every message processor? I believe it would and as long as there is no state involved I believe it is OK - any and all transactions will be started outside of the DAL class.
So my question is will my approach work? What are the risks of doing it that way? Is there a better way to use a strategy pattern to dynamically select an implementation where the implementation needs access to container managed resources?
Thanks for your time.
In a similar problem statement I used an extension to the processor interface to decide which type of data object it can handle. Then you can inject all variants of the handler via instance and simply use a loop:
public interface MessageProcessor
{
public boolean canHandle(String xml);
public void processMessage(String xml);
}
And in your queueReader:
#Inject
private Instance<MessageProcessor> allProcessors;
public void handleMessage(String xml) {
MessageProcessor processor = StreamSupport.stream(allProcessors.spliterator(), false)
.filter(proc -> proc.canHandle(xml))
.findFirst()
.orElseThrow(...);
processor.processMessage(xml);
}
This does not work on a running server, but to add a new processor simply implement and deploy.

How to access Obsolete variables in graph extension

I have field declared with obsolete in some other project, how i can access this field in my extended graph.
[Obsolete("Use AMShiftMst.shiftType", true)]
public abstract class shftDiff : IBqlField, IBqlOperand
{
protected shftDiff();
}
You cannot use this class shftDiff because the ObsoleteAttribute marks any usage as an error. The message in the obsolete call mentions the new object to use.
See ObsoleteAttribute(String, Boolean) here:
ObsoleteAttribute Class
However related to the product it looks like the message should indicate to use AMShiftMst.shftDiff and not AMShiftMst.shiftType as shiftType doesn't exist.
Using AMShiftMSt.shftDiff should solve your issue.

spring-ldap and #attributes annotation with spring-ldap 2.x ODM interface

There seems be some things missing in the Spring-LDAP ODM annotations. This is a question by way of a feature request, if there is a better way to contribute such requests, please say so.
I'd like to mark an #Attribute as read-only, so it will populate the bean from LDAP for reference, but not persist it back to ldap. I'd suggest adding an attribute read-only to #Attribute, defaulting to false, for the usual case. The default attributes of * misses all the operational attributes, some of which are very useful, and transfers more data than is required, slowing down the ldap query with attributes which will never be used.
An example of this; it would be very useful, for literally read only, such as entryUUID, etag, etc., which you cannot use if you wish to persist only some fields back to ldap, as the bean fails to persist to ldap with an exception when you save the bean. But also would be usefule for general fields which you want to structurally prevent the user from ever updating.
You can get around this by not annotating read-only fields, and then manually populating the read only fields with a separate call. Very messy and kills the query speed.
Also on a related topic, query() coudl have a default list of attributes, which you have already annotated in your classes, something like :
public static String[] getBeanAttributes(Class<?> beanClass) {
ArrayList<String> attrsObj = new ArrayList<>();
for (Field field : beanClass.getDeclaredFields()) {
if (field.isAnnotationPresent(Attribute.class)) {
Attribute attr = field.getAnnotation(Attribute.class);
attrsObj.add(attr.name());
}
}
String[] attrs = attrsObj.toArray(new String[attrsObj.size()]);
return attrs;
}
Above just returns a simple String[] of your declared attributes, to pass to query.attributes() - now i realize that as a static member, query() is built before the bean class is known, but at least there could be a helper function like the above, or a method signature for query attributes() that took a bean Class signature as an argument.
I created LDAP-312 on Jira. Thanks.

Adding attributes to parent class fields and properties

In my example to illustrate my use case I have a parent class that is purposely database agnostic (let's say I can't change the code of it for some reasons, because the class come from a commercial assembly or the .net framework or are auto generated by entity framework):
public class Father
{
public string Field1;
public string Field2;
}
Now I'd like to store an object derived from it into MongoDB (again, it's only for the example, there a lot of other use cases and my question has nothing to do with MongoDB):
public class Child:Father
{
public ObjectId Id;
public DateTime DateCreation;
}
But I'd like to add attributes to some elements of the father, like [BsonIgnoreIfNull], without overriding (they are not marked as virtual) or having to fully reimplement the Father in my Child class.
What would be the cleanest way to do this?
Thanks!

App Fabric : While GET misses an Enum property

I have a class marked as CollectionDataContract which has a enum member. When I place an object of this class in Appfabric, I am through. When I get it back from App fabric, it does not deserialize the enum member. But I am not sure if the enum has been missed out in Serialization part itself.
Please do help.
If you need more information let me know.
Thanks.
[CollectionDataContract]
public partial class RuleConditionList : List<IRuleCondition>, IRuleCondition
{
public LogicalOperator Operator;
}
where LogicalOperator is an enum
I think there is a problem when serializing/deserializing your object. AppFabric uses the NetDataContractSerializer class for serialization before storing the items in the cache.
You can use the Net­Dat­a­Con­tract­Se­ri­al­izer on any type which are marked with the Dat­a­Con­trac­tAt­tribute or Seri­al­iz­ableAt­tribute, or types that imple­ment the ISe­ri­al­iz­able interface.
So depending and your object, there should be something wrong like a private type, a private field, a missing attibute, ...
Edit
You should add DataMember to your field.
[DataMember]
public LogicalOperator Operator;
Any data member in a class marked with Collection data contract cannot be serialized by NetdataContractSerializer which is the serailization technique used by App fabric for storing data.
To make things work we have two options:
Make a wrapper for RuleConditionList
Instead of Inheriting from List, make it as a property and change the attribute as DataContract.

Resources