Hazelcast mapstore based on map name - hazelcast

I have a Hazelcast instance that organizes data based on map names. The map names are dynamic, so I do not know what they will be until the Hazelcast instance is already started. I now want to store this data in a database via the MapStore functionality, but retain the organization I have setup with the map names. When looking at the MapStore functionality, I do not see any way to retrieve the map or map name that the object came from. Looks like all I get is the key-value pair in the MapStore implementation.
On a broader note, is there any way to get ANY information (not just map name) about the key-value pair needing to be stored? I would like to transfer some knowledge about how to store the data... I know the information when I call map.put(..), but I do not know how to transfer that information to the MapStore call...

I just needed something similar and found that you can implement com.hazelcast.core.MapStoreFactory interface whose 'newMapStore' method gives you the map name and properties from config. From there 'new' your own MapStore implementation passing the map name.
public class MyMapStoreFactory implements MapStoreFactory {
#Override
public MapStore newMapStore(String mapName, Properties properties) {
return new MyMapStoreImplementation(mapName, properties);
}
}
And configure this factory class in hazelcast.xml config like this:
<map name="MapX*">
<map-store enabled="true" initial-mode="EAGER">
<factory-class-name>MyMapStoreFactory</factory-class-name>
</map-store>
</map>
Notice the wildcard on the map name and that <class-name> element must not appear once you set <factory-class-name>.
Tested this with Hazelcast 3.6.1 and it works very fine.

As per my understanding, there is no support out of the box in hazelcast. The following are couple of workarounds I can think of:
Encapsulate the extra information (map name, how to store the data etc) in a context object and store it in a different java map against your key. Use this map in your MapStore implementation later to retrieve respective information which will help you to persist your key,value pairs.
You put operation might look like.
hzMap.put(key, value);
Context context = new Context();
context.setHowToStoreData();
context.setMapName();
// any othe rother information
context.xxx();
// create a unique context key which can be later deduced from (key,value) pair.
contextKey = getUniqueContextKey(key, value);
contextMap.put(contextKey, context);
In your MapStore implementation, you can then make use of this contextMap to retrieve additional values.
Second way would be to encapsulate the information within the (key, value) pair. You can create a new class called CacheEntry to store cache values as well as additional information. You can then later retrieve your cache value as well as additional information from iMap itself.
You put operation might look like.
CacheEntry<YourValueClass> cacheEntry = new CacheEntry<YourValueClass>();
cacheEntry.setValue(value);
cacheEntry.howToStoreData(..);
cacheEntry.setMapName(..);
imap.put(key, cacheEntry);
In your MapStore implementation, you can then make use of the value (which would be a CacheEntry object) itself to retrieve additional information as well as actual value (instance of YourValueClass).

Related

spring-ldap and #attributes annotation with spring-ldap 2.x ODM interface

There seems be some things missing in the Spring-LDAP ODM annotations. This is a question by way of a feature request, if there is a better way to contribute such requests, please say so.
I'd like to mark an #Attribute as read-only, so it will populate the bean from LDAP for reference, but not persist it back to ldap. I'd suggest adding an attribute read-only to #Attribute, defaulting to false, for the usual case. The default attributes of * misses all the operational attributes, some of which are very useful, and transfers more data than is required, slowing down the ldap query with attributes which will never be used.
An example of this; it would be very useful, for literally read only, such as entryUUID, etag, etc., which you cannot use if you wish to persist only some fields back to ldap, as the bean fails to persist to ldap with an exception when you save the bean. But also would be usefule for general fields which you want to structurally prevent the user from ever updating.
You can get around this by not annotating read-only fields, and then manually populating the read only fields with a separate call. Very messy and kills the query speed.
Also on a related topic, query() coudl have a default list of attributes, which you have already annotated in your classes, something like :
public static String[] getBeanAttributes(Class<?> beanClass) {
ArrayList<String> attrsObj = new ArrayList<>();
for (Field field : beanClass.getDeclaredFields()) {
if (field.isAnnotationPresent(Attribute.class)) {
Attribute attr = field.getAnnotation(Attribute.class);
attrsObj.add(attr.name());
}
}
String[] attrs = attrsObj.toArray(new String[attrsObj.size()]);
return attrs;
}
Above just returns a simple String[] of your declared attributes, to pass to query.attributes() - now i realize that as a static member, query() is built before the bean class is known, but at least there could be a helper function like the above, or a method signature for query attributes() that took a bean Class signature as an argument.
I created LDAP-312 on Jira. Thanks.

How to use ObjectContext with Model Builder?

Is there a way we can use ObjectContext with DbContext's ModelBuilder? We don't want to use POCO because we have customized property code that does not modify entire object in update, but only update modified properties. Also we have lots of serialisation and auditing code that uses EntityObject.
Since poco does create a proxy with EntityObject, we want our classes to be derived from EntityObject. We don't want proxy. We also heavily use CreateSourceQuery. The only problem is EDMX file and its big connection string syntax web.config.
Is there any way I can get rid of EDMX file? It will be useful as we can dynamically compile new class based on reverse engineering database.
I would also like to use DbContext with EntityObject instead of poco.
Internal Logic
Access Modified Properties in Save Changes which is available in ObjectStateEntry and Save them onto Audit with Old and New Values
Most of times we need to only check for Any condition on Navigation Property for example
User.EmailAddresses.CreateSourceQuery()
.Any( x=> x.EmailAddress == givenAddress);
Access Property Attributes, such as XmlIgnore etc, we rely heavily on attributes defined on the properties.
A proxy for a POCO is a dynamically created class which derives from (inherits) a POCO. It adds functionality previously found in EntityObject, namely lazy loading and change tracking, as long as a POCO meets requirements. A POCO or its proxy does not contain an EntityObject as the question suggests, but rather a proxy contains functionality of EntityObject. You cannot (AFAIK) use ModelBuilder with EntityObject derivatives and you cannot get to an underlying EntityObject from a POCO or a proxy, since there isn't one as such.
I don't know what features of ObjectContext does your existing serialisation and auditing code use, but you can get to ObjectContext from a DbContext by casting a DbContext to a IObjectContextAdapter and accessing IObjectContextAdapter.ObjectContext property.
EDIT:
1. Access Modified Properties in Save Changes which is available in ObjectStateEntry and Save them onto Audit with Old and New Values
You can achieve this with POCOs by using DbContext.ChangeTracker. First you call DbContext.ChangeTracker.DetectChanges to detect the changes (if you use proxies this is not needed, but can't hurt) and then you use DbCotnext.Entries.Where(e => e.State != EntityState.Unchanged && e.State != EntityState.Detached) to get DbEntityEntry list of changed entities for auditing. Each DbEntityEntry has OriginalValues and CurrentValues and the actual Entity is in property Entity.
You also have access to ObjectStateEntry, see below.
2. Most of times we need to only check for Any condition on Navigation Property for example:
User.EmailAddresses.CreateSourceQuery().Any( x=> x.EmailAddress == givenAddress);
You can use CreateSourceQuery() with DbContext by utilizing IObjectContextAdapter as described previously. When you have ObjectContext you can get to the source query for a related end like this:
public static class DbContextUtils
{
public static ObjectQuery<TMember> CreateSourceQuery<TEntity, TMember>(this IObjectContextAdapter adapter, TEntity entity, Expression<Func<TEntity, ICollection<TMember>>> memberSelector) where TMember : class
{
var objectStateManager = adapter.ObjectContext.ObjectStateManager;
var objectStateEntry = objectStateManager.GetObjectStateEntry(entity);
var relationshipManager = objectStateManager.GetRelationshipManager(entity);
var entityType = (EntityType)objectStateEntry.EntitySet.ElementType;
var navigationProperty = entityType.NavigationProperties[(memberSelector.Body as MemberExpression).Member.Name];
var relatedEnd = relationshipManager.GetRelatedEnd(navigationProperty.RelationshipType.FullName, navigationProperty.ToEndMember.Name);
return ((EntityCollection<TMember>)relatedEnd).CreateSourceQuery();
}
}
This method uses no dynamic code and is strongly typed since it uses expressions. You use it like this:
myDbContext.CreateSourceQuery(invoice, i => i.details);

Storing object in Esent persistent dictionary gives: Not supported for SetColumn Parameter error

I am trying to save an Object which implements an Interface say IInterface.
private PersistentDictionary<string, IInterface> Object = new PersistentDictionary<string, IInterface>(Environment.CurrentDirectory + #"\Object");
Since many classes implement the same interface(all of which need to cached), for a generic approach I want to store an Object of type IInterface in the dictionary.
So that anywhere I can pull out that object type cast it as IInterface and use that object's internal implementation of methods etc..
But, as soon as the Esent cache is initialized it throws this error:
Not supported for SetColumn
Parameter name: TColumn
Actual value was IInterface.
I have tried to not use XmlSerializer to do the same but is unable to deserialize an Interface type.Also, [Serializable] attribute cannot be used on top of a Interface, so I am stuck.
I have also tried to make all the implementations(classes) of the Interface as [Serializable] as a dying attempt but to no use.
Does any one know a way out ? Thanks in advance !!!
The only reason that only structs are supported (as well as some basic immutable classes such as string) is that the PersistentDictionary is meant to be a drop-in replacement for Dictionary, SortedDictionary and other similar classes.
Suppose I have the following code:
class MyClass
{
int val;
}
.
.
.
var dict = new Dictionary<int,MyClass>();
var x = new MyClass();
x.val = 1;
dict.Add(0,x);
x.val = 2;
var y = dict[0];
Console.WriteLine(y.val);
The output in this case would be 2. But if I'd used the PersistentDictionary instead of the regular one, the output would be 1. The class was created with value 1, and then changed after it was added to the dictionary. Since a class is a reference type, when we retrieve the item from the dictionary, we will also have the changed data.
Since the PersistentDictionary writes the data to disk, it cannot really handle reference types this way. Serializing it, and writing it to disk is essentially the same as treating the object as a value type (an entire copy is made).
Because it's intended to be used instead of the standard dictionaries, and the fact that it cannot handle reference types with complete transparency, the developers instead opted to support only structs, because structs are value types already.
However, if you're aware of this limitation and promise to be careful not to fall into this trap, you can allow it to serialize classes quite easily. Just download the source code and compile your own version of the EsentCollections library. The only change you need to make to it is to change this line:
if (!(type.IsValueType && type.IsSerializable))
to this:
if (!type.IsSerializable)
This will allow classes to be written to the PersistentDictionary as well, provided that it's Serializable, and its members are Serializable as well. A huge benefit is that it will also allow you to store arrays in there this way. All you have to keep in mind is that it's not a real dictionary, therefore when you write an object to it, it will store a copy of the object. Therefore, updating any of your object's members after adding them to the PersistentDictionary will not update the copy in the dictionary automatically as well, you'd need to remember to update it manually.
PersistentDictionary can only store value-structs and a very limited subset of classes (string, Uri, IPAddress). Take a look at ColumnConverter.cs, at private static bool IsSerializable(Type type) for the full restrictions. You'd be hitting the typeinfo.IsValueType() restriction.
By the way, you can also try posting questions about PersistentDictionary at http://managedesent.codeplex.com/discussions .
-martin

How to auto-generate early bound properties for Entity specific (ie Local) Option Set text values?

After spending a year working with the Microsoft.Xrm.Sdk namespace, I just discovered yesterday the Entity.FormattedValues property contains the text value for Entity specific (ie Local) Option Set texts.
The reason I didn't discover it before, is there is no early bound method of getting the value. i.e. entity.new_myOptionSet is of type OptionSetValue which only contains the int value. You have to call entity.FormattedValues["new_myoptionset"] to get the string text value of the OptionSetValue.
Therefore, I'd like to get the crmsrvcutil to auto-generate a text property for local option sets. i.e. Along with Entity.new_myOptionSet being generated as it currently does, Entity.new_myOptionSetText would be generated as well.
I've looked into the Microsoft.Crm.Services.Utility.ICodeGenerationService, but that looks like it is mostly for specifying what CodeGenerationType something should be...
Is there a way supported way using CrmServiceUtil to add these properties, or am I better off writing a custom app that I can run that can generate these properties as a partial class to the auto-generated ones?
Edit - Example of the code that I would like to be generated
Currently, whenever I need to access the text value of a OptionSetValue, I use this code:
var textValue = OptionSetCache.GetText(service, entity, e => e.New_MyOptionSet);
The option set cache will use the entity.LogicalName, and the property expression to determine the name of the option set that I'm asking for. It will then query the SDK using the RetrieveAttriubteRequest, to get a list of the option set int and text values, which it then caches so it doesn't have to hit CRM again. It then looks up the int value of the New_MyOptionSet of the entity and cross references it with the cached list, to get the text value of the OptionSet.
Instead of doing all of that, I can just do this (assuming that the entity has been retrieved from the server, and not just populated client side):
var textValue = entity.FormattedValues["new_myoptionset"];
but the "new_myoptionset" is no longer early bound. I would like the early bound entity classes that gets generated to also generate an extra "Text" property for OptionSetValue properties that calls the above line, so my entity would have this added to it:
public string New_MyOptionSetText {
return this.GetFormattedAttributeValue("new_myoptionset"); // this is a protected method on the Entity class itself...
}
Could you utilize the CrmServiceUtil extension that will generate enums for your OptionSets and then add your new_myOptionSetText property to a partial class that compares the int value to the enums and returns the enum string
Again, I think specifically for this case, getting CrmSvcUtil.exe to generate the code you want is a great idea, but more generally, you can access the property name via reflection using an approach similar to the accepted answer # workarounds for nameof() operator in C#: typesafe databinding.
var textValue = entity.FormattedValues["new_myoptionset"];
// becomes
var textValue = entity.FormattedValues
[
// renamed the class from Nameof to NameOf
NameOf(Xrm.MyEntity).Property(x => x.new_MyOptionSet).ToLower()
];
The latest version of the CRM Early Bound Generator includes a Fields struct that that contains the field names. This allows accessing the FormattedValues to be as simple as this:
var textValue = entity.FormattedValues[MyEntity.Fields.new_MyOptionSet];
You could create a new property via an interface for the CrmSvcUtil, but that's a lot of work for a fairly simple call, and I don't think it justifies creating additional properties.

groovy domain objects in Db4O database

I'm using db4o with groovy (actually griffon). I'm saving dozen of objects into db4o objectSet and see that .yarv file size is about 11Mb. I've checked its content and found that it stores metaClass with all nested fields into every object. It's a waste of space.
Looking for the way to avoid storing of metaClass and therefore reduce the size of result .yarv file, since I'm going to use db4o to store millions of entities.
Should I try callConstructors(true) db4o configuration? Think it would help?
Any help would be highly appreciated.
As an alternative you can just store 'Groovy'-beans instances. Those are compiled down to regular Java-ish classes with no special Groovy specific code attached to them.
Just like this:
class Customer {
// properties
Integer id
String name
Address address
}
class Address{
String street;
}
def customer = new Customer(id:1, name:"Gromit", address:new Address(street:"Fun"))
I don't know groovy but based on your description every groovy object carries metadata and you want to skip storing these objects.
If that is the case installing a "null translator" (TNull class) will cause the "translated" objects to not be stored.
PS: Call Constructor configuration has no effect on what gets stored in the db; it only affects how objects are instantiated when reading from db.
Hope this helps

Resources