Hazelcast distributed query using indexes - hazelcast

I am trying to query Hazelcast map using predicates. My predicate code works fine without indexes but for better performance, I want to put index on the key my Hazelcast map.
Map Structure: IMap<Event, Long> - Event is a POJO class.
<map name="event.map">
<in-memory-format>NATIVE</in-memory-format>
<backup-count>2</backup-count>
<async-backup-count>0</async-backup-count>
<time-to-live-seconds>30</time-to-live-seconds>
<max-idle-seconds>0</max-idle-seconds>
<eviction-policy>LFU</eviction-policy>
<max-size policy="FREE_NATIVE_MEMORY_PERCENTAGE">25</max-size>
<cache-deserialized-values>INDEX-ONLY</cache-deserialized-values>
<indexes>
<index ordered="true">eventType</index>
</indexes>
</map>
With the above Map config, I get following exception:
Jul 19, 2019 10:04:21 PM com.hazelcast.map.impl.operation.HDEntryOperation
SEVERE: [127.0.0.1]:5701 [dev] [3.11.2] java.lang.IllegalArgumentException: There is no suitable accessor for 'eventType' on class 'java.lang.Long'
com.hazelcast.query.QueryException: java.lang.IllegalArgumentException: There is no suitable accessor for 'eventType' on class 'java.lang.Long'
at com.hazelcast.query.impl.getters.ReflectionHelper.createGetter(ReflectionHelper.java:175)
at com.hazelcast.query.impl.getters.Extractors.instantiateGetter(Extractors.java:124)
at com.hazelcast.query.impl.getters.Extractors.getGetter(Extractors.java:101)
at com.hazelcast.query.impl.getters.Extractors.extract(Extractors.java:63)
at com.hazelcast.query.impl.QueryableEntry.extractAttributeValueFromTargetObject(QueryableEntry.java:144)
at com.hazelcast.query.impl.QueryableEntry.extractAttributeValue(QueryableEntry.java:82)
at com.hazelcast.query.impl.QueryableEntry.getAttributeValue(QueryableEntry.java:48)
at com.hazelcast.query.impl.QueryableEntry.getConverter(QueryableEntry.java:67)
at com.hazelcast.query.impl.IndexImpl.saveEntryIndex(IndexImpl.java:79)
at com.hazelcast.query.impl.Indexes.saveEntryIndex(Indexes.java:164)
at com.hazelcast.map.impl.recordstore.AbstractRecordStore.saveIndex(AbstractRecordStore.java:165)
at com.hazelcast.map.impl.recordstore.DefaultRecordStore.putInternal(DefaultRecordStore.java:709)
at com.hazelcast.map.impl.recordstore.DefaultRecordStore.setWithUncountedAccess(DefaultRecordStore.java:987)
at com.hazelcast.map.impl.operation.EntryOperator.onAddedOrUpdated(EntryOperator.java:288)
at com.hazelcast.map.impl.operation.EntryOperator.doPostOperateOps(EntryOperator.java:219)
at com.hazelcast.map.impl.operation.HDEntryOperation.runVanilla(HDEntryOperation.java:257)
at com.hazelcast.map.impl.operation.HDEntryOperation.runInternal(HDEntryOperation.java:95)
at com.hazelcast.map.impl.operation.HDMapOperation.run(HDMapOperation.java:88)
at com.hazelcast.spi.Operation.call(Operation.java:170)
at com.hazelcast.spi.impl.operationservice.impl.OperationRunnerImpl.call(OperationRunnerImpl.java:208)
at com.hazelcast.spi.impl.operationservice.impl.OperationRunnerImpl.run(OperationRunnerImpl.java:197)
at com.hazelcast.spi.impl.operationexecutor.impl.OperationExecutorImpl.run(OperationExecutorImpl.java:407)
at com.hazelcast.spi.impl.operationexecutor.impl.OperationExecutorImpl.runOrExecute(OperationExecutorImpl.java:434)
at com.hazelcast.spi.impl.operationservice.impl.Invocation.doInvokeLocal(Invocation.java:586)
at com.hazelcast.spi.impl.operationservice.impl.Invocation.doInvoke(Invocation.java:571)
at com.hazelcast.spi.impl.operationservice.impl.Invocation.invoke0(Invocation.java:530)
at com.hazelcast.spi.impl.operationservice.impl.Invocation.invoke(Invocation.java:220)
at com.hazelcast.spi.impl.operationservice.impl.InvocationBuilderImpl.invoke(InvocationBuilderImpl.java:60)
at com.hazelcast.client.impl.protocol.task.AbstractPartitionMessageTask.processMessage(AbstractPartitionMessageTask.java:67)
at com.hazelcast.client.impl.protocol.task.AbstractMessageTask.initializeAndProcessMessage(AbstractMessageTask.java:123)
at com.hazelcast.client.impl.protocol.task.AbstractMessageTask.doRun(AbstractMessageTask.java:111)
at com.hazelcast.client.impl.protocol.task.AbstractMessageTask.run(AbstractMessageTask.java:101)
at com.hazelcast.spi.impl.operationservice.impl.OperationRunnerImpl.run(OperationRunnerImpl.java:161)
at com.hazelcast.spi.impl.operationexecutor.impl.OperationThread.process(OperationThread.java:159)
at com.hazelcast.spi.impl.operationexecutor.impl.OperationThread.process(OperationThread.java:127)
at com.hazelcast.spi.impl.operationexecutor.impl.OperationThread.run(OperationThread.java:110)
Caused by: java.lang.IllegalArgumentException: There is no suitable accessor for 'eventType' on class 'java.lang.Long'
at com.hazelcast.query.impl.getters.ReflectionHelper.createGetter(ReflectionHelper.java:168)
... 35 more
From the exception I understand that Hazelcast is trying to apply the index on the Value field of the IMap.
Is there a way index can be put on the Key field of the IMap?

Try
<indexes>
<index ordered="true">__key.eventType</index>
</indexes>
As a key-value store, it's usual to search on the values, so that's what the index expects. When you put eventType in the index it looks for that field in
the value. You need to change it to __key.eventType to make it look in the key.
If you need frequent search access to part of a composite primary key, as the index implements, then the choice of primary key structure may need reviewed.
Also, if you can, upgrade from 3.11.2 to 3.12.1. There are some querying improvements behind the scenes.

Related

Hazelcast Supplier and Aggregation gives Concurrent Execution Exception

I am trying to get a set of the distinct values of an object's field stored in a Hazelcast map.
This line of java code:
instructions.aggregate(Supplier.all(value -> value.getWorkArea()), Aggregations.distinctValues());
has the following stacktrace :
java.util.concurrent.ExecutionException: com.hazelcast.nio.serialization.HazelcastSerializationException: java.lang.ClassNotFoundException: com.example.instruction.repository.HazelcastInstructionRepository$GeneratedEvaluationClass
com.hazelcast.nio.serialization.HazelcastSerializationException: java.lang.ClassNotFoundException: com.example.instruction.repository.HazelcastInstructionRepository$GeneratedEvaluationClass
java.lang.ClassNotFoundException: com.example.instruction.repository.HazelcastInstructionRepository$GeneratedEvaluationClass
If I were to try this line :
instructions.aggregate(Supplier.all()), Aggregations.distinctValues());
or:
instructions.aggregate(Supplier.fromPredicate(Predicates.and(Predicates.equal("type", "someType"), equal("groupId", null),
Predicates.equal("workArea", "someWorkArea"))), Aggregations.distinctValues());
It just works ... It seems to be something wrong when I am making a reference to the object's field. (I also tried it with other fields of the object and the same error gets returned)
This is running on my local environment and I am sure that the objects are being placed correctly in the Hazelcast map since the other aggregations/predicates are working.
Do you have any ideas about what am I doing wrong?
Many Thanks!
EDITED: So the problem is the closure. It's not available on all nodes. Only on the calling node.
Also. This feature is deprecated. Plz use the fast-aggregations instead.
http://docs.hazelcast.org/docs/latest/manual/html-single/#fast-aggregations

how to write class to do Database operation - spring integeration

i have a code
<int-jpa:updating-outbound-gateway
request-channel="nativeQlChannel" auto-startup="true"
native-query="update Transactions t set t.transaction_Status = :transactionStatus where t.bank_Reference_Number = :bankReferenceNumber "
entity-manager="entityManager" persist-mode="PERSIST" reply-channel="nativeQlChannelOne"
use-payload-as-parameter-source="false">
it works fine , but as i need to execute insert operation on more than one table , and i am not able to do that by this configuration ,
how can i write code using spring integration JPA class like JpaOutboundGatewayFactoryBean , or any other so that i can perform DB operation in my java code.
First of all it isn't JPA responsibility to worry about "more than one table".
It operates only with the entities as a high level abstraction.
Although yes, you can map your entity to several tables. Plus there is cascade insert when you have dependencies.
In addition that component supports native-query/native-query if you need more graceful control for your DB operation.
As for Java configuration correct: you should use JpaOutboundGatewayFactoryBean #Bean together with the #ServiceActivator to reach similar behavior.
You can find more samples in the Reference Manual.
i am able to fix it by using
ExpressionEvaluatingParameterSourceFactory
ExpressionEvaluatingParameterSourceFactory paramFactory=new ExpressionEvaluatingParameterSourceFactory() ;
paramFactory.setParameters(paramList);
but i got exception
WARN - o.s.i.e.ExpressionUtils: Creating EvaluationContext with no beanFactory
java.lang.RuntimeException: No beanfactory
although it is not stopping any functionality . so i just ignored it.

Spring Integration jdbc stored procedure custom rowmapper

I am newbie to Spring Integration and am using Spring 4.2.4.
I am trying to invoke stored procedure with jdbc:stored-proc-outbound-gateway. i am already using spring jdbc.
Stored procedure is returning cursor and am using customRowMapper like below
new SqlOutParameter(A_RC, OracleTypes.CURSOR, null, new MyCustomDataExtractor())
MyCustomDataExtractor implements SqlReturnType and it returns custom object.
Now the question is how can I achieve this in SI jdbc stored proc.piece of my code here..
...
<int-jdbc:sql-parameter-definition name="A_RC" type="#{T(oracle.jdbc.OracleTypes).CURSOR}" direction="OUT"/>
...
<int-jdbc:returning-resultset name="A_RC" row-mapper="a.b.c.MyCustomDataExtractor"/>
...
Spring expect this as a row mapper. should I use any transformer here? Please advice.
Note : I have to return multiple resultset.
Actually with the CURSOR type you are good to go with just a returning-resultset and RowMapper implementation.
With that you don't need to worry about any SqlReturnType and just map the row to your domain object directly.
I'm even sure you can rework your MyCustomDataExtractor to the RowMapper contract.
Note: with returning-resultset defintion you don't need to
specify the sql-parameter-definition for the same OUT param. The component identifies them correctly as a OutParameter.
And, yes you can have several returning-resultset for CURSOR parameters.
I added the return-type in the sql-parameter-definition and removed returning-resultset.
<int-jdbc:sql-parameter-definition name="A_RC" type="#{T(oracle.jdbc.OracleTypes).CURSOR}" direction="OUT" return-type="ed"/>
and here "ed" is nothing but the bean reference for a.b.c.MyCustomDataExtractor.
<bean id="ed" class="a.b.c.MyCustomDataExtractor"/>

Hazelcast mapstore based on map name

I have a Hazelcast instance that organizes data based on map names. The map names are dynamic, so I do not know what they will be until the Hazelcast instance is already started. I now want to store this data in a database via the MapStore functionality, but retain the organization I have setup with the map names. When looking at the MapStore functionality, I do not see any way to retrieve the map or map name that the object came from. Looks like all I get is the key-value pair in the MapStore implementation.
On a broader note, is there any way to get ANY information (not just map name) about the key-value pair needing to be stored? I would like to transfer some knowledge about how to store the data... I know the information when I call map.put(..), but I do not know how to transfer that information to the MapStore call...
I just needed something similar and found that you can implement com.hazelcast.core.MapStoreFactory interface whose 'newMapStore' method gives you the map name and properties from config. From there 'new' your own MapStore implementation passing the map name.
public class MyMapStoreFactory implements MapStoreFactory {
#Override
public MapStore newMapStore(String mapName, Properties properties) {
return new MyMapStoreImplementation(mapName, properties);
}
}
And configure this factory class in hazelcast.xml config like this:
<map name="MapX*">
<map-store enabled="true" initial-mode="EAGER">
<factory-class-name>MyMapStoreFactory</factory-class-name>
</map-store>
</map>
Notice the wildcard on the map name and that <class-name> element must not appear once you set <factory-class-name>.
Tested this with Hazelcast 3.6.1 and it works very fine.
As per my understanding, there is no support out of the box in hazelcast. The following are couple of workarounds I can think of:
Encapsulate the extra information (map name, how to store the data etc) in a context object and store it in a different java map against your key. Use this map in your MapStore implementation later to retrieve respective information which will help you to persist your key,value pairs.
You put operation might look like.
hzMap.put(key, value);
Context context = new Context();
context.setHowToStoreData();
context.setMapName();
// any othe rother information
context.xxx();
// create a unique context key which can be later deduced from (key,value) pair.
contextKey = getUniqueContextKey(key, value);
contextMap.put(contextKey, context);
In your MapStore implementation, you can then make use of this contextMap to retrieve additional values.
Second way would be to encapsulate the information within the (key, value) pair. You can create a new class called CacheEntry to store cache values as well as additional information. You can then later retrieve your cache value as well as additional information from iMap itself.
You put operation might look like.
CacheEntry<YourValueClass> cacheEntry = new CacheEntry<YourValueClass>();
cacheEntry.setValue(value);
cacheEntry.howToStoreData(..);
cacheEntry.setMapName(..);
imap.put(key, cacheEntry);
In your MapStore implementation, you can then make use of the value (which would be a CacheEntry object) itself to retrieve additional information as well as actual value (instance of YourValueClass).

Groovy Griffon with Hibernate4 plugin class not found exception

I was looking for a fast and easy way to write a very cross platform desktop application. This leads me to thinking that the JVM is the place to be. Since Groovy (Grails) is used in my workplace I thought I would try Griffon since they claim it is essentially Grails for the desktop.
I wanted a persistence management layer and it doesn't not appear that GORM is showtime ready in this environment so I moved towards hibernate using the Hibernate4 plugin for Griffon.
Not that I've really used Hibernate in general however I believe, based on the guides, that I am doing things correctly. My gatherings indicate that this doesn't support annotations to wire up classes so I am using hbm.xml files.
The provided sample for the plugin isn't complex but I don't understand where I am deviating.
Here is a sample class file as it stands:
package gwash
import groovy.beans.Bindable
class DeliveryMethodModel {
// #Bindable String propName
}
Here is some of the stack trace:
org.hibernate.InvalidMappingException: Could not parse mapping document from res
ource gwash\DeliveryMethod.hbm.xml
at org.hibernate.cfg.Configuration$MetadataSourceQueue.processHbmXml(Con
figuration.java:3415)
at org.hibernate.cfg.Configuration$MetadataSourceQueue.processHbmXmlQueu
e(Configuration.java:3404)
at org.hibernate.cfg.Configuration$MetadataSourceQueue.processMetadata(C
onfiguration.java:3392)
at org.hibernate.cfg.Configuration.secondPassCompile(Configuration.java:
1341)
at org.hibernate.cfg.Configuration.buildSessionFactory(Configuration.jav
a:1737)
at org.hibernate.cfg.Configuration.buildSessionFactory(Configuration.jav
a:1788)
at org.hibernate.cfg.Configuration$buildSessionFactory.call(Unknown Sour
ce)
at griffon.plugins.hibernate4.Hibernate4Connector.connect(Hibernate4Conn
ector.groovy:72)
at griffon.plugins.hibernate4.Hibernate4Connector.connect(Hibernate4Conn
ector.groovy)
at griffon.plugins.hibernate4.Hibernate4Connector$connect.call(Unknown S
ource)
at Hibernate4GriffonAddon.addonInit(Hibernate4GriffonAddon.groovy:27)
at griffon.core.GriffonAddon$addonInit.call(Unknown Source)
at griffon.core.GriffonAddon$addonInit.call(Unknown Source)
at org.codehaus.griffon.runtime.util.AddonHelper.handleAddon(AddonHelper
.groovy:155)
at org.codehaus.griffon.runtime.util.AddonHelper.handleAddonsAtStartup(A
ddonHelper.groovy:105)
at org.codehaus.griffon.runtime.core.DefaultAddonManager.doInitialize(De
faultAddonManager.java:33)
at org.codehaus.griffon.runtime.core.AbstractAddonManager.initialize(Abs
tractAddonManager.java:101)
at org.codehaus.griffon.runtime.util.GriffonApplicationHelper.initialize
AddonManager(GriffonApplicationHelper.java:320)
at org.codehaus.griffon.runtime.util.GriffonApplicationHelper.prepare(Gr
iffonApplicationHelper.java:123)
at org.codehaus.griffon.runtime.core.AbstractGriffonApplication.initiali
ze(AbstractGriffonApplication.java:221)
at griffon.swing.AbstractSwingGriffonApplication.bootstrap(AbstractSwing
GriffonApplication.java:74)
at griffon.swing.AbstractSwingGriffonApplication.run(AbstractSwingGriffo
nApplication.java:131)
at griffon.swing.SwingApplication.main(SwingApplication.java:36)
Caused by: org.hibernate.PropertyNotFoundException: field [id] not found on gwas
h.DeliveryMethodModel
at org.hibernate.property.DirectPropertyAccessor.getField(DirectProperty
Accessor.java:182)
at org.hibernate.property.DirectPropertyAccessor.getField(DirectProperty
Accessor.java:189)
at org.hibernate.property.DirectPropertyAccessor.getField(DirectProperty
Accessor.java:189)
at org.hibernate.property.DirectPropertyAccessor.getField(DirectProperty
Accessor.java:189)
at org.hibernate.property.DirectPropertyAccessor.getField(DirectProperty
Accessor.java:189)
at org.hibernate.property.DirectPropertyAccessor.getField(DirectProperty
Accessor.java:174)
at org.hibernate.property.DirectPropertyAccessor.getGetter(DirectPropert
yAccessor.java:197)
at org.hibernate.internal.util.ReflectHelper.getter(ReflectHelper.java:2
53)
at org.hibernate.internal.util.ReflectHelper.reflectedPropertyClass(Refl
ectHelper.java:229)
at org.hibernate.mapping.SimpleValue.setTypeUsingReflection(SimpleValue.
java:326)
at org.hibernate.cfg.HbmBinder.bindSimpleId(HbmBinder.java:449)
at org.hibernate.cfg.HbmBinder.bindRootPersistentClassCommonValues(HbmBi
nder.java:382)
at org.hibernate.cfg.HbmBinder.bindRootClass(HbmBinder.java:322)
at org.hibernate.cfg.HbmBinder.bindRoot(HbmBinder.java:173)
at org.hibernate.cfg.Configuration$MetadataSourceQueue.processHbmXml(Con
My xml mapping file:
<!DOCTYPE hibernate-mapping PUBLIC
"-//Hibernate/Hibernate Mapping DTD 3.0//EN"
"http://www.hibernate.org/dtd/hibernate-mapping-3.0.dtd">
<hibernate-mapping package="gwash">
<class name="DeliveryMethodModel" table="[DELIVERY METHODS]">
<id name="id" column="[DELIVERY METHOD ID]">
<generator class="increment"/>
</id>
<property name="method" column="[DELIVERY METHOD]"/>
</class>
</hibernate-mapping>
EDIT: I've removed the brackets and spaces as indicated. Changed the DataSource.groovy to 'create' on the DB side. Still experiencing the same issues. The examples for hibernate integration with griffon/hsqldb/groovy are scant on details. Do I need to create all given properties for the model files for this to parse correctly? I've never used hibernate. Nor groovy. Nor griffon. I would definitely provide feedback for the community if I can get this resolved, if not I'll be rolling me own ORM since this is a rather small project. Rather not roll me own.
do you actually have the strings wrapped with [and ]?
I would suspect that for a class defined as
package gwash
import groovy.beans.Bindable
class DeliveryMethodModel {
Long id
#Bindable String method
}
the mapping file would be
<!DOCTYPE hibernate-mapping PUBLIC
"-//Hibernate/Hibernate Mapping DTD 3.0//EN"
"http://www.hibernate.org/dtd/hibernate-mapping-3.0.dtd">
<hibernate-mapping package="gwash">
<class name="DeliveryMethodModel" table="DELIVERY_METHODS">
<id name="id" column="DELIVERY_METHOD_ID">
<generator class="increment"/>
</id>
<property name="method" column="DELIVERY_METHOD"/>
</class>
</hibernate-mapping>

Resources