My use-case is that I am using dynamoDB mapper's transactionWrite() method. I am doing 2 put operations in my TransactionWriteRequest.
Now, when I am trying to verify the same using verify(dynamoDBmapper).transactionWrite(writeRequest);
Output
=> Argument(s) are different! Wanted:
[java] dynamoDBMapper.transactionWrite(
[java] ...dynamodbv2.datamodeling.TransactionWriteRequest#4ee37ca3
[java] );
[java] -> at
...MyTestClass.myMethod_withValidData_returnSuccess(MyTestClass.java:99)
[java] Actual invocation has different arguments:
[java] dynamoDBMapper.transactionWrite(
[java] ...dynamodbv2.datamodeling.TransactionWriteRequest#45c8d09f
I was able to do this using Unitils ReflectionAssert which compares objects recursively using reflection.
Related
HybrisContextFactory$ApplicationContextFactory.build(HybrisContextFactory.java:263) [coreserver.jar:?]
[java] at de.hybris.platform.core.HybrisContextHolder.getApplicationInstance(HybrisContextHolder.java:87) [coreserver.jar:?]
[java] at de.hybris.platform.core.AbstractTenant.createCoreApplicationContext(AbstractTenant.java:726) [coreserver.jar:?]
[java] at de.hybris.platform.core.AbstractTenant.doStartupSafe(AbstractTenant.java:765) [coreserver.jar:?]
[java] at de.hybris.platform.core.AbstractTenant.doStartUp(AbstractTenant.java:698) [coreserver.jar:?]
[java] at de.hybris.platform.core.Registry.assureTenantStarted(Registry.java:658) [coreserver.jar:?]
[java] at de.hybris.platform.core.Registry.activateTenant(Registry.java:719) [coreserver.jar:?]
[java] at de.hybris.platform.core.Registry.setCurrentTenant(Registry.java:566) [coreserver.jar:?]
[java] at de.hybris.platform.core.Registry.activateMasterTenant(Registry.java:626) [coreserver.jar:?]
[java] at de.hybris.platform.util.ClientExecuter.execute(ClientExecuter.java:43) [coreserver.jar:?]
[java] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_202]
[java] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_202]
[java] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_202]
[java] at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_202]
[java] at de.hybris.bootstrap.loader.Loader.execute(Loader.java:142) [ybootstrap.jar:?]
[java] at de.hybris.bootstrap.loader.Loader.main(Loader.java:118) [ybootstrap.jar:?]
[java] Caused by: org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'defaultProductInterestRelationConverter' defined in class path resource [customerinterestsfacades-spring.xml]: Cannot resolve reference to bean 'productInterestRelationPopulator' while setting bean property 'populators' with key [0]; nested exception is org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'productInterestRelationPopulator' defined in class path resource [customerinterestsfacades-spring.xml]: Initialization of bean failed; nested exception is org.springframework.beans.ConversionNotSupportedException: Failed to convert property value of type 'de.hybris.platform.acceleratorfacades.futurestock.impl.DefaultFutureStockFacade' to required type 'de.hybris.platform.customerinterestsfacades.futurestoc
In order to be able to login with the mani1 user please run the following impex:
INSERT_UPDATE Employee;uid[unique=true];backOfficeLoginDisabled;
;mani1;false;
Above impex sets the backOfficeLoginDisabled to false allowing mani1 user to be able to login.
Another option would be assigning to the user the OOTB(Out Of The Box) backofficeadmingroup.
Once we have a stable production setup, ideally we do not perform Initialization. To avoid any accident, it's good to block the Initialization option from HAC for all the users.
Refer how-to-block-system-initialization-in-production, where you can find a few possible options.
If you are using Hybris V 6.1 or any higher version and want to give HAC tab specific access using user groups/roles then all you have to do is assign a specific user group to your user and you are done. There are predefined user groups like ROLE_HAC_PLATFORM_INITIALIZATION, ROLE_HAC_PLATFORM_UPDATE and many more. You can find a full list of HAC roles and more details here.
sanker
You can fix it by assigning OOB user-group hac_platform_initialization to only expected non-admin employee users in your system.
Something like...
INSERT_UPDATE Employee;UID[unique=true];groups(uid)
;mani1 ;hac_platform_initialization;
This way you can allow the function to only limited set of users.
Hope it helps. Thanks!
I am new to jhipster and get the below error when I try to run my project:
com.mysql.jdbc.exceptions.jdbc4.MySQLNonTransientConnectionException: No operations allowed after connection closed.
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at com.mysql.jdbc.Util.handleNewInstance(Util.java:425)
at com.mysql.jdbc.Util.getInstance(Util.java:408)
at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:919)
at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:898)
at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:887)
at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:861)
at com.mysql.jdbc.ConnectionImpl.throwConnectionClosedException(ConnectionImpl.java:1184)
at com.mysql.jdbc.ConnectionImpl.checkClosed(ConnectionImpl.java:1179)
at com.mysql.jdbc.ConnectionImpl.rollback(ConnectionImpl.java:4523)
at com.zaxxer.hikari.pool.ProxyConnection.rollback(ProxyConnection.java:377)
at com.zaxxer.hikari.pool.HikariProxyConnection.rollback(HikariProxyConnection.java)
at liquibase.database.jvm.JdbcConnection.rollback(JdbcConnection.java:337)
at liquibase.database.AbstractJdbcDatabase.rollback(AbstractJdbcDatabase.java:1166)
at liquibase.lockservice.StandardLockService.acquireLock(StandardLockService.java:205)
at liquibase.lockservice.StandardLockService.waitForLock(StandardLockService.java:170)
at liquibase.Liquibase.update(Liquibase.java:196)
at liquibase.Liquibase.update(Liquibase.java:192)
at liquibase.integration.spring.SpringLiquibase.performUpdate(SpringLiquibase.java:431)
at liquibase.integration.spring.SpringLiquibase.afterPropertiesSet(SpringLiquibase.java:388)
at io.github.jhipster.config.liquibase.AsyncSpringLiquibase.initDb(AsyncSpringLiquibase.java:103)
at io.github.jhipster.config.liquibase.AsyncSpringLiquibase.lambda$afterPropertiesSet$0(AsyncSpringLiquibase.java:83)
at io.github.jhipster.async.ExceptionHandlingAsyncTaskExecutor.lambda$createWrappedRunnable$1(ExceptionHandlingAsyncTaskExecutor.java:68)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
2018-09-17 18:01:07.135 ERROR 31400 --- [-erp-Executor-1] i.g.j.c.liquibase.AsyncSpringLiquibase : Liquibase could not start correctly, your database is NOT ready: java.sql.SQLException: Connection is closed
liquibase.exception.DatabaseException: java.sql.SQLException: Connection is closed
at liquibase.database.jvm.JdbcConnection.rollback(JdbcConnection.java:340)
at liquibase.database.jvm.JdbcConnection.close(JdbcConnection.java:111)
at liquibase.database.AbstractJdbcDatabase.close(AbstractJdbcDatabase.java:1209)
at liquibase.integration.spring.SpringLiquibase.afterPropertiesSet(SpringLiquibase.java:397)
at io.github.jhipster.config.liquibase.AsyncSpringLiquibase.initDb(AsyncSpringLiquibase.java:103)
at io.github.jhipster.config.liquibase.AsyncSpringLiquibase.lambda$afterPropertiesSet$0(AsyncSpringLiquibase.java:83)
at io.github.jhipster.async.ExceptionHandlingAsyncTaskExecutor.lambda$createWrappedRunnable$1(ExceptionHandlingAsyncTaskExecutor.java:68)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.sql.SQLException: Connection is closed
at com.zaxxer.hikari.pool.ProxyConnection$ClosedConnection.lambda$getClosedConnection$0(ProxyConnection.java:490)
at com.sun.proxy.$Proxy144.getAutoCommit(Unknown Source)
at com.zaxxer.hikari.pool.HikariProxyConnection.getAutoCommit(HikariProxyConnection.java)
at liquibase.database.jvm.JdbcConnection.rollback(JdbcConnection.java:336)
... 9 common frames omitted
18:01:07.136 [restartedMain] ERROR org.springframework.boot.SpringApplication - Application run failed
org.springframework.beans.factory.UnsatisfiedDependencyException: Error creating bean with name 'userService' defined in file [/home/arvind/Documents/sts-workspace2/multichanneApp/build/classes/java/main/com/webbee/b2erp/service/UserService.class]: Unsatisfied dependency expressed through constructor parameter 2; nested exception is org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'userSearchRepository': Invocation of init method failed; nested exception is org.springframework.beans.BeanInstantiationException: Failed to instantiate [org.springframework.data.elasticsearch.repository.support.NumberKeyedRepository]: Constructor threw exception; nested exception is org.springframework.data.elasticsearch.ElasticsearchException: Failed to build mapping for user:user
Thanks for your Prompt Response. Please find the Code attached for your Reference. i am using this following code in jdl file to create user module Let me know if you need anything else on this.
relationship ManyToOne{
Role{User(login)} to User{login}
}
entity Role {
name String
}
package com.webbee.b2erp.repository.search;
import com.webbee.b2erp.domain.User;
import org.springframework.data.elasticsearch.repository.ElasticsearchRepository;
/**
* Spring Data Elasticsearch repository for the User entity.
*/
public interface UserSearchRepository extends ElasticsearchRepository {
}
I am trying to get a set of the distinct values of an object's field stored in a Hazelcast map.
This line of java code:
instructions.aggregate(Supplier.all(value -> value.getWorkArea()), Aggregations.distinctValues());
has the following stacktrace :
java.util.concurrent.ExecutionException: com.hazelcast.nio.serialization.HazelcastSerializationException: java.lang.ClassNotFoundException: com.example.instruction.repository.HazelcastInstructionRepository$GeneratedEvaluationClass
com.hazelcast.nio.serialization.HazelcastSerializationException: java.lang.ClassNotFoundException: com.example.instruction.repository.HazelcastInstructionRepository$GeneratedEvaluationClass
java.lang.ClassNotFoundException: com.example.instruction.repository.HazelcastInstructionRepository$GeneratedEvaluationClass
If I were to try this line :
instructions.aggregate(Supplier.all()), Aggregations.distinctValues());
or:
instructions.aggregate(Supplier.fromPredicate(Predicates.and(Predicates.equal("type", "someType"), equal("groupId", null),
Predicates.equal("workArea", "someWorkArea"))), Aggregations.distinctValues());
It just works ... It seems to be something wrong when I am making a reference to the object's field. (I also tried it with other fields of the object and the same error gets returned)
This is running on my local environment and I am sure that the objects are being placed correctly in the Hazelcast map since the other aggregations/predicates are working.
Do you have any ideas about what am I doing wrong?
Many Thanks!
EDITED: So the problem is the closure. It's not available on all nodes. Only on the calling node.
Also. This feature is deprecated. Plz use the fast-aggregations instead.
http://docs.hazelcast.org/docs/latest/manual/html-single/#fast-aggregations
How can I register MapWithStateRDDRecord in kryo?
When I'm trying to do.
`sparkConfiguration.registerKryoClasses(Array(classOf[org.apache.spark.streaming.rdd.MapWithStateRDD))`
I get an error
class MapWithStateRDDRecord in package rdd cannot be accessed in package org.apache.spark.streaming.rdd
[error] classOf[org.apache.spark.streaming.rdd.MapWithStateRDDRecord]
I'd like to make sure that all serialization is done via kryo thus I set SparkConf().set("spark.kryo.registrationRequired", "true"). With this setting enabled I get exceptions during runtime. java.lang.IllegalArgumentException (Class is not registered: org.apache.spark.streaming.rdd.MapWithStateRDDRecord)
I have grammar with a semantic predicate in a subrule which requires initialization to be done in the invoking rule in order to execute properly e.g..
decl_specifier_seq
#init {
//some initialization required by a semantic predicate
}
: decl_specifier+ ;
decl_specifier
:
storage_class_specifier //auto, register, static, extern, mutable
| {/*semantic predicate requiring the initialization*/}? type_specifier
| function_specifier //inline, virtual, explicit
;
But some tests show that the semantic predicate throws NullPointerException because it is called before the initialization in the #init{} block of the invoking rule is ever called.
After checking the generated Parser code, I found that there is another function containing my semantic predicate:
private boolean decl_specifier_sempred(Decl_specifierContext _localctx, int predIndex)
It seems that this function is called before my #init{} block is called to do the initialization. Is it a bug or something by design? The exception contains the name of the above function:
Exception in thread "main" java.lang.reflect.InvocationTargetException
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
at java.lang.reflect.Method.invoke(Unknown Source)
at org.antlr.v4.runtime.misc.TestRig.process(TestRig.java:249)
at org.antlr.v4.runtime.misc.TestRig.process(TestRig.java:211)
at org.antlr.v4.runtime.misc.TestRig.main(TestRig.java:143)
Caused by: java.lang.NullPointerException
at cppParser.CPPProcessorParser.decl_specifier_sempred(CPPProcessorParse
r.java:10989)
at cppParser.CPPProcessorParser.sempred(CPPProcessorParser.java:10853)
at org.antlr.v4.runtime.atn.SemanticContext$Predicate.eval(SemanticConte
xt.java:119)
at org.antlr.v4.runtime.atn.ParserATNSimulator.evalSemanticContext(Parse
rATNSimulator.java:1295)
at org.antlr.v4.runtime.atn.ParserATNSimulator.execATN(ParserATNSimulato
r.java:539)
at org.antlr.v4.runtime.atn.ParserATNSimulator.adaptivePredict(ParserATN
Simulator.java:415)
at cppParser.CPPProcessorParser.cppCompilationUnit(CPPProcessorParser.ja
va:330)
... 7 more
The exception is encountered before the #init{} block is called.
ANTLR 4 determines the behavior of predicates based on whether or not they are "context sensitive". Context sensitive predicates use the $ syntax to reference a parameter, label, local, or rule/token defined in the current rule. It appears in your case you are defining and initializing state information outside of the standard ANTLR syntax, so it has no way to know the predicate is context sensitive. There are two ways to address this issue:
Define one or more of your state variables which are used in the predicate in a locals block for the rule instead of in a #members block.
Add a reference to $ctx inside of a comment in the predicates. For example, you could add /*$ctx*/ at the end of the predicate.
If a context sensitive predicate is encountered but no context information is available (as is the case for your code), the predicate is assumed to be true.