How do I load spring-data(-neo4j)-repositories lazily? - cdi

I am using spring-data-neo4j (standalone) in my JavaEE7-application as nice neo4j-OGM.
For time being, I am trying to integrate spring-data-neo4j repositories via #Autowired into my project.
public interface UserRepository extends GraphRepository<User> {}
I have started writing some JUnit-tests, which are testing beans which themselves use this repositories. Everything works fine so far.
#RunWith(SpringJUnit4ClassRunner.class)
#ContextConfiguration(locations = { "/spring/application-context.xml" })
#Transactional
#Import(NeighborinoNeo4jConfiguration.class)
public class UserFactoryTest {
private UserFactory userFactory;
...
}
#Named
#Stateless
public class UserFactory {
#Autowired
private UserRepository userRepo;
...
}
Now, I want to integrate this new repositories-classes into my JavaEE7-application, which I am deploying to a wildfly-8.1.
Adding the mentioned UserRepository to my application and deploying it results in following error:
javax.enterprise.inject.UnsatisfiedResolutionException: Unable to resolve a bean for 'org.springframework.data.neo4j.support.mapping.Neo4jMappingContext' with qualifiers [#javax.enterprise.inject.Default(), #javax.enterprise.inject.Any()].
at org.springframework.data.neo4j.repository.cdi.Neo4jCdiRepositoryExtension.createRepositoryBean(Neo4jCdiRepositoryExtension.java:109)
at org.springframework.data.neo4j.repository.cdi.Neo4jCdiRepositoryExtension.afterBeanDiscovery(Neo4jCdiRepositoryExtension.java:83)
...
To make myself clear: Just by adding this new interface to my source-code and deploying it, results in this error in the application-server. The app without this repository is deployed just fine.
As far as I can see, Neo4jCdiRepositoryExtension.createRepositoryBean(), runs too early. I have an own #ApplicationScoped-bean "Application" which, without this repository in the source-code, does the spring-configuration. But with this repository added, this ApplicationScoped-bean "Application" is not executed at all; and I assume this UnsatisfiedResolutionException occurs, because spring-configuration was not done before Neo4jCdiRepositoryExtension runs . I guess my problem could be solved by having the repositories initialization done after my "Application"-bean.
so... How do I load spring-data-neo4j repositories lazily?
Hint#1: #NoRepositoryBean makes the app deployable again. Of course, I cannot use the UserRepository.
import org.springframework.data.repository.NoRepositoryBean
#NoRepositoryBean
public interface UserRepository extends GraphRepository<User> {}
Hint#2: #Lazy does not help, same error.
import org.springframework.context.annotation.Lazy;
#Lazy
public interface UserRepository extends GraphRepository<User> {
versions:
pom.xml:
<spring.version>4.0.7.RELEASE</spring.version>
<spring.data.neo4j.version>3.2.0.RELEASE</spring.data.neo4j.version>

I was finally able to develop a workaround. In short I stopped using "#Autowired" at all and falled back to using javax.inject.Inject etc. again, and I am boostrapping spring-data-neo4j in good old fashioned "ClassPathXmlApplicationContext" way.
The main difficulty was to find out, how to build a spring-data-neo4j repository on my own. It is easy.
// The repo you want to have an instance of.
#NoRepositoryBean // Has to be here to avoid UnsatisfiedResolutionException from question.
public interface UserRepository extends GraphRepository<User> {
}
(Hint: "#NoRepositoryBean" is required on your repository-interfaces. Even tough I removed "#EnableNeo4jRepositories(...)".)
I assume you have a running springApplication:
final ClassPathXmlApplicationContext applicationContext = new ClassPathXmlApplicationContext(
"spring/application-context.xml");
final Neo4jTemplate neo4jTemplate = applicationContext .getBean(Neo4jTemplate.class);
final Neo4jMappingContext neo4jMappingContext = applicationContext .getBean(Neo4jMappingContext.class);
Next is to create a GraphRepositoryFactory, which is needed to build the repositories:
org.springframework.data.neo4j.repository.GraphRepositoryFactory graphRepositoryFactory = new GraphRepositoryFactory(neo4jTemplate, neo4jMappingContext);
UserRepository userRepository = graphRepositoryFactory.getRepository(UserRepository.class);

Related

How to use Micronaut & depenency injection in a single-file Groovy script?

I want to use Micronaut from a Groovy script. It seems that annotations such as #Inject and #PostConstruct are not processed.
Here is the code I tried:
#!/usr/bin/env nix-shell
#!nix-shell -i groovy
#Grapes([
#Grab('ch.qos.logback:logback-classic'),
#Grab('io.micronaut:micronaut-runtime')
])
package org.sdf // NPE without package
import io.micronaut.runtime.Micronaut
import javax.inject.*
import javax.annotation.*
#Singleton
class Component {
}
#Singleton
class App implements Runnable {
#Inject
Component comp
#Override
#PostConstruct
public void run() {
// Never runs
assert this.comp != null
assert false
}
}
static void main(String... args) {
Micronaut.run(App, args);
}
It doesn't run post-construct method and logs this:
22:17:43.669 [main] DEBUG i.m.context.DefaultBeanContext - Resolved bean candidates [] for type: interface io.micronaut.runtime.EmbeddedApplication
22:17:43.671 [main] INFO io.micronaut.runtime.Micronaut - No embedded container found. Running as CLI application
How can I use Micronaut with dependency injection in a single-file Groovy script?
How can I use Micronaut with dependency injection in a single-file
Groovy script?
You would have to compile all of the code in your script with Micronaut's annotation processors configured on the compile time classpath for your script. Technically that could be done with a single script but as a practical matter I don't think many folks are going to have a good reason to do or build support for making that more easily done.
FYI: Not an answer to your question as asked but one simple alternative approach is to not define all of your bean related classes in a single script, instead define them in their own separate source files in a project configured to build with Maven or Gradle using our annotation processors and then consume those classes from your script using #Grab like you have for other dependencies.

How to use the strategy pattern with managed objects

I process messages from a queue. I use data from the incoming message to determine which class to use to process the message; for example origin and type. I would use the combination of origin and type to look up a FQCN and use reflection to instantiate an object to process the message. At the moment these processing objects are all simple POJOs that implement a common interface. Hence I am using a strategy pattern.
The problem I am having is that all my external resources (mostly databases accessed via JPA) are injected (#Inject) and when I create the processing object as described above all these injected objects are null. The only way I know to populate these injected resources is to make each implementation of the interface a managed bean by adding #stateless. This alone does not solve the problem because the injected members are only populated if the class implementing the interface is itself injected (i.e. container managed) as opposed to being created by me.
Here is a made up example (sensitive details changed)
public interface MessageProcessor
{
public void processMessage(String xml);
}
#Stateless
public VisaCreateClient implements MessageProcessor
{
#Inject private DAL db;
…
}
public MasterCardCreateClient implements MessageProcessor…
In the database there is an entry "visa.createclient" = "fqcn.VisaCreateClient", so if the message origin is "Visa" and the type is "Create Client" I can look up the appropriate processing class. If I use reflection to create VisaCreateClient the db variable is always null. Even if I add the #Stateless and use reflection the db variable remains null. It's only when I inject VisaCreateClient will the db variable get populated. Like so:
#Stateless
public QueueReader
{
#Inject VisaCreateClient visaCreateClient;
#Inject MasterCardCreateClient masterCardCreateClient;
#Inject … many more times
private Map<String, MessageProcessor> processors...
private void init()
{
processors.put("visa.createclient", visaCreateClient);
processors.put("mastercard.createclient", masterCardCreateClient);
… many more times
}
}
Now I have dozens of message processors and if I have to inject each implementation then register it in the map I'll end up with dozens of injections. Also, should I add more processors I have to modify the QueueReader class to add the new injections and restart the server; with my old code I merely had to add an entry into the database and deploy the new processor on the class path - didn't even have to restart the server!
I have thought of two ways to resolve this:
Add an init(DAL db, OtherResource or, ...) method to the interface that gets called right after the message processor is created with reflection and pass the required resource. The resource itself was injected into the QueueReader.
Add an argument to the processMessage(String xml, Context context) where Context is just a map of resources that were injected into the QueueReader.
But does this approach mean that I will be using the same instance of the DAL object for every message processor? I believe it would and as long as there is no state involved I believe it is OK - any and all transactions will be started outside of the DAL class.
So my question is will my approach work? What are the risks of doing it that way? Is there a better way to use a strategy pattern to dynamically select an implementation where the implementation needs access to container managed resources?
Thanks for your time.
In a similar problem statement I used an extension to the processor interface to decide which type of data object it can handle. Then you can inject all variants of the handler via instance and simply use a loop:
public interface MessageProcessor
{
public boolean canHandle(String xml);
public void processMessage(String xml);
}
And in your queueReader:
#Inject
private Instance<MessageProcessor> allProcessors;
public void handleMessage(String xml) {
MessageProcessor processor = StreamSupport.stream(allProcessors.spliterator(), false)
.filter(proc -> proc.canHandle(xml))
.findFirst()
.orElseThrow(...);
processor.processMessage(xml);
}
This does not work on a running server, but to add a new processor simply implement and deploy.

why can't my spring-cloud-stream mockito test Autowire my Processor?

I'm trying to create tests for my spring-cloud-stream project. I've created my own BizSyncProcessor interface instead of using the default Processor, which seems to be in all the documentation. I've done this kind of project before with tests, but can't remember if I used mockito at the same time, so I'm wondering if that's the issue, because I'm doing #RunWith(MockitoJUnitRunner.class) instead of #RunWith(SpringRunner).
I also had similar problems when building the actual app, before I included the rabbit implementation as a dependency in maven.
IntelliJ flags an error on the #Autowired BizSyncProcessor saying 'no Beans of type 'BizSyncProcessor' could be found. However I'm able to run the test, so it compiles, but then bizSyncProcessor is null when running the test.
I'm including mockito because the handler that listens for the message makes a call to another service (the SFISClient), so I'm mocking out that call.
Here's my test:
#RunWith(MockitoJUnitRunner.class)
#SpringBootTest(webEnvironment = SpringBootTest.WebEnvironment.RANDOM_PORT)
#EnableAutoConfiguration
#Configuration
#EnableBinding(BizSyncProcessor.class)
public class UpdatedBusinessHandlerTest {
#Autowired
private BizSyncProcessor bizSyncProcessor;
#Autowired
private MessageCollector messageCollector;
#Mock
SFISClient sfisClient;
#InjectMocks
UpdatedBusinessHandler updatedBusinessHandler;
#Test
public void testWiring() throws Exception {
UpdatedBusinessAlert updatedBusinessAlert = new UpdatedBusinessAlert();
updatedBusinessAlert.setBusinessId(UUID.randomUUID());
Message<UpdatedBusinessAlert> updatedBusinessAlertMessage = MessageBuilder.withPayload(updatedBusinessAlert).build();
bizSyncProcessor.writeUpdatedBusinessIds().send(updatedBusinessAlertMessage);
Message<BusinessFlooringSummary> businessFlooringSummaryMessage = (Message<BusinessFlooringSummary>) messageCollector.forChannel(bizSyncProcessor.writeFlooringSummaries()).poll();
BusinessFlooringSummary businessFlooringSummary = businessFlooringSummaryMessage.getPayload();
assertNotNull(businessFlooringSummary);
}
}
The #SpringBootTest and everything Spring-based are not going to work in your case because you don't use #RunWith(SpringRunner). There is just nothing what can trigget those Spring hooks.
On the other hand there is no reason to use a MockitoJUnitRunner. You simply can rely on the #MockBean instead for your SFISClient: https://docs.spring.io/spring-boot/docs/2.1.1.RELEASE/reference/htmlsingle/#boot-features-testing-spring-boot-applications-mocking-beans

Connect XPage with OpenOffice

I have a button on an XPage where I want to connect to a remote OpenOffice instance. OpenOffice is started and is listening for a socket connection.
The onclick event of the button runs following SSJS:
oo = new com.test.OpenOffice();
oo.init("host=127.0.0.1,port=8107");
oo.openFile("C:\\TEMP\\Test.odt");
The code raises an excepction jva.lang.IlleagalStateException: NotesContext not initialized for the thread
The exception is raised within the method initof the class OpenOffice.
The relevant parts of the class OpenOffice is the following code:
public class DHOpenOffice implements Serializable {
private static final long serialVersionUID = -7443191805456329135L;
private XComponentContext xRemoteContext;
private XMultiComponentFactory xMCF;
private XTextDocument oTextDocument;
public DHOpenOffice() {
xRemoteContext = null;
xMCF = null;
oTextDocument = null;
}
public void init(String hostAdr) throws java.lang.Exception {
xRemoteContext = null;
XComponentContext xLocalContext = Bootstrap.createInitialComponentContext(null);
XUnoUrlResolver xUrlResolver = UnoUrlResolver.create(xLocalContext);
String sConnect = "uno:socket," + hostAdr + ",tcpNoDelay=0;urp;StarOffice.ServiceManager";
Object context = xUrlResolver.resolve(sConnect);
xRemoteContext = UnoRuntime.queryInterface(XComponentContext.class, context);
xMCF = xRemoteContext.getServiceManager();
}
The code line Object context = xUrlResolver.resolve(sConnect); is the one that raises the exception.
Why is this happing? What is the reason for this exception and how can I resolve the situation?
N.B.: The class code runs smoothly in a standalone application. The error occurs only when the code is started by a SSJS code.
It looks like a threading issue. There are a number of things you can go and try:
Wrap the whole interaction into a custom class and use it from a managed bean instead of calling it from SSJS
Make sure not to hand over any Notes objects into the custom class, only your own
Check if the Open Document Toolkit would be sufficient to do the operations you are interested in, so you don't need to run OO
let us know how it goes
Update
Try to get outside the standard XPages cycle. One way is to deploy a custom plug-in servlet:
import javax.servlet.ServletException;
import javax.servlet.http.HttpServlet;
import javax.servlet.http.HttpServletRequest;
import javax.servlet.http.HttpServletResponse;
import javax.servlet.http.HttpSession;
public class OpenOfficeServlet extends HttpServlet {
// Your code goes here
}
You need to get the plugin.xml right:
<?xml version="1.0" encoding="UTF-8"?>
<?eclipse version="3.4"?>
<plugin>
<extension point="org.eclipse.equinox.http.registry.servlets">
<servlet alias="/ooproxy" class="com.yourcompany.OpenOfficeServlet" />
</extension>
</plugin>
Then you could e.g. post a JSON structure or a serializable Java object to the servlet with the data and process it there (async if necessary). You deploy such a plug-in using the updatesite.nsf
Thanks to the answer of #stwissel I was able to solve the problem (he pointed me to the right direction).
I could solve the problem with a simple OSGI plug-in. The servlet approach solved the problem, too, but for me the OSGI plug-in was easier to use.
So these are the steps to create the plug-in
start a new plugin project
copy the open office jar files into the project and include them into the build path
copy the custom class that uses the UNO API into the plug-in
create a feature project for the plugin
create an update site
deploy the plugin via an update site
The following site where also quite helpfull:
Creating an XPages Library
Wrap an existing JAR file into a plug-in

Error - None of the constructors found with 'Orchard.Environment.AutofacUtil.DynamicProxy2.ConstructorFinderWrapper'

I have a custom module, Module1. In this module, I am referencing another custom module, Module2. Everything was working fine last week.
I did a fresh re-install of Orchard this morning. Since then, I have been getting this error.
None of the constructors found with 'Orchard.Environment.AutofacUtil.DynamicProxy2.ConstructorFinderWrapper' on type 'Module1' can be invoked with the available services and parameters: Cannot resolve parameter 'Module2' of constructor 'Void .ctor(...)'.
Any idea how to fix this error?
Thanks.
That means that an implementation of some interface could not be found. Several thing can have happened: a module may have failed to compile, or you forgot to make an interface derive from IDependency.
I know the post is quite old now, but just to link any possible mistake that could cause the described problem... here is my mistake.
I simply forgot to enable the referenced module from the dashboard.
Of course that didn't prevent me to add a project reference and module dependency, having the code compiling perfectly .
The point is, my referenced module doesn't contain any content type definition. It is just a module conceived to collect some functionality and common utilities. That's why I forgot to enable it.
Cheers.
You can get this error if you manually enabled your modules.
If so, fix it by deleting App_Data\cache.dat and then recycle the app pool.
I had the same issue. It seems that I referenced the concrete class and not the interface in my constructor.
public OrderService(
IRepository<Order> orderRepository,
ProductService productService,
ProductCategoryService productCategoryService
)
Instead of
public OrderService(
IRepository<Order> orderRepository,
IProductService productService,
IProductCategoryService productCategoryService
)
checklist is:
Interface derive from IDependency
Implementation derive from Interface
Constructor references the Interface
Build All and check if all referenced modules compile
Enable module in Admin panel
example:
public class myController : Controller{
private readonly IMyService _myService;
public myController(
IMyService myService
) {
_myService = myService;
}
}
public interface IMyService : IDependency
{
int GetOne();
}
public class MyService: IMyService
{
public MyService()
{ // init code }
public int GetOne()
{ return 1; }
}

Resources