#IfProfileValue vs #ActiveProfiles in the context of Spring test - spring-test

I am in reference to Spring test and the following annotations:
#IfProfileValue
#ActiveProfiles
I currently use #ActiveProfiles in my application and I recently discovered the existence of the #IfProfileValue annotation which seems to provide similar functionnality.
Can someone please explain what the differences are between those two annotations perhaps by providing usage examples that would contrast the two?

As stated in the Javadoc, #IfProfileValue is used to indicate that a test is enabled for a specific testing profile or environment.
Whereas, #ActiveProfiles is used to declare which active bean definition profiles should be used when loading an ApplicationContext for test classes.
In other words, you use #IfProfileValue to control whether a test class or test method will be executed or skipped, and you use #ActiveProfiles to set the active bean definition profiles that will be used to load the ApplicationContext for your test.
Please note that #IfProfileValue was introduced in Spring Framework 2.0, long before the notion of bean definition profiles, and #ActiveProfiles was first introduced in Spring Framework 3.1.
Both annotations contain the term profile, but they are actually completely unrelated!
The term profile is perhaps misleading when considering the semantics for #IfProfileValue. The key is to think about test groups (like those in TestNG) instead of profiles. See the examples in the JavaDoc for #IfProfileValue. Here's an excerpt:
#IfProfileValue(name = "test-groups", values = { "unit-tests", "integration-tests" })
public void testWhichRunsForUnitOrIntegrationTestGroups() {
// ...
}
The above test method would be executed if you set the test-groups system property (e.g., -Dtest-groups=unit-tests or -Dtest-groups=integration-tests).
The Context configuration with environment profiles section of the Testing chapter in the Spring Reference manual provides detailed examples of how to use #ActiveProfiles.
Regards,
Sam (author of the Spring TestContext Framework)

Related

Quarkus / CDI and "java config" DI definitions

I just started a quarkus proof of concept. The containers-start time is amazing!
Right now, I'm working on the Dependency Injection part. And figuring out the options.
https://quarkus.io/blog/quarkus-dependency-injection/
My preferences are:
I prefer constructor injection. (This has been going ok).
I prefer "java config" so I can follow the "Composition Root" pattern of putting all my application dependency injections in a common place. (See https://blog.ploeh.dk/2011/07/28/CompositionRoot/ )
With Spring DI, this is done with the
org.springframework.context.annotation.Configuration
and declaring the Beans there.
Aka, I prefer not to place "#ApplicationScoped" annotations all over my classes.
Does CDI/Quarkus support a "java config" model? The reason I ask about quarkus is that I read quarkus has a limited CDI implementation.
//start quote//Our primary goal was to implement a supersonic
build-time oriented DI solution compatible with CDI. This would allow
users to continue using CDI in their applications but also leverage
Quarkus build-time optimizations. However, ArC is not a full CDI
implementation verified by the TCK - see also the list of supported
features and the list of limitations.//end quote
So my question isn't a solely CDI question.
I've tried different internet search terms, but they keep showing me Spring links. :(
You should create a CDI bean that will produce your beans, this is the standard CDI approach to what Spring calls Java Configuration.
So something like this
import javax.enterprise.context.ApplicationScoped;
import javax.enterprise.inject.Produces;
#ApplicationScoped
public class MyConfiguration {
#Produces
public MyBean myBean(){
return new MyBean();
}
}

how to use subflow in spring integraion

I am working on project where we have used spring integration. And we have may flow which eventually create a full flow of the system.
Now, we needs to create a main flow which have all abstract component which internally call the sub flow. I found spring integration flow project for create a subflow. https://github.com/spring-projects/spring-integration-flow/tree/master.
But while I try to find out latest jar I found which is build on 2015 (https://mvnrepository.com/artifact/org.springframework.integration/spring-integration-flow). Now I am confused that do we have to use this project or some other approach spring integration build which needs to use.
e.x:
we have 3 flow file.
1) prepare-file.xml
2) prepare-database.xml
3) enrich-object.xml
which eventually call like prepare-file.xml-->prepare-database.xml-->enrich-object.xml
Now, we like to create a file which is master-flow.xml which shows all component in diagram very high level.
Thanks,
Nishit C.
Well, that project hasn't have enough interest in community for a while. And now when most people step aside from XML configuration in favor of Java & Annotation configuration with Spring Boot on top, such a project doesn't have its attractiveness any more.
On the other hand we provide a Java DSL for Spring Integration flows several years already: https://docs.spring.io/spring-integration/docs/current/reference/html/#java-dsl
I would say its IntegrationFlow definitions with the sub-flow functionality may server for your requirements.
I understand that this might not be an answer you are looking for, but at least this one should be as some food to think about.

How to override UserLocalServiceImpl in liferay 7 without service wrapper?

I created service wrapper for UserLocalServiceImpl and declared a new method inside the service wrapper. But when I explicitly call that method using UserLocalServiceUtil the compiler could not resolve this method. So, kindly help me and tell how to override UserLocalServiceImpl so that I can define new methods inside it. Thanx in advance..
This doesn't work. You'd change the interface of Liferay's published API and basically be incompatible with any other plugin that assumes Liferay's API.
While you technically have access to all of Liferay's source code and can build a modified version of Liferay, introducing this change, it would mean that no marketplace plugin (that uses UserLocalService) would be compatible with your customized version. Any OSGi component can hook into Liferay and get into the callstack for the published API, no OSGi plugin can extend a published interface so that the original interface then has more methods than Liferay's published API.
The best thing you can do if you rely on a separate function call: Create your custom service that makes calls to UserLocalService.
Further more, in Liferay 7 you shouldn't use UserLocalServiceUtil any more, rather get the service dependency properly injected through a #Reference annotation. The *LocalServiceUtil classes are there purely for backwards compatibility and to be used only from *.WAR style plugins.
You can do
UserLocalServiceUtil.getService()
and then cast the result to your custom wrapper type. Then you should be able to call the new method.

How to use arquillian rest extension with mockito

Hope you can help me out with something I've been struggling with for some time now. I'm using arquillian for my integraton testing and it works very nice. So I thought instead of testing my persistence layer I wanted to isolate my rest services layer to test it also. This time using arquillian, the arquillian rest extension (http://arquillian.org/modules/rest-extension/) which sounds perfect for black box testing of my services layer and combine this with mockito.
My application exists of several layers from top to bottom:
ui (javascript)
interaction layer (Rest services using resteasy)
business layer (Stateless EJBs)
persistence layer (DAOs)
All gets deployed in a java ee6 container and uses that technology.
In order to test the interaction layer which holds the rest services I'm using arquillian rest extension which avoids the hassle of setting up a httpclient in my code and invoking the rest call. This all is done by using arquillian rest extension which sounds great doesn't it!?
My rest services have several CDI beans injected which represent the business layer. In order to avoid having to test all layers at once, I want the rest layer to be isolated using mocks. So I tried to use mockito that could mock the injected beans in the rest layer. Unfortunately my mocks work in my test, but the mocks are not inserted in my rest services.
In order to fix this I stumpled upon another arquillian extension called autodiscover but have no clue what to do or configure in order to let the mocks work in my project.
hope you guys can help me out.
In fact when running black box tests like rest calls or selenium based tests either mockito neither autodiscover or even byteman extension will help you cause your tests are driven outside the container and you can relay only on what was deployed.
One alternative is to deploy fake services so for example if you have a rest endpoint like this:
#Path("rest")
public void MyEndpoint{
#Inject
RealService realService;
#Path("rest-resource")
#GET
public Response methodUnderTest(){
realService.doSomething();
return Response.ok().build();
}
}
on your test #Deployment you can add a FakeService instead of RealService where the FakeService is either a CDI alternative or a Bean Specialization, i have wrote about arquillian and mocks on my blog, you can read the post here.

How to modularize an Enterprise Application with OSGi and EE6?

I know that there are already some questions related to this topic but I couldn't find a real solution yet.
Currently I am developing applications with EE6, using JPA, CDI, JSF. I would like to take a more modular approach than packaging everything into a WAR or EAR and deploy the whole thing on an Application Server.
I am trying to design my applications as modular as possible by separating a module into 3 maven projects:
API - Contains the interfaces for (stateless) services
Model - Contains the JPA Entities for the specific module
Impl - Contains the implementation of the API, mostly CDI beans
The view logic of every module is currently bundeled within a big web project, which is ugly. I already thought of web fragmets, but if I spread my bean classes and xhtml files in jar files, I would have to implement a hook so that the resources could be looked up by a parent web application. This kind of solution would at least enable me to have a fourth project per module that would contain all the view logic related to the module, which is a good start.
What I want is not only that I can have those 4 kinds of projects, but also that every project is hot swappable. This led me to OSGi, which was at first really cool until I realized that the EE6 technologies are not very well supported within an OSGi Container.
JPA
Let's look at JPA first. There are some tutorials[1] around that explain how to make a JPA enabled OSGi Bundle, but none of these tutorials shows how to spread entities into different bundles(the model project of a module). I would want to have for example three different modules
Core
User
Blog
The model project of the blog module has a (compile-time)dependency on the model project of user.
The model project of the user module has a (compile-time)dependency on the model project of core.
How can I make JPA work in such a scenario without having to create a Persistence Unit for each model project of a module? I want one persistence unit that is aware of all entities available at runtime. The model projects in which the entities are should of course be hot swappable. Maybe I will need to make a separate project for every client that imports all the needed entities of the projects and contains a persistence.xml that includes all necessary configuration things. Are there any available maven plugins for building such a projects or even other approaches to solve that issue?
CDI
CDI is very nice. I really love it and I don't want to miss it any more! I use CDI extensions like MyFaces CODI and DeltaSpike which are awesome!
I inject my (stateless) services into other services or into the view layer which is just great. Since my services are stateless it should not be a problem to use them as OSGi Services, but what about CDI integration in OSGi? I found a glassfish CDI Extension[2] that would the injection of OSGi Services into CDI beans, but I also want may OSGi Services to be CDI beans. I am not totally sure how to achive that, probably I would have to use the BeanManager to instantiate the implementations and then register every implementation for its interface in the ServiceRegistry within a BundleActivator. Is there any standard way for doing that? I would like to avoid any (compile-time)dependencies to the OSGi framework.
I would also like to use my services just like I use them right now, without changing anything(implementations not annotated and injection points not qualified).
There is a JBoss Weld extension/sub project[3] that seems to target that issue but it seems to be inactive, i can't find any best practices or how-tos.
How can I leave my implementation as it is but still be able to use OSGi? I mean it would not be a big deal to add an annotation to the implementations since every implementation is already annotated with a stereotype annotation, anyway I would like to prevent that.
JSF
As mentioned before I would like to be able to spread my view logic module wise. As far as I know this is not really possible out of the box. Pax Web[4] should solve that somehow, but I am not familiar with it.
I would like to have a project "CoreWeb" in the module "core" that contains a Facelet template, let's call it "template.xhtml". A JSF page in a project called "BlogWeb" in the module "blog" should then be able to reference that template and apply a composition.
To be able to extend the view I would introduce a java interface "Extension" that can be implemented by a specific class of a module. A controller for a view would then inject all implementations of the extension. An extension would for example provide a list of subviews that will be included into a main view.
The described extension mechanism can be implemented easily, but the following requirements must be fulfilled:
When adding new OSGi Bundles to the application server, the set of available extensions might change, the extensions must be available for the controller of the view.
The subviews(from a separate bundle) which should be included into a main view should be accessible.
The concept of a single host but multiple slice applications of Spring Slices[5] is very interesting, but seems limited to Spring DM Server and the project also seems to be inactive.
Summary
After all the examples and behaviors I described I hope that you know what I would like to achive. It's simply an EE6 App that is very dynamic and modularized.
What I look for at the end is at least documentation on how to get everything running as I would expect it or even better an already working solution!
[1] http://jaxenter.com/tutorial-using-jpa-in-an-osgi-environment-36661.html
[2] https://blogs.oracle.com/sivakumart/entry/typesafe_injection_of_dynamic_osgi
[3] http://www.slideshare.net/TrevorReznik/weldosgi-injecting-easiness-in-osgi
[4] http://team.ops4j.org/wiki//display/paxweb/Pax+Web
[5] https://jira.springsource.org/browse/SLICE
To answer some of your questions, using a single persistence unit but spreading your entities across multiple bundles is not recommended, but may occasionally work. However, if your entities are so closely related that they need to share a persistence unit, splitting them across modules may not make sense. Also, don't forget you can handle compile-time dependencies by separating the implementation and interface for each entity - interface and implementation need not be in the same bundle.
For dependency injection, you may like Blueprint.
Several implementations are available and most application servers with enterprise OSGi support support Blueprint out of the box. It uses XML to add metadata, so classes themselves won't need any modification.

Resources