How to make a serviceloader created class handle container managed objects - jsf

I'm currently writing a library where I want the user of my library to implement an interface. From within my library I'm calling this implementation.
I'm using ServiceLoader to instantiate the implementation provided by the integrator and it works just fine. The integrator calls a start() method in my library and in the end he gets something in return. The implementation is used to give me some things along the way that I need in order to get to the final result.
(I'm deliberately not using CDI or any other DI container 'cause I want to create a library that can be used anywhere. In a desktop application, a spring application an application using guice...)
Now I'm facing a problem. I'm creating a showcase in which I'm using my own library. It's a webapplication where I'm using jsf and CDI. When I instantiate the implementation provided in said webapp from within my library, I'm dealing with a non-container managed object. But since this implementation needs to use container managed objects I'm kinda screwed since this can never work.
Example:
Interface in lib:
public interface Example{
public abstract String getInfo();
}
Implementation in war:
public class ExampleImpl implements Example{
#Inject
private ManagedBean bean;
public String getInfo(){
return bean.getSomethingThatReturnsString();
}
}
As you can see this is a huge problem in the way my library is build since the bean will always be null... This means no one using a DI container can use my library. I know I can get the managedbean by doing a FacesContext lookup and get the managedbean but more importantly, my library isn't very well designed if you think about it.
So to conclude my question(s):
Is there any way I can make the serviceloader use a DI container to instantiate the class?
Anyone who knows a better way to fix my problem?
Anyone who knows a better way to get the things I need without making the integrator implement an interface but I can get information from the integrator?
I know this is a quite abstract question but I'm kinda stuck on this one.
Thanks in advance

As the implementation of Example is not performed within the CDI container the injection doesn't happen. What you can do is to lookup the bean manually using the BeanManager. According to the docs, the BeanManager is bound to the jndi name java:comp/BeanManager. Using the following code you can get the BeanManager within your implementation class and lookup the dependencies manually:
InitialContext context = new InitialContext();
BeanManager beanManager = (BeanManager) context.lookup("java:comp/BeanManager");
Set<Bean<?>> beans = beanManager.getBeans(YourBean.class, new AnnotationLiteral<Default>() {});
Bean<YourBean> provider = (Bean<YourBean>) beans.iterator().next();
CreationalContext<YourBean> cc = beanManager.createCreationalContext(provider);
YourBean yourBean = (YourBean) beanManager.getReference(provider, YourBean.class, cc);
where YourBean is the dependency you are looking for.

Related

Ad-Hoc type conversion in JOOQ DSL query

scenario:
we store some encrypted data in db as blob. When reading/saving it, we need to decrypt/encrypt it using an external service.
because it is actually a spring bean using an external service, we cannot use the code generator like dealing with enums.
I don't want to use dslContext.select(field1, field2.convertFrom).from(TABLE_NAME) because you need to specify every fields of the table.
It is convenient to use dslContext.selectFrom(TABLE_NAME). wonder if any way we can register the converter bean in such query to perform encrypt and decrypt on the fly.
Thanks
Edit: I ended up using a service to encrypt/decrypt the value when it is actually used. Calling an external service is relatively expensive. Sometimes the value isn't used in the request. It may not make sense to always decrypt the value when reading from db using the converter.
because it is actually a spring bean using an external service, we cannot use the code generator like dealing with enums.
Why not? Just because Spring favours dependency injection, and you currently (as of jOOQ 3.15) cannot inject anything into jOOQ Converter and Binding instances, doesn't mean you can't use other means of looking up such a service. Depending on what you have available, you could use some JNDI lookup, or other means to discover that service when needed, from within your Converter.
Another option would be to use a ConverterProvider and register your logic inside of that. That wouldn't produce your custom type inside of jOOQ records, but whenver you convert your blob to your custom data type, e.g. using reflection.
How to access Spring Beans without Dependency Injection?
If you need to access your Spring Beans you don't need Dependency Injection. Simply create the following class and you can get beans from the static method getBean():
#Component
public class ApplicationContextHolder implements ApplicationContextAware {
private static ApplicationContext applicationContext;
public static <T> T getBean(Class<T> type) {
return applicationContext.getBean(type);
}
#Override
public void setApplicationContext(ApplicationContext applicationContext) throws BeansException {
ApplicationContextHolder.applicationContext = applicationContext;
}
}

How to create a CDI Interceptor which advises methods from a Feign client?

I've been trying to figure out how to intercept methods defined in a Feign client with CDI (1.2) interceptors. I need to intercept the response value the client is returning, and extract data to log and remove some data prior to it being returned to the calling process.
I'm running a Weld 2.3 container which provides CDI 1.2. In it, I would like to create a CDI interceptor which is triggered everytime a call to filter() is made.
public interface MyRepository {
#RequestLine("POST /v1/data/policy/input_data_filtered")
JsonNode filter(Body body);
}
and a matching Producer method:
#Produces
public MyRepository repositoryProducer() {
return Feign.builder()
.client(new ApacheHttpClient())
.encoder(new JacksonEncoder(mapper))
.decoder(new JacksonDecoder(mapper))
.logger(new Slf4jLogger(MyRepository.class))
.logLevel(feign.Logger.Level.FULL)
.target(MyRepository.class, "http://localhost:9999");
}
I've tried the standard CDI interceptor way by creating an #InterceptorBinding and adding it to the interface definition, but that didn't work. I suspect because the interceptor must be applied to the CDI bean(proxy) and cannot be defined in an interface. I tried applying it to the repositoryProducer() method but that too was non functional.
I've read about the javax.enterprise.inject.spi.InterceptionFactory which is availabel in CDI 2.0, but I don't have access to it.
How can I do this in CDI 1.2? Or alternatively, is there a better interceptor pattern I can use that is built into Feign somehow?
The short, somewhat incorrect answer is: you cannot. InterceptionFactory is indeed how you would do it if you could.
The longer answer is something like this:
Use java.lang.reflect.Proxy to create a proxy implementation of the MyRepository interface.
Create an InvocationHandler that performs the interception around whatever methods you want.
Target Feign at that proxy implementation.

JSF Singleton Services/DAO/.. vs ApplicationScope [duplicate]

I'm trying to get used to how JSF works with regards to accessing data (coming from a spring background)
I'm creating a simple example that maintains a list of users, I have something like
<h:dataTable value="#{userListController.userList}" var="u">
<h:column>#{u.userId}</h:column>
<h:column>#{u.userName}</h:column>
</h:dataTable>
Then the "controller" has something like
#Named(value = "userListController")
#SessionScoped
public class UserListController {
#EJB
private UserListService userListService;
private List<User> userList;
public List<User> getUserList() {
userList = userListService.getUsers();
return userList;
}
}
And the "service" (although it seems more like a DAO) has
public class UserListService {
#PersistenceContext
private EntityManager em;
public List<User> getUsers() {
Query query = em.createQuery("SELECT u from User as u");
return query.getResultList();
}
}
Is this the correct way of doing things? Is my terminology right? The "service" feels more like a DAO? And the controller feels like it's doing some of the job of the service.
Is this the correct way of doing things?
Apart from performing business logic the inefficient way in a managed bean getter method, and using a too broad managed bean scope, it looks okay. If you move the service call from the getter method to a #PostConstruct method and use either #RequestScoped or #ViewScoped instead of #SessionScoped, it will look better.
See also:
Why JSF calls getters multiple times
How to choose the right bean scope?
Is my terminology right?
It's okay. As long as you're consistent with it and the code is readable in a sensible way. Only your way of naming classes and variables is somewhat awkward (illogical and/or duplication). For instance, I personally would use users instead of userList, and use var="user" instead of var="u", and use id and name instead of userId and userName. Also, a "UserListService" sounds like it can only deal with lists of users instead of users in general. I'd rather use "UserService" so you can also use it for creating, updating and deleting users.
See also:
JSF managed bean naming conventions
The "service" feels more like a DAO?
It isn't exactly a DAO. Basically, JPA is the real DAO here. Previously, when JPA didn't exist, everyone homegrew DAO interfaces so that the service methods can keep using them even when the underlying implementation ("plain old" JDBC, or "good old" Hibernate, etc) changes. The real task of a service method is transparently managing transactions. This isn't the responsibility of the DAO.
See also:
I found JPA, or alike, don't encourage DAO pattern
DAO and JDBC relation?
When is it necessary or convenient to use Spring or EJB3 or all of them together?
And the controller feels like it's doing some of the job of the service.
I can imagine that it does that in this relatively simple setup. However, the controller is in fact part of the frontend not the backend. The service is part of the backend which should be designed in such way that it's reusable across all different frontends, such as JSF, JAX-RS, "plain" JSP+Servlet, even Swing, etc. Moreover, the frontend-specific controller (also called "backing bean" or "presenter") allows you to deal in a frontend-specific way with success and/or exceptional outcomes, such as in JSF's case displaying a faces message in case of an exception thrown from a service.
See also:
JSF Service Layer
What components are MVC in JSF MVC framework?
All in all, the correct approach would be like below:
<h:dataTable value="#{userBacking.users}" var="user">
<h:column>#{user.id}</h:column>
<h:column>#{user.name}</h:column>
</h:dataTable>
#Named
#RequestScoped // Use #ViewScoped once you bring in ajax (e.g. CRUD)
public class UserBacking {
private List<User> users;
#EJB
private UserService userService;
#PostConstruct
public void init() {
users = userService.listAll();
}
public List<User> getUsers() {
return users;
}
}
#Stateless
public class UserService {
#PersistenceContext
private EntityManager em;
public List<User> listAll() {
return em.createQuery("SELECT u FROM User u", User.class).getResultList();
}
}
You can find here a real world kickoff project here utilizing the canonical Java EE / JSF / CDI / EJB / JPA practices: Java EE kickoff app.
See also:
Creating master-detail pages for entities, how to link them and which bean scope to choose
Passing a JSF2 managed pojo bean into EJB or putting what is required into a transfer object
Filter do not initialize EntityManager
javax.persistence.TransactionRequiredException in small facelet application
It is a DAO, well actually a repository but don't worry about that difference too much, as it is accessing the database using the persistence context.
You should create a Service class, that wraps that method and is where the transactions are invoked.
Sometimes the service classes feel unnecessary, but when you have a service method that calls many DAO methods, their use is more warranted.
I normally end up just creating the service, even if it does feel unnecessary, to ensure the patterns stay the same and the DAO is never injected directly.
This adds an extra layer of abstraction making future refactoring more flexible.

How can I initialize a Java FacesServlet

I need to run some code when the FacesServlet starts, but as FacesServlet is declared final I can not extend it and overwrite the init() method.
In particular, I want to write some data to the database during development and testing, after hibernate has dropped and created the datamodel.
Is there a way to configure Faces to run some method, e.g. in faces-config.xml?
Or is it best to create a singleton bean that does the initialization?
Use an eagerly initialized application scoped managed bean.
#ManagedBean(eager=true)
#ApplicationScoped
public class App {
#PostConstruct
public void startup() {
// ...
}
#PreDestroy
public void shutdown() {
// ...
}
}
(class and method names actually doesn't matter, it's free to your choice, it's all about the annotations)
This is guaranteed to be constructed after the startup of the FacesServlet, so the FacesContext will be available whenever necessary. This in contrary to the ServletContextListener as suggested by the other answer.
You could implement your own ServletContextListener that gets notified when the web application is started. Since it's a container managed you could inject resources there are do whatever you want to do. The other option is to create a #Singleton ejb with #Startup and do the work in it's #PostCreate method. Usually the ServletContextListener works fine, however if you have more than one web application inside an ear and they all share the same persistence context you may consider using a #Singleton bean.
Hey you may want to use some aspects here. Just set it to run before
void init(ServletConfig servletConfig)
//Acquire the factory instances we will
//this is from here
Maybe this will help you.

Mapping to an internal type with AutoMapper for Silverlight

How do I configure my application so AutoMapper can map to internal types and/or properties in Silverlight 5? For example, I have the following type:
internal class SomeInfo
{
public String Value { get; set; }
}
I try to call Mapper.DynamicMap with this type as the destination and I receive the following error at runtime:
Attempt by security transparent method
'DynamicClass.SetValue(System.Object, System.Object)' to access
security critical type 'Acme.SomeInfo' failed.
I've tried instantiating the class first, then passing the instance to DynamicMap as well as changing the class scope to public with an internal setter for the property. I've also marked the class with the [SecuritySafeCritical] attribute. All of these tests resulted in the same error message.
The only way I've been able to get past this is to completely expose the class with public scope and public setters. This is, of course, a problem as I am developing a class library that will be used by other developers and using "internal" scope is a deliberate strategy to hide implementations details as well as make sure code is used only as intended (following the no public setters concept from DDD and CQRS).
That said, what can I do to make it so AutoMapper can work with internal types and/or properties?
(Note: The class library is built for SL5 and used in client apps configured to run out-of-browser with elevated trust.)
This is more of a Silverlight limitation - it does not allow reflection on private/protected/internal members from outside assemblies, see:
http://msdn.microsoft.com/en-us/library/stfy7tfc(VS.95).aspx
Simply put - AutoMapper can't access internal members of your assembly.

Resources