How can I override the Docket bean from jhipster SwaggerConfiguration? I need to add custom Docket::directModelSubstitute for my api.
To override SwaggerConfiguration I added SwaggerConfig class to my project config/apidocs that extends SwaggerConfiguration and #Override the swaggerSpringfoxDocket bean
#Configuration
public class SwaggerConfig extends SwaggerConfiguration {
private final Logger log = LoggerFactory.getLogger(SwaggerConfig.class);
#Bean
#Override
public Docket swaggerSpringfoxDocket(JHipsterProperties jHipsterProperties) {
Finally add SwaggerConfiguration to the exclusion for #EnableAutoConfiguration in App.java
#EnableAutoConfiguration(exclude = {MetricFilterAutoConfiguration.class, MetricRepositoryAutoConfiguration.class, SwaggerConfiguration.class})
It would be a lot easier if SwaggerConfiguration had #ConditionalOnMissingBean for the swaggerSpringfoxDocket bean.
Related
I've coded this aspect:
#Aspect
public class LoggingCacheAspect {
#Pointcut("call * javax.cache.integration.CacheLoader.load(*)")
void cacheLoadCalls() {};
#Before("cacheLoadCalls")
public void beforeCacheCalls() {}
}
Also, I'm using CDI, and I'm looking forward to figure out how to inject a bean into this aspect.
I guess that adding #Inject annotation will not be enought.
Is it possible?
How could I get it?
You need to use an interceptor instead of the aspect
Here is an example:
#InterceptorBinding
#Target({TYPE, METHOD })
#Retention(RUNTIME)
public #interface CacheLog{
}
#Interceptor
#CacheLog
public class CacheLogInterceptor implements Serializable {
private static final long serialVersionUID = 1L;
#Inject
private YourBean yourBean;
#AroundInvoke
public Object cacheLogMethodCall(InvocationContext ctx) throws Exception {
//#Before
yourBean.method();
...
return ctx.proceed();
}
}
#CacheLog
public void cacheLoadCalls() {
...
...
}
I'm pretty new to the general procedure of bean injection. I've googled a lot but haven't found a solution to my problem.
Additional Information
Running Wildfly 9.0.1 final
EJB Vers. : 3.1
CDI Vers. : 2.2.16 (SP1)
JSF Vers. : 2.2
import javax.annotation.PostConstruct;
import javax.ejb.Stateless;
import javax.inject.Named;
#Named
#ViewScoped
public class UserEmailSettingsBean extends UserModuleSettingsBean {
private List<String> store;
private List<String> selectedStore;
//getters and setters, some fancy stuff...
#Override
public boolean saveProperties() {
LOG.info("Save called");
LOG.info(selectedStore.toString());
LOG.info(store.toString());
for(String prop : store) {
getProperties().setProperty(prop, String.valueOf(false));
}
for(String selectedProp : selectedStore){
LOG.info("selected: " + selectedProp + ":" + getProperties().getProperty(selectedProp) + " -> true");
getProperties().setProperty(selectedProp, String.valueOf(true));
}
super.saveProperties();
return true;
}
}
2nd Class:
public abstract class UserModuleSettingsBean implements ModuleSettings {
private static final long serialVersionUID = 459417872482285085L;
protected abstract List<String> getPropertiesName();
#Inject
private SettingsRepository settingsRepository;
#Inject
private SettingsService settingsService;
private Properties properties = new Properties();
#Override
public boolean saveProperties() {
String username = SecurityContextHolder.getContext().getAuthentication().getName();
settingsService.store(getProperties(), username);
return (true);
}
}
The problem is, that the settingsService is constructed, however its field "settingsRepository" is null in my child class.
On the call of my save method from UserEmailSettings, getProperties().setProperty() is called with the right values, however its never stored, as the settingsRepository is null. I believe that is due to a wrong Injection for some reason.
Let me know if I need to provide more information ☺
Here is the needed part of SettingsRepository:
#Stateless
#TransactionAttribute(TransactionAttributeType.SUPPORTS)
public class SettingsService implements Serializable {
private static final long serialVersionUID = 1695882717866085259L;
#Inject
SettingsRepository settingsRepository;
//...
}
And here the information SettingsRepository
#Stateless
public class SettingsRepository extends AbstractBaseRepository<Settings, Long> {
/**
* Instantiates a new settings repository.
*/
public SettingsRepository() {
super(Settings.class);
}
}
wanted to say my problem was that I didn't called an init() function on the settingsService to create the propertys, so getProperties was empty
I have an application scoped bean
#ManagedBean(name = "myController")
#ApplicationScoped
public class MyController implements Serializable{
...
public void allOn(){...}
And i want to call the allOn() method from a quartz-job
import org.quartz.Job;
public class CronJobAllOn implements Job{
#Override
public void execute(..){
//call allOn();}
}
I tried to pass the FacesContext to the Job-Class via the JobDataMap
JobDataMap jobDataMap = new JobDataMap();
jobDataMap.put("facesContext", FacesContext.getCurrentInstance());
JobDetail job = newJob(CronJobAllOn.class)
.usingJobData(jobDataMap)
.withIdentity("job1", "group1")
.build();
But it only throws an IllegalStateException when i try to call it in the CronJobAllOn Class
public void execute(JobExecutionContext context) throws JobExecutionException {
FacesContext fc= (FacesContext) context.getMergedJobDataMap().get("facesContext");
MyController test = (MyController)fc.getExternalContext().getApplicationMap().get("MyController");
test.allOn();}
How can i call the allOn() method in MyController from a quartz-job?
I got the solution for my Problem, the short comment from BalusC put me on the right path.
I switched to TomEE, to get CDI.
To use the CDI-Bean injection in my jobs, i had to create my own JobFactory Class:
public class CdiJobFactory implements JobFactory {
#Inject
#Any
private Instance<Job> jobs;
#Override
public Job newJob(TriggerFiredBundle triggerFiredBundle, Scheduler scheduler) throws SchedulerException {
final JobDetail jobDetail = triggerFiredBundle.getJobDetail();
final Class<? extends Job> jobClass = jobDetail.getJobClass();
for (Job job : jobs) {
if (job.getClass().isAssignableFrom(jobClass)) {
return job;
}
}
throw new RuntimeException("Cannot create a Job of type " + jobClass);
}
create the Factory
Scheduler scheduler = new StdSchedulerFactory().getScheduler();
scheduler.setJobFactory(cdiJobFactory);
after that i was able to inject myController:
public class CronJobAllOn implements Job{
#Inject
private MyController mc;
#Override
public void execute(JobExecutionContext context) throws JobExecutionException {
mc.allOn();
}
I'm having problems with CDI on tomcat. That's some relevant part of my code:
public class JPAUtil {
private static EntityManagerFactory emf = Persistence.createEntityManagerFactory("unit");
#Produces #RequestScoped
public static EntityManager getEntityManager() {
return emf.createEntityManager();
}
public void close(#Disposes EntityManager em) {
em.close();
}
}
My DAO Class:
public class DAO<T> implements Serializable{
private final Class<T> classe;
#Inject
protected EntityManager em;
public DAO(Class<T> classe) {
this.classe = classe;
}
}
and a child class:
public class UserDao extends DAO<User> implements Serializable{
public UserDao() {
super(User.class);
}
}
Because of the Generics, I used a producer for the DAO class:
public class DAOFactory {
#Produces
#SuppressWarnings({ "rawtypes", "unchecked" })
public DAO createDAO(InjectionPoint injectionPoint) {
ParameterizedType type = (ParameterizedType) injectionPoint.getType();
Class classe = (Class) type.getActualTypeArguments()[0];
return new DAO(classe);
}
}
In this example:
public class Test {
#Inject UserDAO userDAO;
#Inject DAO<User> dao;
}
When I try to use the UserDAO class, everything works fine, but when I use the DAO, the EntityManager remains null. Anyone have any idea?
In DAOFactory you instantiate the DAO with new operator, if you do so, CDI has no chance to inject dependencies in the DAO instance.
While in UserDAO CDI manages the entity manager injection.
So in DAOFactory you should set manually the entity manager in the newly created DAO instance.
I have a EAR package that contains a web module and EJB. The EJB is currently exposed its contains to local app client via Remote interface
#Stateless
public class CoreEJB implements CoreEJBRemote {
#PersistenceContext(unitName = "CoreWeb-ejbPU")
private EntityManager em;
#Override
public void packageProcess(String configFileName) throws Exception {
//Process logics
}
#Override
public <T> T create(T t) {
em.persist(t);
return t;
}
#Override
public <T> T find(Class<T> type, Object id) {
return em.find(type, id);
}
...
}
The Remote interface is not in EAR, and look like this
#Remote
public interface CoreEJBRemote {
public void packageProcess(java.lang.String configFileName) throws java.lang.Exception;
public <T extends java.lang.Object> T create(T t);
public <T extends java.lang.Object> T find(java.lang.Class<T> type, java.lang.Object id);
}
The Main of the app client is below
public class Main {
#EJB
private static CoreEJBRemote coreEJB;
public static void main(String[] args) {
coreEJB.packageProcess("path/to/a/file");
}
}
Now i want to create an Local interface for the EJB so that in the Managed Bean of my web module, I can access the EJB via local invocation.
Do I just change CoreEJB from public class CoreEJB implements CoreEJBRemote to public class CoreEJB implements CoreEJBRemote, CoreEJBLocal and create #Local interface call CoreEJBLocal inside EJB package? Or will there be something extra? I want my Managed Bean code to be like this
#ManagedBean
#RequestScoped
public void testView{
#EJB
private CoreEJBLocal coreEJB;
public void add(Something something){
coreEJB.add(something); //Local invocation
}
}
Do I just change CoreEJB from public
class CoreEJB implements CoreEJBRemote
to public class CoreEJB implements
CoreEJBRemote, CoreEJBLocal and create
#Local interface call CoreEJBLocal
inside EJB package?
Yes, this should be sufficient. Did you try? Did it fail? Keep in mind that local interfaces are pass-by-reference and remote interfaces are pass-by-value. If callers (or the bean) mutate state on the return value (or arguments, respectively), then you're going to get different behavior between the two. You must manage this careful in your API contract.