As for good user feedback, I am using messages on multiple sites in my webapplication.
To add a message, I simple use:
FacesContext.getCurrentInstance().addMessage(null, new FacesMessage(type, "", message));
I added variables for type and message, because it depends on different validation.
Well, I am using different ManagedBeans for different sites, this is just normal.
It came to my mind, what is the best practice for adding those messages in different ManagedBeans.
Currently, I am always using the above code snippet like over 30 times (and it will become more and more for sure).
Should I create a Bean with SessionScopedannotated or #ApplicationScoped? Do you have any other hints, that should I know?
Just hide away repeated static code into a reusable static method to make it more DRY ("Don't Repeat Yourself").
Design the static method in such way that you can ultimately refactor from this,
FacesContext.getCurrentInstance().addMessage(null, new FacesMessage(type, "", message));
to something like this,
Messages.addGlobalInfo(message);
or even with an import static com.example.Messages.*; (Eclipse: Ctrl+Shift+M the line):
addGlobalInfo(message);
It doesn't need to be a managed bean as it doesn't hold any state. Moreover, you should make the default constructor of such an utility class private, so Java/JSF can in first place already not construct it via new operator or Class#newInstance() in reflection. If you're using CDI, annotate it if necessary with #Typed with an empty value to prevent it from being registered as managed bean candidate via Bean<T>.
#Typed
public final class Messages {
private Messages() {}
// ...
}
JSF utility library OmniFaces has exactly this utility class: org.omnifaces.util.Messages.
May be just move this code to some utility class:
public void static addMessage(FacesMessage.Severity type, String message){
FacesContext.getCurrentInstance().addMessage(null, new FacesMessage(type, "", message));
}
Also you can create several methods with one argument: addInfoMessage, addErrorMessage, ...
Related
In the past I always implemented authorization with some very proprietary code having an #ApplicationScoped bean with methods like isUserInAnyOfTheseRoles(String... roles), which executes appropriate database queries. I always had to call these methods at the start of a protected EJB method/REST resource, making an if-statement and possibly throwing a "NotAuthorized"-exception.
For the future I consider using JakartaEE's security mechanisms so I am reading into it. I have a problem understanding the use of the #javax.annotation.security.DeclareRoles annotation.
Amongst others I read SO answer to ejb Security questions regarding Roles and Authentication. There it is said that
The #DeclareRoles annotation on the other hand is merely used to declare a list of roles; [...]. The EJB container does not require knowledge of these roles to enforce access control checks on business methods of an EJB; instead, the bean provider/developer may use these roles in isCallerInRole tests to ensure programmatic security.
If I understand it right I need to declare the roles I use programmatically with javax.security.enterprise.SecurityContext#isCallerInRole(String). So my class might look like the following.
#DeclareRoles({"ADD_ENTITY", "DELETE_ENTITY", "UPDATE_ENTITY", "SEE_ENTITY", "SEE_ENTITY_DETAILS", "SEE_RESTRICTED_DATA", "MERGE_ENTITY", "ATTACH_METADATA"})
public class PersonService {
#Inject SecurityContext ctx;
#RolesAllowed({"SEE_ENTITY"})
public Person getPerson(long id) {
if(ctx.isCallerInRole("SEE_ENTITY_DETAILS") {...}
else if(ctx.isCallerInRole("SEE_RESTRICTED_DATA") {...}
else {...}
}
...
}
1st question:
Now if I have another class, do I need to declare all the roles again?
#DeclareRoles({"ADD_ENTITY", "DELETE_ENTITY", "UPDATE_ENTITY", "SEE_ENTITY", "SEE_ENTITY_DETAILS", "SEE_RESTRICTED_DATA", "MERGE_ENTITY", "ATTACH_METADATA"})
public class CompanyService {
#Inject SecurityContext ctx;
#RolesAllowed({"SEE_ENTITY"})
public Company geCompany(long id) {
if(ctx.isCallerInRole("SEE_ENTITY_DETAILS") {...}
else {...}
}
...
}
In the mentioned answer the EJB spec is quoted with
The DeclareRoles annotation is specified on a bean class, where it serves to declare roles that may be tested by calling isCallerInRole [...]
2nd question: Is it correct that this declaration is only used by the programmatically access and not for the declarative part? So that I do not need to declare any roles if I only use the annotation #RolesAllowed.
3rd question: What is the reason for this "duplication"? For me it is just annoying to declare the roles (by the way I do have plenty of roles in my application) twice. I do not see the point in doing so, after all it is just a string. I need to write a custom javax.security.enterprise.identitystore.IdentityStore to map roles to a user. Aligning these roles in every EJB seems to be unnecessary.
4th question: In a comment to a similar question (Define #DeclareRoles annotation programmatically) I interpret that #DeclareRoles might by a ancient relic and is no longer needed if an IdentityStore is used. If that would be true, it would make things a little bit clearer to me.
Sorry for the four questions in one post, but it all tangled together somehow. Basically it is question 3 that haunts me the most.
I like the jhipster entity generator.
I often get to change my model and regen all entities.
I wish to keep the generated stuff and override for my needs.
On angular side, it is quite easy to create a new service extending the default entity service to do my stuff.
On java side, it is more complicated.
For example, I override src/main/java/xxx/web/rest/xxxResource.java with src/main/java/xxx/web/rest/xxxOverrideResource.java
I have to comment #RestController in xxxResource.java. I tried to give it a different bundle name from the overrided class but it is not sufficient : #RestController("xxxResource")
In xxxOverrideResource.java, I have to change all #xxxMapping() to different paths
In xxxOverrideResource.java, I have to change all method names
This allow me to keep the CRUD UI and API, and overload it using another MappingPath.
Some code to make it more visual. Here is the generated xxxResource.java
/**
* REST controller for managing WorldCommand.
*/
// Commented to prevent bean dupplicated error.
// #RestController
#RequestMapping("/api")
public class WorldCommandResource {
private final WorldCommandService worldCommandService;
public WorldCommandResource(WorldCommandService worldCommandService) {
this.worldCommandService = worldCommandService;
}
#PutMapping("/world-commands")
#Timed
public ResponseEntity<WorldCommand> updateWorldCommand(#Valid #RequestBody WorldCommand worldCommand)
throws URISyntaxException {
log.debug("REST request to update WorldCommand : {}", worldCommand);
...
}
Here is my overloaded version : xxxOverrideResource.java
/**
* REST controller for managing WorldCommand.
*/
#RestController("WorldCommandOverrideResource")
#RequestMapping("/api")
public class WorldCommandOverrideResource extends WorldCommandResource {
private final WorldCommandOverrideService worldCommandService;
public WorldCommandOverrideResource(WorldCommandOverrideService worldCommandService) {
super(worldCommandService);
log.warn("USING WorldCommandOResource");
this.worldCommandService = worldCommandService;
}
#PutMapping("/world-commands-override")
#Timed
public ResponseEntity<WorldCommand> updateWorldCommandOverride(#Valid #RequestBody WorldCommand worldCommand)
throws URISyntaxException {
throw new RuntimeException("WorldCommand updating not allowed");
}
With the xxxResource overrided, it is easy to override the xxxService and xxxRepository by constructor injection.
I feel like I am over thinking it. As it is not an external component but code from a generator, maybe the aim is to use the tool to write less code and then do the changes you need.
Also, I fear this overriding architecture will prevent me from creating abstract controller if needed.
Do you think keeping the original generated code is a good pratice or I should just make my changes in the generated class and be carefull when regenerating an entity ?
Do you know a better way to override a Spring controller ?
Your approach looks like the side-by-side approach described here: https://www.youtube.com/watch?v=9WVpwIUEty0
I often found that the generated REST API is only useful for managing data in a backoffice and I usually write a complete separate API with different endpoints, authorizations and DTOs that is consumed by mobile or end-users. So I don't see much value in overriding REST controllers, after all they are supposed to be quite thin with as little business logic as possible.
You must also consider how long you want to keep this compatibility with generated code. As your app grows in complexity you might want to refactor your code and organize it around feature packages rather than by technical packages (repository, rest controllers, services, ...). For many reasons, sooner or later the way the generated code is setup will get in your way, so I would not put too much effort into this compatibility goal that has no real business value especially when you know that the yearly released major version may break it because of changes in the generator itself or more likely because of changes in underlying frameworks.
I know how to add a Bean to a CDI container during AfterBeanDiscovery. My problem is that what I really need to do is the equivalent of adding a new producer method with the equivalent of a particularly qualified parameter.
That is, I'd like to somehow programmatically create several of these:
#Produces
#SomeQualifier("x")
private Foo makeFoo(#SomeQualifier("x") final FooMaker fm) {
return fm.makeFoo();
}
...where the domain over which SomeQualifier's value element ranges is known only at AfterBeanDiscovery time. In other words, some other portable extension has installed two FooMaker instances into the container: FooMaker-qualified-by-#SomeQualifier("x") and FooMaker-qualified-by-#SomeQualifier("y"). Now I need to do the equivalent of making two producer methods to "match" them.
Nonbinding is not an option; I want this resolution to take place at container startup, not at injection time.
I am aware of BeanManager's getProducerFactory method, but the dozens if not hundreds of lines of gymnastics I'd have to go through to add the right qualifier annotation on each AnnotatedParameter "reachable" from the AnnotatedMethod I'd have to create by hand (to avoid generics issues) make me think I'm way off the beaten path here.
Update: So in my extension, I have created a private static method that returns a Foo, and has a FooMaker parameter. I've wrapped this in a hand-tooled AnnotatedMethod that reports SomeQualifier("x") etc. in its getAnnotations() method, and also reports SomeQualifier("x") etc. from its AnnotatedParameter's getAnnotations() method. Then I got a ProducerFactory from the BeanManager and feed that into a new Bean that I create, where I use it to implement the create and destroy methods. Everything compiles and so forth just fine.
(However, Weld (in particular) blows up with this usage, which leads me to think that I'm doing Really Bad Thingsā¢.)
I think i understood how CDI works and in order to dive deep in it, i would like to try using it with something real world example. I am stuck with one thing where i need your help to make me understand. I would really appreciate your help in this regard.
I have my own workflow framework developed using Java reflection API and XML configurations where based on specific type of "source" and "eventName" i load appropriate Module class and invoke "process" method on that. Everything is working fine in our project.
I got excited with CDI feature and wanted to give it try with workflow framework where i am planning inject Module class instead of loading them using Reflection etc...
Just to give you an idea, I will try to keep things simple here.
"Message.java" is a kind of Transfer Object which carries "Source" and "eventName", so that we can load module appropriately.
public class Message{
private String source;
private String eventName;
}
Module configurations are as below
<modules>
<module>
<source>A</source>
<eventName>validate</eventName>
<moduleClass>ValidatorModule</moduleClass>
</module>
<module>
<source>B</source>
<eventName>generate</eventName>
<moduleClass>GeneratorModule</moduleClass>
</module>
</modules>
ModuleLoader.java
public class ModuleLoader {
public void loadAndProcess(Message message){
String source=message.getSource();
String eventName=message.getEventName();
//Load Module based on above values.
}
}
Question
Now , if i want to implement same via CDI to inject me a Module (in ModuleLoader class), I can write Factory class with #Produce method , which can do that. BUT my question is,
a) how can pass Message Object to #Produce method to do lookup based on eventName and source ?
Can you please provide me suggestions ?
Thanks in advance.
This one is a little tricky because CDI doesn't work the same way as your custom solution (if I understand it correctly). CDI must have all the list of dependencies and resolutions for those dependencies at boot time, where your solution sounds like it finds everything at runtime where things may change. That being said there are a couple of things you could try.
You could try injecting an InjectionPoint as a parameter to a producer method and returning the correct object, or creating the correct type.
There's also creating your own extension of doing this and creating dependencies and wiring them all up in the extension (take a look at ProcessInjectionTarget, ProcessAnnotatedType, and 'AfterBeanDiscovery` events. These two quickstarts may also help get some ideas going.
I think you may be going down the wrong path regarding a producer. Instead it more than likely would be much better to use an observer especially based on what you've described.
I'm making the assumption that the "Message" transfer object is used abstractly like a system wide event where basically you fire the event and you would like some handler defined in your XML framework you've created to determine the correct manager for the event, instantiate it (if need be), and then call the class passing it the event.
#ApplicationScoped
public class MyMessageObserver {
public void handleMessageEvent(#Observes Message message) {
//Load Module based on above values and process the event
}
}
Now let's assume you want to utilize your original interface (I'll guess it looks like):
public interface IMessageHandler {
public void handleMessage(final Message message);
}
#ApplicationScoped
public class EventMessageHandler implements IMessageHandler {
#Inject
private Event<Message> messageEvent;
public void handleMessage(Message message) {
messageEvent.fire(message);
}
}
Then in any legacy class you want to use it:
#Inject
IMessageHandler handler;
This will allow you to do everything you've described.
May be you need somthing like that:
You need the qualifier. Annotation like #Module, which will take two paramters source and eventName; They should be non qualifier values. See docs.
Second you need a producer:
#Produces
#Module
public Module makeAmodule(InjectionPoint ip) {
// load the module, take source and eventName from ip
}
Inject at proper place like that:
#Inject
#Module(source="A", eventName="validate")
Module modulA;
There is only one issue with that solution, those modules must be dependent scope, otherwise system will inject same module regardles of source and eventName.
If you want to use scopes, then you need make source and eventName qualified parameters and:
make an extension for CDI, register programmatically producers
or make producer method for each and every possible combinations of source and eventName (I do not think it is nice)
I am having problems with the following class in a multi-threaded environment:
public class Foo
{
[Inject]
public IBar InjectedBar { get; set; }
public bool NonInjectedProp { get; set; }
public void DoSomething()
{
/* The following line is causing a null-reference exception */
InjectedBar.DoSomething();
}
public Foo(bool nonInjectedProp)
{
/* This line should inject the InjectedBar property */
KernelContainer.Inject(this);
NonInjectedProp = nonInjectedProp;
}
}
This is a legacy class which is why I am using property rather than constructor injection.
Sometime when the DoSomething() is called the InjectedBar property is null. In a single-threaded application, everything runs fine.
How can this be occuring and how can I prevent it?
I am using NInject 2.0 without any extensions, although I have copied the KernelContainer from the NInject.Web project.
I have noticed a similar problem occurring in my web services. This problem is extremely intermittent and difficult to replicate.
First of all, let me say that this is wrong on so many levels; the KernelContainer was an infrastructure class kept specifically to work around certain limitations in the ASP.NET WebForms page lifecycle. It was never meant to be used in application code. Using the Ninject kernel (or any DI container) as a service locator is an anti-pattern.
That being said, Ninject itself is definitely thread-safe because it's used to service parallel requests in ASP.NET all the time. Wherever this NullReferenceException is coming from, it's got little if anything to do with Ninject.
I can think of two possibilities:
You have to initialize KernelContainer.Kernel somewhere, and that code might have a race condition. If something tries to use the KernelContainer before the kernel is fully initialized (possible if you use the IKernel.Bind methods instead of loading modules as per the guidance), you'll get errors like this. Or:
It's your IBar implementation itself that has problems, and the NullReferenceException is happening somewhere inside the DoSomething method. You don't actually specify that InjectedBar is null when you get the exception, so that's a legitimate possibility here.
Just to narrow the field of possibilities, I'd eliminate the KernelContainer first. If you absolutely must use Ninject as a service locator due to a poorly-designed legacy architecture, then at least allow it to create the dependencies instead of relying on Inject(this). That is to say, whichever class or classes need to create your Foo, have that class call kernel.Get<Foo>(), and set up your kernel to Bind<Foo>().ToSelf().