Can somedy explain me the TAB SAMPLE (gwtp)? - gwt-platform

I am using GWTP. I did the nested presenter tutorial. But there is no tutorial for the SAMPLE TAB application (the one with the admin tab appearing if you switch to the admin mode). Can somebody explain me the main concepts of this application ? Tkx.

Update: Update: Now you can download the workable sample Maven project from here: gwtp-sample-tab.zip
I used the tabbed presenter feature successfully in my project (I found the sample code didn't compile as well). I think the first thing is to make it work, and then learn it and feel the benefits gradually :)
Here is the steps I did:
1) Copy the following files
BaseTab.java
BaseTabPanel.java
SimpleTab.java
SimpleTabPanel.java
SimpleTab.ui.xml
SimpleTabPanel.ui.xml
UiModule.java
from the sample code to you project. For example, I copied to this package: com.widenhome.web.client.ui. Also please remember to configure UiModule in ClientGinjector class.
2) Create a normal presenter (MyPresenter) via GWTP eclipse plugin
3) Change EventBus import this in the presenter
import com.google.web.bindery.event.shared.EventBus;
4) Make sure the MyPresenterView.ui.xml has the following code or similar:
<g:HTMLPanel>
<npui:SimpleTabPanel ui:field="tabPanel" />
<g:SimplePanel ui:field="contentPanel" />
</g:HTMLPanel>
5) Change the presenter to extend TabContainerPresenter instead of Presenter
public class MyPresenter extends
TabContainerPresenter<MyPresenter.MyView, MyPresenter.MyProxy>
6) Define several variables in MyPresenter, or you can just copy/paste the following code:
/**
* This will be the event sent to our "unknown" child presenters, in order
* for them to register their tabs.
*/
#RequestTabs
public static final Type<RequestTabsHandler> TYPE_RequestTabs = new Type<RequestTabsHandler>();
/**
* Fired by child proxie's when their tab content is changed.
*/
#ChangeTab
public static final Type<ChangeTabHandler> TYPE_ChangeTab = new Type<ChangeTabHandler>();
/**
* Use this in leaf presenters, inside their {#link #revealInParent} method.
*/
#ContentSlot
public static final Type<RevealContentHandler<?>> TYPE_SetTabContent = new Type<RevealContentHandler<?>>();
7) Change the constructor of MyPresenter to use the variables:
#Inject
public MyPresenter(final EventBus eventBus, final MyView view, final MyProxy proxy) {
super(eventBus, view, proxy, TYPE_SetTabContent, TYPE_RequestTabs, TYPE_ChangeTab);
}
8) Now we can start to create tab presenters, (e.g MyFirstTabPresenter). Just create a normal presenter again via GWTP eclipse plugin
9) In MyFirstTabPresenter, change MyProxy to let it 'extends' TabContentProxyPlace instead of ProxyPlace
10) Create #TabInfo method, please see javadoc of #TabInfo annotation, you can also use other ways here. For example, I did this:
#TabInfo(container = MyPresenter.class)
static TabData getTabLabel(ClientGinjector ginjector) {
return new TabDataBasic("My First Tab", 0);
}
11) In revealInParent() method of MyFirstTabPresenter class, please make sure it has the following code or similar:
#Override
protected void revealInParent() {
RevealContentEvent.fire(this, MyPresenter.TYPE_SetTabContent, this);
}
That's all related to Tabbed presenter configurations. Now you can add some logic to load some data to show in MyFirstPresenter's view.
I hope this can help you to start with GWTP Tabbed presenter, please let me know any issues you have, I will edit answer gradually and perfect it so that it can help more people to get started with it.
BTW, I also posted this to my blog to help more people on this.
Thanks,
Jiakuan

It doesn't even compile. The only way to trigger multiple presenters in via Nested Presenters - which is tooooo complicated. I built a multiple presenter app with simple GWT History mechanism without any pain. This framework has made GWT History (s aimple mechanism) a very esoteric thing.

Related

How can I know the retained class name or keyword when I use navigation framework in Android Studio?

The following code is from the project at https://github.com/mycwcgr/camera/tree/master/CameraXBasic
The project use the latest navigation framework, I find there are some retained class name such as CameraFragmentDirections, GalleryFragmentArgs.
The system have no prompt information for these class name, must I remember these keywords by myself?
Code
/** Method used to re-draw the camera UI controls, called every time configuration changes */
#SuppressLint("RestrictedApi")
private fun updateCameraUi() {
// Listener for button used to view last photo
controls.findViewById<ImageButton>(R.id.photo_view_button).setOnClickListener {
Navigation.findNavController(requireActivity(), R.id.fragment_container).navigate(
CameraFragmentDirections.actionCameraToGallery(outputDirectory.absolutePath))
}
}
/** Fragment used to present the user with a gallery of photos taken */
class GalleryFragment internal constructor() : Fragment() {
/** AndroidX navigation arguments */
private val args: GalleryFragmentArgs by navArgs()
}
No you do not need to remember these things by yourself, if you know of a trick.
For example, if you don't remember the "keyword" Directions, but you know you want to do something related to CameraFragment, you can start typing e.g. CameraFragm in Android Studio. It will then suggest CameraFragment and CameraFragmentDirections for you. That way you can find CameraFragmentDirections easily even though you did not remember the keyword Directions.
There are not that many keywords to worry about though. After working with the Navigation framework for a while, you will remember them all.
If you are curious, you can find the generated classes here after a build:
./app/build/generated/source/navigation-args/...
e.g. after a debug build:
./app/build/generated/source/navigation-args/debug/com/android/example/cameraxbasic/fragments/CameraFragmentDirections.java
If you are even more curious, the code that generates these classes is here:
https://android.googlesource.com/platform/frameworks/support/+/refs/heads/androidx-master-dev/navigation/navigation-safe-args-generator/src/main/kotlin/androidx/navigation/safe/args/generator/java/JavaNavWriter.kt
There you can for example find this code:
internal fun Destination.toClassName(): ClassName {
val destName = name ?: throw IllegalStateException("Destination with actions must have name")
return ClassName.get(destName.packageName(), "${destName.simpleName()}Directions")
}
which is the code that decides what name CameraFragmentDirections gets. (Note "${destName.simpleName()}Directions" at the end.)

Jhipster override entity (keep existant + add behaviour)

I like the jhipster entity generator.
I often get to change my model and regen all entities.
I wish to keep the generated stuff and override for my needs.
On angular side, it is quite easy to create a new service extending the default entity service to do my stuff.
On java side, it is more complicated.
For example, I override src/main/java/xxx/web/rest/xxxResource.java with src/main/java/xxx/web/rest/xxxOverrideResource.java
I have to comment #RestController in xxxResource.java. I tried to give it a different bundle name from the overrided class but it is not sufficient : #RestController("xxxResource")
In xxxOverrideResource.java, I have to change all #xxxMapping() to different paths
In xxxOverrideResource.java, I have to change all method names
This allow me to keep the CRUD UI and API, and overload it using another MappingPath.
Some code to make it more visual. Here is the generated xxxResource.java
/**
* REST controller for managing WorldCommand.
*/
// Commented to prevent bean dupplicated error.
// #RestController
#RequestMapping("/api")
public class WorldCommandResource {
private final WorldCommandService worldCommandService;
public WorldCommandResource(WorldCommandService worldCommandService) {
this.worldCommandService = worldCommandService;
}
#PutMapping("/world-commands")
#Timed
public ResponseEntity<WorldCommand> updateWorldCommand(#Valid #RequestBody WorldCommand worldCommand)
throws URISyntaxException {
log.debug("REST request to update WorldCommand : {}", worldCommand);
...
}
Here is my overloaded version : xxxOverrideResource.java
/**
* REST controller for managing WorldCommand.
*/
#RestController("WorldCommandOverrideResource")
#RequestMapping("/api")
public class WorldCommandOverrideResource extends WorldCommandResource {
private final WorldCommandOverrideService worldCommandService;
public WorldCommandOverrideResource(WorldCommandOverrideService worldCommandService) {
super(worldCommandService);
log.warn("USING WorldCommandOResource");
this.worldCommandService = worldCommandService;
}
#PutMapping("/world-commands-override")
#Timed
public ResponseEntity<WorldCommand> updateWorldCommandOverride(#Valid #RequestBody WorldCommand worldCommand)
throws URISyntaxException {
throw new RuntimeException("WorldCommand updating not allowed");
}
With the xxxResource overrided, it is easy to override the xxxService and xxxRepository by constructor injection.
I feel like I am over thinking it. As it is not an external component but code from a generator, maybe the aim is to use the tool to write less code and then do the changes you need.
Also, I fear this overriding architecture will prevent me from creating abstract controller if needed.
Do you think keeping the original generated code is a good pratice or I should just make my changes in the generated class and be carefull when regenerating an entity ?
Do you know a better way to override a Spring controller ?
Your approach looks like the side-by-side approach described here: https://www.youtube.com/watch?v=9WVpwIUEty0
I often found that the generated REST API is only useful for managing data in a backoffice and I usually write a complete separate API with different endpoints, authorizations and DTOs that is consumed by mobile or end-users. So I don't see much value in overriding REST controllers, after all they are supposed to be quite thin with as little business logic as possible.
You must also consider how long you want to keep this compatibility with generated code. As your app grows in complexity you might want to refactor your code and organize it around feature packages rather than by technical packages (repository, rest controllers, services, ...). For many reasons, sooner or later the way the generated code is setup will get in your way, so I would not put too much effort into this compatibility goal that has no real business value especially when you know that the yearly released major version may break it because of changes in the generator itself or more likely because of changes in underlying frameworks.

Can CDI #Producer method take custom parameters?

I think i understood how CDI works and in order to dive deep in it, i would like to try using it with something real world example. I am stuck with one thing where i need your help to make me understand. I would really appreciate your help in this regard.
I have my own workflow framework developed using Java reflection API and XML configurations where based on specific type of "source" and "eventName" i load appropriate Module class and invoke "process" method on that. Everything is working fine in our project.
I got excited with CDI feature and wanted to give it try with workflow framework where i am planning inject Module class instead of loading them using Reflection etc...
Just to give you an idea, I will try to keep things simple here.
"Message.java" is a kind of Transfer Object which carries "Source" and "eventName", so that we can load module appropriately.
public class Message{
private String source;
private String eventName;
}
Module configurations are as below
<modules>
<module>
<source>A</source>
<eventName>validate</eventName>
<moduleClass>ValidatorModule</moduleClass>
</module>
<module>
<source>B</source>
<eventName>generate</eventName>
<moduleClass>GeneratorModule</moduleClass>
</module>
</modules>
ModuleLoader.java
public class ModuleLoader {
public void loadAndProcess(Message message){
String source=message.getSource();
String eventName=message.getEventName();
//Load Module based on above values.
}
}
Question
Now , if i want to implement same via CDI to inject me a Module (in ModuleLoader class), I can write Factory class with #Produce method , which can do that. BUT my question is,
a) how can pass Message Object to #Produce method to do lookup based on eventName and source ?
Can you please provide me suggestions ?
Thanks in advance.
This one is a little tricky because CDI doesn't work the same way as your custom solution (if I understand it correctly). CDI must have all the list of dependencies and resolutions for those dependencies at boot time, where your solution sounds like it finds everything at runtime where things may change. That being said there are a couple of things you could try.
You could try injecting an InjectionPoint as a parameter to a producer method and returning the correct object, or creating the correct type.
There's also creating your own extension of doing this and creating dependencies and wiring them all up in the extension (take a look at ProcessInjectionTarget, ProcessAnnotatedType, and 'AfterBeanDiscovery` events. These two quickstarts may also help get some ideas going.
I think you may be going down the wrong path regarding a producer. Instead it more than likely would be much better to use an observer especially based on what you've described.
I'm making the assumption that the "Message" transfer object is used abstractly like a system wide event where basically you fire the event and you would like some handler defined in your XML framework you've created to determine the correct manager for the event, instantiate it (if need be), and then call the class passing it the event.
#ApplicationScoped
public class MyMessageObserver {
public void handleMessageEvent(#Observes Message message) {
//Load Module based on above values and process the event
}
}
Now let's assume you want to utilize your original interface (I'll guess it looks like):
public interface IMessageHandler {
public void handleMessage(final Message message);
}
#ApplicationScoped
public class EventMessageHandler implements IMessageHandler {
#Inject
private Event<Message> messageEvent;
public void handleMessage(Message message) {
messageEvent.fire(message);
}
}
Then in any legacy class you want to use it:
#Inject
IMessageHandler handler;
This will allow you to do everything you've described.
May be you need somthing like that:
You need the qualifier. Annotation like #Module, which will take two paramters source and eventName; They should be non qualifier values. See docs.
Second you need a producer:
#Produces
#Module
public Module makeAmodule(InjectionPoint ip) {
// load the module, take source and eventName from ip
}
Inject at proper place like that:
#Inject
#Module(source="A", eventName="validate")
Module modulA;
There is only one issue with that solution, those modules must be dependent scope, otherwise system will inject same module regardles of source and eventName.
If you want to use scopes, then you need make source and eventName qualified parameters and:
make an extension for CDI, register programmatically producers
or make producer method for each and every possible combinations of source and eventName (I do not think it is nice)

Synchronising GEF editor with EMF model with two EMF adapters

I'm having trouble synchronising my GEF editor with the EMF-based model. I think this is due to the fact that the model-internal EMF Adapter, or rather the methods it calls, aren't finished before the editor's Adapter's notifyChanged() is called and updates the model children. This leads to the editor view being out-of-sync with the model itself, or rather, the changes to the model not being represented in the view when they should be.
Consider this set up. A Command "CreateNodeCommand" adds a node to the underlying model:
#Override
public void execute() {
...
getNewNode().setGraph(getGraph());
...
}
The GraphEditPart has an internal class extending org.eclipse.emf.common.notify.Adapter. It's notifyChanged() method is indeed notified, as tested similar to below (incomplete code):
#Override
public void notifyChanged(Notification notification) {
switch (notification.getEventType()) {
case Notification.ADD:
System.err.println("ADD occurred!");
refreshChildren();
}
The problem is, that the (third-party) model itself also implements an Adapter, which in turn runs a number of methods on the new model element, such as adding an ID, etc.
It seems to me that the fact that the new element's figure doesn't show up in the editor directly after it's been created - but only after the next editing step, the figure for which then doesn't appear - suggests that the model adapter is still busy setting up the new element while refreshChildren() is already being called by the editor adapter.
This seems to call for synchronisation, but I'm unsure whether this can be achieved with built-in Java functionality for multithreading, or calls for an EMF-based approach.
Please share your knowledge about synchronising in EMF.
Many thanks in advance!
EDIT
On request, here is the source code for the getModelChildren() method:
#Override
protected List<EObject> getModelChildren() {
List<EObject> allModelObjects = new ArrayList<EObject>();
allModelObjects.addAll(((MyGraph) getModel()).getTokens());
allModelObjects.addAll(((MyGraph) getModel()).getNodes());
return allModelObjects;
}
Debugging the (3rd party) model, I found out that the Graph's enotify() fired the notification before the actual adding took place, hence my Adapterreceived the notification too early, i.e., before the node had been added.
The notification is now called after the add and everything works fine.
Thanks for all of your help!
Try extending EContentAdapter instead of AdapterImpl, and not forget to call
super.notifyChanged(Notification notification);
in it. It's an adapter, which will add itself to new elements of the model, and notify you then they are changed.

JBehave - all steps marked pending?

I'm trying to create and run a simple JUnitStory to run a .story file.
I have this:
class Scenario1 extends JUnitStory {
#Delegate MySteps steps = new MySteps()
#Override
public Configuration configuration() {
return new MostUsefulConfiguration()
.useStoryLoader(new LoadFromRelativeFile(new File('src/test/groovy').toURL()))
.useStoryReporterBuilder(
new StoryReporterBuilder()
.withDefaultFormats()
.withFormats(Format.HTML, Format.CONSOLE, Format.TXT)
);
}
#Override
public List candidateSteps() {
final candidateSteps = new InstanceStepsFactory(configuration(), this).createCandidateSteps()
return candidateSteps;
}
}
With or without the delegate (copying and pasting in all the annotated methods of MySteps), whenever I run JBehave, I get the following output:
somePattern(){
// PENDING
}
It's like the individual stories don't pick up the steps.
When I create a "Stories" class and pull all the story files in with storyPaths, the individual steps are defined. Using a debugger, I see that candidateSteps is being hit, but it's not pulling in the data it needs to.
What could possibly be going on here?
You don't need to delegate to the Steps. And also you should not override candidateSteps, but rather stepsFactory. In later versions of JBehave, candidateSteps is deprecated, to make that preference for the factory method more prominent ( http://jbehave.org/reference/stable/javadoc/core/org/jbehave/core/ConfigurableEmbedder.html#candidateSteps() )
See this blog, where I explained how the basic JBehave configuration works in more detail:
http://blog.codecentric.de/en/2012/06/jbehave-configuration-tutorial/
Andreas
Here is your answer buddy:
The package of format has Changed.
This is the deprecated
import static org.jbehave.core.reporters.StoryReporterBuilder.Format.HTML;
This is the new one :)
import static org.jbehave.core.reporters.Format.HTML;
Took a while to find the answer, but was hidden on the jbehave documentation
Hope it helps!
Cheers!
You shouldn't need to use the #Delegate - your JUnitStory is not your Steps class. Can you try passing in steps where you have this?
When you pass in a class that has been bytecode manipulated for Steps classes, JBehave may not see the jbehave annotations anymore.
JBehave is old, underdeveloped technology. Don't use it.

Resources