I have been experimenting with the FXMLLoader and using the setControllerFactory method using a custom Callback<P,R> implementation.
The ORACLE documentation says the following:
An implementation might return a null value to indicate that it does
not or cannot create a controller of the given type; in this case, the
default controller construction mechanism will be employed by the
loader.
The result I want to achieve is that I can use a dependency injection framework to create any controllers that require parameters but I will let the FXMLLoader load any controllers that do not require parameters.
So if I have the following simple FXML file which uses the ViewController class which accepts no parameters...
<StackPane fx:id="pane"
xmlns:fx="http://javafx.com/fxml"
fx:controller="my.package.ViewController">
</StackPane>
and I use the following simple controller factory implementation to signal to the FXMLLoader that I want it to manage the construction of the controller in this case...
loader.setControllerFactory(new Callback<Class<?>, Object>(){
#Override
public Object Call(Class<?> type) {
return null; // Let the FXMLLoader handle construction...
}
});
after calling the load() method my Initialise method in the ViewController class is never called (I have verified this with a breakpoint).
If I change my controller factory implementation to return an instance of the ViewController class then everything works as expected.
Can anyone help me to clear up my confusion? Am I using the Callback interface incorrectly or is the ORACLE documentation incorrect?
javafx does the following in FXMLLoader:
try {
if (controllerFactory == null) {
setController(ReflectUtil.newInstance(type));
} else {
setController(controllerFactory.call(type));
}
} catch (InstantiationException exception) {
throw new LoadException(exception);
} catch (IllegalAccessException exception) {
throw new LoadException(exception);
}
so, yes, the oracle tutorial is incorrect.
Related
I am trying to create service that will read some data from remote server and process them using Spring Integration.
I have class that extends ArrayList, because I need to keep pointer to other page, so I can read it in next remote call. I set up release strategy to collect all these pages, until there is no pointer for the next page.
Here is definition of class:
public class CustomList extends ArrayList<DataInfo>
{
private String nextCursor;
// Methods omitted for readability
}
Everything worked fine, until I setup JdbcMessageStore in Aggregator, so I can keep messages in case of service shutdown.
Problem on which I come across is that in my release strategy class I cast my list class to same class (because message group does not define type), this exceptions is raised:
java.lang.ClassCastException: com.example.CustomList cannot be cast to com.example.CustomList
This is my release strategy class:
#Component
public class CursorReleaseStrategy implements ReleaseStrategy
{
#Override
public boolean canRelease(MessageGroup group)
{
return group.getMessages().stream()
.anyMatch(message -> ((CustomList) message.getPayload()).getNextCursor() == null);
}
}
If I remove message store, everything works fine, but the problem is that I need message store.
I am using spring boot 2.1.6 and Spring Integration DSL for creating this flow.
From what I read, this error happens because of different class loaders, but this I do from the same application.
Is there anything more that I need to configure for this to work_
Almost certainly a class loader issue; you can find which class loader loads each component (message store, release strategy) by injecting them into a bean and calling getClass().getClassLoader().
When application has been packaged in jar, there was such error.
So to fix the problem I created two beans, depending on profile.
For example:
#Profile("!prod")
#Bean
public MessageGroupStore messageStore(DataSource dataSource)
{
JdbcMessageStore jdbcMessageStore = new JdbcMessageStore(dataSource);
jdbcMessageStore.setDeserializer(inputStream -> {
ConfigurableObjectInputStream objectInputStream = new ConfigurableObjectInputStream(inputStream, Thread.currentThread().getContextClassLoader());
try {
return (Message<?>) objectInputStream.readObject();
} catch (ClassNotFoundException var4) {
throw new NestedIOException("Failed to deserialize object type", var4);
}
});
return jdbcMessageStore;
}
#Profile("prod")
#Bean
public MessageGroupStore prodMessageStore(DataSource dataSource)
{
return new JdbcMessageStore(dataSource);
}
I am using primefaces version 5 and I am adding messages regarding to a specific actions like save :
public void save(Tenant tenant) {
tenantDao.save(tenant);
FacesContext.getCurrentInstance().addMessage(null, new FacesMessage("Save success"));
}
Since I had a lot of these actions I tried to simplify this by creating a custom annotation called Message:
#Target(Runtime)
#Retention(Method)
public #interface Message {
String value();
}
and in my dao class :
public class TenantDao {
#Message("Saved Successfully")
public Tenant save(Tenant t) {
return em.save(t);
}
}
To read this annotation I have overridden the ELResolver the method invoke
public Object invoke(ELContext context,
Object base,
Object method,
Class<?>[] paramTypes,
Object[] params) {
Object result = super.invoke(context,method,paramTypes,params);
Method m = base.getClass().getMethod(method,paramTypes);
if(m.getAnnotation(Message.class) != null) {
addMessahe(m.getAnnotation(Message.class).value());
}
return result;
}
This was called in property (rendered, update, ..) but not in action listener
After a lot of debugging I discovered that theactionListener is called from MethodExpression class. So, I wrapped the MethodExpression class, and override the method invoke.
The problem now is that there are no way to retreive the class Method from MethodExpression class, also if I used the expression #{tenantDao.save(tenant)} the method getMethodInfo from MethodExpression will throw an exception.
Are there any way to read tAnnotation from any jsf context ?
I know that using Spring with AOP may solve this but I am not using Spring now.
Thanks
In a pretty old .NET tutorial, "Nerd Dinner", it talks about using a Helper Class for Rule Violations. Everything seems straight forward except I'm not sure where to put this class so I can reference it. I am pretty new at MVC.
All of this below was taken from Nerd Dinner Tutorial:
Using a AddRuleViolations Helper Method
Our initial HTTP-POST Edit implementation used a foreach statement within its catch block to loop over the Dinner object's Rule Violations and add them to the controller's ModelState collection:
catch {
foreach (var issue in dinner.GetRuleViolations()) {
ModelState.AddModelError(issue.PropertyName, issue.ErrorMessage);
}
return View(dinner);
}
We can make this code a little cleaner by adding a "ControllerHelpers" class to the NerdDinner project, and implement an "AddRuleViolations" extension method within it that adds a helper method to the ASP.NET MVC ModelStateDictionary class. This extension method can encapsulate the logic necessary to populate the ModelStateDictionary with a list of RuleViolation errors:
public static class ControllerHelpers {
public static void AddRuleViolations(this ModelStateDictionary modelState, IEnumerable errors) {
foreach (RuleViolation issue in errors) {
modelState.AddModelError(issue.PropertyName, issue.ErrorMessage);
}
}
}
I am using Guice's RequestScoped and Provider in order to get instances of some classes during a user request. This works fine currently. Now I want to do some job in a background thread, using the same instances created during request.
However, when I call Provider.get(), guice returns an error:
Error in custom provider, com.google.inject.OutOfScopeException: Cannot
access scoped object. Either we are not currently inside an HTTP Servlet
request, or you may have forgotten to apply
com.google.inject.servlet.GuiceFilter as a servlet
filter for this request.
afaik, this is due to the fact that Guice uses thread local variables in order to keep track of the current request instances, so it is not possible to call Provider.get() from a thread different from the thread that is handling the request.
How can I get the same instances inside new threads using Provider? It is possible to achieve this writing a custom scope?
I recently solved this exact problem. There are a few things you can do. First, read up on ServletScopes.continueRequest(), which wraps a callable so it will execute as if it is within the current request. However, that's not a complete solution because it won't forward #RequestScoped objects, only basic things like the HttpServletResponse. That's because #RequestScoped objects are not expected to be thread safe. You have some options:
If your entire #RequestScoped hierarchy is computable from just the HTTP response, you're done! You will get new instances of these objects in the other thread though.
You can use the code snippet below to explicitly forward all RequestScoped objects, with the caveat that they will all be eagerly instantiated.
Some of my #RequestScoped objects couldn't handle being eagerly instantiated because they only work for certain requests. I extended the below solution with my own scope, #ThreadSafeRequestScoped, and only forwarded those ones.
Code sample:
public class RequestScopePropagator {
private final Map<Key<?>, Provider<?>> requestScopedValues = new HashMap<>();
#Inject
RequestScopePropagator(Injector injector) {
for (Map.Entry<Key<?>, Binding<?>> entry : injector.getAllBindings().entrySet()) {
Key<?> key = entry.getKey();
Binding<?> binding = entry.getValue();
// This is like Scopes.isSingleton() but we don't have to follow linked bindings
if (binding.acceptScopingVisitor(IS_REQUEST_SCOPED)) {
requestScopedValues.put(key, binding.getProvider());
}
}
}
private final BindingScopingVisitor<Boolean> IS_REQUEST_SCOPED = new BindingScopingVisitor<Boolean>() {
#Override
public Boolean visitScopeAnnotation(Class<? extends Annotation> scopeAnnotation) {
return scopeAnnotation == RequestScoped.class;
}
#Override
public Boolean visitScope(Scope scope) {
return scope == ServletScopes.REQUEST;
}
#Override
public Boolean visitNoScoping() {
return false;
}
#Override
public Boolean visitEagerSingleton() {
return false;
}
};
public <T> Callable<T> continueRequest(Callable<T> callable) {
Map<Key<?>, Object> seedMap = new HashMap<>();
for (Map.Entry<Key<?>, Provider<?>> entry : requestScopedValues.entrySet()) {
// This instantiates objects eagerly
seedMap.put(entry.getKey(), entry.getValue().get());
}
return ServletScopes.continueRequest(callable, seedMap);
}
}
I have faced the exact same problem but solved it in a different way. I use jOOQ in my projects and I have implemented transactions using a request scope object and an HTTP filter.
But then I created a background task which is spawned by the server in the middle of the night. And the injection is not working because there is no request scope.
Well. The solutions is simple: create a request scope manually. Of course there is no HTTP request going on but that's not the point (mostly). It is the concept of the request scope. So I just need a request scope that exists alongside my background task.
Guice has an easy way to create a request scope: ServletScope.scopeRequest.
public class MyBackgroundTask extends Thread {
#Override
public void run() {
RequestScoper scope = ServletScopes.scopeRequest(Collections.emptyMap());
try ( RequestScoper.CloseableScope ignored = scope.open() ) {
doTask();
}
}
private void doTask() {
}
}
Oh, and you probably will need some injections. Be sure to use providers there, you want to delay it's creation until inside the created scope.
Better use ServletScopes.transferRequest(Callable) in Guice 4
I am new to Mockito as a concept. Can you please help me understand using Mockito for formhandlers in ATG. Some examples will be appreciated.
There is a good answer (related to ATG) for other similar question: using-mockito-for-writing-atg-test-case. Please review if it includes what you need.
Many of ATG-specific components (and form handlers particularly) are known to be "less testable" (in comparison to components developed using TDD/BDD approach), b/c design of OOTB components (including reference application) doesn't always adhere to the principle of having "Low Coupling and High Cohesion"
But still the generic approach is applicable for writing unit-tests for all ATG components.
Below is a framework we've used for testing ATG FormHandlers with Mockito. Obviously you'll need to put in all the proper bits of the test but this should get you started.
public class AcmeFormHandlerTest {
#Spy #InjectMocks private AcmeFormHandler testObj;
#Mock private Validator<AcmeInterface> acmeValidatorMock;
#Mock private DynamoHttpServletRequest requestMock;
#Mock private DynamoHttpServletResponse responseMock;
private static final String ERROR1_KEY = "error1";
private static final String ERROR1_VALUE = "error1value";
#BeforeMethod(groups = { "unit" })
public void setUp() throws Exception {
testObj = new AcmeFormHandler();
initMocks(this);
}
//Test the happy path scenario
#Test(groups = { "unit" })
public void testWithValidData() throws Exception {
testObj.handleUpdate(requestMock, responseMock);
//Assume your formhandler calls a helper method, then ensure the helper method is called once. You verify the working of your helper method as you would do any Unit test
Mockito.verify(testObj).update(Matchers.refEq(requestMock), Matchers.refEq(responseMock), Mockito.anyString(), (AcmeBean) Mockito.anyObject());
}
//Test a validation exception
#Test(groups = { "unit" })
public void testWithInvalidData() throws Exception {
Map<String, String> validationMessages = new HashMap<String, String>();
validationMessages.put(ERROR1_KEY, ERROR1_VALUE);
when(acmeValidatorMock.validate((AcmeInterface) Mockito.any())).thenReturn(validationMessages);
testObj.handleUpdate(requestMock, responseMock);
assertEquals(1, testObj.getFormExceptions().size());
DropletFormException exception = (DropletFormException) testObj.getFormExceptions().get(0);
Assert.assertEquals(exception.getMessage(), ERROR1_VALUE);
}
//Test a runtime exception
#Test(groups = { "unit" })
public void testWithRunProcessException() throws Exception {
doThrow(new RunProcessException("")).when(testObj).update(Matchers.refEq(requestMock), Matchers.refEq(responseMock), Mockito.anyString(), (AcmeBean) Mockito.anyObject());
testObj.handleAddGiftCardToCart(requestMock, responseMock);
assertEquals(1, testObj.getFormExceptions().size());
DropletFormException exception = (DropletFormException) testObj.getFormExceptions().get(0);
Assert.assertEquals(exception.getMessage(), GENERAL_ERROR_KEY);
}
}
Obviously the above is just a framework that fit in nicely with the way in which we developed our FormHandlers. You can also add validation for redirects and stuff like that if you choose:
Mockito.verify(responseMock, Mockito.times(1)).sendLocalRedirect(SUCCESS_URL, requestMock);
Ultimately the caveats of testing other people's code still applies.
Here's what I do when I unit test a form handler (at least until I manage to release a major update for AtgDust). Note that I don't use wildcard imports, so I'm not sure if this causes any namespace conflicts.
import static org.mockito.Mockito.*;
import static org.mockito.MockitoAnnotations.initMocks;
import org.junit.*;
import static org.junit.Assert.assertThat;
import static org.hamcrest.CoreMatchers.*;
import atg.servlet.*;
import some.form.handler.FormHandler;
#RunWith(JUnit4.class)
public class FormHandlerTest {
#Mock DynamoHttpServletRequest request;
#Mock DynamoHttpServletResponse response;
FormHandler handler;
#Before
public void setup() {
initMocks(this);
handler = new FormHandler();
}
#Test
public void testSubmitHandlerRedirects() {
handler.handleSubmit(request, response);
verify(response).sendLocalRedirect(eq("/success.jsp"), eq(request));
assertThat(handler.getFormError(), is(false));
}
}
The basic idea is to set up custom behavior for mocks/stubs using when() on the mock object method invocation to return some test value or throw an exception, then verify() mock objects were invoked an exact number of times (in the default case, once), and do any assertions on data that's been changed in the form handler. Essentially, you'll want to use when() to emulate any sort of method calls that need to return other mock objects. When do you need to do this? The easiest way to tell is when you get NPEs or other runtime exceptions due to working with nulls, zeros, empty strings, etc.
In an integration test, ideally, you'd be able to use a sort of in-between mock/test servlet that pretends to work like a full application server that performs minimal request/session/global scope management. This is a good use for Arquillian as far as I know, but I haven't gotten around to trying that out yet.