How to add multiple outlets for generated XText DSL - dsl

By default the generated XText artifacts generate code from my DSL to the default outlet (which defaults to src-gen folder). I know that you can explicitly pass the outlet configuration name in fsa.generateFile("myfile.txt", "MY_OUTLET_NAME", "Some file content").
I it because I want to generate code with my XText DSL and want to use the generation gap pattern and generate code in a folder called "src-once".
I'am using XText 2.2.1.
My questions:
1) Where and how do I define my outlets like "MY_OUTLET_NAME"?
2) Is there a way to prevent overwriting existing files in a specific outlet?

The hint form Christian Dietrich pointed me in the right direction. Below is the code that I ended up with.
I have created a new class MyOutputConfigurationProvider that implements IOutputConfigurationProvider. The getOutputConfigurations method returns two output configurations, the default src-gen and a custom src-gen-once with the correct settings for generating sources only once.
package com.my.dsl;
import static com.google.common.collect.Sets.newHashSet;
import java.util.Set;
import org.eclipse.xtext.generator.IFileSystemAccess;
import org.eclipse.xtext.generator.IOutputConfigurationProvider;
import org.eclipse.xtext.generator.OutputConfiguration;
public class MyOutputConfigurationProvider implements
IOutputConfigurationProvider {
public final static String DEFAULT_OUTPUT_ONCE = "DEFAULT_OUTPUT_ONCE";
/**
* #return a set of {#link OutputConfiguration} available for the generator
*/
public Set<OutputConfiguration> getOutputConfigurations() {
OutputConfiguration defaultOutput = new OutputConfiguration(IFileSystemAccess.DEFAULT_OUTPUT);
defaultOutput.setDescription("Output Folder");
defaultOutput.setOutputDirectory("./src-gen");
defaultOutput.setOverrideExistingResources(true);
defaultOutput.setCreateOutputDirectory(true);
defaultOutput.setCleanUpDerivedResources(true);
defaultOutput.setSetDerivedProperty(true);
OutputConfiguration onceOutput = new OutputConfiguration(DEFAULT_OUTPUT_ONCE);
onceOutput.setDescription("Output Folder (once)");
onceOutput.setOutputDirectory("./src-gen-once");
onceOutput.setOverrideExistingResources(false);
onceOutput.setCreateOutputDirectory(true);
onceOutput.setCleanUpDerivedResources(false);
onceOutput.setSetDerivedProperty(true);
return newHashSet(defaultOutput, onceOutput);
}
}
To use the MyOutputConfigurationProvider implementation add a configure method to your module class:
/**
* Use this class to register components to be used within the IDE.
*/
public class MyDslUiModule extends com.my.dsl.ui.AbstractMyDslUiModule {
public MyDslUiModule(AbstractUIPlugin plugin) {
super(plugin);
}
#Override
public void configure(Binder binder) {
super.configure(binder);
binder.bind(IOutputConfigurationProvider.class).to(MyOutputConfigurationProvider.class).in(Singleton.class);
}
}

implement a custom IOutputConfigurationProvider should do the trick

Related

JukitoRunner, bind mock of final class

How to bind mock of final class in Jukito ?
For example :
public final class SomeFinalClass(){
public SomeFinalClass(String someString){
}
}
//Testing class
#Runwith(JukitoRunner.class)
public class TestingClass(){
#Inject
private SomeFinalClass someFinalClassMock;
public static class TestModule extends JukitoModule {
#Override
protected void configureTest() {
// bind(SomeClient.class).in(TestSingleton.class);
}
#Provides
public SomeFinalClass getSomkeFinalClass() {
return Mokito.mock(SomeFinalClass.class); //throws error
}
}
}
Is there a way i can use PowerMockito with JukitoRunner ?
You can mock a final class if you're using Mockito 2. From Mockito 2 Wiki:
Mocking of final classes and methods is an incubating, opt-in feature. It uses a combination of Java agent instrumentation and subclassing in order to enable mockability of these types. As this works differently to our current mechanism and this one has different limitations and as we want to gather experience and user feedback, this feature had to be explicitly activated to be available ; it can be done via the mockito extension mechanism by creating the file src/test/resources/mockito-extensions/org.mockito.plugins.MockMaker containing a single line: mock-maker-inline.
After you created this file, Mockito will automatically use this new engine and one can do :
final class FinalClass {
final String finalMethod() { return "something"; }
}
FinalClass concrete = new FinalClass();
FinalClass mock = mock(FinalClass.class);
given(mock.finalMethod()).willReturn("not anymore");
assertThat(mock.finalMethod()).isNotEqualTo(concrete.finalMethod());

How to use JAXB with PropertyChangeSupport?

I am trying to use JAXB in an Eclipse project. View widgets are bound to model attributes with java.beans.PropertyChangeSupport. This works fine. I want to also bind model attributes to a persistent XML representation on disk with JAXB. I can marshal important state to XML and can unmarshal that back into a pojo/bean thing at runtime but am not sure how best to proceed.
The bean setters bound to my view widgets need to firePropertyChange() but XJC generates only simple setters, this.value = value.
XJC properties are protected, so it looks like I could override its setters to firePropertyChange(), but I don't know how my overriding subclass could have its unmarshaled superclass magically change state at runtime (like when user requests report for different year which is when I would unmarshal a different XML file).
Is there an example or pattern for doing this? Surely it is not new. Many thanks. -d
#Adam Thanks! I grokked a workable solution with this:
public class MyBean extends JaxBean {
public JaxBean getJaxBean() {
return this;
}
public void setJaxBean(JaxBean jaxBean) {
super.setThis(jaxBean.getThis());
super.setThat(jaxBean.getThat());
// etc...
}
public MyBean() {
// etc...
}
}
I think my confusion was thinking the unmarshalled bean would somehow magically replace my working instance. The solution above requires additional text but it works and the use of JaxBean's dumb setters avoids firing events unnecessarily when loading a new XML.
Your solution, annotating MyBean with JAXB and using schemagen, sounds even better. I will try that next go around. These are very nice technologies. -d
I mentioned another approach to your application in my comment.
It's what we use in our RCP application. Except that we marshall/unmarshall through network thus we use JAXWS and not just JAXB.
I'm somewhat experienced with this kind of stack, so here's a kick-starter for you:
/**
* Your UI POJO-s should extend this class.
*/
public abstract class UIModel<T extends UIModel> {
protected final PropertyChangeSupport propertyChangeSupport = new PropertyChangeSupport(this);
/**
* This comes handy at times
*/
public void afterUnmarshal(Unmarshaller unmarshaller, Object parent) {
//....
}
/**
* And this too, trust me.
*/
public void deepCopy(final T of) {
removePropertyChangeListener(propertyChangeListener);
//It's from Spring Framework but you can write your own. Spring is a fat-ass payload for a Java-SE application.
BeanUtils.copyProperties(of, this, IGNORED_ON_CLIENT);
addPropertyChangeListener(propertyChangeListener);
}
}
public void addPropertyChangeListener(String propertyName, PropertyChangeListener listener) {
propertyChangeSupport.addPropertyChangeListener(propertyName, listener);
}
public void removePropertyChangeListener(PropertyChangeListener listener) {
propertyChangeSupport.removePropertyChangeListener(listener);
}
}
/**
* Example of a UI POJO.
*/
public class Car extends UIModel<Car> {
private String make;
private int numberOfWheels;
//... etc.
/**
* Example of a setter
*/
public void setMake(String make) {
propertyChangeSupport.firePropertyChange("make", this.make, this.make = make);
}
public String getMake() {
return make;
}
//... etc.
}
I don't know how often your Schema-definition changes but there's a pattern supporting this;
/**
* New application (compiled with the class below) can open a file saved by the old application.
*/
public class Car2 extends Car {
private String fuelType; // Example of a new field
public void setFuelType(String fuelType) {
propertyChangeSupport.firePropertyChange("fuelType", this.fuelType, this.fuelType = fuelType);
}
//... etc.
}
This way the old application can open XML-outputs of the new. Dropping a field from such a class's source code will result in a RuntimeException as JAXB is still looking for it.
If you're clients are always up-to-date then you should not care about this at all.
When tackling with Java collections and subclassing excessively you will run into JAXB problems which you can solve by Googling #XmlRootElement and #XmlSeeAlso annotations.
Comments don't format, trying "answer". Need to do the stackoverflow tour. Continuing,
Thanks, Adam, I will bookmark these for future reference. They look similar to my example, the pattern is (unmarshal New, be quiet, copy New to Old, be noisy). I like the mind-bending recursion,
class UIModel<T extends UIModel>
class Car extends UIModel<Car>
and assume you've tested it compiles. ;)
Regards, -d.

Groovy AST - Adding annotations at compilation

I'm trying to dynamicly crate an annotation that will dynamicaly add an #XmlElement annotation to every field in a class using metaprogramming and AST. I'm having problems creating the annotations and applying them to the fields properly.
The code i have is formatted here: http://pastebin.com/60DTX5Ya
import javax.xml.bind.annotation.XmlElement
#GroovyASTTransformation(phase = CompilePhase.CANONICALIZATION)
class WebserviceAnnotationModifier implements ASTTransformation {
#Override
void visit(ASTNode[] astNodes, SourceUnit sourceUnit) {
if (!astNodes) return
if (!astNodes[0] || !astNodes[1]) return
if (!(astNodes[0] instanceof AnnotationNode)) return
if (!(astNodes[1] instanceof ClassNode)) return
ClassNode node = (ClassNode)astNodes[1]
List fields = node.getFields()
fields.each {FieldNode field ->
field.addAnnotation(ClassHelper.make(new XmlElement.DEFAULT()));
}
}
}
#Retention(RetentionPolicy.SOURCE)
#Target([ElementType.TYPE])
#GroovyASTTransformationClass(classes =[WebserviceAnnotationModifier])
public #interface WebresourceAnnotation{}
#WebresourceAnnotation
class TestPerson{
String name;
String lastName;
int Age
}
Am i approaching this all wrong? The reason i do this is i have a domain that is still in the making and i'd like to just go in and apply the annotation to all fields. Couldn't find any examples of annotations added during compilation. Is this not possible?
Writing codes using Groovy AST Transformation alone does not work with the Grails reloading mechanism. Here's a proper way to implement AST transformation for a Grails app.
Your transformer class must extends AbstractGrailsArtefactTransformer.
Your transformer class must be annotated by #AstTransformer.
You class must be put under org.codehaus.groovy.grails.compiler or its sub-package. In my case I use org.codehaus.groovy.grails.compiler.zk and it's working fine.
Implement shouldInject() to match only classes you want, in this case domain classes.
Override performInjection() and write your codes there.
Pack your transformer and releated classes into a .jar file, or Grails compiler does not load it.

Is there a way to show partial class name in the testrunner for resharper?

I have created a set of tests that I have grouped together by using a partial class. Is there a way to get the partial name to show up in the test runner? What I have is something like
File 1:
public partial class MyWrapperClass
{
[TestClass]
public class This_is_a_descriptive_scenario {
[TestMethod]
public void This_is_a_descriptive_scenario_outcome() { ... }
}
}
File 2:
public partial class MyWrapperClass
{
[TestClass]
public class This_is_a_descriptive_scenario2 {
[TestMethod]
public void This_is_a_descriptive_scenario2_outcome() { ... }
}
}
When running tests like that in the builtin test runner in Visual studio I can see the result as: MyWrapperClass+This_is_a_descriptive_test, if I have added the class column to the test result. But when you run the test in resharper's testrunne they are grouped by project and/or namespace and the class name, but I can't see that the tests are part of a partial class anywhere. Is that possible?
I don't think this is supported. Although it might be possible to extend it. I can point you in the right direction if you're willing to go down that path.
The easiest solution would be to use namespaces for grouping instead of partial classes.
Hope this helps
Miguel
The name of a partial class is the same in either file... No, there's nothing that currently groups tests by test file name.

Tapestry 4: Asset Cache Control?

I use Tapestry 4, and whenever we push a release that changes any assets (image, style sheet, JS library), we get problems because users still have the old version of the asset in their browser cache. I'd like to set up some easy way to allow caching, but force a new asset download when we update the application. Simply disallowing caching entirely for assets is not an acceptable solution.
I couldn't see any existing mechanism for doing this, but I was figuring that there might be some way to tell Tapestry to add the build number to the URL, something like this:
http://www.test.com/path/to/the/asset/asset.jpg?12345
That way, every new build would make it look like a different asset to the end user.
Does Tapestry provide an easy way to solve the cache problem that I'm not aware of? If not, how would one go about modifying the URL generated by Tapestry? And how would the code responsible for doing that get the build number? (I could get the build number into a Spring bean, for example, but how would the new URL building mechanism get at it?)
After stewing about this problem for a long time, I eventually solved it myself. This solution assumes you have the tapestry-spring library in your project.
In my case, I have a Spring bean that contains some of my application's global properties:
package myapp;
public class AppProperties {
private String build;
public String getBuild() {
return build;
}
public void setBuild(String build) {
this.build = build;
}
// other properties
}
Declare this bean in your Spring configuration:
<bean id="appProperties" class="myapp.AppProperties">
<property name="build" value="#BUILD_NUMBER#"/>
</bean>
You can set up your Ant build script to replace #BUILD_NUMBER# with the actual number (see the Copy task in the Ant manual for details).
Now create a class that will wrap IAssets and tack the build number onto the URL:
package myapp;
import java.io.InputStream;
import org.apache.hivemind.Location;
import org.apache.hivemind.Resource;
import org.apache.tapestry.IAsset;
public class BuildAwareAssetWrapper implements IAsset {
private IAsset wrapped;
private String build;
public BuildAwareAssetWrapper(IAsset wrapped, String build) {
this.wrapped = wrapped;
this.build = build;
}
public String buildURL() {
return addParam(wrapped.buildURL(), "build", build);
}
public InputStream getResourceAsStream() {
return wrapped.getResourceAsStream();
}
public Resource getResourceLocation() {
return wrapped.getResourceLocation();
}
public Location getLocation() {
return wrapped.getLocation();
}
private static String addParam(String url, String name, String value) {
if (url == null) url = "";
char sep = url.contains("?") ? '&' : '?';
return url + sep + name + '=' + value;
}
}
Next, we need to make Tapestry wrap all assets with our wrapper. The AssetSourceImpl class is responsible for providing IAsset instances to Tapestry. We'll extend this class and override the findAsset() method so that we can wrap the created assets with the wrapper class:
package myapp;
import java.util.Locale;
import org.apache.hivemind.Location;
import org.apache.hivemind.Resource;
import org.apache.tapestry.IAsset;
import org.apache.tapestry.asset.AssetSourceImpl;
public class BuildAwareAssetSourceImpl extends AssetSourceImpl {
private AppProperties props;
#Override
public IAsset findAsset(Resource base, String path, Locale locale, Location location) {
IAsset asset = super.findAsset(base, path, locale, location);
return new BuildAwareAssetWrapper(asset, props.getBuild());
}
public void setAppProperties(AppProperties props) {
this.props = props;
}
}
Notice that the implementation has a setter which can accept our Spring bean. The last step is to get Tapestry to use BuildAwareAssetSourceImpl to create assets instead of AssetSourceImpl. We do this by overriding the corresponding service point in hivemodule.xml:
<!-- Custom asset source -->
<implementation service-id="tapestry.asset.AssetSource">
<invoke-factory service-id="hivemind.BuilderFactory" model="singleton">
<construct class="myapp.BuildAwareAssetSourceImpl">
<set-object property="appProperties" value="spring:appProperties"/>
<set-configuration property="contributions" configuration-id="tapestry.asset.AssetFactories"/>
<set-service property="lookupAssetFactory" service-id="tapestry.asset.LookupAssetFactory"/>
<set-service property="defaultAssetFactory" service-id="tapestry.asset.DefaultAssetFactory"/>
</construct>
</invoke-factory>
</implementation>
That's it. If you run your application and view the source for any page that uses an asset, you will see that the URL will have the new build parameter on it.

Resources