How to exclude a package from Archunit Tests? - archunit

In my root project I have two sub projects with that package structure.
Project 1: com.app
Project 2: com.app.api
In Project 1 I have a Class definied with ArchRules, anotaed like that
#AnalyzeClasses( packages = "com.app")
public class ArchTests
The Problem is that if I run that test, its analyze everything from Project 2 too. How I can exclude package com.app.api?

You can use custom importOptions:
#AnalyzeClasses(packages = "com.app", importOptions = ExcludeApiImportOption.class)
where the ImportOption is bascially a predicate whether a Location should be imported or not:
class ExcludeApiImportOption implements com.tngtech.archunit.core.importer.ImportOption {
#Override
public boolean includes(Location location) {
return // ... whether location is not in "com.app.api"
}
}

Related

How to configure environment variable for particular cucumber feature file in eclipse?

I am using Cucumber in Java with JUnit and building the project with Maven. I want to use environment variable in the project that should access my entire project. Is there any way to do it?
Your problem title says for a particular cucumber feature file and description of your question says in your whole projects the environment variable should be accessible.
Can you confirm what you want for a particular feature file or for whole project?
In case if you want your environment variables to be accessed in whole project with static value than you can do following things:
Create Single ton class and getter and setter methods to access those environment variables as below
public class GlobalClass {
private static GlobalClass ourInstance;
private HashMap<String, Object> sessionState = new HashMap<>();
public static GlobalClass getInstance() {
if (ourInstance == null) {
return ourInstance = new ThisRun();
}
return ourInstance;
}
public void add(String key, Object value) {
sessionState.put(key, value);
}
public String getAsString(String key) {
return sessionState.get(key).toString();
}
}
so you can add the key and values in this single ton class and can access anywhere in the project

JSF update managed bean with ServletContext listener for testing

In a JSF 2.2 application, I want to build a war file for testing with Selenium. In that webtest.war, I want to replace a central class, called the NodeCache, with a mock version, called the WebtestNodeCache, to keep the database and other external dependencies out of the tests.
NodeCache is a managed bean:
#javax.faces.bean.ManagedBean(name = NodeCache.INSTANCE)
#javax.faces.bean.ApplicationScoped
public class NodeCache {
public static final String INSTANE = "nodecache";
// ...
}
To sneak in WebtestNodeCache, I use a ServletContextListener like this:
public class WebtestContextListener implements ServletContextListener {
#Override
public void contextInitialized(ServletContextEvent event) {
WebtestNodeCache nodeCache = new WebtestNodeCache();
ServletContext context = event.getServletContext();
context.setAttribute(NodeCache.INSTANCE, nodeCache);
}
#Override
public void contextDestroyed(ServletContextEvent sce) {}
}
In normal builds, WebtestContextListener and WebtestNodeCache are excluded from the war file, in test builds, they are included.
This seems to work: when I log in, I get dummy nodes from the WebtestNodeCache.
Is this a reliable way to replace a bean in application context or did I just get lucky?
Is there a better way to sneak in test dummies?
Using both an #ManagedBean annotation and a Listener to replace the object did not work. The code was always using the unmocked production code managed bean.
Defining a new #ManagedBean with the same name is an error and prevents deployment.
I ended up with this:
Put the #ManagedBean annotation with the same name on both the real bean and its mock.
When building, only include the mocks when building the webtest.war, but not in the regular build.
When building, have the build script (Gradle in my case) copy and filter the sources, looking for a special comment behind the #ManagedBean declaration in the production code and taking out these lines to remove the #ManagedBean declaration on the production code so that only the ones in the mock remains.
So the original NodeCache looks like this now:
#javax.faces.bean.ManagedBean(name = NodeCache.INSTANCE) // webtest:remove
#javax.faces.bean.ApplicationScoped // webtest:remove
public class NodeCache {
public static final String INSTANE = "nodecache";
// ...
}
and the mocked version has the same annotations, just without the comment:
#javax.faces.bean.ManagedBean(name = NodeCache.INSTANCE)
#javax.faces.bean.ApplicationScoped
public class WebtestNodeCache extends NodeCache {
// ...
}
Here is the relevant part of the Gradle build script:
boolean isWebtest = false
gradle.taskGraph.whenReady { taskGraph ->
isWebtest = taskGraph.hasTask(compileWebtestWarJava);
}
task copySrc(type: Copy) {
from "src"
into "${buildDir}/src"
outputs.upToDateWhen {
// Always execute this task so that resources do or don't get filtered
// when switching between normal war file and webtests.
false
}
filter { String line ->
isWebtest && line.contains("webtest:remove") ? null : line;
}
}
This solves the problem for me. Hope someone else finds it useful.

How to add multiple outlets for generated XText DSL

By default the generated XText artifacts generate code from my DSL to the default outlet (which defaults to src-gen folder). I know that you can explicitly pass the outlet configuration name in fsa.generateFile("myfile.txt", "MY_OUTLET_NAME", "Some file content").
I it because I want to generate code with my XText DSL and want to use the generation gap pattern and generate code in a folder called "src-once".
I'am using XText 2.2.1.
My questions:
1) Where and how do I define my outlets like "MY_OUTLET_NAME"?
2) Is there a way to prevent overwriting existing files in a specific outlet?
The hint form Christian Dietrich pointed me in the right direction. Below is the code that I ended up with.
I have created a new class MyOutputConfigurationProvider that implements IOutputConfigurationProvider. The getOutputConfigurations method returns two output configurations, the default src-gen and a custom src-gen-once with the correct settings for generating sources only once.
package com.my.dsl;
import static com.google.common.collect.Sets.newHashSet;
import java.util.Set;
import org.eclipse.xtext.generator.IFileSystemAccess;
import org.eclipse.xtext.generator.IOutputConfigurationProvider;
import org.eclipse.xtext.generator.OutputConfiguration;
public class MyOutputConfigurationProvider implements
IOutputConfigurationProvider {
public final static String DEFAULT_OUTPUT_ONCE = "DEFAULT_OUTPUT_ONCE";
/**
* #return a set of {#link OutputConfiguration} available for the generator
*/
public Set<OutputConfiguration> getOutputConfigurations() {
OutputConfiguration defaultOutput = new OutputConfiguration(IFileSystemAccess.DEFAULT_OUTPUT);
defaultOutput.setDescription("Output Folder");
defaultOutput.setOutputDirectory("./src-gen");
defaultOutput.setOverrideExistingResources(true);
defaultOutput.setCreateOutputDirectory(true);
defaultOutput.setCleanUpDerivedResources(true);
defaultOutput.setSetDerivedProperty(true);
OutputConfiguration onceOutput = new OutputConfiguration(DEFAULT_OUTPUT_ONCE);
onceOutput.setDescription("Output Folder (once)");
onceOutput.setOutputDirectory("./src-gen-once");
onceOutput.setOverrideExistingResources(false);
onceOutput.setCreateOutputDirectory(true);
onceOutput.setCleanUpDerivedResources(false);
onceOutput.setSetDerivedProperty(true);
return newHashSet(defaultOutput, onceOutput);
}
}
To use the MyOutputConfigurationProvider implementation add a configure method to your module class:
/**
* Use this class to register components to be used within the IDE.
*/
public class MyDslUiModule extends com.my.dsl.ui.AbstractMyDslUiModule {
public MyDslUiModule(AbstractUIPlugin plugin) {
super(plugin);
}
#Override
public void configure(Binder binder) {
super.configure(binder);
binder.bind(IOutputConfigurationProvider.class).to(MyOutputConfigurationProvider.class).in(Singleton.class);
}
}
implement a custom IOutputConfigurationProvider should do the trick

Is there a way to show partial class name in the testrunner for resharper?

I have created a set of tests that I have grouped together by using a partial class. Is there a way to get the partial name to show up in the test runner? What I have is something like
File 1:
public partial class MyWrapperClass
{
[TestClass]
public class This_is_a_descriptive_scenario {
[TestMethod]
public void This_is_a_descriptive_scenario_outcome() { ... }
}
}
File 2:
public partial class MyWrapperClass
{
[TestClass]
public class This_is_a_descriptive_scenario2 {
[TestMethod]
public void This_is_a_descriptive_scenario2_outcome() { ... }
}
}
When running tests like that in the builtin test runner in Visual studio I can see the result as: MyWrapperClass+This_is_a_descriptive_test, if I have added the class column to the test result. But when you run the test in resharper's testrunne they are grouped by project and/or namespace and the class name, but I can't see that the tests are part of a partial class anywhere. Is that possible?
I don't think this is supported. Although it might be possible to extend it. I can point you in the right direction if you're willing to go down that path.
The easiest solution would be to use namespaces for grouping instead of partial classes.
Hope this helps
Miguel
The name of a partial class is the same in either file... No, there's nothing that currently groups tests by test file name.

Tapestry 4: Asset Cache Control?

I use Tapestry 4, and whenever we push a release that changes any assets (image, style sheet, JS library), we get problems because users still have the old version of the asset in their browser cache. I'd like to set up some easy way to allow caching, but force a new asset download when we update the application. Simply disallowing caching entirely for assets is not an acceptable solution.
I couldn't see any existing mechanism for doing this, but I was figuring that there might be some way to tell Tapestry to add the build number to the URL, something like this:
http://www.test.com/path/to/the/asset/asset.jpg?12345
That way, every new build would make it look like a different asset to the end user.
Does Tapestry provide an easy way to solve the cache problem that I'm not aware of? If not, how would one go about modifying the URL generated by Tapestry? And how would the code responsible for doing that get the build number? (I could get the build number into a Spring bean, for example, but how would the new URL building mechanism get at it?)
After stewing about this problem for a long time, I eventually solved it myself. This solution assumes you have the tapestry-spring library in your project.
In my case, I have a Spring bean that contains some of my application's global properties:
package myapp;
public class AppProperties {
private String build;
public String getBuild() {
return build;
}
public void setBuild(String build) {
this.build = build;
}
// other properties
}
Declare this bean in your Spring configuration:
<bean id="appProperties" class="myapp.AppProperties">
<property name="build" value="#BUILD_NUMBER#"/>
</bean>
You can set up your Ant build script to replace #BUILD_NUMBER# with the actual number (see the Copy task in the Ant manual for details).
Now create a class that will wrap IAssets and tack the build number onto the URL:
package myapp;
import java.io.InputStream;
import org.apache.hivemind.Location;
import org.apache.hivemind.Resource;
import org.apache.tapestry.IAsset;
public class BuildAwareAssetWrapper implements IAsset {
private IAsset wrapped;
private String build;
public BuildAwareAssetWrapper(IAsset wrapped, String build) {
this.wrapped = wrapped;
this.build = build;
}
public String buildURL() {
return addParam(wrapped.buildURL(), "build", build);
}
public InputStream getResourceAsStream() {
return wrapped.getResourceAsStream();
}
public Resource getResourceLocation() {
return wrapped.getResourceLocation();
}
public Location getLocation() {
return wrapped.getLocation();
}
private static String addParam(String url, String name, String value) {
if (url == null) url = "";
char sep = url.contains("?") ? '&' : '?';
return url + sep + name + '=' + value;
}
}
Next, we need to make Tapestry wrap all assets with our wrapper. The AssetSourceImpl class is responsible for providing IAsset instances to Tapestry. We'll extend this class and override the findAsset() method so that we can wrap the created assets with the wrapper class:
package myapp;
import java.util.Locale;
import org.apache.hivemind.Location;
import org.apache.hivemind.Resource;
import org.apache.tapestry.IAsset;
import org.apache.tapestry.asset.AssetSourceImpl;
public class BuildAwareAssetSourceImpl extends AssetSourceImpl {
private AppProperties props;
#Override
public IAsset findAsset(Resource base, String path, Locale locale, Location location) {
IAsset asset = super.findAsset(base, path, locale, location);
return new BuildAwareAssetWrapper(asset, props.getBuild());
}
public void setAppProperties(AppProperties props) {
this.props = props;
}
}
Notice that the implementation has a setter which can accept our Spring bean. The last step is to get Tapestry to use BuildAwareAssetSourceImpl to create assets instead of AssetSourceImpl. We do this by overriding the corresponding service point in hivemodule.xml:
<!-- Custom asset source -->
<implementation service-id="tapestry.asset.AssetSource">
<invoke-factory service-id="hivemind.BuilderFactory" model="singleton">
<construct class="myapp.BuildAwareAssetSourceImpl">
<set-object property="appProperties" value="spring:appProperties"/>
<set-configuration property="contributions" configuration-id="tapestry.asset.AssetFactories"/>
<set-service property="lookupAssetFactory" service-id="tapestry.asset.LookupAssetFactory"/>
<set-service property="defaultAssetFactory" service-id="tapestry.asset.DefaultAssetFactory"/>
</construct>
</invoke-factory>
</implementation>
That's it. If you run your application and view the source for any page that uses an asset, you will see that the URL will have the new build parameter on it.

Resources