Subtracting Enum elements in groovy - groovy

Let's assume we have an enum:
enum MyEnum
{
ONE, TWO
}
we can easily subtract MyEnum values from each other by using:
def a = MyEnum.values()
println (a - MyEnum.values()) //results: []
but if we try to use strong typing, we don't receive an empty list:
Collection<MyEnum> a = MyEnum.values()
println (a - MyEnum.values()) //results: [ONE, TWO]
What type should we use to properly subtract MyEnum values and why?

Given that the return type of E.values() is E[] the obvious answer
would be
MyEnum[] a = MyEnum.values()
The way how two arrays are "subtracted" is defined in
org.codehaus.groovy.runtime.DefaultGroovyMethods:
public static <T> T[] minus(T[] self, Object[] removeMe) {
But if you are using something "collection-y" this is used (e.g. for a Set here):
public static <T> Set<T> minus(Set<T> self, Object removeMe)
(Only the given element is about to be removed - not each element).
So if you want something "collection-y", then you also have to turn the
array into something iterable or "collection-y". E.g. this works also:
Set<MyEnum> a = MyEnum.values()
println (a - MyEnum.values().toList())
// → []
And if you want to be explicit you might as well use a function like
removeAll.

Related

Comparator vs Closure in min call in groovy?

I am trying to understand the difference between passing a Closure vs a Comparator to the min function on a collection:
// Example 1: Closure/field/attribute?
Sample min = container.min { it.timespan.start }
// Example 2: Comparator
Sample min2 = container.min(new Comparator<Sample>() {
#Override
int compare(Sample o1, Sample o2) {
return o1.timespan.start <=> o2.timespan.start
}
})
They both return the correct result.
Where:
class Sample {
TimeSpan timespan
static constraints = {
}
}
And:
class TimeSpan {
LocalDate start
LocalDate end
}
In Example 1 I just pass the field timespan.start to min which I guess means that I am passing a Closure (even though its just a field in a class)?
In Example 1 does groovy convert the field timespan.start into a Comparator behind the scenes like the one I create explicitly in Example 2?
The difference is, that those are two different min methods both
taking different arguments. There is one for passing
a closure
and one for the
comparator
(there is a third one using identity and some deprecated ones, but we can ignore that for now).
The first version (Closure with one (implicit argument)) you have to
extract the value from the passed value, you want to make the min
aggregate with. Therefor this versions has some inner working to deal
with comparing the values.
But the docs also state:
If the closure has two parameters it is used like a traditional
Comparator. I.e. it should compare its two parameters for order,
returning a negative integer, zero, or a positive integer when the
first parameter is less than, equal to, or greater than the second
respectively. Otherwise, the Closure is assumed to take a single
parameter and return a Comparable (typically an Integer) which is then
used for further comparison.
So you can use a Closure version also to the same as your second example
(you have to define two params explicitly):
container.min{ a, b -> a <=> b }
And there is also a shorter version of the second example. You can cast
a Closure to an interface with groovy. So this works too:
container.min({ a, b -> a <=> b } as Comparator)

Groovy: Constructor hash collision

I have the following groovy code:
def script
String credentials_id
String repository_path
String relative_directory
String repository_url
CredentialsWrapper(script, credentials_id, repository_name, repository_group, relative_directory=null) {
this(script, credentials_id, 'git#gitlab.foo.com:' + repository_group +'/' + repository_name + '.git', relative_directory);
}
CredentialsWrapper(script, credentials_id, repository_url, relative_directory=null) {
this.script = script;
this.credentials_id = credentials_id;
this.repository_url = repository_url;
if (null == relative_directory) {
int lastSeparatorIndex = repository_url.lastIndexOf("/");
int indexOfExt = repository_url.indexOf(".git");
this.relative_directory = repository_url.substring(lastSeparatorIndex+1, indexOfExt);
}
}
Jenkins gives me the following:
Unable to compile class com.foo.CredentialsWrapper due to hash collision in constructors # line 30, column 7.
I do not understand why, the constructors are different, they do not have the same number of arguments.
Also, "script" is an instance from "WorkflowScript", but I do not know what I should import to access this class, which would allow me to declare script explicitly instead of using "def"
Any idea ?
When you call the Constructor with four parameters, would you like to call the first or the second one?
If you write an constructor/method with default values, groovy will actually generate two or more versions.
So
Test(String x, String y ="test")
will result in
Test(String x, String y) {...}
and
Test(String x) {new Test(x, "test")}
So your code would like to compile to 4 constructors, but it contains the constructor with the signature
CredentialsWrapper(def, def, def, def)
two times.
If I understand your code correctly, you can omit one or both of the =null. The result will be the same, but you will get only two or three signatures. Then you can choose between both versions by calling calling them with the right parameter count.

Spock reusing generic closure

I have this code in Spock:
then:
1 * dao.getByValue(Something.ONE, _ as String) >> {Something smth, String value ->
return createSomething(smth).withValue(value).build()
}
It doesn't look exactly like that, but you get the point. I want to return an object based on arguments passed to the method, in the real version this object is loaded from database.
The point is that I have this call in a lot of places and it looks exactly the same everywhere. Could I somehow extract this closure and use it everywhere, like this:
then:
1 * dao.getByValue(Something.ONE, _ as String) >> Closures.makeSomething
I tried using Intellij extract feature, but it kinda went crazy there with types, after I edited the types manually I had weird errors:
public static final Closure<Optional<Something>> makeSomething = { Something smth, String value ->
return createSomething(smth).withValue(value).build()
}
...
1 * dao.getByValue(Something.ONE, _ as String) >> makeSomething
org.codehaus.groovy.runtime.typehandling.GroovyCastException: Cannot cast object 'mypackage.MySpec$__clinit__closure1#1757cd72' with class 'mypackage.MySpec$__clinit__closure1' to class 'java.util.Optional'
Even that one didn't work, and I thought it would:
public static final Closure<Optional<Something>> makeSomething = { Something smth, String value ->
return createSomething(smth).withValue(value).build()
}
...
1 * dao.getByValue(Something.ONE, _ as String) >> {args -> makeSomething.call(args[0], args[1]) }
groovy.lang.MissingMethodException: No signature of method: mypackage.MySpec$__clinit__closure2.call() is applicable for argument types: (java.util.Arrays$ArrayList) values: [[mypackage.Something$$Lambda$6/1105423942#6f45df59, ...]]
I'm not good at Groovy or Spock in general, I'm just trying this out for now.
Edit:
Working code after #tim_yates suggestion (whole interaction is in the helper method):
then:
interaction {
somethingCall(2, Something.TWO)
somethingCall(3, Something.ONE)
}
}
private void somethingCall(int times, Something something) {
times * dao.getByValue(something, _ as String) >> { Something smth, String value ->
return createSomething(smth).withValue(value).build()
}
}
Not working code that I'd like (only the return value is in the helper method):
then:
2 * dao.getByValue(Something.TWO, _ as String) >> makeSomething
3 * dao.getByValue(Something.ONE, _ as String) >> makeSomething
}
public static final Closure<Optional<Something>> makeSomething = { Something smth, String value ->
return createSomething(smth).withValue(value).build()
}
If I simply inline each >> makeSomething and write there it's body instead, then it works.
You have a conceptual problem here. You cannot just split the closure from the preceding code because if you look at it
dao.getByValue(something, _ as String) >> { Something smth, String value ->
return createSomething(smth).withValue(value).build()
}
you might notice that smth and value inside the closure get their values from getByValue(something, _ as String).
Now if you factor out the in-line closure part into a stand-alone closure instance, you lose that connection. First of all, >> makeSomething has no parameters, secondly you do not evaluate the makeSomething closure, i.e. on the right hand side you do not get your Optional instance but a Closure instance. In order to evaluate the closure you have to call it with parameters, i.e. something like >> makeSomething(something, "dummy") would work. But this way you have to repeat the first getByValue parameter and make up a dummy for the second, unspecified one because you have no easy way to refer to it other than introducing yet another closure like >> { Something smth, String value -> makeSomething(smth, value) }. But then you are not saving a lot of code.
It is your decision if this is nicer than somethingCall(2, Something.TWO) (I like it, actually) or if you go for my contrived construct. What I cannot do for you is change Groovy or Spock DSL syntax just because you prefer it to look different.

Creating Map using withDefault causing null when putting element

I'm trying to use the Groovy way of creating a TreeMap<String, List<Data>> with default values so I easily add data to a new list if the key isn't already present.
TreeMap<String, List<Data>> myData = (TreeMap<String, List<Data>>) [:].withDefault { [] }
As you can see, I have the requirement to use a TreeMap and withDefault only returns a Map instance, so I need to cast.
When I attempt to add a new list to the map,
myData[newKey].add(newData)
myData[newKey] is null. However, if I change my Map initilization to remove the TreeMap cast (and change the type to just Map instead of TreeMap), myData[newKey].add(newData) works as expected.
What's the reasoning for this? Can I not use withDefault if I cast the map?
The problem isn't just about the cast. It also has to do with the declared type. The problem can be simplified to something like this:
def map1 = [:].withDefault { 0 }
TreeMap map2 = map1
When that is executed map1 is an instance of groovy.lang.MapWithDefault and map2 is an instance of java.util.TreeMap. They are 2 separate objects on the heap, not just 2 references pointing to the same object. map2 will not have any default behavior associated with it. It is as if you had done this:
def map1 = [:].withDefault { 0 }
TreeMap map2 = new TreeMap(map1)
That is what is happening with your code. The cast and the generics just makes it less clear with your code.
This:
TreeMap<String, List<Data>> myData = (TreeMap<String, List<Data>>) [:].withDefault { [] }
Can be broken down to this:
def tmpMap = [:].withDefault { [] }
TreeMap<String, List<Data>> myData = (TreeMap<String, List<Data>>)tmpMap
I hope that helps.
EDIT:
Another way to see the same thing happening is to do something like this:
Set names = new HashSet()
ArrayList namesList = names
When the second line executes a new ArrayList is created as if you had done ArrayList namesList = new ArrayList(names). That looks different than what you have in your code, but the same sort of thing is happening. You have a reference with a static type associated with it and are pointing that reference at an object of a different type and Groovy is creating an instance of your declared type. In this simple example above, that declared type is ArrayList. In your example that declared type is TreeMap<String, List<Data>>.

Groovy named and default arguments

Groovy supports both default, and named arguments. I just dont see them working together.
I need some classes to support construction using simple non named arguments, and using named arguments like below:
def a1 = new A(2)
def a2 = new A(a: 200, b: "non default")
class A extends SomeBase {
def props
A(a=1, b="str") {
_init(a, b)
}
A(args) {
// use the values in the args map:
_init(args.a, args.b)
props = args
}
private _init(a, b) {
}
}
Is it generally good practice to support both at the same time? Is the above code the only way to it?
The given code will cause some problems. In particular, it'll generate two constructors with a single Object parameter. The first constructor generates bytecode equivalent to:
A() // a,b both default
A(Object) // a set, b default
A(Object, Object) // pass in both
The second generates this:
A(Object) // accepts any object
You can get around this problem by adding some types. Even though groovy has dynamic typing, the type declarations in methods and constructors still matter. For example:
A(int a = 1, String b = "str") { ... }
A(Map args) { ... }
As for good practices, I'd simply use one of the groovy.transform.Canonical or groovy.transform.TupleConstructor annotations. They will provide correct property map and positional parameter constructors automatically. TupleConstructor provides the constructors only, Canonical applies some other best practices with regards to equals, hashCode, and toString.

Resources