My question is regarding the property count in the matcher class in groovy. Following is an example:
import java.util.regex.Matcher
def p= /[a-z]*/
Matcher m= "abc" =~ p
println m.count
As you can see, Groovy actually uses the Matcher class in Java. According to javadoc, Matcher doesn't have any count property. We have to use loops to count manually in Java development for pattern matching. How can groovy do this? Is it documented somewhere?
Thanks,
A good start for this is Groovy Backstage and especially Groovy Method Invocation. Basically the classes get a metaClass augmented with a common set of tools (e.g. DefaultGroovyMethods (see below) for Object). On method invocation, groovy takes this into account for "what is there". count for Matcher is already some more special case, something much more common used on Object is e.g. println.
If you are interested only in the functionality itself you can check the GDK. E.g. for Matcher.getCount().
If you are interested how it is actually implemented, then a good start is always DefaultGroovyMethods or the descendants of DefaultGroovyMethodsSupport in general. So following this for Matcher.getCount() leeds to https://github.com/groovy/groovy-core/blob/master/src/main/org/codehaus/groovy/runtime/StringGroovyMethods.java#L1508.
Related
I have a kotlin function of this form in an interface:
fun foo(bar: String, vararg baz: Pair<String, ByteArray>):Boolean
Using Mockito to mock this interface, how do I verify that this function was called with no Pairs?
It doesn't work leaving the second matcher off, because then Mockito complains that it needs two matchers.
Using any any*() matcher, including anyVararg(), fails because of typing.
A non-answer to give some inspiration:
Keep in mind that Mockito doesn't know or care what you are writing in some Kotlin source code file.
Mockito only deals with the compiled byte code. In other words: Mockito looks into the final classfile; created by the kotlin compiler.
Thus: your first stop should be javap to disassemble the class file that contains that method definition. You check the signature of the method there; and that should tell you how to specify correct argument matchers to Mockito.
And just another idea: java varargs translate arrays. So "no" args means: an empty array. So you probably want to match specifically on something like empty array of Pairs.
I have a variable of this type:XSSFSheet sheet (I'm using apache-poi to read from Excel).
In the simple groovy console I can do sheet[4][5] to access a certain cell by coordinates. When I try the same thing in IntelliJ, it gives me the exception that
No signature of method: org.apache.poi.xssf.usermodel.XSSFSheet.getAt() is applicable for argument types: (java.lang.Integer) values: [0]
Possible solutions: getAt(java.lang.String), getRow(int), putAt(java.lang.String, java.lang.Object), wait(), last(), first()
I looked in the reference and indeed, XSSFSheet can't be indexed by an int. But why then it is possible in the simple groovy console that comes with it? Can I do the same in IntelliJ?
This is old, and I'm only answering because I was wondering the same thing you asked in your last comment and found the answer trying to understand.
As you mention in your last comment, the subscript operator in groovy translates to the getAt() method, and as you say, the XSSFSheet class doesn't have such a method in java.
Since it can indeed be called from groovy, it does exist somewhere. With a bit of metaprogramming, we get the following:
def getAt = org.apache.poi.xssf.usermodel.XSSFRow.metaClass.getMetaMethod("getAt", [java.lang.Integer] as Class[])
println getAt
org.codehaus.groovy.runtime.dgm$240#e7edb54[name: getAt params: [int] returns: class java.lang.Object owner: interface java.lang.Iterable]
Which means it comes from the Iterable interface. And still, this is not true in java. But groovy adds many methods to standard java classes, and indeed we can see here that it adds a getAt() method to Iterable.
This doesn't answer your original question (why doesn't it work in your IntelliJ ? it should and it does here), but it answers a question in the comments. I would have posted it there, but I lack reputation.
In Swift, sometimes, keywords are plain keywords, and some others start with an #.
For instance, weak, unowned, inout, class are plain. But #final, #lazy start with #.
Sometimes, we even have both! prefix and #prefix, infix and #infix for instance.
It is not entirely an Objective-C inheritance since we have #class and not class in Objective-C. I could understand why we have class and not #class in Swift, but since we have #final or #lazy , I would have thought that it should be #weak and not weak.
Why this choice? Is there a kind of intuitive way that should tell: "hey, it is logical that this keyword starts with #?
Even if I think with a preprocessor perspective in mind, it is not obvious that # would call a kind of specific preprocessor before compilation (e.g. #final is not really a kind of preprocessor directive).
#-prefixed items in Swift are not keywords, these are attributes.
Apple's book on Swift says that
Attributes provide more information about a declaration or type. There are two kinds of attributes in Swift, those that apply to declarations and those that apply to types.
Some attributes (such as #objc(isEnabled)) accept parameters.
The main difference between attributes and keywords is that keywords tell the compiler what you are defining (a class, a method, a property, a variable, and so on), while attributes tell the compiler in what contexts you intend to use that definition. For example, you would use a func keyword to tell the compiler that you are defining a function, and decorate that function with an #infix attribute to tell the compiler that you plan to use that function as an infix operator.
I am trying to test the overridden toString() in groovy (I know it is trivial but that is what you get after reading kent beck's TDD book).
I assertSame on the expected string and the actual
Here is the code block:
#Test void testToString(){
def study = new Study(identifier:"default-study", OID:"S_DEFAULTS1", name:"Default Study")
def expected = "org.foo.oc.model.bar(OID:S_DEFAULTS1, name:Default Study, identifier:default-study)"
assertSame "Should be equal", expected, study.toString()
}
Here is the stack trace for the failed test:
junit.framework.AssertionFailedError: Should be equal expected same:org.foo.oc.model.bar(OID:S_DEFAULTS1, name:Default Study, identifier:default-study) was not:org.foo.oc.model.bar(OID:S_DEFAULTS1, name:Default Study, identifier:default-study)
at junit.framework.Assert.fail(Assert.java:47)
at junit.framework.Assert.failNotSame(Assert.java:273)
at junit.framework.Assert.assertSame(Assert.java:236)
Just to add that assertEquals works well with the same parameters.
I know it is no biggie but I want to understand why it fails.
Thanks
Why aren't you using assertEquals which uses .equals()? assertSame compares object references (== operator). Even though the strings are the same, they are two different objects, hence the assertion is failing.
UPDATE: This is a very common mistake in Java: String.equals() and == operator work differently. This has been discussed several times:
How do I compare strings in Java?
Java String.equals versus ==
Difference between Equals/equals and == operator?
I know you are using Groovy which does not suffer this problem, but JUnit is written in Java and behaves according to the rules above.
UPDATE: actually, your string are different:
org.foo.oc.model.bar(OID:S_DEFAULTS1, name:Default Study, identifier:default-study)
org.foo.oc.model.bar(OID:S_DEFAULTS1, name:Default Study, identifier:default-study)
Your original uses lowercase d in "default Study" but your expected string does not.
EDIT: when comparing strings you should always use equals() rather than comparing references. Two strings that pass the equals() test may or may not also be the same object.
BTW, in Groovy == is the same as equals().
(All in ActivePython 3.1.2)
I tried to change the class (rather than instance) attributes. The __dict__ of the metaclass seemed like the perfect solution. But when I tried to modify, I got:
TypeError: 'dict_proxy' object does
not support item assignment
Why, and what can I do about it?
EDIT
I'm adding attributes inside the class definition.
setattr doesn't work because the class is not yet built, and hence I can't refer to it yet (or at least I don't know how).
The traditional assignment doesn't work because I'm adding a large number of attributes, whose names are determined by a certain rule (so I can't just type them out).
In other words, suppose I want class A to have attributes A.a001 through A.a999; and all of them have to be defined before it's fully built (since otherwise SQLAlchemy won't instrument it properly).
Note also that I made a typo in the original title: it's __dict__ of a regular class, not a metaclass, that I wanted to modify.
The creation of a large number of attributes following some rule smells like something is seriously wrong. I'd go back and see if there isn't a better way of doing that.
Having said there here is "Evil Code" (but it'll work, I think)
class A:
locals()['alpha'] = 1
print A.alpha
This works because while the class is being defined there is a dictionary that tracks the local variables you are definining. These local variables eventually become the class attributes. Be careful with locals as it won't necessarily act "correctly." You aren't really supposed to be modifying locals, but it does seem to work when I tried it.
Instead of using the declarative syntax, build the table seperately and then use mapper on it. see http://www.sqlalchemy.org/docs/05/ormtutorial.html# I think there is just no good way to add computed attributes to class while defining it.
Alternatively, I don't know whether this will work but:
class A(object):
pass
A.all_my_attributes = values
class B(declarative_base, A):
pass
might possibly work.
I'm not too familiar with how 3 treats dict but you might be able to circumvent this problem by simply inheriting the dictionary class like so:
class A(dict):
def __init__(self,dict_of_args):
self['key'] = 'myvalue'
self.update(dict_of_args)
# whatever else you need to do goes here...
A() can be referenced like so:
d = {1:2,3:4}
obj = A(mydict)
print obj['test'],obj[3] # this will print myvalue and 4
Hope this helps.