Alloy appears to have a bug when relations include unconstrained Strings. No instance is found for the following:
sig foo{
bar: String,
yak: Int
}
pred show[]{one f:foo | f.yak=0}
run show for 1
If we change this to bar: Int, Alloy finds an instance with an arbitrary value.
This "known for ages" bug has thankfully a workaround.
For things to work, you need to "implicitly declare" some string values by using them in a fact or a predicate.
As an example, the following signature fact will allow bar to take its value in {"a","b","c"} :
sig foo{
bar: String,
yak: Int
}{
bar in "a"+"b"+"c"
}
You can also define a pool of string to be used instance wide as follows:
fact stringPool{
none!= "a"+"b"+"c"+"d"+"e"
}
See:
Provide Alloy with a "pool" of custom Strings
Problem in generation of world in predicate
How to use String in Alloy?
and so on ...
Thanks for the bug report. Can you file an issue?
BTW, strings are not well supported in Alloy. In general it's best to avoid any concrete types unless you really need them, and do everything with abstract ones. Most uses of integers aren't necessary either.
Related
Say we have
class Foo {}
Is there a way to obtain "Foo" from within the class?
Yes.
class Foo {
say ::?CLASS.^name; # OUTPUT: Foo
}
Kaiepi's solution has its place, which I'll get to below, but also consider:
class Foo {
say Foo.perl; # Foo
say OUR.WHO; # Foo
::?PACKAGE
}
Foo.perl
This provides a simple answer to your literal question (though it ignores what you're really after, as explained in your comment below and as suggested by the metaprogramming tag and your use of the word "introspection").
OUR.WHO
I think this is typically more appropriate than ::?CLASS.^name for several reasons:
Looks less line-noisy.
Works for all forms of package, i.e. ones declared with the built in declarators package, module, grammar, or role as well as class, and also custom declarators like actor, monitor, etc.
Will lead readers to mostly directly pertinent issues if they investigate OUR and/or .WHO in contrast to mostly distracting arcana if they investigate the ::?... construct.
::?CLASS vs ::?PACKAGE
OUR.WHO only works in a value grammatical slot, not a type grammatical slot. For the latter you need a suitable ::?... form, eg:
class Foo { has ::?CLASS $bar }
And these ::?... forms also work as values:
class Foo { has $bar = ::?CLASS }
So, despite their relative ugliness, they're more general in this particular sense. That said, if generality is at a premium then ::?PACKAGE goes one better because it works for all forms of package.
I'm trying to create a record where one of the fields has a type that is not exported, since it's using a smart constructor. Using the smart constructor as the type does not work.
Not in scope: type variable `domain'
Maybe there is a language extension that allows me to do this, or something similar?
Exporting the constructor along with the smart constructor would allow me to solve this problem, but that in turn creates the possibility of creating values that the smart constructor wouldn't allow.
The (non-working) code I have right now:
import Domain (domain) -- Domain is not exported, and domain is a smart constructor for Domain
data Rec = Rec
{ dint :: domain Int -- what do I do here? I want it to be `Domain Int` but `Domain` isn't exported.
...
}
The issue here is confusion between the concept of a type constructor and a data constructor. For brevity, I will illustrate the difference with an example.
data Foo a = Bar [a]
In the expression above, Foo is the type constructor and Bar is the data constructor. The key difference is that Foo is a value in Haskell's type space and Bar is a value in its data space. A value in type space cannot be used in data space, and vice versa. For example, the compiler would error at the following expressions.
someVariable :: Bar Int
someVariable = Foo [15]
The next expression, however, is completely valid.
someVariable :: Foo Int
someVariable = Bar [15]
Additionally, all type constructors must start with an upper case letter. Any types starting with a lower case letter will be considered type variables, not type constructors (the a in our definition above is an example of this).
The introduction of smart constructors add another layer to this problem, but the key thing to understand is that smart constructors are data constructors, not type constructors. In your definition of Rec, you tried to use your smart constructor, domain, in the type declaration for dint field. However, because domain is a data constructor not a type constructor, and it is lower case, the Haskell compiler tried to interpret domain as the name of a type variable. Because you never specified a variable named domain in your definiton of the Rec type, the compiler raised an error.
You don't actually need to export the data constructor for Domain to solve the issue, just the type itself. This can be accomplished with the following.
module Domain (
Domain(), domain,
...
) where
Including Domain() in the export definition tells Haskell to export the Domain type constructor, but not any of its data constructors. This preserves the safety you wanted with the safe constructor, and allows you to define types correctly. You can now use your newly exported type in your definition of Rec.
import Domain (Domain(), domain)
data Rec = Rec
{ dint :: Domain Int
...
}
For more information, I strongly recommend you read the HaskellWiki articles on constructors and smart constructors.
To illustrate the following example I created a litte spock test (but it's about groovy itself, not spock):
void "some spock test"() {
given: String value = null
expect: someMethod(value) == 3
}
int someMethod(String s) {
return 3
}
int someMethod(Map s) {
return 5
}
There are two methods who's signatures only differ by the type of the given parameter. I thought that when I give it a null value that is explicitly typed as a string, the string-method will be called.
But that doesn't happen; the test fails, because the map-method is called! Why?
I guess groovy ignores the type and treats all nulls the same. There seems to be some kind of priority of types: When I use Object instead of Map as the parameter type of the wrong-method, its all the same, but when I for instance use Integer, the test succeeds.
But than again: If groovy really ignores the type of nulls, why can the following fix the original test:
expect: someMethod((String) value) == 3
If you read my answer to the question Tim already mentioned you will see that I talk there about runtime types. The static type plays normally no role in this. I also described there how the distance calculation is used and that for null the distance to Object is used to determine the best fitting method. What I did not mention is that you can force method selection by using a cast. Internally Groovy will use a wrapper for the object, that also transports the type. Then the transported type is used instead. But you surely understand, that this means a one additional object creation per method class, which is very inefficient. Thus it is not the standard. In the future Groovy maybe change to include that static type information, but this requires a change to the MOP as well. And that is difficult
I would like to be able to use plain java-style implicit/explicit casting instead of asType overrides so that sources written in Java work properly. I've overridden asType on String similarly to the approach suggested in How to overload some Groovy Type conversion for avoiding try/catch of NumberFormatException? like:
oldAsType = String.metaClass.getMetaMethod("asType", [Class] as Class[])
String.metaClass.asType = {Class typ ->
if (Foo.class.isAssignableFrom(typ)) {
Foo.myCast(delegate)
} else {
oldAsType.invoke(delegate,typ)
}
}
I'd like all of these options to work:
// groovy
String barString
Foo foo = barString asType(Foo.class) // asType works but
Foo foo = barString // implicit cast fails
Foo foo = (Foo) barString // explicit cast fails
The latter two fail because groovy is using DefaultTypeTransformation.castToType, which doesn't attempt to invoke new Foo() unless the object to be cast is either one of a slew of special cases or is some sort of Collection type.
Note that the solution Can I override cast operator in Groovy? doesn't solve the issue because the code that is doing the casting is regular Java code that I cannot alter, at least not at the source code level. I'm hoping that there is either a secret hook into casting or a way to override the static castToType method (in a Java class, called by another Java class - which Can you use Groovy meta programming to override a private method on a Java class says is unsupported)... or some other clever approach I haven't thought of.
Edit: The question is about using Java-style casting syntax, essentially to use groovy facilities to add an autoboxing method. Groovy calls this mechanism "casting," for better or worse (see DefaultTypeTransformation.castToType as referenced above). In particular, I have replaced an enum with a resourced class and want to retain JSON serialization. Groovy's JSON package automatically un/marshals enum values of instance members to strings and I'm trying to make the replacement class serialize compatibly with a minimal changes to the source code.
Part of the problem here is you are confusing conversion with casting. Using the "as" operator is not the same thing as imposing a cast. They seem similar, but they serve separate purposes.
Foo foo = (Foo) barString
That doesn't say something like "create a Foo out of barString". That says "Declare a reference named foo, associate the static type Foo with that reference and then point that reference at the object on the heap that the reference barString currently points to.". Unlike languages like C++, Groovy and Java do not allow you to ever get in a situation where a reference points at an object that is of a type that is incompatible with the reference's type. If you ever got into a situation where a Foo reference was pointing to a String on the heap, that would represent a bug in the JVM. It cannot be done. You can come up with ways to create Foo objects out of String objects, but that isn't what the code above is about.
The answer appears to be "no". Absent a rewrite of the DefaultTypeTransformation.castToType to allow for this sort of metaprogramming, the implication is to use another implementation strategy or use a different language.
I have a class diagram with numerous classes, some of them containing attributes of type string. I want all my strings to be of length at least 1.
The easy (yet ugly) solution is as follows:
context Class1
inv: self.attributeOfTypeString.size > 0
context Class2
inv: self.attributeOfTypeString.size > 0
...
Do you know a way to define an OCL constraints for all attributes matching a template? Something like:
global.select(attr | attr.TYPE = string) -> forall (str : string | str.size > 0)
Finally got an answer from somewhere else. I share it in case someone needs it someday.
There are three possible ways to solve the problem.
1°) The first one is to remember that multiple inheritance is allowed in UML. Therefore, we can make all classes with a string attribute inherit from a WithString class, and set the OCL constraint on this parent class. However this makes the diagrams kinda unreadable.
2°) Another possibility is to create a class String and to store an instance of this class instead of all string attributes. The problem with this encapsulation solution is the performance (use of a getter for all strings).
3°) Finally, the cleanest solution to my opinion is the following: we can declare the OCL constraint at the meta level. In the class diagram describing class diagrams, we can just state that all strings are non-empty.