haskell -- odd ambiguous type variable error message for code in "where" statement with TypeFamilies extension - haskell

Does anyone know why this code fails?
{-# LANGUAGE NoMonomorphismRestriction,
TypeFamilies #-}
module Test where
asExprTyp :: Expr γ =>
γ α
-> α
-> γ α
asExprTyp x _ = x
int = undefined :: Integer
class Expr γ where
a :: γ α
-- this works fine
b = a `asExprTyp` int
-- this fails
mcode = do
return ()
where b = a `asExprTyp` int
The error is as follows,
Test.hs:23:15:
Ambiguous type variable `γ0' in the constraint:
(Expr γ0) arising from a use of `a'
Probable fix: add a type signature that fixes these type variable(s)
In the first argument of `asExprTyp', namely `a'
In the expression: a `asExprTyp` int
In an equation for `b': b = a `asExprTyp` int
Failed, modules loaded: none.

I don't see what ghc complains about either. I thought it might be because it's trying to give the local binding a monomorphic type, but adding NoMonoLocalBinds to the language pragma didn't change anything.
However, the code compiles as is with a recent HEAD (7.3.20111026), together with the fact that it compiles without TypeFamilies enabled, that supports the bug hypothesis.
If it's a real problem you have to solve: adding type signatures makes ghc happy.

Okay, this is somewhat over my head since I've never used type families in Haskell. However, your example isn't actually using type families either, so I thought I'd see what happens when I remove the TypeFamilies language extension from the LANGUAGE pragma. Turns out: it compiles just fine then! :)
So it could very well be a GHC bug.
That being said, I poked at it a bit and noticed that the following compiles happily with TypeFamilies enabled:
mcode = do
b
return ()
where b = a `asExprTyp` int
This is probably nonsensical, since its inferred type is mcode :: (Expr m, Monad m) => m (), rather than just mcode :: Monad m => m (), but my point is that GHC seems happy only when a's type is tied in some way to mcode's type.
Not sure if this is helpful, but it definitely piqued my curiosity!

Related

Apply constraint within constraint in Haskell

Is there anyway to apply a constraint within another constraint such that this
{-# LANGUAGE ConstraintKinds #-}
{-# LANGUAGE KindSignatures #-}
module Test where
type Con a = (Num a, Show a)
type App c a b = (c a, c b)
program :: App Con a b => a -> b -> String
program a b = show a ++ " " ++ show (b+1)
will work?
Currently GHC is giving me the following errors:
[1 of 1] Compiling Test ( Test.hs, interpreted )
Test.hs:9:12: error:
• Expected a constraint, but ‘App Con a b’ has kind ‘*’
• In the type signature: program :: App Con a b => a -> b -> String
|
9 | program :: App Con a b => a -> b -> String
| ^^^^^^^^^^^
Test.hs:9:16: error:
• Expected kind ‘* -> *’, but ‘Con’ has kind ‘* -> Constraint’
• In the first argument of ‘App’, namely ‘Con’
In the type signature: program :: App Con a b => a -> b -> String
|
9 | program :: App Con a b => a -> b -> String
| ^^^
Failed, no modules loaded.
Thanks!
An easy way to fix this is to use the LiberalTypeSynonyms extension. This extension allows GHC to first treat the type synonyms as substitutions and only afterwards check that the synonyms are fully applied. Note that GHC can be a little silly at kind inference, so you'll need to be very clear with it (i.e., an explicit signature). Try this:
{-# LANGUAGE ConstraintKinds #-}
{-# LANGUAGE KindSignatures #-}
{-# LANGUAGE LiberalTypeSynonyms #-}
module Test where
import Data.Kind (Constraint)
type Con a = (Num a, Show a)
type App c a b = (c a, c b) :: Constraint
program :: App Con a b => a -> b -> String
program a b = show a ++ " " ++ show (b+1)
Before I understood that this could be solved with LiberalTypeSynonyms, I had a different solution, which I'll keep here in case anyone's interested.
Although the error message you're getting is a bit misleading, the fundamental problem with your code comes down to the fact that GHC does not support partial application of type synonyms, which you have in App Con a b. There are a few ways to fix this, but I find the simplest is to convert the type synonym constraint into a class constraint following this pattern:
{-# LANGUAGE FlexibleInstances #-}
{-# LANGUAGE UndecidableInstances #-}
type Con' a = (Num a, Show a)
class Con' a => Con a
instance Con' a => Con a
You can use this definition of Con anywhere you were intending to use your old one.
If you're interested in how/why this works, it's basically a trick to get around GHC's lack of support for partial type synonym/family application for the particular cases where those type synonyms/families define simple constraints.
What we're doing is defining a class, and every class comes with a constraint of the same name. Now, notice that the class has no body, but critically, the class itself has a constraint (in the above case Con' a), which means that every instance of the class must have that same constraint.
Next, we make an incredibly generic instance of Con, one that covers any type so long as the constraint Con' holds on that type. In essence, this assures that any type that is an instance of Con' is also an instance of Con, and the Con' constraint on the Con class instance assures that GHC knows that anything that's an instance of Con also satisfies Con'. In total, the Con constraint is functionally equivalent to Con', but it can be partially applied. Success!
As another side note, the GHC proposal for unsaturated type families was recently accepted, so there may be a not-too-far-off future where these tricks are unnecessary because partial application of type families becomes allowed.
Haskell does not support type-level lambdas, nor partial application of type families / type synonyms. Your Con must always be fully applied, it can not passed unapplied to another type synonym.
At best, we can try to use "defunctionalization" as follows, effectively giving names to the type-level lambdas we need.
{-# LANGUAGE ConstraintKinds, KindSignatures, TypeFamilies #-}
import Data.Kind
-- Generic application operator
type family Apply f x :: Constraint
-- A name for the type-level lambda we need
data Con
-- How it can be applied
type instance Apply Con x = (Show x, Num x)
-- The wanted type-level function
type App c a b = (Apply c a, Apply c b)
-- Con can now be passed since it's a name, not a function
program :: App Con a b => a -> b -> String
program a b = show a ++ " " ++ show (b+1)
To call App with a different first argument, one would need to repeat this technique: define a custom dummy type name (like Con) and describe how to apply it (using type instance Apply ... = ...).

Using a default implementation of typeclass method to omit an argument

I want to be able to define a (mulit-parameter-) typeclass instance whose implementation of the class's method ignores one of its arguments. This can be easily done as follows.
instance MyType MyData () where
specific _ a = f a
As I'm using this pattern in several places, I tried to generalize it by adding a specialized class method and adequate default implementations. I came up with the following.
{-# LANGUAGE MultiParamTypeClasses, AllowAmbiguousTypes #-}
{-# LANGUAGE ScopedTypeVariables #-}
class MyType a b where
specific :: b -> a -> a
specific = const dontCare
dontCare :: a -> a
dontCare = specific (undefined :: b)
{-# MINIMAL specific | dontCare #-}
This however yields the error Could not deduce (MyType a b0) arising from a use of ‘dontCare’ [..] The type variable ‘b0’ is ambiguous. I don't see why the latter should be the case with the type variable b being scoped from the class signature to the method declaration. Can you help me understand the exact problem that arises here?
Is there another reasonable way to achieve what I intended, namely to allow such trimmed instances in a generic way?
The problem is in the default definition of specific. Let's zoom out for a second and see what types your methods are actually given, based on your type signatures.
specific :: forall a b. MyType a b => b -> a -> a
dontCare :: forall a b. MyType a b => a -> a
In the default definition of specific, you use dontCare at type a -> a. So GHC infers that the first type argument to dontCare is a. But nothing constrains its second type argument, so GHC has no way to select the correct instance dictionary to use for it. This is why you ended up needing AllowAmbiguousTypes to get GHC to accept your type signature for dontCare. The reason these "ambiguous" types are useful in modern GHC is that we have TypeApplications to allow us to fix them. This definition works just fine:
class MyType a b where
specific :: b -> a -> a
specific = const (dontCare #_ #b)
dontCare :: a -> a
dontCare = specific (undefined :: b)
{-# MINIMAL specific | dontCare #-}
The type application specifies that the second argument is b. You could fill in a for the first argument, but GHC can actually figure that one out just fine.

Resolving type ambiguities using available class instances

Given the following code:
import Data.Word
data T = T deriving (Eq, Show)
class C a where f :: a -> ()
instance C T where f _ = ()
instance C Word16 where f _ = ()
main = return $ f 0x16
GHC complains that it can't infer what the type for the literal 0x16 should be with the error:
No instance for (Num a0) arising from the literal ‘22’
The type variable ‘a0’ is ambiguous
It is easy to see why this would be -- Haskell allows numeric literals to be of any type which has an instance of Num, and here we can't disambiguate what the type for the literal 0x16 (or 22) should be.
It's also clear as a human reading this what I intended to do -- there is only one available instance of the class C which satisfies the Num constraint, so obviously I intended to use that one so 0x16 should be treated as a Word16.
There are two ways that I know to fix it: Either annotate the literal with its type:
main = return $ f (0x16 :: Word16)
or define a function which essentially does that annotation for you:
w16 x = x :: Word16
main = return $ f (w16 0x16)
I have tried a third way, sticking default (Word16) at the top of the file in the hope that Haskell would pick that as the default type for numeric literals, but I guess I'm misunderstanding what the default keyword is supposed to do because that didn't work.
I understand that typeclasses are open, so just because you can make the assumption in the context quoted above that Word16 is the only numeric instance of C that may not hold in some other module. But my question is: is there some mechanism by which I can assume/enforce that property, so that it is possible to use f and have Haskell resolve the type of its numeric argument to Word16 without explicit annotations at the call site?
The context is that I am implementing an EDSL, and I would rather not have to include manual type hints when I know that my parameters will either be Word16 or some other non-numeric type. I am open to a bit of dirty types/extensions abuse if it makes the EDSL feel more natural! Although if solutions do involve the naughty pragmas I'd definitely appreciate hints on what I should be wary about when using them.
Quick solution with "naughty pragmas" with GHC 7.10:
{-# LANGUAGE TypeFamilies, FlexibleInstances #-}
class C a where f :: a -> ()
instance C T where f _ = ()
instance {-# INCOHERENT #-} (w ~ Word16) => C w where f _ = ()
And with GHC 7.8:
{-# LANGUAGE TypeFamilies, FlexibleInstances, IncoherentInstances #-}
class C a where f :: a -> ()
instance C T where f _ = ()
instance (w ~ Word16) => C w where f _ = ()
Here, GHC essentially picks an arbitrary most specific instance that remains after trying to unify the instances heads and constraints.
You should only use this if
You have a fixed set of instances and don't export the class.
For all use cases of the class method, there is a single possible most specific instance (given the constraints).
Many people advise against ever using IncoherentInstances, but I think it can be quite fun for DSL-s, if we observe the above considerations.
For anybody else wondering about default (I know I was!)
https://www.haskell.org/onlinereport/haskell2010/haskellch4.html#x10-750004.3
Quoting section 4.3.4:
In situations where an ambiguous type is discovered, an ambiguous type variable, v, is defaultable if:
v appears only in constraints of the form C v, where C is a class, and
at least one of these classes is a numeric class, (that is, Num or a subclass of Num), and
all of these classes are defined in the Prelude or a standard library.
So that explains why your default clause is being completely ignored; C is not a standard library type-class.
(As to why this is the rule… can't help you there. Presumably to avoid breaking arbitrary user-defined code.)

Type inference in haskell and Arrows

I'm trying to use arrows and faced annoying problem - I have to provide explicit types for all functions I implemented.
If I not provide it ghc outputs some error like
No instance for (Arrow a0) arising from a use of ‘...’
The type variable ‘a0’ is ambiguous
I can provide explicit types but it's VERY annoying, as every time I change some function it's a possibility that I have to manually alter types of every function depended on changed.
Is it possible to force ghc to infer function types automatically?
Trivial case
import Control.Arrow
ss = arr
causes
No instance for (Arrow a0) arising from a use of ‘arr’
The type variable ‘a0’ is ambiguous
Relevant bindings include
ss :: (b -> c) -> a0 b c (bound at src/Main.hs:62:1)
Note: there are several potential instances:
instance Arrow Coroutine -- Defined at src/Main.hs:33:10
instance Arrow (->) -- Defined in ‘Control.Arrow’
instance Monad m => Arrow (Kleisli m) -- Defined in ‘Control.Arrow’
In the expression: arr
In an equation for ‘ss’: ss = arr
while code with exactly the same semantic
import Control.Arrow
ss :: forall a b c. (Arrow a) => (b -> c) -> a b c
ss = arr
compiles pretty well.
Easiest thing is to turn off the monomorphism restriction - put this at the top of your source file:
{-# LANGUAGE NoMonomorphismRestriction #-}
The reason for your error is that although Haskell can infer the type for ss fine, the monomorphism restriction requires that in a top-level definitions of a value, the type is not polymorphic over a type-class (e.g. Arrow) unless there is an explicit type signature.

List of existentially quantified values in Haskell

I'm wondering why this piece of code doesn't type-check:
{-# LANGUAGE ScopedTypeVariables, Rank2Types, RankNTypes #-}
{-# OPTIONS -fglasgow-exts #-}
module Main where
foo :: [forall a. a]
foo = [1]
ghc complains:
Could not deduce (Num a) from the context ()
arising from the literal `1' at exist5.hs:7:7
Given that:
Prelude> :t 1
1 :: (Num t) => t
Prelude>
it seems that the (Num t) context can't match the () context of arg. The point I can't understand is that since () is more general than (Num t), the latter should and inclusion of the former. Has this anything to do with lack of Haskell support for sub-typing?
Thank you for any comment on this.
You're not using existential quantification here. You're using rank N types.
Here [forall a. a] means that every element must have every possible type (not any, every). So [undefined, undefined] would be a valid list of that type and that's basically it.
To expand on that a bit: if a list has type [forall a. a] that means that all the elements have type forall a. a. That means that any function that takes any kind of argument, can take an element of that list as argument. This is no longer true if you put in an element which has a more specific type than forall a. a, so you can't.
To get a list which can contain any type, you need to define your own list type with existential quantification. Like so:
data MyList = Nil | forall a. Cons a MyList
foo :: MyList
foo = Cons 1 Nil
Of course unless you restrain element types to at least instantiate Show, you can't do anything with a list of that type.
First, your example doesn't even get that far with me for the current GHC, because you need to enable ImpredecativeTypes as well. Doing so results in a warning that ImpredicativeTypes will be simplified or removed in the next GHC. So we're not in good territory here. Nonetheless, adding the proper Num constraint (foo :: [forall a. Num a => a]) does allow your example to compile.
Let's leave aside impredicative types and look at a simpler example:
data Foo = Foo (forall a. a)
foo = Foo 1
This also doesn't compile with the error Could not deduce (Num a) from the context ().
Why? Well, the type promises that you're going to give the Foo constructor something with the quality that for any type a, it produces an a. The only thing that satisfies this is bottom. An integer literal, on the other hand, promises that for any type a that is of class Num it produces an a. So the types are clearly incompatible. We can however pull the forall a bit further out, to get what you probably want:
data Foo = forall a. Foo a
foo = Foo 1
So that compiles. But what can we do with it? Well, let's try to define an extractor function:
unFoo (Foo x) = x
Oops! Quantified type variable 'a' escapes. So we can define that, but we can't do much interesting with it. If we gave a class context, then we could at least use some of the class functions on it.
There is a time and place for existentials, including ones without class context, but its fairly rare, especially when you're getting started. When you do end up using them, often it will be in the context of GADTs, which are a superset of existential types, but in which the way that existentials arise feels quite natural.
Because the declaration [forall a. a] is (in meaning) the equivalent of saying, "I have a list, and if you (i.e. the computer) pick a type, I guarantee that the elements of said list will be that type."
The compiler is "calling your bluff", so-to-speak, by complaining, "I 'know' that if you give me a 1, that its type is in the Num class, but you said that I could pick any type I wanted to for that list."
Basically, you're trying to use the value of a universal type as if it were the type of a universal value. Those aren't the same thing, though.

Resources