Ad hoc polymorphic functions - haskell

I have some data that I'd like to print (some is Maybe and some is not), and I'm trying to create a generic showField function as follows:
showField :: (Show a) => a -> Text
showField x
| isJust x = Text.pack $ show $ fromJust x
| isNothing x = "None"
| otherwise = Text.pack $ show x
This is throwing a rigid type error:
• Couldn't match expected type ‘Maybe a0’ with actual type ‘a’
‘a’ is a rigid type variable bound by
the type signature for:
showField :: forall a. Show a => a -> Text
at /data/users/jkozyra/fbsource/fbcode/experimental/jkozyra/hs/holdout_cleanup/HoldoutReaper.hs:244:18
• In the first argument of ‘isNothing’, namely ‘x’
In the expression: isNothing x
In a stmt of a pattern guard for
an equation for ‘showField’:
isNothing x
I generally understand this error, but I don't understand if there's a way to achieve what I'd like to. I've also tried pattern matching rather than guards, but can't quite work that out either. Is there something that I could construct in this format that would work?

It looks like you're trying to construct an adhoc polymorphic function - a function whose definition varies according to its type.
A parametric polymorphic function does the same thing to all data types:
both :: a -> (a,a)
both a = (a,a)
In Haskell, adhoc polymorphism is implemented using type classes:
class ShowField a where
showField :: a -> Text
instance Show a => ShowField (Maybe a) where
showField Nothing = "None"
showField (Just a) = Text.pack $ show a
However there's no way to define an instance for "all other types other than Maybe a" with type classes, so you just have to define instances for the types you actually care about:
class ShowField Int where
showField = Text.pack . show
class ShowField Float where
showField = Text.pack . show
You can cut down on boilerplate by using -XDefaultSignatures:
class ShowField' a where
showField :: a -> Text
default showField :: Show a => a -> Text
showField = Text.pack . show
instance ShowField' Int where
instance ShowField' Float where

The error tells us:
‘a’ is a rigid type variable bound by
the type signature for:
showField :: forall a. Show a => a -> Text
Basically, this tells us that according to the type signature you provided, the type of the first parameter is forall a. Show a (the 'forall a.' bit is implied by the signature), which means that the first parameter can be any type that is an instance of Show. It is a rigid type variable because it is defined by an explicit type signature.
It also tells us:
Couldn't match expected type ‘Maybe a0’ with actual type ‘a’
By applying the functions isJust and isNothing — both of type Maybe a -> Bool — to the first parameter you are also claiming the type of the first parameter is Maybe a which is obviously not the same type as forall a. Show a => a -> Text.
You can turn this into a correct program simply by removing the type signature for showField, but that won't have the behavior you desire — the inferred type signature will be (Show a) => Maybe a -> Text which obviously only accepts values of Maybe a (where a is also an instance of Show).
In Haskell, you can't have a function that accepts values of both a and Maybe a¹. Without more context, it's unclear what your actual goal is, but there is almost certainly a more idiomatic way to achieve it.
¹ Unless you have a type class that has instances for both a and Maybe a.

Related

What is a quick way to determine how many typed holes to give to a function?

Typed holes offer a great way of finding out how to implement something: if you know what function to use, say foo, you can just write out something like foo _ _ _ and let the compiler tell you what types it expects for each argument. This makes it largely unnecessary to look up any documentation.
However, it only works properly if you actually write out the correct number of underscores. At the moment, I usually determine this by some trial-and-error, but it's not always obvious what hints to look out for, because in Haskell functions can always be just partially applied.
What is a good way to find out this number as quickly as possible?
As #chi suggests, the best thing I've found is "apply the hole to the function". I don't know if this answers the question, but hopefully it is at least somewhat helpful.
I'm guessing that by "functions can always be just partially applied" you mean you can have such a function:
foldr :: Foldable t => (a -> b -> b) -> b -> t a -> b
foldr = undefined
for which you can't tell just from the type how many arguments it should take in order to typecheck at any particular type. The best that the typechecker can do here is to tell you the minimum number of arguments it will accept:
bar :: String -> String
bar = _1 foldr
* Found hole:
_1 :: ((a0 -> b0 -> b0) -> b0 -> t0 a0 -> b0) -> String -> String
Where: `t0' is an ambiguous type variable
`b0' is an ambiguous type variable
`a0' is an ambiguous type variable
* In the expression: _
In the expression: _ foldr
In an equation for `bar': bar = _ foldr
* Ambiguous type variable `t0' arising from a use of `foldr'
prevents the constraint `(Foldable t0)' from being solved.
Probable fix: use a type annotation to specify what `t0' should be.
These potential instances exist:
instance Foldable (Either a) -- Defined in `Data.Foldable'
instance Foldable Maybe -- Defined in `Data.Foldable'
instance Foldable ((,) a) -- Defined in `Data.Foldable'
...plus one other
...plus 22 instances involving out-of-scope types
(use -fprint-potential-instances to see them all)
Aside: the second error isn't particularly helpful here, as t0 could really be any of those types. But if you find yourself in such a situation often, then -fprint-potential-instances may actually be useful.
You can now do a little bit of typechecking in your head:
((a0 -> b0 -> b0) -> b0 -> t0 a0 -> b0 ) ->
<_1> <_2> String -> String
for the types to match, you must supply at least two holes. You may need more, but that will depend on the instantiation of b0. Substituting in these holes, you get a pretty easy problem
bar :: String -> String
bar = foldr _1 _2
* Found hole: _1 :: Char -> String -> String
* Found hole: _2 :: String
You may even encounter a (in my opinion, silly) function like
class C a where foo :: a
instance C String where
instance C a => C (Int -> a) where
in which case you can do the same thing, and the typechecker helpfully notifies you of all the possible instances:
bar :: String -> String
bar = _ foo
test0.hs:6:7: warning: [-Wtyped-holes]
* Found hole: _ :: t0 -> String -> String
Where: `t0' is an ambiguous type variable
* In the expression: _
In the expression: _ foo
In an equation for `bar': bar = _ foo
* Relevant bindings include
bar :: String -> String (bound at test0.hs:6:1)
test0.hs:6:9: warning: [-Wdeferred-type-errors]
* Ambiguous type variable `t0' arising from a use of `foo'
prevents the constraint `(C t0)' from being solved.
Probable fix: use a type annotation to specify what `t0' should be.
These potential instances exist:
instance C a => C (Int -> a) -- Defined at test0.hs:3:10
instance C String -- Defined at test0.hs:2:10
* In the first argument of `_', namely `foo'
In the expression: _ foo
In an equation for `bar': bar = _ foo
Here you really have to guess. In this (admittedly contrived) example, I would probably guess you want one argument, because bar takes one argument.
bar :: String -> String
bar = foo . _
test0.hs:6:7: warning: [-Wdeferred-type-errors]
* Ambiguous type variable `b0' arising from a use of `foo'
prevents the constraint `(C (b0 -> String))' from being solved.
(maybe you haven't applied a function to enough arguments?)
Probable fix: use a type annotation to specify what `b0' should be.
These potential instance exist:
instance C a => C (Int -> a) -- Defined at test0.hs:3:10
test0.hs:6:13: warning: [-Wtyped-holes]
* Found hole: _ :: String -> b0
Where: `b0' is an ambiguous type variable
Now it tells you there is one potential instance, and so you can guess that the type of that hole really should be String -> Int.

How do I understand the set of valid inputs to a Haskell type constructor?

Warning: very beginner question.
I'm currently mired in the section on algebraic types in the Haskell book I'm reading, and I've come across the following example:
data Id a =
MkId a deriving (Eq, Show)
idInt :: Id Integer
idInt = MkId 10
idIdentity :: Id (a -> a)
idIdentity = MkId $ \x -> x
OK, hold on. I don't fully understand the idIdentity example. The explanation in the book is that:
This is a little odd. The type Id takes an argument and the data
constructor MkId takes an argument of the corresponding polymorphic
type. So, in order to have a value of type Id Integer, we need to
apply a -> Id a to an Integer value. This binds the a type variable to
Integer and applies away the (->) in the type constructor, giving us
Id Integer. We can also construct a MkId value that is an identity
function by binding the a to a polymorphic function in both the type
and the term level.
But wait. Why only fully polymorphic functions? My previous understanding was that a can be any type. But apparently constrained polymorphic type doesn't work: (Num a) => a -> a won't work here, and the GHC error suggests that only completely polymorphic types or "qualified types" (not sure what those are) are valid:
f :: (Num a) => a -> a
f = undefined
idConsPoly :: Id (Num a) => a -> a
idConsPoly = MkId undefined
Illegal polymorphic or qualified type: Num a => a -> a
Perhaps you intended to use ImpredicativeTypes
In the type signature for ‘idIdentity’:
idIdentity :: Id (Num a => a -> a)
EDIT: I'm a bonehead. I wrote the type signature below incorrectly, as pointed out by #chepner in his answer below. This also resolves my confusion in the next sentence below...
In retrospect, this behavior makes sense because I haven't defined a Num instance for Id. But then what explains me being able to apply a type like Integer in idInt :: Id Integer?
So in generality, I guess my question is: What specifically is the set of valid inputs to type constructors? Only fully polymorphic types? What are "qualified types" then? Etc...
You just have the type constructor in the wrong place. The following is fine:
idConsPoly :: Num a => Id (a -> a)
idConsPoly = MkId undefined
The type constructor Id here has kind * -> *, which means you can give it any value that has kind * (which includes all "ordinary" types) and returns a new value of kind *. In general, you are more concerned with arrow-kinded functions(?), of which type constructors are just one example.
TypeProd is a ternary type constructor whose first two arguments have kind * -> *:
-- Based on :*: from Control.Compose
newtype TypeProd f g a = Prod { unProd :: (f a, g a) }
Either Int is an expression whose value has kind * -> * but is not a type constructor, being the partial application of the type constructor Either to the nullary type constructor Int.
Also contributing to your confusion is that you've misinterpreted the error message from GHC. It means "Num a => a -> a is a polymorphic or qualified type, and therefore illegal (in this context)".
The last sentence that you quoted from the book is not very well worded, and maybe contributed to that misunderstanding. It's important to realize that in Id (a -> a) the argument a -> a is not a polymorphic type, but just an ordinary type that happens to mention a type variable. The thing which is polymorphic is idIdentity, which can have the type Id (a -> a) for any type a.
In standard Haskell polymorphism and qualification can only appear at the outermost level of the type in a type signature.
The type signature is almost correct
idConsPoly :: (Num a) => Id (a -> a)
Should be right, though i have no ghc on my phone to test this.
Also i think your question is quite broad, thus i deliberately answer only the concrete problem here.

How to apply polymorphic function to a concrete type?

Below is a distilled version of a problem I encountered while learning Haskell:
data A = A
data B = B
data Test = TestA A
| TestB B
test :: (a -> a) -> (Test -> Test)
test op t =
case t of
TestA a -> TestA $ op a
TestB b -> TestB $ op b
testA = test id (TestA A)
testB = test id (TestB B)
Trying to compile this gives the following error:
Couldn't match expected type ‘B’ with actual type ‘a’
‘a’ is a rigid type variable bound by
the type signature for test :: (a -> a) -> Test -> Test
What's going on? I thought that when I pass in a polymorphic function, I should be able to apply it to values of different concrete types.
The basic problem here is how Haskell infers quantification from free variables in type signatures. Given the following type signature...
test :: (a -> a) -> (Test -> Test)
...the type variable a is unbound. Haskell automatically converts unbound type variables into universal quantification constraints, so the above type is actually interpreted like this:
test :: forall a. (a -> a) -> (Test -> Test)
Now the error you are getting might make a little bit more sense—the type variable a can only unify with one type per invocation of test, which is decided by the caller. Therefore, the (a -> a) function could be String -> String or Int -> Int or any other type, but it can never be a function that works on both A and B.
Obviously, though, you had a different intent when you wrote that type signature. You wanted the (a -> a) function to be a type signature like the one for id: a function that truly works for any value, not some particular function for some particular choice of a. To specify this, you must make the forall explicit so that the compiler knows precisely how that type variable should be quantified:
test :: (forall a. a -> a) -> (Test -> Test)
However, the above type is actually not valid in standard Haskell. It is supported by GHC, though, by using the Rank2Types or RankNTypes extension, which permits “higher rank” polymorphism like the type signature above.

Can't properly define transformation from universal type, that defined with GADT

I've defined an universal data type that can contain anything (well, with current implementation not totally anything)!
Here it is (complete code):
{-#LANGUAGE NoMonomorphismRestriction#-}
{-#LANGUAGE GADTs#-}
{-#LANGUAGE StandaloneDeriving#-}
data AnyT where
Any :: (Show a, Read a) => a -> AnyT
readAnyT :: (Read a, Show a) => (String -> a) -> String -> AnyT
readAnyT readFun str = Any $ readFun str
showAnyT :: AnyT -> String
showAnyT (Any thing) = show thing
deriving instance Show AnyT --Just for convinience!
a = [Any "Hahaha", Any 123]
And I can play with it in console:
*Main> a
[Any "Hahaha",Any 123]
it :: [AnyT]
*Main> readAnyT (read::String->Float) "134"
Any 134.0
it :: AnyT
*Main> showAnyT $ Any 125
"125"
it :: String
Well, I have it, but I need to process it somehow. For example, let's define transformation functions (functions definition, add to previous code):
toAnyT :: (Show a, Read a) => a -> AnyT -- Rather useless
toAnyT a = Any a
fromAny :: AnyT -> a
fromAny (Any thing) = thing
And there is the problem! the fromAny definition from previous code is incorrect! And I don't know how to make it correct. I get the error in GHCi:
2.hs:18:23:
Could not deduce (a ~ a1)
from the context (Show a1, Read a1)
bound by a pattern with constructor
Any :: forall a. (Show a, Read a) => a -> AnyT,
in an equation for `fromAny'
at 2.hs:18:10-18
`a' is a rigid type variable bound by
the type signature for fromAny :: AnyT -> a at 2.hs:17:12
`a1' is a rigid type variable bound by
a pattern with constructor
Any :: forall a. (Show a, Read a) => a -> AnyT,
in an equation for `fromAny'
at 2.hs:18:10
In the expression: thing
In an equation for `fromAny': fromAny (Any thing) = thing
Failed, modules loaded: none.
I tried some other ways that giving errors too.
I have rather bad solution for this: defining necessary functions via showAnyT and read (replace previous function definitions):
toAnyT :: (Show a, Read a) => a -> AnyT -- Rather useless
toAnyT a = Any a
fromAny :: Read a => AnyT -> a
fromAny thing = read (showAnyT thing)
Yes, it's work. I can play with it:
*Main> fromAny $ Any 1352 ::Float
1352.0
it :: Float
*Main> fromAny $ Any 1352 ::Int
1352
it :: Int
*Main> fromAny $ Any "Haha" ::String
"Haha"
it :: String
But I think it's bad, because it uses string to transform.
Could you please help me to find neat and good solution?
First a disclaimer: I don't know the whole context of the problem you are trying to solve, but the first impression I get is that this kind of use of existentials is the wrong tool for the job and you might be trying to implement some code pattern that is common in object-oriented languaged but a poor fit for Haskell.
That said, existential types like the one you have here are usually like black holes where once you put something in, the type information is lost forever and you can't cast the value back to its original type. However, you can operate on existential values via typeclasses (as you've done with Show and Read) so you can use the typeclass Typeable to retain the original type information:
import Data.Typeable
data AnyT where
Any :: (Show a, Read a, Typeable a) => a -> AnyT
Now you can implement all the functions you have, as long as you add the new constraint to them as well:
readAnyT :: (Read a, Show a, Typeable a) => (String -> a) -> String -> AnyT
readAnyT readFun str = Any $ readFun str
showAnyT :: AnyT -> String
showAnyT (Any thing) = show thing
toAnyT :: (Show a, Read a, Typeable a) => a -> AnyT -- Rather useless
toAnyT a = Any a
fromAny can be implemented as returning a Maybe a (since you cannot be sure if the value you are getting out is of the type you are expecting).
fromAny :: Typeable a => AnyT -> Maybe a
fromAny (Any thing) = cast thing
You're using GADTs to create an existential data type. The type a in the constructor existed, but there's no way to recover it. The only information available to you is that it has Show and Read instances. The exact type is forgotten, because that's what your constructor's type instructs the type system to do. "Make sure this type has the proper instances, then forget what it is."
There is one function you've missed, by the way:
readLike :: String -> AnyT -> AnyT
readLike s (Any a) = Any $ read s `asTypeOf` a
Within the context of the pattern match, the compiler knows that whatever type a has, there is a Read instance, and it can apply that instance. Even though it's not sure what type a is. But all it can do with it is either show it, or read strings as the same type as it.
What you have is something called Existential type. If you follow that link than you will find that in this pattern the only way to work with the "data" inside the container type is to use type classes.
In your current example you mentioned that a should have Read and Show instances and that means only the functions in these type classes can be used on a and nothing else and if you want to support some more operations on a then it should be constrained with the required type class.
Think it like this: You can put anything in a box. Now when you extract something out of that box you have no way to specify what you will get out of it as you can put anything inside it. Now if you say that you can put any eatable inside this box, then you are sure that when you pick something from this box it will be eatable.

Why doesn't the compiler resolve implicit constrained type variables inside a function?

Here's the code:
class Problem p where
readProblem :: String -> p
solveProblem :: p -> String
readAndSolve = solveProblem . readProblem
And this is the error message GHC yields:
Ambiguous type variable `b0' in the constraint:
(Problem b0) arising from a use of `readProblem'
Probable fix: add a type signature that fixes these type variable(s)
In the second argument of `(.)', namely `readProblem'
In the expression: solveProblem . readProblem
In an equation for `readAndSolve':
readAndSolve = solveProblem . readProblem
As I understand, I have to somehow tell the compiler that the Problem instance used by solveProblem and readProblem is the same type, but I see no way to declare that. And why can't it figure that by itself?
You need not tell the compiler that it has to be the same type, the compiler figures that out by itself. However, it can't figure out which type to use. The canonical famous example of the problem is
foo = show . read
If foo had a legal type, that would be
foo :: (Read a, Show a) => String -> String
Now, how could the compiler determine a?
Your readAndSolve would have the type
readAndSolve :: Problem p => String -> String

Resources