Haskell bitwise operators in Data.Bits - haskell

I am new to Haskell and am trying to use bitwise operations from Data.Bits. Every time I try I get an error message
Prelude Data.Bits> 1 `shiftL` 16
<interactive>:1:0:
Ambiguous type variable `t' in the constraint:
`Bits t' arising from a use of `shiftL' at <interactive>:1:0-12
Probable fix: add a type signature that fixes these type variable(s)
This happens for a number of operations, I also tried .|. and .&.
I must be missing something very simple, please let me know if you can spot the problem

In the interactive session, Haskell can't infer the types of 1 and 16. The solution then, is to give a hint:
> :m +Data.Bits
> let a = 1 :: Int
> let b = 16 :: Int
> a `shiftL` b
65535
>

ghci doesn't know what type to choose. However, that shouldn't be so:
Prelude Data.Bits> 1 `shiftL` 16
65536
From the expression entered at the prompt, the constraint Bits t is inferred (also Num t, but Num is a superclass of Bits, hence implied; and since it is to be displayed by the interpreter, also Show t).
Now, since one of the constraints is a numeric class and all of the classes are defined in the Prelude or the standard libraries, the ambiguous type variable t is eligible for defaulting. In the absence of explicit default declarations, the ambiguity is resolved by choosing Integer as the type.
Off the top of my head, I can't think of a language extension that would prevent the resolution of the ambiguity by defaulting, so the conclusion is that your ghci is old. The Bits class was not in the standard libraries as defined by the Haskell98 report, so a Bits constraint was not eligible for defaulting in compilers adhering to that standard, for example GHC < 7.
In that case, the immediate workaround is to specify a type signature,
Prelude Data.Bits> 1 `shiftL` 16 :: Int
65536
and the fix to your problem is to upgrade your GHC to a version adhering to the newer Haskell2010 standard.

Related

Haskell Types in some examples

I'm in the process of learning Haskell and I'm a beginner.
I wish I could search this question on StackOverflow.
But honestly I'm not quite sure what to search for.
I already tried to get the answers without much success so
please bear with me. It seems this is still really low level
stuff.
So my ghci interactive session never seems to output
"primitive types" like Int for example. I don't know how
else to put it. At the moment I'm trying to follow the
tutorial on http://book.realworldhaskell.org/read/getting-started.html.
Unfortunately, I can't seem to produce the same results.
For example:
Prelude> 5
5
Prelude> :type it
it :: Num a => a
I have to specifically say:
Prelude> let e = 5 :: Int
Prelude> e
5
Prelude> :type it
it :: Int
This is all very confusing to me so I hope somebody
can clear up this confusion a little bit.
EDIT:
On http://book.realworldhaskell.org/read/getting-started.html it says: "Haskell has several numeric types. For example, a literal number such as 1 could, depending on the context in which it appears, be an integer or a floating point value. When we force ghci to evaluate the expression 3 + 2, it has to choose a type so that it can print the value, and it defaults to Integer." I can't seem to force ghci to evaluate the type.
So for example:
Prelude> 3 + 2
5
Prelude> :t it
it :: Num a => a
Where I expected "Integer" to be the correct type.
There are a number of things going on here.
Numeric literals in Haskell are polymorphic; the type of the literal 5 really is Num a => a. It can belong to any type that adheres to the Num type class.
Addition is part of the Num type class, so an addition of two numeric literals is still Num a => a.
Interactive evaluation in ghci is very similar to evaluating actions in the IO monad. When you enter a bare expression, ghci acts as if you ran something like the following:
main = do
let it = 5 + 5
print it
It's not exactly like that, though, because in a program like that, inference would work over the entire do expression body to find a specific type. When you enter a single line, it has to infer a type and compile something with only the context available as of the end of the line you entered. So the print doesn't affect the type inferred for the let-binding, as it's not something you entered.
There's nothing in that program that would constrain it to a particular instance of Num or Show; this means that it is still a polymorphic value. Specifically, GHC compiles values with type class constraints to functions that accept a type class dictionary that provides the instance implementations required to meet the constraint. So, although it looks like a monomorphic value, it is actually represented by GHC as a function. This was surprising to enough people that the dreaded "Monomorphism Restriction" was invented to prevent this kind of surprise. It disallows pattern bindings (such as this one) where an identifier is bound to a polymorphic type.
The Monomorphism Restriction is off by default in GHC now, and it has been off by default in GHCi since version 7.8.
See the GHC manual for more info.
Haskell provides a special bit of magic for polymorphic numbers; each module can make a default declaration that provides type defaulting rules for polymorphic numbers. At your ghci prompt, the defaulting rules made ghci choose 'Int' when it was forced to provide instance dictionaries to show it in order to get to an IO action value.
Here's the relevant section in the Haskell 98 Report.
To sum it up: it was bound to the expression 5 + 5, which has type Num a => a because that's the more general inferred type based on the polymorphic numeric literals.
Polymorphic values are represented as functions waiting for a typeclass dictionary. So evaluating it at a particular instance doesn't force it to become monomorphic.
However, Haskell's type default rules allow it to pick a particular type when you implicitly print it as part of the ghci interaction. It picks Int and so it chooses the Int type class instance dictionaries for Show and Num when forced to by print it.
I hope that makes it somewhat less confusing!
By the way, here is an example of how you can get the same behavior outside of ghci by explicitly requesting the polymorphic let-binding. Without the type signature in this context, it will infer a monomorphic type for foo and give a type error.
main = do
let foo :: Num a => a
foo = 5 + 5
let bar = 8 :: Double
let baz = 9 :: Int
print (foo + bar)
print (foo + baz)
This will compile and run, printing the following:
18.0
19
UPDATE:
Looking at the Real World Haskell example and the comment thread, some people included different ghci logs along with their ghc versions. Using that information, I looked at ghc release notes and found that starting in version 7.8, the Monomorphism Restriction was disabled in ghci by default.
If you run the following command, you'll re-enable the Monomorphism Restriction and in order to be friendly, ghci will default the binding to Integer rather than giving you either an error or a polymorphic binding:
Prelude> :set -XMonomorphismRestriction
Prelude> 5 + 5
10
Prelude> :t it
it :: Integer
Prelude>
It appears that GHCi is performing some magic here. It is correctly defaulting the numbers to Integers so that they can be printed. However it is binding it to the polymorphic type before the defaulting appears.
I guess you want to see the type after the defaulting takes place. For that, I would recommend to use the Data.Typeable library as follows:
> import Data.Typeable
> let withType x = (x, typeOf x)
> withType 5
(5,Integer)
Above, GHCi has to default 5 to Integer, but this causes typeOf x to report the representation of the type after the defaulting happened. Hence we get the wanted type.
The following also works, precisely because typeOf is called after the defaulting happened:
> :type 5
5 :: Num a => a
> typeOf 5
Integer
Keep however in mind that typeOf only works for monomorphic types. In general, the polymorphic result of :type is more useful.
Numbers in Haskell are polymorphic, there are separate types for Fixed and arbitrary precision Integers, Rationals, Floating Point numbers, and user defined number types. All can be instantiated with simple literals, by implementing the fromInteger method on the Num typeclass. The value you've given, (True, 1, "hello world", 3) has two integral literals, and they can be used to create two numbers, of possibly different types. the bit of type before the fat arrow, (Num t, Num t1), is saying that in the inferred type, t and t1 can be anything, so long as they happen to have the Num typeclass defined on them, ie they can be obtained with a fromInteger.

How to work around issue with ambiguity when monomorphic restriction turned *on*?

So, learning Haskell, I came across the dreaded monomorphic restriction, soon enough, with the following (in ghci):
Prelude> let f = print.show
Prelude> f 5
<interactive>:3:3:
No instance for (Num ()) arising from the literal `5'
Possible fix: add an instance declaration for (Num ())
In the first argument of `f', namely `5'
In the expression: f 5
In an equation for `it': it = f 5
So there's a bunch of material about this, e.g. here, and it is not so hard to workaround.
I can either add an explicit type signature for f, or I can turn off the monomorphic restriction (with ":set -XNoMonomorphismRestriction" directly in ghci, or in a .ghci file).
There's some discussion about the monomorphic restriction, but it seems like the general advice is that it is ok to turn this off (and I was told that this is actually off by default in newer versions of ghci).
So I turned this off.
But then I came across another issue:
Prelude> :set -XNoMonomorphismRestriction
Prelude> let (a,g) = System.Random.random (System.Random.mkStdGen 4) in a :: Int
<interactive>:4:5:
No instance for (System.Random.Random t0)
arising from the ambiguity check for `g'
The type variable `t0' is ambiguous
Possible fix: add a type signature that fixes these type variable(s)
Note: there are several potential instances:
instance System.Random.Random Bool -- Defined in `System.Random'
instance System.Random.Random Foreign.C.Types.CChar
-- Defined in `System.Random'
instance System.Random.Random Foreign.C.Types.CDouble
-- Defined in `System.Random'
...plus 33 others
When checking that `g' has the inferred type `System.Random.StdGen'
Probable cause: the inferred type is ambiguous
In the expression:
let (a, g) = System.Random.random (System.Random.mkStdGen 4)
in a :: Int
In an equation for `it':
it
= let (a, g) = System.Random.random (System.Random.mkStdGen 4)
in a :: Int
This is actually simplified from example code in the 'Real World Haskell' book, which wasn't working for me, and which you can find on this page: http://book.realworldhaskell.org/read/monads.html (it's the Monads chapter, and the getRandom example function, search for 'getRandom' on that page).
If I leave the monomorphic restriction on (or turn it on) then the code works. It also works (with the monomorphic restriction on) if I change it to:
Prelude> let (a,_) = System.Random.random (System.Random.mkStdGen 4) in a :: Int
-106546976
or if I specify the type of 'a' earlier:
Prelude> let (a::Int,g) = System.Random.random (System.Random.mkStdGen 4) in a :: Int
-106546976
but, for this second workaround, I have to turn on the 'scoped type variables' extension (with ":set -XScopedTypeVariables").
The problem is that in this case (problems when monomorphic restriction on) neither of the workarounds seem generally applicable.
For example, maybe I want to write a function that does something like this and works with arbitrary (or multiple) types, and of course in this case I most probably do want to hold on to the new generator state (in 'g').
The question is then: How do I work around this kind of issue, in general, and without specifying the exact type directly?
And, it would also be great (as a Haskell novice) to get more of an idea about exactly what is going on here, and why these issues occur..
When you define
(a,g) = random (mkStdGen 4)
then even if g itself is always of type StdGen, the value of g depends on the type of a, because different types can differ in how much they use the random number generator.
Moreover, when you (hypothetically) use g later, as long as a was polymorphic originally, there is no way to decide which type of a you want to use for calculating g.
So, taken alone, as a polymorphic definition, the above has to be disallowed because g actually is extremely ambiguous and this ambiguity cannot be fixed at the use site.
This is a general kind of problem with let/where bindings that bind several variables in a pattern, and is probably the reason why the ordinary monomorphism restriction treats them even stricter than single variable equations: With a pattern, you cannot even disable the MR by giving a polymorphic type signature.
When you use _ instead, presumably GHC doesn't worry about this ambiguity as long as it doesn't affect the calculation of a. Possibly it could have detected that g is unused in the former version, and treated it similarly, but apparently it doesn't.
As for workarounds without giving unnecessary explicit types, you might instead try replacing let/where by one of the binding methods in Haskell which are always monomorphic. The following all work:
case random (mkStdGen 4) of
(a,g) -> a :: Int
(\(a,g) -> a :: Int) (random (mkStdGen 4))
do (a,g) <- return $ random (mkStdGen 4)
return (a :: Int) -- The result here gets wrapped in the Monad

Haskell function composition confusion

I'm trying to learn haskell and I've been going over chapter 6 and 7 of Learn you a Haskell. Why don't the following two function definitions give the same result? I thought (f . g) x = f (g (x))?
Def 1
let{ t :: Eq x => [x] -> Int; t xs = length( nub xs)}
t [1]
1
Def 2
let t = length . nub
t [1]
<interactive>:78:4:
No instance for (Num ()) arising from the literal `1'
Possible fix: add an instance declaration for (Num ())
In the expression: 1
In the first argument of `t', namely `[1]'
In the expression: t [1]
The problem is with your type signatures and the dreaded monomorphism restriction. You have a type signature in your first version but not in your second; ironically, it would have worked the other way around!
Try this:
λ>let t :: Eq x => [x] -> Int; t = length . nub
λ>t [1]
1
The monomorphism restriction forces things that don't look like functions to have a monomorphic type unless they have an explicit type signature. The type you want for t is polymorphic: note the type variable x. However, with the monomorphism restriction, x gets "defaulted" to (). Check this out:
λ>let t = length . nub
λ>:t t
t :: [()] -> Int
This is very different from the version with the type signature above!
The compiler chooses () for the monomorphic type because of defaulting. Defaulting is just the process Haskell uses to choose a type from a typeclass. All this really means is that, in the repl, Haskell will try using the () type if it encounters an ambiguous type variable in the Show, Eq or Ord classes. Yes, this is basically arbitrary, but it's pretty handy for playing around without having to write type signatures everywhere! Also, the defaulting rules are more conservative in files, so this is basically just something that happens in GHCi.
In fact, defaulting to () seems to mostly be a hack to make printf work correctly in GHCi! It's an obscure Haskell curio, but I'd ignore it in practice.
Apart from including a type signature, you could also just turn the monomorphism restriction off in the repl:
λ>:set -XNoMonomorphismRestriction
This is fine in GHCi, but I would not use it in real modules--instead, make sure to always include a type signature for top-level definitions inside files.
EDIT: Ever since GHC 7.8.1, the monomorphism restriction is turned off by default in GHCi. This means that all this code would work fine with a recent version of GHCi and you do not need to set the flag explicitly. It can still be an issue for values defined in a file with no type signature, however.
This is another instance of the "Dreaded" Monomorphism Restriction which leads GHCi to infer a monomorphic type for the composed function. You can disable it in GHCi with
> :set -XNoMonomorphismRestriction

Why doesn't GHC complain when number constant out of range

GHC silently ignores out-of-range bits in numerical constants.
This behavior led me to wrestle with a rather strange bug today:
[0..256]::[Word8] -- evaluates to [0]!
I know what caused this bug (256 == 0 in a rot256 world).... I am interested in why GHC/Haskell was designed not to complain about it at compile time.
(This behavior is true for Int also- for instance, 18446744073709551617::Int = 1).
I've grown used to Haskell catching trivial compile time issues, and I was surprised when I had to track this down.
I suspect the honest answer is "because nobody implemented it yet". But I think there's another layer to that answer, which is that there are some subtle design issues.
For example: how should we know that 256 is out of range for Word8? Well, I suppose one answer might be that the compiler could notice that Word8 is an instance of all three of Integral, Ord, and Bounded. So it could generate a check like
(256 :: Integer) > fromIntegral (maxBound :: Word8)
and evaluate this check at compile time. The problem is that all of a sudden we are running potentially user-written code (e.g. maxBound, fromIntegral, and (>) presumably all come from instance declarations that can be written by a programmer) at compile time. That can be a bit dangerous -- since it's impossible to tell if we'll ever get an answer! So at the very least you would want this check to be off by default, and presumably at least as hard to turn on as Template Haskell is.
On the other hand, it might also be possible to just build in a handful of instances that we "trust" -- e.g. Word8 and Int, as you say. I would find that a bit disappointing, though perhaps such a patch would not be rejected.
It depends on the implementation of the individual Num instance. If you perform
> :type 1
1 :: Num a => a
So Haskell only initially converts something to a generic Num, and then you specify a type of Word8. If you try
> (maxBound :: Word8) + 1
0
> maxBound :: Word8
255
This is what is known as an overflow, and holds true in many languages, notably C. Haskell does not prevent you from doing this because there are legitimate cases where you might want to have overflow. Instead, it is up to you, the programmer, to ensure that your input data is valid. Also, as jozefg points out, it is impossible to know at compile time if every conversion is valid.
You could implement a Cyclic class that gives you the behavior you want, if you already have Eq, Bounded, and Enum:
class (Eq a, Bounded a, Enum a) => Cyclic a where
next :: a -> a
next a = if a == maxBound then minBound else succ a
prev :: a -> a
prev a = if a == minBound then maxBound else pred a
instance Cyclic Word8
> next 255 :: Word8
0
> prev 0 :: Word8
255
Luckily, for all Integral types, you already have Enum and Eq, and the only Integral I know of that doesn't have Bounded is Integer. It's just a matter of adding instance Cyclic <Int Type> for each that you want to use.

Ambiguous type variable with HashMap

I recently starting to play with haskell and I came across a problem while using HashMap that can be illustrated by this toy example:
import Data.HashMap as HashMap
foo = HashMap.insert 42 53 HashMap.empty
Here is the error I get when I load my file in the interpreter or compile it:
Prelude List HashMap> :load TypeError.hs
[1 of 1] Compiling Main ( TypeError.hs, interpreted )
TypeError.hs:3:22:
Ambiguous type variable `k0' in the constraints:
(Num k0) arising from the literal `42' at TypeError.hs:3:22-23
(Ord k0) arising from a use of `insert' at TypeError.hs:3:7-20
(Data.Hashable.Hashable k0)
arising from a use of `insert' at TypeError.hs:3:7-20
Possible cause: the monomorphism restriction applied to the following:
foo :: Map k0 Integer (bound at TypeError.hs:3:1)
Probable fix: give these definition(s) an explicit type signature
or use -XNoMonomorphismRestriction
In the first argument of `insert', namely `42'
In the expression: insert 42 53 empty
In an equation for `foo': foo = insert 42 53 empty
Failed, modules loaded: none.
Prelude List HashMap>
However if I define the exact same function directly in the interpreter, I get no error:
Prelude List HashMap> let foo = HashMap.insert 42 53 HashMap.empty
Prelude List HashMap>
Does anyone have any clue about this?
Thanks.
The reason is that ghci uses extended default rules, while the compiler uses (by default) the defaulting rules specified in the language report:
Ambiguities in the class Num are most common, so Haskell provides another way to resolve them—with a default declaration: default (t1 , … , tn) where n ≥ 0, and each ti must be a type for which Num ti holds. In situations where an ambiguous type is discovered, an ambiguous type variable, v, is defaultable if:
v appears only in constraints of the form C v, where C is a class, and
at least one of these classes is a numeric class, (that is, Num or a subclass of Num), and
all of these classes are defined in the Prelude or a standard library
The relevant rule is that per the report defaulting is only done if all involved classes are defined in a standard library, but the Hashable constraint on the key involves a class that doesn't meet that requirement.
Thus the compiler rejects it, because it can't resolve the ambiguous type variable arising from the key, while ghci defaults it to Integer, since ghci also defaults if other classes are involved.

Resources