haskelltypeclassliterals

Haskell arithmetic with literals


In Haskell, a standalone expression 3 + 5 means fromInteger (3 :: Integer) + fromInteger (5 :: Integer). In GHCi it is evaluated to fromInteger 8 of type Num a => a. It is clear that Haskell can calculate (3 :: Integer) + (5 :: Integer), but evaluating the original polymorphic expression seems to require some rule of the form fromInteger x + fromInteger y = fromInteger (x + y), and I don't see such rule in the Prelude. How exactly is the evaluation of 3 + 5 done in GHCi?

Does defaulting play a role here? It does not seem so because if I write default () at the prompt, then 3 + 5 is still computed correctly, while, for example, \(x :: Float) -> x^2 fails to typecheck. If I restore the default with default (Integer, Double), then \(x :: Float) -> x^2 typechecks as expected.


Solution

  • Yes, defaulting is how this works.

    + has type Num a => a -> a -> a. So 3 + 5 can indeed be used to write an expression for a value of type Num a => a. But that value will never actually be evaluated if we leave it as that type! We can't identify which code to run for + to evaluate the expression until we've determined which Num instance to use, so there is no way for 3 + 5 to be evaluated at the polymorphic type Num a => a . In fact if you actually create such a value it is internally represented as a function, where the parameter is a dictionary identifying the Num instance, and the body will use the dictionary's definition for + (and also for fromInteger to evaluate the literals 3 and 5, in fact; those can't be evaluated polymorphically either) to evaluate the expression.

    When you just type 3 + 5 into GHCi, it might appear to have evaluated your expression polymorphically, as it will (with default settings) print 8. But what actually happens is that expression creates a value that doesn't evaluate the addition (internally represented as a function), and then when GHCi tries to print your last-entered value it can't do that without having actually evaluated it, so it applies defaulting to the use of the value during print. Defaulting (with default settings) comes up with Integer , so it passes the dictionary for Num Integer to the function representing 3 + 5, which can then use Integer addition to produce the Integer value 8.

    If the same expression was used in a context requiring it to be a Double then the underlying function representing 3 + 5 would instead be passed the Num Double instance dictionary, and would then use Double addition to produce the Double value 8.0.

    The result of the addition is never actually represented as a concrete polymorphic value with the type Num a => a. It only actually gets evaluated once a concrete type has been chosen and the instance dictionary passed in. If you haven't identified a concrete type then you would get an ambiguous type variable error if you ever needed to actually evaluate the expression (such as being able to print a concrete value), unless defaulting chooses a concrete type for you.