How dangerous is the AllowAmbiguousTypes
extension when used with the TypeApplications
extension?
The GHC manual gives the following example of an ambiguous type:
class C a
f :: C a => Int
f = 3
This fails to compile when not using AllowAmbiguousTypes
with the following message:
file.hs:8:6: error:
• Could not deduce (C a0)
from the context: C a
bound by the type signature for:
f :: forall a. C a => Int
at file.hs:8:6-15
The type variable ‘a0’ is ambiguous
• In the ambiguity check for ‘f’
To defer the ambiguity check to use sites, enable AllowAmbiguousTypes
In the type signature: f :: C a => Int
|
8 | f :: C a => Int
| ^^^^^^^^^^
With AllowAmbiguousTypes
, it compiles correctly.
However, the following example fails to compile even with AllowAmbiguousTypes
:
class C a
f :: C a => Int
f = 3
g :: C a => Int
g = f
It gives the following error when trying to compile it:
file.hs:12:5: error:
• Could not deduce (C a0) arising from a use of ‘f’
from the context: C a
bound by the type signature for:
g :: forall a. C a => Int
at file.hs:11:1-15
The type variable ‘a0’ is ambiguous
• In the expression: f
In an equation for ‘g’: g = f
|
12 | g = f
| ^
This can be written by enabling TypeApplications
and writing it like the following:
class C a
f :: C a => Int
f = 3
g :: forall a. C a => Int
g = f @a
AllowAmbiguousTypes
sounds scary, but based on the description in the GHC manual, it seems pretty benign, especially when used with TypeApplications
.
It's not like AllowAmbiguousTypes
is going to cause runtime errors, right?
This combination of AllowAmbiguousTypes
and TypeApplications
also seems to be used in some pretty popular packages, like constraints.
Alexis King nailed it in her comment; it deserves to be elevated to the level of an answer.
AllowAmbiguousTypes
is completely safe, in the sense that it is perfectly sound, will not make the compiler diverge, and cannot lead to runtime errors. It was just mostly useless prior to the introduction ofTypeApplications
. The combination of the two is a perfectly reasonable thing.