When I use a type union in Crystal with String
and Int32
, and then I assigned a value to it , it works fine
test : (String | Int32) = 100
puts "Hello, World! #{test}"
Resulting
~/Projects/learn-crystal crystal hello.cr
Hello, World! 100
But, when I change the Int32
to UInt32
, it will become an error
test : (String | UInt32) = 100
puts "Hello, World! #{test}"
Resulting
~/Projects/learn-crystal crystal hello.cr
Showing last frame. Use --error-trace for full trace.
In hello.cr:1:1
1 | test : (String | UInt32) = 100
^---
Error: type must be (String | UInt32), not Int32
But this works fine though
test : (String | UInt32) = 100_u32
puts "Hello, World! #{test}"
Resulting
~/Projects/learn-crystal crystal hello.cr
Hello, World! 100
Why Crystal does not doing auto infer for an union of one integer and string type?
This issue has been fixed since v0.36.0.