I'm using OpenJDK17 and am perplexed as to why Integer.parseInt("10000000000000000000000000000000", 2)
would throw a NumberFormatException
as it is the binary representation of Integer.MIN_VALUE
(or -2147483648
in decimal).
Is this expected or some known edge-case/quirk?
I attempted to provide the binary representation of Java's Integer.MIN_VALUE
to Integer.parseInt
in order to receive back -2147483648
but inside experience a NumberFormatException
.
Note: I'm not particularly trying to achieve anything just fiddling around on the JVM and came across this.
The Integer.parseInt()
method with the radix of 2
is not used to convert a bit representation of a two's complement integer value to an int
value. Instead it "just" parsed the string as a signed integer with the given radix. What that means is that "10000000000000000000000000000000" in binary is 2147483648 in decimal. Such a big value cannot be saved as int
in java because the maximum int
value in java is 2147483647 (or Integer.MAX_VALUE
). That's why you get a NumberFormatException
. The result will not be a negative value like Integer.MIN_VALUE
what you expected. The result will only be negative when the first character in the string is a -
character as mentioned in the javadoc of Integer.parseInt()
:
[...] except that the first character may be an ASCII minus sign '-' ('\u002D') to indicate a negative value [...]