It seems like ones' complement representation of signed numbers is the most popular now (and probably the only representation used in the modern hardware). Why is it exactly better than others?
Actually the dominant representation is two's complement.
Representation methods include:
Ones' complement replaced sign-magnitude because the circuitry to implement it was much simpler.
Ones' complement has 2 representations for zero which complicates programming since it needs to test for -0 and +0.
This problem is not present in two's complement (has one value for 0) which is the dominant representation used universally today.