pythoncomparisontypesoperatorslogic

Why is Decimal('0') > 9999.0 True in Python?


This is somehow related to my question Why is ''>0 True in Python?

In Python 2.6.4:

>> Decimal('0') > 9999.0
True

From the answer to my original question I understand that when comparing objects of different types in Python 2.x the types are ordered by their name. But in this case:

>> type(Decimal('0')).__name__ > type(9999.0).__name__
False

Why is Decimal('0') > 9999.0 == True then?

UPDATE: I usually work on Ubuntu (Linux 2.6.31-20-generic #57-Ubuntu SMP Mon Feb 8 09:05:19 UTC 2010 i686 GNU/Linux, Python 2.6.4 (r264:75706, Dec 7 2009, 18:45:15) [GCC 4.4.1] on linux2). On Windows (WinXP Professional SP3, Python 2.6.4 (r264:75706, Nov 3 2009, 13:23:17) [MSC v.1500 32 bit (Intel)] on win32) my original statement works differently:

>> Decimal('0') > 9999.0
False

I even more puzzled now. %-(


Solution

  • Because the decimal module does not compare against any type except long, int, and Decimal. In all other cases, decimal silently returns the "not something it knows about object" as greater. You can see this behavior in the _convert_other() function of decimal.py

    Silly, silly Decimal class.

    Oh, see http://bugs.python.org/issue2531 as well.

    So, here is what happens:

    Are we there yet?