pythonmemory-managementtuplesfrozenset

Which takes less memory, a frozenset or a tuple?


I have an object which needs to be "tagged" with 0-3 strings (out of a set of 20-some possibilities); these values are all unique and order doesn't matter. The only operation that needs to be done on the tags is checking if a particular one is present or not (specific_value in self.tags).

However, there's an enormous number of these objects in memory at once, to the point that it pushes the limits of my old computer's RAM. So saving a few bytes can add up.

With so few tags on each object, I doubt the lookup time is going to matter much. But: is there a memory difference between using a tuple and a frozenset here? Is there any other real reason to use one over the other?


Solution

  • Tuples are very compact. Sets are based on hash tables, and depend on having "empty" slots to make hash collisions less likely.

    For a recent enough version of CPython, sys._debugmallocstats() displays lots of potentially interesting info. Here under a 64-bit Python 3.7.3:

    >>> from sys import _debugmallocstats as d
    >>> tups = [tuple("abc") for i in range(1000000)]
    

    tuple("abc") creates a tuple of 3 1-character strings, ('a', 'b', 'c'). Here I'll edit out almost all the output:

    >>> d()
    Small block threshold = 512, in 64 size classes.
    
    class   size   num pools   blocks in use  avail blocks
    -----   ----   ---------   -------------  ------------
    ...
        8     72       17941         1004692             4
    

    Since we created a million tuples, it's a very good bet that the size class using 1004692 blocks is the one we want ;-) Each of the blocks consumes 72 bytes.

    Switching to frozensets instead, the output shows that those consume 224 bytes each, a bit over 3x more:

    >>> tups = [frozenset(t) for t in tups]
    >>> d()
    Small block threshold = 512, in 64 size classes.
    
    class   size   num pools   blocks in use  avail blocks
    -----   ----   ---------   -------------  ------------
    ...
       27    224       55561         1000092             6
    

    In this particular case, the other answer you got happens to give the same results:

    >>> import sys
    >>> sys.getsizeof(tuple("abc"))
    72
    >>> sys.getsizeof(frozenset(tuple("abc")))
    224
    

    While that's often true, it's not always so, because an object may require allocating more bytes than it actually needs, to satisfy HW alignment requirements. getsizeof() doesn't know anything about that, but _debugmallocstats() shows the number of bytes Python's small-object allocator actually needs to use.

    For example,

    >>> sys.getsizeof("a")
    50
    

    On a 32-bit box, 52 bytes actually need to be used, to provide 4-byte alignment. On a 64-bit box, 8-byte alignment is currently required, so 56 bytes need to be used. Under Python 3.8 (not yet released), on a 64-bit box 16-byte alignment is required, and 64 bytes will need to be used.

    But ignoring all that, a tuple will always need less memory than any form of set with the same number of elements - and even less than a list with the same number of elements.