pythonentropy

Shannon entropy function returns -0.0 instead of 0.0


My Shannon entropy function for "aaaaa" should return 0.0 but gives me -0.0. How do I fix that?

enter image description here


Solution

  • Since -0.0 == 0.0 is true, what you have is mathematically correct. If you find the output unaesthetic, the fix is simple. Don't multiply by -1. Take the absolute value instead:

    from collections import Counter
    import math
    
    def entropy(string):
        counts = Counter(string)
        rel_freq = ((i/len(string)) for i in counts.values())
        return abs(sum(f*math.log2(f) for f in rel_freq))
    

    Then entropy('aaaaa') evaluates to 0.0.