How does the skimage.measure.shannon_entropy() compute the histogram before returning the entropy result? I pass images in the format of 1,2 and 4 byte signed and unsigned grayscale images and also 32bit floating point grayscale images.
The source code is here: https://github.com/scikit-image/scikit-image/blob/main/skimage/measure/entropy.py
As you can see it is quite short, two lines of code. The histogram does not do binning at all, it uses the count
output of numpy.unique
. This means that the histogram will have as many elements as unique values in the image. For an 8-bit image, you’ll have up to 256 bins, but possibly less. For a 16-bit image you have up to 65536 bins, etc.
For images with higher bit depth, and especially for floating-point images, you should quantize your image to get meaningful results. For a floating-point image you can assume each pixel has a unique value, which makes the entropy computation not so meaningful. Because Shannon entropy does not take the relationship between neighboring pixels into account, so a floating-point image of a smooth ramp has the same Shannon entropy as a floating-point image with random values.