pythonbitmapbmpbit-depth

Specific image returns strange value when parsed with struct.unpack_from


I'm using the following bit of code to find the bit depth of a given image:

def parseImage(self):
    with open(self.imageAddress, "rb") as image:
        data = bytearray(image.read())
        bitDepth = struct.unpack_from("<L", data, 0x0000001c)
        print("the image's colour depth is " + str(bitDepth[0]))

It works as it should when I input my other test images, but when I specifically input the small sample image from this page, it outputs 196640. I've viewed the file in Hex Editor Neo, and the value of the chosen byte is 32. Does anyone know why the program doesn't return this value?


Solution

  • The 4 bytes starting at offset 0x1c are 20 00 03 00 which indeed is 196640 in decimal in little-endian byte format. The problem is all you want is the 20 00 which, again in little-endian byte format, is 32 in decimal.

    The Wikipedia article on the BMP file format (in the Windows BITMAPINFOHEADER section) says it's only a two-byte value — so the problem is you're parsing too many bytes.

    The fix is simple, specify the correct number of bytes for the unsigned integer in the struct format string ("<H" instead of "<L"). Note I've also added some scaffolding to make the code posted into something runnable.

    import struct
    
    
    class Test:
        def __init__(self, filename):
            self.imageAddress = filename
    
        def parseImage(self):
            with open(self.imageAddress, "rb") as image:
                data = bytearray(image.read())
                bitDepth = struct.unpack_from("<H", data, 0x1c)
                print("the image's colour depth is " + str(bitDepth[0]))
    
    
    t = Test('Small Sample BMP Image File Download.bmp')
    t.parseImage()  # -> the image's colour depth is 32