pythonmatplotlibimage-processingastropyfits

How to display a 16 bit colour FITS image using python


I want to display a FITS image in python. I have all this working using astropy (the FITS loader) and matplotlib and a greyscale image. However I have images that are 16 bit colour with a Bayer matrix of RGGB and I don't know how to display this as an RGB colour image.

This works for a greyscale image:

import numpy as np
import matplotlib.pyplot as plt
from astropy.io import fits
m42 = fits.open('FITS Data/frame-u-006073-4-0063.fits')
imagedata = m42[0].data
plt.imshow(imagedata, cmap='gray')
plt.colorbar()

enter image description here

However I have a second image that is 16 bits per pixel and I don't know how to map the bits to r, g, b values and display them in matplotlib.

For example (first pixel is 3148):

pixel = imagedata[0][0]
r = (pixel & 0b1111000000000000) >> 12
g = (pixel & 0b0000111100000000) >> 8
g = int((g + ((pixel & 0b0000000011110000) >> 4)) / 2)
b = pixel & 0b0000000000001111

Gives 0, 8, and 12 for red, green, and blue respectively. How do I map the entire array imagedata to RGB and get matplotlib to display it? Also assuming you average the two green values? Any help appreciated.

Update: Have I misunderstood the format of a 16 bit image with a Bayer matrix? Are the full 16 bits per pixel either R, G, G, or B? In which case do I need to look at demosaicing / debayering the image first?


Solution

  • The answer was to use OpenCV to debayer the image and then normalise the data.

    import cv2
    debayered_image = cv2.cvtColor(imagedata, cv2.COLOR_BayerBG2BGR)
    normalised_image = cv2.normalize(debayered_image, None, alpha=0, beta=1, norm_type=cv2.NORM_MINMAX, dtype=cv2.CV_32F)
    imgplot = plt.imshow(normalised_image)