c++opencvimage-processingcomputer-visionwebcam

Filter fluctuating lighting with OpenCV


I need to do fairly sensitive color (brightness) measurements in webcam footage, using OpenCV. The problem I am experiencing is that the ambient light fluctuates, which makes it hard to get accurate results. I'm looking for a way to continuously update sequential frames of the video to smooth out the global lighting differences. The light changes I'm trying to filter out occur globally in most or all of the image. I have tried to calculate a difference and subtract that, but with little luck. Does anyone have any advice on how to approach this problem?

EDIT: The 2 images below are from the same video, with color changes slightly magnified. If you alternate between them, you'll see that there's slight changes in lighting, probably due to clouds shifting outside. The problem is that these changes obscure any other color changes I might want to detect.

So I would like to filter out these particular changes. Given that I only need part of the frames I capture, I figured that it should be possible to filter out the lighting changes as they occur in the rest of the footage as well. Outside of my area of interest.

I have tried to capture the dominant frequencies in the changes using dft, to simply ignore changes in lighting. But I am not familiar enough with the use of that function. I have only been using opencv for a week, so I am still learning.

enter image description here enter image description here


Solution

  • Short answer: temporal low-pass filter on illumination as a whole

    Consider the illumination, conceptually, as a time sequence of values representing something like the light flux impinging upon the scene being photographed. Your ideal situation is that this function be constant, but the second-best situation is that it vary as slowly as possible. A low-pass filter changes a function that can vary rapidly to one that varies more slowly. The basic steps are thus: (1) Calculate a total illumination function (2) Compute a new illumination function using a low-pass filter (3) Normalize the original image sequence to the new illumination values.

    (1) The simplest way of calculating an illumination function is to add up all the luminance values for each pixel in the image. In simple cases, this might even work; you might guess from my tone that there are a number of caveats.

    An important issue is that you'd prefer to add up illumination values not in some color space (such as HSV) but rather some physical measure of illumination. Going back from a color space to the actual light in the room requires data that's not in an image, such as the spectral reflectivity of each surface in the image, so that's unlikely. As a proxy for this, you can use only part of the image, one that has a consistent reflectivity. In the sample images, the desk surface at the top of the image could be use. Select a geometric region and compute a total illumination number from that.

    Related to this, if you have regions of the image where the camera has saturated, you've lost a lot of information and the total illumination value won't relate well to the physical illumination. Simply cut out any such regions (but do it consistently across all frames).

    (2) Compute a low-pass filter on the illumination function. These transforms are a fundamental part of every signal processing package. I don't know enough about OpenCV to know if it's got appropriate function itself, so you might need another library. There are lots of different kinds of low-pass filters, but they should all give you similar results.

    (3) Once you've got a low-pass time series, you want to use it as a normalization function for the total illumination. Compute the average value of the low-pass series and divide by it, yielding a time series with average value 1. Now transform each image by multiplying the illumination in the image by the normalization factor. All the warnings about working ideally in a physical illumination space and not a color space apply.