c++opencvcolorsrgbbgr

Converting an OpenCV BGR 8-bit Image to CIE L*a*b*


I am trying to convert a given Mat representing an RGB image with 8-bit depth to Lab using the function provided in the documentation:

cvtColor(source, destination, <conversion code>);

I have tried the following conversion codes:

CV_RGB2Lab
CV_BGR2Lab
CV_LBGR2Lab

I have received bizarre results each time around, with an "L" value of greater than 100 for some samples, literally <107, 125, 130>.

I am also using Photoshop to check the results - but given that 107 is beyond the accepted range of 0 ≤ L ≤ 100, I can not comprehend what my error is.

Update: I'll post my overall results here: Given an image (Mat) represented by 8-bit BGR, the image can be converted by the following:

cvtColor(source, destination, CV_BGR2Lab);

The pixel values can then be accessed in the following manner:

int step = destination.step;
int channels = destination.channels();
for (int i = 0; i < destination.rows(); i++) {
    for (int j = 0; j < destination.cols(); j++) {
        Point3_<uchar> pixelData;
        //L*: 0-255 (elsewhere is represented by 0 to 100)
        pixelData.x = destination.data[step*i + channels*j + 0];
        //a*: 0-255 (elsewhere is represented by -127 to 127)
        pixelData.y = destination.data[step*i + channels*j + 1];
        //b*: 0-255 (elsewhere is represented by -127 to 127)
        pixelData.z = destination.data[step*i + channels*j + 2];
    }
}

Solution

  • That's because L value is in range [0..255] in OpenCV. You can simply scale this value to needed interval ([0..100] in your case).