iosxcodeopencvimage-processingadaptive-threshold

thresholding an image with bright zones


I am developing an app for iOS with openCV that take a picture from a monitor and extract a curve, but when the image has some bright zones after thresholding, I don't get the complete curve but some black zones

Original image

processed image after thresholding

        original = [MAOpenCV cvMatGrayFromUIImage:_sourceImage];   
        cv::threshold(original, original, 70, 255, CV_THRESH_BINARY | CV_THRESH_OTSU);

        findContours(original, contours, hierarchy,CV_RETR_CCOMP, CV_CHAIN_APPROX_SIMPLE );

        cv::Mat skel(original.size(), CV_8UC1, cv::Scalar(0));

        int idx = 0;
        for(; idx >= 0; idx = hierarchy[idx][0])
        {
            if (contours[idx].size()>250 && idx>-1){
                cv::Scalar color( 255,255,255);
                drawContours(skel, contours, idx, color, CV_FILLED, 8, hierarchy);
            }
        }
        cv::threshold(skel, skel, 100, 255, CV_THRESH_BINARY_INV);
        cv::erode(skel,skel,cv::Mat(),cv::Point(-1,-1),2);

So how I can process the image to extract the curve when the image have some bright zones like the example


Solution

  • When you have a background with an uneven illumination, you may want to apply first a White Top-Hat (or here for MatLab, and here for OpenCV).

    Here is the result I got using a structuring element of type disk with radius 3.

    Then, whatever thresholding method you choose will work.