c++bitmapgrayscaleargbhbitmap

c++ - Conversion from ARGB to Grayscale - Results are well, but upside down


I wrote a small function to convert a Bitmap in ARGB to Grayscale. The conversion itself works fantastic, but the results are upside down. I cannot find the mistake.

Code: #include #include

inline BYTE GrayScaleValue(BYTE* r, BYTE* g, BYTE* b) { return /*ceil*/(0.21f * (*r) + 0.72f * (*g) + 0.07f * (*b)); }

extern "C" __declspec(dllexport) HBITMAP ConvertToMonocrom(HBITMAP bmp) {
    INT x = 0, y = 0;
    char Gray;
    BITMAP bm;
    GetObject(bmp, sizeof(BITMAP), (LPSTR)&bm);
    BYTE * pImgByte = (BYTE *)bm.bmBits;
    INT iWidthBytes = bm.bmWidth * 4;
    for (y = 0; y < bm.bmHeight; y++) {
        for (x = 0; x < bm.bmWidth; x++) {
            Gray = GrayScaleValue(&pImgByte[y * iWidthBytes + x * 4 + 3], &pImgByte[y * iWidthBytes + x * 4 + 2], &pImgByte[y * iWidthBytes + x * 4 + 1]);
            pImgByte[y * iWidthBytes + x * 4] = Gray;
            pImgByte[y * iWidthBytes + x * 4 + 1] = Gray;
            pImgByte[y * iWidthBytes + x * 4 + 2] = Gray;
            pImgByte[y * iWidthBytes + x * 4 + 3] = Gray;
        }
    }
    return CreateBitmapIndirect(&bm);
}

Here is the Picture:

Basic picture

The picture after conversion, without setting A - only RGB: Conversion without setting Alpha-Value

The picture after conversion, as shown in the code (with setting Alpha-Value: Conversion with setting Alpha-Value

Well, I don't know, why he is setting "Transparent" to black...


Solution

  • An HBITMAP can reference bitmaps stored in many different formats. It could be a DIBSECTION or a device-dependent bitmap. Either of which may represent the pixel values in a variety of ways. Note that the GetObject documentation lists two different ways to get information about an HBITMAP.

    My guess is that your input is a DIBSECTION with the scanlines stored top-to-bottom. When you ask for the BITMAP (with GetObject), you lose some of that format information, specifically whether the image is bottom-up or top-down, whether it has an alpha channel, and the order of the channels (e.g., BGRA V. ARGB). A BITMAP cannot represent as much format detail as a DIBSECTION.

    After manipulating the pixel data, you're creating a new bitmap with CreateBitmapIndirect. Since the BITMAP structure doesn't contain information about whether the data is bottom-up or top-down, it uses the default, which is bottom-up. Since you (apparently) started with top-down pixel data you've effectively flipped the image upside down.

    Your difficulty with the alpha channel can also be explained by the fact that you lost information when you tried to describe the format with just a BITMAP. If the color channels are in a different order than the default that CreateBitmapIndirect assumes, then you would see exactly this problem. By setting alpha to the same gray value as all the other channels, you've effectively hidden the fact that you've scrambled the order of the color channels.

    A Solution

    There are several different ways to go about this. Here's one possible solution:

    You can ask Windows to give you a pointer to the pixel data in the format you want to work in regardless of its native format using GetDIBits. You can then have Windows convert your modified pixel values back into the bitmap's format with SetDIBits.

    You'll have to fill out a BITMAPINFO (there are several variants) to describe the in-memory format you're expecting. You'd pass this BITMAPINFO to both GetDIBits and SetDIBits.