rustxlib

Image incorrectly displays in xlib


I'm trying to show image with plain xlib, I almost succeeded but for some reason result image differs with colors. Here's snippet of code how image is being shown:

/* preceeding initialization */

image = XCreateImage(display, CopyFromParent, XDefaultDepth(display, screen), ZPixmap, 0, buffer, width, height, 32, 0);

XPutImage(display, window, XDefaultGC(display, screen), image, 0, 0, 0, 0, width, height);

/* handling events */

buffer contains image pixels retrieved by another lib. Here's original image:enter image description here

And that's what I got:enter image description here

For some reason blue and red colors are swapped and yellow somehow became cyan. I think that the problem is different color depth of picture itself (8 bits) and screen depth (24 bits). If I understood correctly xlib expects to see each pixel as 3 bytes (24 bits), but image contains only 1 byte (8 bits) per each pixel. Also I tried to change color depth of window with XGetVisualInfo but there was no option with less than 24 bits in return result. Or maybe I misunderstood how pixels are being stored in memory? Isn't it like just and array of bytes, like this:

11111111 R
11111111 G
11111111 B

11111111 R
00000000 G
11111111 B

00000000 R
11111111 G
11111111 B

And so on...

Also I tried to display same buffer of pixels with another lib, and everything works fine, so the problem is somewhere in my code. I didn't use Pixmap bcs there is no difference, result is the same.


Solution

  • Finally I fixed it. All I had to do was to put each pixel manually by calling XPutPixel, here's snippet:

    let mut x = 0;
    let mut y = 0;
    let mut idx = 0;
    
    loop {
        XPutPixel(image, x, y, u32::from_be_bytes([buffer[idx + 3], buffer[idx], buffer[idx + 1], buffer[idx + 2]]) as u64);
        
        if x < img_info.width - 1 {
            x += 1;
        } else {
            if y == img_info.height - 1 {
                break;
            }
    
            x = 0;
            y += 1;
        }
    
        idx += 4;
    }
    
    /* further event handling, window mapping etc. */
    

    Also first byte can be 0, I didn't spot any difference, though I thought it was alpha but it seems like it's just padding for alignment. But still I'm not sure if every X server is using such pixel format or is it just only my case? I couldn't find any info about it in Xlib docs, maybe if I'll got any luck with this, answer will be completed. Also you need to know image endianness of server, you can check it by reading field byte_order in XImage, and compare against Xlib constants LSBFirst and MSBFirst. In my case X server is using LSBFirst and I'm using x86 so I don't need to make any additional transformations.