cocoansimagenscolornsbitmapimagerep

cocoa: Read pixel color of NSImage


I have an NSImage. I would like to read the NSColor for a pixel at some x and y. Xcode seems to thing that there is a colorAtX:y: method on NSImage, but this causes a crash saying that there is no such method for NSImage. I have seen some examples where you create an NSBitmapImageRep and call the same method on that, but I have not been able to successfully convert my NSImage to an NSBitmapImageRep. The pixels on the NSBitmapImageRep are different for some reason.

There must be a simple way to do this. It cannot be this complicated.


Solution

  • Without seeing your code it's difficult to know what's going wrong.

    You can draw the image to an NSBitmapImageRep using the initWithData: method and pass in the image's TIFFRepresentation.

    You can then get the pixel value using the method colorAtX:y:, which is a method of NSBitmapImageRep, not NSImage:

    NSBitmapImageRep* imageRep = [[NSBitmapImageRep alloc] initWithData:[yourImage TIFFRepresentation]];
    NSSize imageSize = [yourImage size];
    CGFloat y = imageSize.height - 100.0;
    NSColor* color = [imageRep colorAtX:100.0 y:y];
    [imageRep release];
    

    Note that you must make an adjustment for the y value because the colorAtX:y method uses a coordinate system that starts in the top left of the image, whereas the NSImage coordinate system starts at the bottom left.

    Alternatively, if the pixel is visible on-screen then you can use the NSReadPixel() function to get the color of a pixel in the current coordinate system.