How can I create a CGImageRef
from a NSBitmapImageRep
?
Or how can I define a complete new CGImageRef
in the same way as the NSBitmapImageRep
? The definition of a NSBitmapImageRep
works fine. But I need an image as CGImageRef
.
unsigned char *plane = (unsigned char *)[data bytes]; // data = 3 bytes for each RGB pixel
NSBitmapImageRep* imageRep = [[NSBitmapImageRep alloc]
initWithBitmapDataPlanes: &plane
pixelsWide: width
pixelsHigh: height
bitsPerSample: depth
samplesPerPixel: channel
hasAlpha: NO
isPlanar: NO
colorSpaceName: NSCalibratedRGBColorSpace
//bitmapFormat: NSAlphaFirstBitmapFormat
bytesPerRow: channel * width
bitsPerPixel: channel * depth
];
I have no idea how to create the CGImageRef
from the NSBitmapImageRep
or how to define a new CGImageRef
:
CGImageRef imageRef = CGImageCreate(width, height, depth, channel*depth, channel*width, CGColorSpaceCreateDeviceRGB(), ... );
Please, can somebody give me a hint?
The easy way is by using the CGImage
property (introduced in 10.5):
CGImageRef image = imageRep.CGImage;
Documentation:
Return Value
Returns an autoreleased CGImageRef opaque type based on the receiver’s current bitmap data.
Discussion
The returned CGImageRef has pixel dimensions that are identical to the receiver’s. This method might return a preexisting CGImageRef opaque type or create a new one. If the receiver is later modified, subsequent invocations of this method might return different CGImageRef opaque types.