capture
is a CGImageRef
returned from a call to CGWindowListCreateImage()
. When I try to turn it into an NSImage
directly via initWithCGImage:size:
it mysteriously doubles in size. If I instead manually create an NSBitmapImageRep
from capture
and then add it to an empty NSImage
everything works ok.
My hardware setup is a retina MBP + non-retina external display. The capture is taking place on the non-retina screen.
NSLog(@"capture image size: %d %d", CGImageGetWidth(capture), CGImageGetHeight(capture));
NSLog(@"logical image size: %f %f", viewRect.size.width, viewRect.size.height);
NSBitmapImageRep *debugRep;
NSImage *image;
//
// Create NSImage directly
image = [[NSImage alloc] initWithCGImage:capture size:NSSizeFromCGSize(viewRect.size)];
debugRep = [[image representations] objectAtIndex:0];
NSLog(@"pixel size, NSImage direct: %d %d", debugRep.pixelsWide, debugRep.pixelsHigh);
//
// Create representation manually
NSBitmapImageRep *imageRep = [[NSBitmapImageRep alloc] initWithCGImage:capture];
image = [[NSImage alloc] initWithSize:NSSizeFromCGSize(viewRect.size)];
[image addRepresentation:imageRep];
[imageRep release];
debugRep = [[image representations] objectAtIndex:0];
NSLog(@"pixel size, NSImage + manual representation: %d %d", debugRep.pixelsWide, debugRep.pixelsHigh);
Log output:
capture image size: 356 262
logical image size: 356.000000 262.000000
pixel size, NSImage direct: 712 524
pixel size, NSImage + manual representation: 356 262
Is this expected behaviour?
The documentation for initWithCGImage:size:
states:
You should not assume anything about the image, other than that drawing it is equivalent to drawing the
CGImage
.
In the end I just continued on working with NSBitmapImageRep
instances directly.