The NSOpenGLView
class has a read/write property openGLContext
, and as I read the documentation, the getter should never return nil:
If the receiver has no associated context object, a new
NSOpenGLContext
object is created. The new object is initialized with the receiver’s pixel format information.
On macOS 10.13, that seems to be true, but on macOS 10.9, it returns nil after I add the view to a window. That is, if I do this:
NSOpenGLView* glView = [[NSOpenGLView alloc]
initWithFrame: NSMakeRect( 0.0, 0.0, 100.0, 100.0 )
pixelFormat: [NSOpenGLView defaultPixelFormat]];
NSLog(@"Pixel format %@, context %@", glView.pixelFormat, glView.openGLContext );
[self.host addSubview: glView];
NSLog(@"Pixel format %@, context %@", glView.pixelFormat, glView.openGLContext );
Then on the second log, the context is nil. Is there any fix? Making my own NSOpenGLContext
and assigning it doesn't help. I have verified that the host view is part of a window and that the windowNumber
is positive.
Apparently, in Mavericks one must set the view of the context as well as setting the context of the view. Something like this works:
NSOpenGLPixelFormat* pf = [NSOpenGLView defaultPixelFormat];
NSOpenGLView* glView = [[[NSOpenGLView alloc]
initWithFrame: NSMakeRect( 0.0, 0.0, 100.0, 100.0 )
pixelFormat: pf] autorelease];
NSOpenGLContext* glc = [[[NSOpenGLContext alloc]
initWithFormat: glView.pixelFormat shareContext: nil] autorelease];
[self.host addSubview: glView];
glc.view = glView; // the key step I was missing
glView.openGLContext = glc;
NSLog(@"Pixel format %@, context %@", glView.pixelFormat, glView.openGLContext );