macosopenglcore-video

Core Video pixel buffers as GL_TEXTURE_2D


So I've setup CVPixelBuffer's and tied them to OpenGL FBOs successfully on iOS. But now trying to do the same on OSX has me snagged.

The textures from CVOpenGLTextureCacheCreateTextureFromImage return as GL_TEXTURE_RECTANGLE instead of GL_TEXTURE_2D targets.

I've found the kCVOpenGLBufferTarget key, but it seems like it is supposed to be used with CVOpenGLBufferCreate not CVPixelBufferCreate.

Is it even possible to get GL_TEXTURE_2D targeted textures on OSX with CVPixelBufferCreate, and if so how?

FWIW a listing of the CV PBO setup:

NSDictionary *bufferAttributes = @{ (__bridge NSString *)kCVPixelBufferPixelFormatTypeKey : @(kCVPixelFormatType_32BGRA), (__bridge NSString *)kCVPixelBufferWidthKey : @(size.width), (__bridge NSString *)kCVPixelBufferHeightKey : @(size.height), (__bridge NSString *)kCVPixelBufferIOSurfacePropertiesKey : @{ } };

if (pool)
{
    error = CVPixelBufferPoolCreatePixelBuffer(kCFAllocatorDefault, pool, &renderTarget);
}
else
{
    error = CVPixelBufferCreate(kCFAllocatorDefault, (NSUInteger)size.width, (NSUInteger)size.height, kCVPixelFormatType_32BGRA, (__bridge CFDictionaryRef)bufferAttributes, &renderTarget);
}

ZAssert(!error, @"Couldn't create pixel buffer");

error = CVOpenGLTextureCacheCreate(kCFAllocatorDefault, NULL, [[NSOpenGLContext context] CGLContextObj], [[NSOpenGLContext format] CGLPixelFormatObj], NULL, &textureCache);
ZAssert(!error, @"Could not create texture cache.");

error = CVOpenGLTextureCacheCreateTextureFromImage(kCFAllocatorDefault, textureCache, renderTarget, NULL, &renderTexture);
ZAssert(!error, @"Couldn't create a texture from cache.");

GLuint reference = CVOpenGLTextureGetName(renderTexture);
GLenum target = CVOpenGLTextureGetTarget(renderTexture);

UPDATE: I've been able to successfully use the resulting GL_TEXTURE_RECTANGLE textures. However, this will cause a lot of problems with the shaders for compatibility between iOS and OSX. And anyway I'd rather continue to use normalised texture coordinates.

If it isn't possible to get GL_TEXTURE_2D textures directly from a CVPixelBuffer in this manner, would it be possible to create a CVOpenGLBuffer and have a CVPixelBuffer attached to it to pull the pixel data?


Solution

  • Just came across this, and I'm going to answer it even though it's old, in case others encounter it.

    iOS uses OpenGL ES (originally 2.0, then 3.0). OS X uses regular old (non-ES) OpenGL, with a choice of Core profile (3.0+ only) or Compatibility profile (up to 3.2).

    The difference here is that OpenGL (non-ES) was designed a long time ago, when there were many restrictions on texture sizes. As cards lifted those restrictions, extensions were added, including GL_TEXTURE_RECTANGLE. Now it's no big deal for any GPU to support any size texture, but for API compatibility reasons they can't really fix OpenGL. Since OpenGL ES is technically a parallel, but separate, API, which was designed much more recently, they were able to correct the problem from the beginning (i.e. they never had to worry about breaking old stuff). So for OpenGL ES they never defined a GL_TEXTURE_RECTANGLE, they just defined that GL_TEXTURE_2D has no size restrictions.

    Short answer - OS X uses Desktop OpenGL, which for legacy compatibility reasons still treats rectangle textures separately, while iOS uses OpenGL ES, which places no size restrictions on GL_TEXTURE_2D, and so never offered a GL_TEXTURE_RECTANGLE at all. Thus, on OS X, CoreVideo produces GL_TEXTURE_RECTANGLE objects, because GL_TEXTURE_2D would waste a lot of memory, while on iOS, it produces GL_TEXTURE_2D objects because GL_TEXTURE_RECTANGLE doesn't exist, nor is it necessary.

    It's an unfortunate incompatibility between OpenGL and OpenGL ES, but it is what it is and there's nothing to be done but code around it. Or, now, you can (and probably should consider) moving on to Metal.