iosobjective-cvideoopengl-esglkit

Create CVPixelBuffer from YUV with IOSurface backed


So I am getting raw YUV data in 3 separate arrays from a network callback (voip app). From what I understand you cannot create IOSurface backed pixel buffers with CVPixelBufferCreateWithPlanarBytes according to here

Important: You cannot use CVPixelBufferCreateWithBytes() or CVPixelBufferCreateWithPlanarBytes() with kCVPixelBufferIOSurfacePropertiesKey. Calling CVPixelBufferCreateWithBytes() or CVPixelBufferCreateWithPlanarBytes() will result in CVPixelBuffers that are not IOSurface-backed

So thus you have to create it with CVPixelBufferCreate, but how do you transfer the data from the call back to the CVPixelBufferRef that you create?

- (void)videoCallBack(uint8_t *yPlane, uint8_t *uPlane, uint8_t *vPlane, size_t width, size_t height, size_t stride yStride,
                      size_t uStride, size_t vStride)
    NSDictionary *pixelAttributes = @{(id)kCVPixelBufferIOSurfacePropertiesKey : @{}};
    CVPixelBufferRef pixelBuffer = NULL;
    CVReturn result = CVPixelBufferCreate(kCFAllocatorDefault,
                                          width,
                                          height,
                                          kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange,
                                          (__bridge CFDictionaryRef)(pixelAttributes),
                                          &pixelBuffer);

I am unsure what to do afterwards here? Eventually I want to turn this into a CIImage which then I can use my GLKView to render the video. How do people "put" the data into the buffers from when you create it?


Solution

  • I figured it out and it was fairly trivial. Here is the full code below. Only issue is that I get a BSXPCMessage received error for message: Connection interrupted and it takes a while for the video to show.

    NSDictionary *pixelAttributes = @{(id)kCVPixelBufferIOSurfacePropertiesKey : @{}};
    CVPixelBufferRef pixelBuffer = NULL;
    CVReturn result = CVPixelBufferCreate(kCFAllocatorDefault,
                                          width,
                                          height,
                                          kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange,
                                          (__bridge CFDictionaryRef)(pixelAttributes),
                                          &pixelBuffer);
    
    CVPixelBufferLockBaseAddress(pixelBuffer, 0);
    uint8_t *yDestPlane = CVPixelBufferGetBaseAddressOfPlane(pixelBuffer, 0);
    memcpy(yDestPlane, yPlane, width * height);
    uint8_t *uvDestPlane = CVPixelBufferGetBaseAddressOfPlane(pixelBuffer, 1);
    memcpy(uvDestPlane, uvPlane, numberOfElementsForChroma);
    CVPixelBufferUnlockBaseAddress(pixelBuffer, 0);
    
    if (result != kCVReturnSuccess) {
        DDLogWarn(@"Unable to create cvpixelbuffer %d", result);
    }
    
    CIImage *coreImage = [CIImage imageWithCVPixelBuffer:pixelBuffer]; //success!
    CVPixelBufferRelease(pixelBuffer);
    

    I forgot to add the code to interleave the two U and V planes, but that shouldn't be too bad.