iosios8uiimagescreenshotcmsamplebufferref

Video stream in AVSampleBufferDisplayLayer doesn't show up in screenshot


I've been using the new Video Toolbox methods to take an H.264 video stream and display it in a view controller using AVSampleBufferDisplayLayer. This all works as intended and the stream looks great. However, when I try to take a screenshot of the entire view, the contents of the AVSampleBufferDisplayLayer (i.e. the decompressed video stream) do not show up in the snapshot. The snapshot shows all other UI buttons/labels/etc. but the screenshot only shows the background color of the AVSampleBufferDisplayLayer (which I had set to bright blue) and not the live video feed.

In the method below (inspired by this post) I take the SampleBuffer from my stream and queue it to be displayed on the AVSampleBufferDisplayLayer. Then I call my method imageFromLayer: to get the snapshot as a UIImage. (I then either display that UIImage in the UIImageView imageDisplay, or I save it to the device's local camera roll to verify what the UIImage looks like. Both methods yield the same result.)

-(void) h264VideoFrame:(CMSampleBufferRef)sample
{
    [self.AVSampleDisplayLayer enqueueSampleBuffer:sample];

    dispatch_sync(dispatch_get_main_queue(), ^(void) {
        UIImage* snapshot = [self imageFromLayer:self.AVSampleDisplayLayer];
        [self.imageDisplay setImage:snapshot];
    });
}

Here I simply take the contents of the AVSampleBufferDisplayLayer and attempt to convert it to a UIImage. If I pass the entire screen into this method as the layer, all other UI elements like labels/buttons/images will show up except for the AVDisplayLayer. If I pass in just the AVDisplayLayer, I get a solid blue image (since the background color is blue).

- (UIImage *)imageFromLayer:(CALayer *)layer
{
    UIGraphicsBeginImageContextWithOptions([layer frame].size, YES, 1.0);

    [layer renderInContext:UIGraphicsGetCurrentContext()];
    UIImage *outputImage = UIGraphicsGetImageFromCurrentImageContext();

    //UIImageWriteToSavedPhotosAlbum(outputImage, self, nil, nil);

    UIGraphicsEndImageContext();

    return outputImage;
}

I've tried using UIImage snapshot = [self imageFromLayer: self.AVSampleDisplayLayer.presentationLayer]; and .modelLayer, but that didn't help. I've tried queueing the samplebuffer and waiting before taking a snapshot, I've tried messing with the opacity and xPosition of the AVDisplayLayer... I've even tried setting different values for the CMTimebase of the AVDisplayLayer. Any hints are appreciated!

Also according to this post, and this post other people are having similar troubles with snapshots in iOS 8.


Solution

  • I fixed this by switching from AVSampleDisplayLayer to VTDecompressionSession. In the VTDecompression didDecompress callback method, I send the decompressed image (CVImageBufferRef) into the following method to get a screenshot of the video stream and turn it into a UIImage.

    -(void) screenshotOfVideoStream:(CVImageBufferRef)imageBuffer
    {
        CIImage *ciImage = [CIImage imageWithCVPixelBuffer:imageBuffer];
        CIContext *temporaryContext = [CIContext contextWithOptions:nil];
        CGImageRef videoImage = [temporaryContext
                                 createCGImage:ciImage
                                 fromRect:CGRectMake(0, 0, 
                                 CVPixelBufferGetWidth(imageBuffer), 
                                 CVPixelBufferGetHeight(imageBuffer))];
    
        UIImage *image = [[UIImage alloc] initWithCGImage:videoImage];
        [self doSomethingWithOurUIImage:image];
        CGImageRelease(videoImage);
    }