ioscore-videovideo-toolbox

Image buffer display order with VTDecompressionSession


I have a project where I need to decode h264 video from a live network stream and eventually end up with a texture I can display in another framework (Unity3D) on iOS devices. I can successfully decode the video using VTDecompressionSession and then grab the texture with CVMetalTextureCacheCreateTextureFromImage (or the OpenGL variant). It works great when I use a low-latency encoder and the image buffers come out in display order, however, when I use the regular encoder the image buffers do not come out in display order and reordering the image buffers is apparently far more difficult that I expected.

The first attempt was to set the VTDecodeFrameFlags with kVTDecodeFrame_EnableAsynchronousDecompression and kVTDecodeFrame_EnableTemporalProcessing... However, it turns out that VTDecompressionSession can choose to ignore the flag and do whatever it wants... and in my case, it chooses to ignore the flag and still outputs the buffer in encoder order (not display order). Essentially useless.

The next attempt was to associate the image buffers with the presentation time stamp and then throw them into a vector which would allow me to grab the image buffer I needed when I create the texture. The problem seems to be that the image buffer that goes into the VTDecompressionSession, which is associated with a time stamp, is no longer the same buffer that comes out, essentially making the time stamp useless.

For example, going into the decoder...

  VTDecodeFrameFlags flags = kVTDecodeFrame_EnableAsynchronousDecompression;
  VTDecodeInfoFlags flagOut;
  // Presentation time stamp to be passed with the buffer
  NSNumber *nsPts = [NSNumber numberWithDouble:pts];

  VTDecompressionSessionDecodeFrame(_decompressionSession, sampleBuffer, flags,
                                          (void*)CFBridgingRetain(nsPts), &flagOut);

On the callback side...

void decompressionSessionDecodeFrameCallback(void *decompressionOutputRefCon, void *sourceFrameRefCon, OSStatus status, VTDecodeInfoFlags infoFlags, CVImageBufferRef imageBuffer, CMTime presentationTimeStamp, CMTime presentationDuration)
 {
      // The presentation time stamp...
      // No longer seems to be associated with the buffer that it went in with!
      NSNumber* pts = CFBridgingRelease(sourceFrameRefCon);
 }

When ordered, the time stamps on the callback side increase monotonically at the expected rate, but the buffers are not in the right order. Does anyone see where I am making an error here? Or know how to determine the order of the buffers on the callback side? At this point I have tried just about everything I can think of.


Solution

  • In my case, the problem wasn't with VTDecompressionSession, it was a problem with the demuxer getting the wrong PTS. While I couldn't get VTDecompressionSession to put out the frames in temporal (display) order with the kVTDecodeFrame_EnableAsynchronousDecompression and kVTDecodeFrame_EnableTemporalProcessing flags, I could sort the frames myself based on PTS with a small vector.

    First, make sure you associate all of your timing information with your CMSampleBuffer along with the block buffer so you receive it in the VTDecompressionSession callback.

    // Wrap our CMBlockBuffer in a CMSampleBuffer...
    CMSampleBufferRef sampleBuffer;
    
    CMTime duration = ...;
    CMTime presentationTimeStamp = ...;
    CMTime decompressTimeStamp = ...;
    
    CMSampleTimingInfo timingInfo{duration, presentationTimeStamp, decompressTimeStamp};
    
    _sampleTimingArray[0] = timingInfo;
    _sampleSizeArray[0] = nalLength;
    
    // Wrap the CMBlockBuffer...
    status = CMSampleBufferCreate(kCFAllocatorDefault, blockBuffer, true, NULL, NULL, _formatDescription, 1, 1, _sampleTimingArray, 1, _sampleSizeArray, &sampleBuffer);
    

    Then, decode the frame... It is worth trying to get the frames out in display order with the flags.

    VTDecodeFrameFlags flags = kVTDecodeFrame_EnableAsynchronousDecompression | kVTDecodeFrame_EnableTemporalProcessing;
    VTDecodeInfoFlags flagOut;
    
    VTDecompressionSessionDecodeFrame(_decompressionSession, sampleBuffer, flags,
                                          (void*)CFBridgingRetain(NULL), &flagOut);
    

    On the callback side of things, we need a way of sorting the CVImageBufferRefs we receive. I use a struct that contains the CVImageBufferRef and the PTS. Then a vector with a size of two that will do the actual sorting.

    struct Buffer
    {
        CVImageBufferRef imageBuffer = NULL;
        double pts = 0;
    };
    
    std::vector <Buffer> _buffer;
    

    We also need a way to sort the Buffers. Always writing to and reading from the index with the lowest PTS works well.

     -(int) getMinIndex
     {
         if(_buffer[0].pts > _buffer[1].pts)
         {
             return 1;
         }
    
         return 0;
     }      
    

    In the callback, we need to fill the vector with Buffers...

     void decompressionSessionDecodeFrameCallback(void *decompressionOutputRefCon, void *sourceFrameRefCon, OSStatus status, VTDecodeInfoFlags infoFlags, CVImageBufferRef imageBuffer, CMTime presentationTimeStamp, CMTime presentationDuration)
     {
        StreamManager *streamManager = (__bridge StreamManager     *)decompressionOutputRefCon;
    
        @synchronized(streamManager)
        {
        if (status != noErr)
        {
            NSError *error = [NSError errorWithDomain:NSOSStatusErrorDomain code:status userInfo:nil];
            NSLog(@"Decompressed error: %@", error);
        }
        else
        {
            // Get the PTS
            double pts = CMTimeGetSeconds(presentationTimeStamp);
    
            // Fill our buffer initially
            if(!streamManager->_bufferReady)
            {
                Buffer buffer;
    
                buffer.pts = pts;
                buffer.imageBuffer = imageBuffer;
    
                CVBufferRetain(buffer.imageBuffer);
    
                streamManager->_buffer[streamManager->_bufferIndex++] = buffer;
            }
            else
            {
                // Push new buffers to the index with the lowest PTS
                int index = [streamManager getMinIndex];
    
                // Release the old CVImageBufferRef
                CVBufferRelease(streamManager->_buffer[index].imageBuffer);
    
                Buffer buffer;
    
                buffer.pts = pts;
                buffer.imageBuffer = imageBuffer;
    
                // Retain the new CVImageBufferRef
                CVBufferRetain(buffer.imageBuffer);
    
                streamManager->_buffer[index] = buffer;
            }
    
            // Wrap around the buffer when initialized
            // _bufferWindow = 2
            if(streamManager->_bufferIndex == streamManager->_bufferWindow)
            {
                streamManager->_bufferReady = YES;
                streamManager->_bufferIndex = 0;
            }
        }
    }
    }
    

    Finally we need to drain the Buffers in temporal (display) order...

     - (void)drainBuffer
     {
          @synchronized(self)
          {
             if(_bufferReady)
             {
                 // Drain buffers from the index with the lowest PTS
                 int index = [self getMinIndex];
    
                 Buffer buffer = _buffer[index];
    
                 // Do something useful with the buffer now in display order
             }
           }
     }