ioshttp-live-streamingfairplay

FairPlay Streaming: Calling copyPixelBufferForItemTime on AVPlayerItemVideoOutput returns NULL


Has anybody had experience using HLS with Fairplay and succeeded in retrieving the pixel buffer?

I'm using an AVURLAsset with its resourceLoader delegate set. My AVAssetResourceLoaderDelegate takes care of dealing with the Fairplay process.

It displays fine on an AVPlayerLayer, however, when I try to use an AVPlayerItemVideoOutput that is attached to the AVPlayerItem and use copyPixelBufferForItemTime on it, the pixelBuffer returned is always NULL.

On the other hand, when I use a non-encrypted stream and not use the resourceLoader copyPixelBufferForItemTime returns a pixelbuffer as expected.

Is this by working as intended? (i.e. Streams that are protected using FairPlay don't expose their pixel buffer to prevent misuse.)

And if not, is there any way to get the pixel buffer from a stream that is protected using FairPlay on iOS?


Solution

  • After researching into this issue further, I have come to the conclusion that Apple engineered their AVPlayer in such a way that once you use FairPlay protected HLS the only exit point (see A:) for the decrypted data copyPixelBufferForItemTime always returns nil.

    It appears that once you use FairPlay, the only way to display your protected video content is by using an AVPlayerLayer. There appears to be no way as of today to retrieve FairPlay protected HLS media from Apple's APIs in order to display it on an OpenGL texture in 3D space for example.

    A: copyPixelBufferForItemTime being the only exit point since calling renderInContext on an AVPlayerLayer doesn't work