iosswiftwebrtcios15videocall

How to add Picture in Picture (PIP) for WebRTC Video Calls in iOS Swift


We have used the following steps of integrating PIP (Picture in Picture) for WebRTC Video Call:

  1. We are enabling mode of Audio, Airplay, and Picture in Picture capability in our project.

  2. We have added an Entitlement file with Accessing the Camera while multitasking, see Accessing the Camera While Multitasking.)

  3. From the documentation link, we followed:

    Provision Your App

    After your account has permission to use the entitlement, you can create a new provisioning profile with it by following these steps:

    1. Log in to your Apple Developer Account.

    2. Go to Certificates, Identifiers & Profiles.

    3. Generate a new provisioning profile for your app.

    4. Select the Multitasking Camera Access entitlement from the additional entitlements for your account.

  4. We have also integrated the following link, but how to add video render layer view in this SampleBufferVideoCallView we don’t have any particular hint. https://developer.apple.com/documentation/avkit/adopting_picture_in_picture_for_video_calls?changes=__8

  5. Also, RTCMTLVideoView creates MTKView isn’t supported, but we have used WebRTC default video render view like RTCEAGLVideoView used to GLKView for a video rendering.

The PIP Integrate with WebRTC iOS Swift code:

class SampleBufferVideoCallView: UIView {
    override class var layerClass: AnyClass {
        get { return AVSampleBufferDisplayLayer.self }
    }
    
    var sampleBufferDisplayLayer: AVSampleBufferDisplayLayer {
        return layer as! AVSampleBufferDisplayLayer
    }
}

func startPIP() {
    if #available(iOS 15.0, *) {
        let sampleBufferVideoCallView = SampleBufferVideoCallView()
        let pipVideoCallViewController = AVPictureInPictureVideoCallViewController()
        pipVideoCallViewController.preferredContentSize = CGSize(width: 1080, height: 1920)
        pipVideoCallViewController.view.addSubview(sampleBufferVideoCallView)
        
        let remoteVideoRenderar = RTCEAGLVideoView()
        remoteVideoRenderar.contentMode = .scaleAspectFill
        remoteVideoRenderar.frame = viewUser.frame
        viewUser.addSubview(remoteVideoRenderar)
        
        let pipContentSource = AVPictureInPictureController.ContentSource(
            activeVideoCallSourceView: self.viewUser,
            contentViewController: pipVideoCallViewController)
        
        let pipController = AVPictureInPictureController(contentSource: pipContentSource)
        pipController.canStartPictureInPictureAutomaticallyFromInline = true
        pipController.delegate = self
        
    } else {
        // Fallback on earlier versions
    }
}

How to add a viewUser GLKView into pipContentSource and how to integrate remote video buffer view into SampleBufferVideoCallView?

Is it possible this way or any other way to video render buffer layer view in AVSampleBufferDisplayLayer?


Solution

  • Code-Level Support Apple gave the following advice when asked about this problem:

    In order to make recommendations, we'd need to know more about the code you’ve tried to render the video.

    As discussed in the article you referred to, to provide PiP support you must first provide a source view to display inside the video-call view controller -- you need to add a UIView to AVPictureInPictureVideoCallViewController. The system supports displaying content from an AVPlayerLayer or AVSampleBufferDisplayLayer depending on your needs. MTKView/GLKView isn’t supported. Video-calling apps need to display the remote view, so use AVSampleBufferDisplayLayer to do so.

    In order to handle the drawing in your source view, you could gain access to the buffer stream before it is turned into a GLKView, and feed it to the content of the AVPictureInPictureViewController. For example, you can create CVPixelBuffers from the video feed frames, then from those, create CMSampleBuffers Once you have the CMSampleBuffers, you can begin providing these to the AVSampleBufferDisplayLayer for display. Have a look at the methods defined there to see how this is done. There's some archived ObjC sample code AVGreenScreenPlayer that you might look at to help you get started using AVSampleBufferDisplayLayer (note: it's Mac code, but the AVSampleBufferDisplayLayer APIs are the same across platforms).

    In addition, for implementing PiP support you'll want to provide delegate methods for AVPictureInPictureControllerDelegate, and for AVSampleBufferDisplayLayer AVPictureInPictureSampleBufferPlaybackDelegate. See the recent WWDC video What's new in AVKit for more information about the AVPictureInPictureSampleBufferPlaybackDelegate delegates which are new in iOS 15.