iosswiftlive-streamingreplaykit

How to send CMSampleBuffer to WebRTC?


So I am using Replaykit to try stream my phone screen on a web browser.

override func processSampleBuffer(_ sampleBuffer: CMSampleBuffer, with sampleBufferType: RPSampleBufferType) {
    //if source!.isSocketConnected {
    switch sampleBufferType {
    case RPSampleBufferType.video:
        // Handle video sample buffer
        
        break
    case RPSampleBufferType.audioApp:
        // Handle audio sample buffer for app audio
        
        break
    case RPSampleBufferType.audioMic:
        // Handle audio sample buffer for mic audio
        break
    @unknown default:
        break
    }
}

So how do we send that data to WebRTC? In order to use WebRTC, I learned that you need a signaling server. Is it possible to start a signaling server on your mobile, just like http server?


Solution

  • Hi Sam WebRTC have one function which can process CMSampleBuffer frames to get Video Frames. But it is working with CVPixelBuffer. So you have to firstly convert your CMSampleBuffer to CVPixelBuffer. And than add this frames into your localVideoSource with RTCVideoCapturer. i have solved similar problem on AVCaptureVideoDataOutputSampleBufferDelegate. This delegate produces CMSampleBuffer as ReplayKit. i hope that below code lines could be help to you. You can try at the below code lines to solve your problem.

    private var videoCapturer: RTCVideoCapturer?
    private var localVideoSource = RTCClient.factory.videoSource()
    private var localVideoTrack: RTCVideoTrack?
    private var remoteVideoTrack: RTCVideoTrack?
    private var  peerConnection: RTCPeerConnection? = nil
    public static let factory: RTCPeerConnectionFactory = {
        RTCInitializeSSL()
        let videoEncoderFactory = RTCDefaultVideoEncoderFactory()
        let videoDecoderFactory = RTCDefaultVideoDecoderFactory()
        return RTCPeerConnectionFactory(encoderFactory: videoEncoderFactory, decoderFactory: videoDecoderFactory)
    }()
    
    
    extension RTCClient : AVCaptureVideoDataOutputSampleBufferDelegate {
    func captureOutput(_ output: AVCaptureOutput, didOutput sampleBuffer: CMSampleBuffer, from connection: AVCaptureConnection) {
        print("didOutPut: \(sampleBuffer)")
        
        guard let imageBuffer: CVImageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer) else { return }
        let timeStampNs: Int64 = Int64(CMTimeGetSeconds(CMSampleBufferGetPresentationTimeStamp(sampleBuffer)) * 1000000000) 
        
        let rtcPixlBuffer = RTCCVPixelBuffer(pixelBuffer: imageBuffer)
        let rtcVideoFrame = RTCVideoFrame(buffer: rtcPixlBuffer, rotation: ._90, timeStampNs: timeStampNs)
        self.localVideoSource.capturer(videoCapturer!, didCapture: rtcVideoFrame)
        
      }  
    }
    

    Also you need configuration like that for mediaSender,

    func createMediaSenders() {
        let streamId = "stream"
        
        let videoTrack = self.createVideoTrack()
        self.localVideoTrack = videoTrack
        self.peerConnection!.add(videoTrack, streamIds: [streamId])
        self.remoteVideoTrack = self.peerConnection!.transceivers.first { $0.mediaType == .video }?.receiver.track as? RTCVideoTrack
        
    }
    private func createVideoTrack() -> RTCVideoTrack {
    let videoTrack = RTCClient.factory.videoTrack(with: self.videoSource, trackId: "video0")
        return videoTrack
    }