iosobjective-cwebrtcvideo-captureapprtc

How to use the method to show localstream in videoview in WEBRTC latest framework <Anakros/WebRTC>? - for webrtc framework(iOS)


After updating the webrtc framework for the latest one , I am not getting how to show local stream to user cause methodology is changed which has no sample on repository's "iOS" folder.

in old code...

   RTCVideoCapturer *capturer = [RTCVideoCapturer capturerWithDeviceName:cameraID];
   RTCMediaConstraints *mediaConstraints = [self defaultMediaStreamConstraints];
   RTCVideoSource *videoSource = [_factory videoSourceWithCapturer:capturer constraints:mediaConstraints];
   localVideoTrack = [_factory videoTrackWithID:@"ARDAMSv0" source:videoSource];

The RTCVideoCapturer object and RTCVideoSource object was linked here to each other.

But in new code...

  RTCVideoSource *source = [_factory videoSource];
  RTCCameraVideoCapturer *capturer = [[RTCCameraVideoCapturer alloc] initWithDelegate:source];
  [_delegate appClient:self didCreateLocalCapturer:capturer];
    localVideoTrack = [_factory videoTrackWithSource:source
                                             trackId:kARDVideoTrackId];

There is no connection to each other. So, the delegate method does what , [_delegate appClient:self didCreateLocalCapturer:capturer]; I am not getting it. [Help Required!]


Solution

  • Implement this delegate method in video call view controller....

    - (void)appClient:(ARDAppClient *)client didCreateLocalCapturer:(RTCCameraVideoCapturer *)localCapturer{
    
        NSLog(@"%s %@",__PRETTY_FUNCTION__ ,localCapturer);
    
        _captureController =  [[ARDCaptureController alloc] initWithCapturer:localCapturer
                                                                    settings:[[ARDSettingsModel alloc] init]];
        [_captureController startCapture];
    }
    

    Then.... this method calls it to create the same...

     - (RTCVideoTrack *)createLocalVideoTrack {
          RTCVideoTrack* localVideoTrack = nil;
          // The iOS simulator doesn't provide any sort of camera capture
          // trying to open a local stream.
        #if !TARGET_IPHONE_SIMULATOR
          if (![_settings currentAudioOnlySettingFromStore]) {
              RTCVideoSource *source = [_factory videoSource];
              RTCCameraVideoCapturer *capturer = [[RTCCameraVideoCapturer alloc] initWithDelegate:source];
              [_delegate appClient:self didCreateLocalCapturer:capturer];
              localVideoTrack = [_factory videoTrackWithSource:source
                                                       trackId:kARDVideoTrackId];
    
              [_delegate appClient:self didReceiveLocalVideoTrack:localVideoTrack];
    
          }
    

    Then call ...

    _localVideoTrack = [self createLocalVideoTrack]; 
    

    in your init method...

    - (void)initCall {
        NSLog(@"%s",__PRETTY_FUNCTION__);
        if (!_isTurnComplete) {
            return;
        }
        self.state = kARDAppClientStateConnected;
        _localVideoTrack = [self createLocalVideoTrack];
        // Create peer connection.
        _constraints = [self defaultPeerConnectionConstraints];
    
    }
    

    This code does me able to achieve this!