I am using the OpenTok iOS sdk to stream from iphone to chrome. What I would like to do is record a high res version of the video while streaming.
Using a custom video capturer via the OTVideoCapture interface from Example 2 Let's Build OTPublisher, I can successfully record the video sample buffer to file. The problem is, I cannot find any reference to the audio data gathered from the microphone.
I assume its using a audioInput(AVCaptureDeviceInput), to an audioOutput(AVCaptureAudioDataOutput) via AVCaptureAudioDataOutputSampleBufferDelegate is used somewhere.
Does anyone know how to access it from the OpenTok iOS SDK?
The captureOutput:didOutputSampleBuffer:fromConnection , fromConnection field will differentiate the audio and sound connection and provides the corresponding buffer.
To setup the audio input/output you can try in Let-Build-OTPublisher initCapture method
//add audio input / outputs
AVCaptureDevice * audioDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeAudio];
_audioInput = [AVCaptureDeviceInput deviceInputWithDevice:audioDevice error:nil];
if([_captureSession canAddInput:_audioInput])
{
NSLog(@"added audio device input");
[_captureSession addInput:_audioInput];
}
_audioOutput = [[AVCaptureAudioDataOutput alloc] init];
if([_captureSession canAddOutput:_audioOutput])
{
NSLog(@"audio output added");
[_captureSession addOutput:_audioOutput];
}
[_audioOutput setSampleBufferDelegate:self queue:_capture_queue];