iosmacosavfoundationcore-audioaudiounit

AudioUnitRender produces empty audio buffers


I'm trying to mix audio from a couple of audio sources using Audio Units. The structure is quite simple, I have two inputs, a mixer audio unit and a generic output. The problem is that the buffers from the output AudioUnit are only filled with zeros.

I removed all OSStatus checks to provide shorter version of the code; however, all Audio Unit methods return noErr. The setup method is:

private func setupAudioUnits() {
    // Setting up mixer unit
    var mixerComponentDescription = AudioComponentDescription(
        componentType: kAudioUnitType_Mixer,
        componentSubType: kAudioUnitSubType_MultiChannelMixer,
        componentManufacturer: kAudioUnitManufacturer_Apple,
        componentFlags: 0,
        componentFlagsMask: 0)
    let mixerComponent = AudioComponentFindNext(nil, &mixerComponentDescription)
    AudioComponentInstanceNew(mixerComponent!, &mixerAudioUnit)
    guard let mixerAudioUnit else { fatalError() }

    var streamFormat = outputFormat.streamDescription.pointee
    AudioUnitSetProperty(mixerAudioUnit,
                         kAudioUnitProperty_StreamFormat,
                         kAudioUnitScope_Output,
                         0,
                         &streamFormat,
                         UInt32(MemoryLayout<AudioStreamBasicDescription>.size))                                
                            
    AudioUnitInitialize(mixerAudioUnit)

    var busCount = UInt32(inputs)
    AudioUnitSetProperty(mixerAudioUnit,
                         kAudioUnitProperty_ElementCount,
                         kAudioUnitScope_Input,
                         0,
                         &busCount,
                         UInt32(MemoryLayout<UInt32>.size))

    // Setting up mixer's inputs
    for input in 0..<inputs {
        var callbackStruct = AURenderCallbackStruct(inputProc: inputRenderCallback, 
                                                    inputProcRefCon: Unmanaged.passUnretained(self).toOpaque())
        AudioUnitSetProperty(mixerAudioUnit,
                             kAudioUnitProperty_SetRenderCallback,
                             kAudioUnitScope_Input,
                             UInt32(input),
                             &callbackStruct,
                             UInt32(MemoryLayout<AURenderCallbackStruct>.size))
    }

    // Setting up output unit
    var outputDescription = AudioComponentDescription(
          componentType: kAudioUnitType_Output,
          componentSubType: kAudioUnitSubType_GenericOutput,
          componentManufacturer: kAudioUnitManufacturer_Apple,
          componentFlags: 0,
          componentFlagsMask: 0)
    let outputComponent = AudioComponentFindNext(nil, &outputDescription)
    AudioComponentInstanceNew(outputComponent!, &outputAudioUnit)
    guard let outputAudioUnit else { fatalError() }

    AudioUnitSetProperty(outputAudioUnit,
                         kAudioUnitProperty_StreamFormat,
                         kAudioUnitScope_Input,
                         0,
                         &streamFormat,
                         UInt32(MemoryLayout<AudioStreamBasicDescription>.size))

    AudioUnitInitialize(outputAudioUnit)

    // Setting up a connection between the mixer and output units
    var connection = AudioUnitConnection(sourceAudioUnit: mixerAudioUnit,
                                         sourceOutputNumber: 0,
                                         destInputNumber: 0)

    AudioUnitSetProperty(outputAudioUnit,
                         kAudioUnitProperty_MakeConnection,
                         kAudioUnitScope_Input,
                         0,
                         &connection,
                         UInt32(MemoryLayout<AudioUnitConnection>.size))

    AudioOutputUnitStart(outputAudioUnit)
}

Here is the part that initiates the rendering. I'm providing nil for mData to use the internal buffer of the audio unit.

private func render(numberOfFrames: AVAudioFrameCount) {
    timeStamp.mFlags = .sampleTimeValid
    timeStamp.mSampleTime = Float64(sampleTime)

    let channelCount = outputFormat.channelCount
    let audioBufferList = AudioBufferList.allocate(maximumBuffers: Int(channelCount))
    for i in 0..<Int(channelCount) {
        audioBufferList[i] = AudioBuffer(mNumberChannels: 1,
                                          mDataByteSize: outputFormat.streamDescription.pointee.mBytesPerFrame,
                                          mData: nil)
    }

    // always returns noErr
    AudioUnitRender(outputAudioUnit,
                    nil,
                    &timeStamp,
                    0,
                    numberOfFrames,
                    audioBufferList.unsafeMutablePointer)

    for channel in 0..<Int(channelCount) {
        let outputBufferData = audioBufferList[channel].mData?.assumingMemoryBound(to: Float.self)
        // Inspecting the data
        // Audio buffer has only zeros
    }

    sampleTime += Int64(numberOfFrames)
}

AudioUnitRender always returns noErr and the buffers have correct size that is based on the provided AVAudioFormat. Input callback is being called and ioData is filled with data:

inputRenderCallback

However, the output buffer is always filled with zeros:

outputBufferData

Is there something wrong with my output audio unit setup or with the connection? Or maybe the mixer is not passing the data forward? Any help is appreciated.


Solution

  • So the main issue was that kAudioUnitSubType_MultiChannelMixer always produces empty buffers on macOS. Not sure if it's a bug, but replacing the subtype with kAudioUnitSubType_StereoMixer fixed the issue:

    let mixerSubType: OSType
    #if os(macOS)
    mixerSubType = kAudioUnitSubType_StereoMixer
    #else
    mixerSubType = kAudioUnitSubType_MultiChannelMixer
    #endif
    

    There were also some other things worth mentioning: