macosaudiokit

Minimal AudioKit Microphone configuration (MacOS + iOS)


I've been building an App for MacOS (and also iOS) using AudioKit.

The App will play sounds with a MIDISampler, and this bit works!

It will also listen with a device microphone and use a PitchTap to provide tuning information. I have not been able to get this to work.

My audio graph setup looks like this…

final class AudioGraph {
    let engine = AudioEngine()
    let sampler = MIDISampler(name: "sampler")
    let pitchTracker: PitchTap

    init(file: SamplerFile? = nil) throws {
        let mic = engine.input!

        engine.output = Mixer(
            sampler, 
            Fader(mic, gain: 0.0)
        )

        pitchTracker = PitchTap(mic) { f, a in
            guard let f = f.first, let a = a.first else { return }
            print("Frequency \(f) – Amplitude \(a)")
        }

        if let file {
            try setFile(file)
        }
    }

    func setFile(_ file: SamplerFile) throws {
        try sampler.loadInstrument(url: file.url)
    }
}

// MARK: -

extension AudioGraph: NotePlayer {
    func startAudioEngine() throws {
        print("### Start engine")
        try engine.start()
        pitchTracker.start()
    }

    func stopAudioEngine() {
        pitchTracker.stop()
        engine.stop()
    }
}

When I run this lot on MacOS, I can play notes (yay). However, the PitchTap callback is called with frequency and magnitude information, but the pitch is always 100.0 and the magnitude always 2.4266092e-05 (like – pretty zero ish).

I've done some experiments…

I've built and run the cook book app of all the things, and the Tuner in this works great on both MacOS and iOS, which is very encouraging!

My speculation is I have not configured microphone input correctly on MacOS, or perhaps obtained necessary permissions, so engine.input is just a stream of silence?

I wondered if there is a supper minimal "Hello, World!" level of demo application showing just how to configure the microphone for use on MacOS?

The key points for microphone input, or just the code, would be a great answer to this question.

I'm also wanting to get this to run on iOS where the PitchTap callback isn't firing. I'll get to this in another question though, once I've resolved this one.

Many Thanks!


Solution

  • The answer to this turned to be as simple as it was completely non obvious.

    It seems to be more general than just the AudioKit library and will probably impact any use of the microphone on all (recent) Apple hardware.

    You need to give your Project Target the necessary "Capabilities" to access audio input.

    If you do not do this, it is still possible to access audio inputs, but the input devices will just provide silence. There doesn't seem to be any log message, compile warning, or thrown errors.

    Getting audio input capability…

    I've found two "Capabilities" to enable in the Xcode project. I'm not sure if both are necessary. Perhaps one is used by Catalyst Targets and the other by non Catalyst Targets. I've not investigated this yet – I suggest just checking both.

    Here's what those sections look like…

    App Snadbox UI

    App Sandbox UI

    Hardened Runtime UI

    Hardened Runtime UI