I'm working on an audio visualizer app that plays files from the user's music library utilizing MediaPlayer and AVAudioEngine. I'm working on getting the music library functionality working before the visualizer aspect. After setting up the engine for file playback, my app inexplicably crashes with an EXC_BREAKPOINT with code = 1. Usually this means I'm unwrapping a nil value, but I think I'm handling the optionals correctly with guard statements. I pinpointed it to the installTap call. I removed the processAudioBuffer function and my code still crashes the same way, so it's not that. The device that I'm testing this on is running iOS 26 beta 3, although my app is designed for iOS 18 and above.
Here is the setupAudioEngine function:
private func setupAudioEngine() {
do {
try AVAudioSession.sharedInstance().setCategory(.playback, mode: .default)
try AVAudioSession.sharedInstance().setActive(true)
} catch {
print("Audio session error: \(error)")
}
engine = AVAudioEngine()
guard let engine else { return }
_ = engine.mainMixerNode
}
Here is the play function:
func play(_ mediaItem: MPMediaItem) {
guard let url = mediaItem.assetURL else {
print("No asset URL for media item")
return
}
player = AVAudioPlayerNode()
guard let engine,
let player else { return }
do {
audioFile = try AVAudioFile(forReading: url)
guard let audioFile else {
print("Failed to create audio file")
return
}
engine.attach(player)
engine.connect(player, to: engine.mainMixerNode, format: audioFile.processingFormat)
duration = Double(audioFile.length) / audioFile.fileFormat.sampleRate
engine.prepare()
if !engine.isRunning {
try engine.start()
}
player.scheduleFile(audioFile, at: nil)
DispatchQueue.main.async { [weak self] in
self?.isPlaying = true
self?.startDisplayLink()
}
} catch {
print("Error playing audio: \(error)")
DispatchQueue.main.async { [weak self] in
self?.isPlaying = false
self?.stopDisplayLink()
}
}
engine.mainMixerNode.installTap(onBus: 0, bufferSize: 1024, format: engine.mainMixerNode.outputFormat(forBus: 0)) { [weak self] buffer, _ in
self?.processAudioBuffer(buffer)
}
player.play()
}
Here is a link to my MRE project if you want to try it out for yourself: https://github.com/aabagdi/VisualMan-example
Thanks to the help of Apple DTS engineers, there are two workarounds:
Quoting them directly:
This hasn't made it back to you, but there are two workaround which added to your bug which should eventually be sent back to you. Those are:
Option 1:
Annotate tapBlock enclosure as @Sendable
Isolate the call of `self?.processAudioBuffer(buffer)
However, since AVAudioBuffer is not marked as sendable, either import AVFAudio.AVAudioBuffer or AVFoundation with @preconcurrency annotation:
@preconcurrency import AVFAudio.AVAudioBuffer // or @preconcurrency import AVFoundation
[…]
engine.mainMixerNode.installTap(onBus: 0, bufferSize: 1024, format: format) { @Sendable [weak self] buffer, _ in
Task { @MainActor in
self?.processAudioBuffer(buffer)
}
}
Option 2:
To avoid annotating the import with @preconcurrency
Annotate tapBlock enclosure as @Sendable
Extract data from AVAudioBuffer within the closure
Isolate the call of `self?.processAudioData(array)
engine.mainMixerNode.installTap(onBus: 0, bufferSize: 1024, format: format) { @Sendable [weak self] buffer, _ in
// Extract the data from the buffer
guard let channelData = buffer.floatChannelData?[0] else { return }
let frameCount = Int(buffer.frameLength)
let audioData = Array(UnsafeBufferPointer(start: channelData, count: frameCount))
Task { @MainActor in
self?.processAudioData(audioData)
}
}