swiftaudioavplayeravaudioplayeraudiokit

How to play two audio files at once with Swift using AudioKit


I have two short wav audio files that I'm trying to play at the same time. Using AudioKit, I have an AudioEngine(), and I'm assuming I should use a MultiSegmentAudioPlayer() as the output along with the scheduleSegments(). Here is what I have:

class AudioPlayClass : ObservableObject {

var player = MultiSegmentAudioPlayer()
let engine = AudioEngine()

init(){}

    func playFiles(){
    
        self.engine.output = player
        do {
            try AVAudioSession.sharedInstance().setCategory(.playAndRecord, mode: .default, options: .defaultToSpeaker)
            try AVAudioSession.sharedInstance().setActive(true)
            try engine.start()
           guard let url = Bundle.main.url(forResource: note1, withExtension: "wav", subdirectory: instrumentDirectory) else {return}
            guard let url2 = Bundle.main.url(forResource: note2, withExtension: "wav", subdirectory: instrumentDirectory) else {return}
            let audioFile = try AVAudioFile(forReading: url)
            let audioFile2 = try AVAudioFile(forReading: url2)
            let fileSampleRate = audioFile.processingFormat.sampleRate
            let file2SampleRate = audioFile2.processingFormat.sampleRate
            let fileNumberOfSamples = audioFile.length
            let file2NumberOfSamples = audioFile2.length
            let audioFileEndTime = Double(fileNumberOfSamples)/fileSampleRate
            let audioFile2EndTime = Double(file2NumberOfSamples)/file2SampleRate

            let segment1 = segment(audioFile: audioFile,
                                   playbackStartTime: 0.0, fileStartTime: 0, fileEndTime: audioFileEndTime)
            let segment2 = segment(audioFile: audioFile2,
                                       playbackStartTime: 0.0, fileStartTime: 0, fileEndTime: audioFile2EndTime)

            player2.scheduleSegments(audioSegments: [segment1, segment2])
            player2.play()
        } catch {
            print(error.localizedDescription.debugDescription)
        }
    }
}

public struct segment : StreamableAudioSegment {
public var audioFile: AVAudioFile

public var playbackStartTime: TimeInterval

public var fileStartTime: TimeInterval

public var fileEndTime: TimeInterval

public var completionHandler: AVAudioNodeCompletionHandler?

}

I just have a basic understanding of playing audio in Swift and using AudioKit so any feedback would be greatly appreciated. Thanks!


Solution

  • Solved this by creating two separate instances of an AudioEngine(), each with their own respective AudioPlayer(), and loaded and played them immediately one after the other.