I have an app that streams mov files. One of the features in my app is recording audio over these streamed files. In order to accomplish that I download the streamed file in the background and then commence audio recording to capture user audio. Once the user is done I take the recorded audio and merge it with the mov file i previously downloaded in the background.
All of this is working well except for when you plug in headphones. The experience is the same but when you go to playback the recording only the audio was captured. The mov file never makes it into the final asset not sure why.
Here is how i'm producing the recording:
let video = AVAsset(URL: currentCacheUrl)
let audioRecording = AVAsset(URL: currentRecordingUrl)
// 1 - Create AVMutableComposition object. This object will hold your AVMutableCompositionTrack instances.
let mixComposition = AVMutableComposition()
// 2 - Video track
let videoTrack = mixComposition.addMutableTrackWithMediaType(AVMediaTypeVideo, preferredTrackID: kCMPersistentTrackID_Invalid)
do {
try videoTrack.insertTimeRange(CMTimeRangeMake(kCMTimeZero, audioRecording.duration),
ofTrack: video.tracksWithMediaType(AVMediaTypeVideo)[0],
atTime: kCMTimeZero)
} catch _ {
print("Failed to load video track")
}
// 3 - Audio track
let audioTrack = mixComposition.addMutableTrackWithMediaType(AVMediaTypeAudio, preferredTrackID: 0)
do {
try audioTrack.insertTimeRange(CMTimeRangeMake(kCMTimeZero, audioRecording.duration),
ofTrack: audioRecording.tracksWithMediaType(AVMediaTypeAudio)[0],
atTime: kCMTimeZero)
} catch _ {
print("Failed to load audio track")
}
// 4 - Get path
let recordingsPath = MisueroKaraokeLatinoHelper.Variables.getRecordingsDirectory
let currentDate = NSDate()
let date = formatDate(currentDate)
let savePath = recordingsPath.URLByAppendingPathComponent("\(date).mov")
// 5 - Create Exporter
guard let exporter = AVAssetExportSession(asset: mixComposition, presetName: AVAssetExportPresetHighestQuality) else { return }
exporter.outputURL = savePath
exporter.outputFileType = AVFileTypeQuickTimeMovie
exporter.shouldOptimizeForNetworkUse = true
// 6 - Perform the Export
exporter.exportAsynchronouslyWithCompletionHandler() {
dispatch_async(dispatch_get_main_queue()) { _ in
print("save and merge complete")
let recording = Recording(title: self.currentRecordSong.title, artist: self.currentRecordSong.artist, genre: self.currentRecordSong.genre, timestamp: currentDate, fileUrl: savePath)
MisueroKaraokeLatinoHelper.Variables.Recordings.append(recording)
MisueroKaraokeLatinoHelper.Variables.Recordings.sortInPlace({$0.timestamp.compare($1.timestamp) == NSComparisonResult.OrderedDescending})
let recordingsData = NSKeyedArchiver.archivedDataWithRootObject(MisueroKaraokeLatinoHelper.Variables.Recordings)
NSUserDefaults.standardUserDefaults().setObject(recordingsData, forKey: "Recordings")
}
}
Something similar happens when the iOS device has it's audio output set to a Bluetooth device. The final recording captures the user recorded audio and the video in the mov file but not the audio in the mov file. What gives?
addMutableTrackWithMediaType
adds a track of the specified media type. By specifying AVMediaTypeVideo
when i add the video track it ONLY adds the video track. Therefore, when the headphones are not plugged in the audio recorder picks up the audio from the mov that is playing, however, when the headphones are plugged in the mic cannot pick up the audio playing through the earbuds.
The solution is to add both the audio and video track of the cached mov file and deal with the lag/synchronization of all tracks. The audio of the cached mov should also be added conditionally if the headphones are plugged or if the audio output on the device is set to something like a bluetooth speaker in which case the audio recorder will not be able to pick up that audio as cleanly.
Open to other solutions/suggestions. But this approach is working for me for now.