swiftavfoundationavassetexportsession

AVAssetExportSession fails on IOS 13, muxing together audio and video


This code works (and still does) on all pre-IOS 13 devices. Currently howerver, am getting this error after the exportAsynchronously call runs:

Error Domain=AVFoundationErrorDomain Code=-11800 "The operation could not be completed" UserInfo={NSLocalizedFailureReason=An unknown error occurred (-12735), NSLocalizedDescription=The operation could not be completed, NSUnderlyingError=0x282e194a0 {Error Domain=NSOSStatusErrorDomain Code=-12735 "(null)"}}

Unsure if IOS 13 adds/changes some requirements in the basic setup of the AVAssetExportSession object or what? Could be an IOS bug?


Here is the code:

func compileAudioAndVideoToMovie(audioInputURL:URL, videoInputURL:URL) {   
        let docPath:String = NSSearchPathForDirectoriesInDomains(.documentDirectory, .userDomainMask, true)[0];   
        let videoOutputURL:URL = URL(fileURLWithPath: docPath).appendingPathComponent("video.mp4");   
        do   
        {   
            try FileManager.default.removeItem(at: videoOutputURL);   
        }   
        catch {}   
        let mixComposition = AVMutableComposition();   
        let videoTrack = mixComposition.addMutableTrack(withMediaType: AVMediaType.video, preferredTrackID: kCMPersistentTrackID_Invalid);   
        let videoInputAsset = AVURLAsset(url: videoInputURL);   
        let audioTrack = mixComposition.addMutableTrack(withMediaType: AVMediaType.audio, preferredTrackID: kCMPersistentTrackID_Invalid);   
        let audioInputAsset = AVURLAsset(url: audioInputURL);   
        do   
        {   
            try videoTrack?.insertTimeRange(CMTimeRangeMake(start: CMTimeMake(value: 0, timescale: 1000), duration: CMTimeMake(value: 3000, timescale: 1000)), of: videoInputAsset.tracks(withMediaType: AVMediaType.video)[0], at: CMTimeMake(value: 0, timescale: 1000));// Insert an 3-second video clip into the video track   
            try audioTrack?.insertTimeRange(CMTimeRangeMake(start: CMTimeMake(value: 0, timescale: 1000), duration: CMTimeMake(value: 3000, timescale: 1000)), of: audioInputAsset.tracks(withMediaType: AVMediaType.audio)[0], at: CMTimeMake(value: 0, timescale: 1000));// Insert an 3-second audio clip into the audio track   

            let assetExporter = AVAssetExportSession(asset: mixComposition, presetName: AVAssetExportPresetPassthrough);   
            assetExporter?.outputFileType = AVFileType.mp4;   
            assetExporter?.outputURL = videoOutputURL;   
            assetExporter?.shouldOptimizeForNetworkUse = false;   
            assetExporter?.exportAsynchronously {   
                switch (assetExporter?.status)   
                {   
                case .cancelled:   
                    print("Exporting cancelled");   
                case .completed:   
                    print("Exporting completed");   
                case .exporting:   
                    print("Exporting ...");   
                case .failed:   
                    print("Exporting failed");   
                default:   
                    print("Exporting with other result");   
                }   
                if let error = assetExporter?.error   
                {   
                    print("Error:\n\(error)");   
                }   
            }   
        }   
        catch   
        {   
            print("Exception when compiling movie");   
        }   
    }   

Solution

  • Issue seems to be related to AVAssetExportPresetPassthrough (and a combination of dealing with an AAC, maybe)

    Changing to AVAssetExportPresetLowQuality or AVAssetExportPresetHighestQuality and the video/audio are properly muxed into one. Again, this is just an IOS 13 issue, and likely a bug.