iosswiftavfoundationswift4avmutablecomposition

Trouble applying scaleTimeRange on multiple videos in a AVMutableComposition video


I am attempting to merge videos with scaleTimeRanges (to make them slo-mo or speed-up); however, it is not working as desired. Only the first video has the timerange effect... not all of them.

The work is done in the merge videos function; it is pretty simple... however I am not sure why the scaling of the time range is not working for only the first video and not the next ones...

This is a test project to test with, it has my current code: https://github.com/meyesyesme/creationMergeProj

This is the merge function I use, with the time range scaling currently commented out (you can uncomment to see it not working):

func mergeVideosTestSQ(arrayVideos:[VideoSegment], completion:@escaping (URL?, Error?) -> ()) {


let mixComposition = AVMutableComposition()


var instructions: [AVMutableVideoCompositionLayerInstruction] = []
var insertTime = CMTime(seconds: 0, preferredTimescale: 1)

print(arrayVideos, "<- arrayVideos")
/// for each URL add the video and audio tracks and their duration to the composition
for videoSegment in arrayVideos {
    
    let sourceAsset = AVAsset(url: videoSegment.videoURL!)
    
    let frameRange = CMTimeRange(start: CMTime(seconds: 0, preferredTimescale: 1), duration: sourceAsset.duration)
    
    guard
        let nthVideoTrack = mixComposition.addMutableTrack(withMediaType: .video, preferredTrackID: Int32(kCMPersistentTrackID_Invalid)),
        let nthAudioTrack = mixComposition.addMutableTrack(withMediaType: .audio, preferredTrackID: Int32(kCMPersistentTrackID_Invalid)), //0 used to be kCMPersistentTrackID_Invalid
        let assetVideoTrack = sourceAsset.tracks(withMediaType: .video).first
    else {
        print("didnt work")
        return
    }
    
    var assetAudioTrack: AVAssetTrack?
    assetAudioTrack = sourceAsset.tracks(withMediaType: .audio).first
    print(assetAudioTrack, ",-- assetAudioTrack???", assetAudioTrack?.asset, "<-- hes", sourceAsset)
    
    do {
        
        try nthVideoTrack.insertTimeRange(frameRange, of: assetVideoTrack, at: insertTime)
        try nthAudioTrack.insertTimeRange(frameRange, of: assetAudioTrack!, at: insertTime)
        
        //MY CURRENT SPEED ATTEMPT:
        let newDuration = CMTimeMultiplyByFloat64(frameRange.duration, multiplier: videoSegment.videoSpeed)
        nthVideoTrack.scaleTimeRange(frameRange, toDuration: newDuration)
        nthAudioTrack.scaleTimeRange(frameRange, toDuration: newDuration)
        
        print(insertTime.value, "<-- fiji, newdur --->", newDuration.value, "sourceasset duration--->", sourceAsset.duration.value, "frameRange.duration -->", frameRange.duration.value)
        
        //instructions:
        let nthInstruction = ViewController.videoCompositionInstruction(nthVideoTrack, asset: sourceAsset)
        nthInstruction.setOpacity(0.0, at: CMTimeAdd(insertTime, newDuration)) //sourceasset.duration
        
        instructions.append(nthInstruction)
        insertTime = insertTime + newDuration //sourceAsset.duration
        
        
    } catch {
        DispatchQueue.main.async {
            print("didnt wor2k")
        }
    }
    
}


let mainInstruction = AVMutableVideoCompositionInstruction()
mainInstruction.timeRange = CMTimeRange(start: CMTime(seconds: 0, preferredTimescale: 1), duration: insertTime)

mainInstruction.layerInstructions = instructions

let mainComposition = AVMutableVideoComposition()
mainComposition.instructions = [mainInstruction]
mainComposition.frameDuration = CMTimeMake(value: 1, timescale: 30)
mainComposition.renderSize = CGSize(width: 1080, height: 1920)

let outputFileURL = URL(fileURLWithPath: NSTemporaryDirectory() + "merge.mp4")

//below to clear the video form docuent folder for new vid...
let fileManager = FileManager()
try? fileManager.removeItem(at: outputFileURL)

print("<now will export: 🔥 🔥🔥🔥🔥🔥🔥🔥🔥🔥🔥🔥🔥🔥🔥🔥🔥🔥🔥🔥🔥🔥🔥🔥🔥🔥🔥🔥")


/// try to start an export session and set the path and file type
if let exportSession = AVAssetExportSession(asset: mixComposition, presetName: AVAssetExportPresetHighestQuality) { //DOES NOT WORK WITH AVAssetExportPresetPassthrough
    exportSession.outputFileType = .mov
    exportSession.outputURL = outputFileURL
    exportSession.videoComposition = mainComposition
    exportSession.shouldOptimizeForNetworkUse = true
    
    /// try to export the file and handle the status cases
    exportSession.exportAsynchronously {
        if let url = exportSession.outputURL{
            completion(url, nil)
        }
        if let error = exportSession.error {
            completion(nil, error)
        }
    }
    
}

}

You'll see this behavior: the first one is working well, but then the next videos do not and have issues with when they were set opacity, etc... I have tried different combinations and this is the closest one yet.

I've been stuck on this for a while!


Solution

    1. After you scale the video, duration of the composition gets recalculated, so you need to append the next part according to this change. Replace

       insertTime = insertTime + duration
      

      with

       insertTime = insertTime + newDuration
      
    2. You also need to update setOpacity at value, I'd advise you to move that line after insertTime update and use new value, to remove duplicate work here

    3. When you're applying scale, it's applied to new composition, so you need to use relative range:

       let currentRange = CMTimeRange(start: insertTime, duration: frameRange.duration)
       nthVideoTrack.scaleTimeRange(currentRange, toDuration: newDuration)
       nthAudioTrack.scaleTimeRange(currentRange, toDuration: newDuration)