iosobjective-cvideoavfoundationvideo-editing

how to merge two video with transparency


I have successfully merge video-1 and video-2, over each other with video-2 being transparent using AVFoundation framework but after merging below video(video-1) is not displayed only video-2 is visible but when I use below code

AVMutableVideoCompositionLayerInstruction *SecondlayerInstruction =[AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:secondTrack];
[SecondlayerInstruction setOpacity:0.6 atTime:kCMTimeZero];

its set opacity on video-2 layer.But here actual problem is, there are some content over video-2 layer which is not transparent and here after applying opacity over video-2 layer it also apply over that content which is not transparent.
I am adding two image here which describe both scenario after set opacity using AVMutableVideoCompositionLayerInstruction

enter image description here enter image description here

Edited-1 : I also try to set a background color on myVideoCompositionInstructionwhich also not helped. taking reference from this old question link

Edited-2 : In AVVideoComposition.h, I found

Indicates the background color of the composition. Solid BGRA colors only are supported; patterns and other color refs that are not supported will be ignored. If the background color is not specified the video compositor will use a default backgroundColor of opaque black. If the rendered pixel buffer does not have alpha, the alpha value of the backgroundColor will be ignored.

What it means, I didn't get it.can any one help?


Solution

  • Good Question :

    Try This

    var totalTime : CMTime = CMTimeMake(0, 0)

    func mergeVideoArray() {

    let mixComposition = AVMutableComposition()
    for videoAsset in arrayVideos {
        let videoTrack = 
            mixComposition.addMutableTrack(withMediaType: AVMediaTypeVideo, 
                                           preferredTrackID: Int32(kCMPersistentTrackID_Invalid))          
        do {
            if videoAsset == arrayVideos.first {
                atTimeM = kCMTimeZero
            } else {
                atTimeM = totalTime // <-- Use the total time for all the videos seen so far.
            }
            try videoTrack.insertTimeRange(CMTimeRangeMake(kCMTimeZero, videoAsset.duration), 
                                           of: videoAsset.tracks(withMediaType: AVMediaTypeVideo)[0], 
                                           at: atTimeM)  
            videoSize = videoTrack.naturalSize
        } catch let error as NSError {
            print("error: \(error)")
        }
        totalTime += videoAsset.duration // <-- Update the total time for all videos.
    

    ...