iosavfoundationcifilteravassetcore-video

Rendering a video in a CALayer hierarchy using CIFilters


In the UI of my iOS app, I display a complex hierarchy of CALayers. One of these layers is a AVPlayerLayer that displays a video with CIFilters applied in real time (using AVVideoComposition(asset:, applyingCIFiltersWithHandler:)).

Now I want to export this layer composition to a video file. There are two tools in AVFoundation that seem helpful:

A: AVVideoCompositionCoreAnimationTool which allows rendering a video inside a (possibly animated) CALayer hierarchy

B: AVVideoComposition(asset:, applyingCIFiltersWithHandler:), which I also use in the UI, to apply CIFilters to a video asset.

However, these two tools cannot be used simultaneously: If I start an AVAssetExportSession that combines these tools, AVFoundation throws an NSInvalidArgumentException:

Expecting video composition to contain only AVCoreImageFilterVideoCompositionInstruction

I tried to workaround this limitation as follows:

Workaround 1

1) Setup an export using AVAssetReader and AVAssetWriter

2) Obtain the sample buffers from the asset reader and apply the CIFilter, save the result in a CGImage.

3) Set the CGImage as the content of the video layer in the layer hierarchy. Now the layer hierarchy "looks like" one frame of the final video.

4) Obtain the data of the CVPixelBuffer for each frame from the asset writer using CVPixelBufferGetBaseAddress and create a CGContext with that data.

5) Render my layer to that context using CALayer.render(in ctx: CGContext).

This setup works, but is extremely slow - exporting a 5 second video sometimes takes a minute. It looks like the CoreGraphics calls are the bottleneck here (I guess that's because with this approach the composition happens on the CPU?)

Workaround 2

One other approach could be to do this in two steps: First, save the source video just with the filters applied to a file as in B, and then use that video file to embed the video in the layer composition as in A. However, as it uses two passes, I guess this isn't as efficient as it could be.

Summary

What is a good approach to export this video to a file, ideally in a single pass? How can I use CIFilters and AVVideoCompositionCoreAnimationTool simultaneously? Is there a native way to set up a "pipeline" in AVFoundation which combines these tools?


Solution

  • The way to achieve this is using a custom AVVideoCompositing. This object allows you to compose (in this case apply the CIFilter) each video frame.

    Here's an example implementation that applies a CIPhotoEffectNoir effect to the whole video:

    class VideoFilterCompositor: NSObject, AVVideoCompositing {
    
        var sourcePixelBufferAttributes: [String : Any]? = [kCVPixelBufferPixelFormatTypeKey as String: kCVPixelFormatType_32BGRA]
        var requiredPixelBufferAttributesForRenderContext: [String : Any] = [kCVPixelBufferPixelFormatTypeKey as String: kCVPixelFormatType_32BGRA]
        private var renderContext: AVVideoCompositionRenderContext?
    
        func renderContextChanged(_ newRenderContext: AVVideoCompositionRenderContext) {
            renderContext = newRenderContext
        }
    
        func cancelAllPendingVideoCompositionRequests() {
        }
    
        private let filter = CIFilter(name: "CIPhotoEffectNoir")!
        private let context = CIContext()
        func startRequest(_ asyncVideoCompositionRequest: AVAsynchronousVideoCompositionRequest) {
            guard let track = asyncVideoCompositionRequest.sourceTrackIDs.first?.int32Value, let frame = asyncVideoCompositionRequest.sourceFrame(byTrackID: track) else {
                asyncVideoCompositionRequest.finish(with: NSError(domain: "VideoFilterCompositor", code: 0, userInfo: nil))
                return
            }
            filter.setValue(CIImage(cvPixelBuffer: frame), forKey: kCIInputImageKey)
            if let outputImage = filter.outputImage, let outBuffer = renderContext?.newPixelBuffer() {
                context.render(outputImage, to: outBuffer)
                asyncVideoCompositionRequest.finish(withComposedVideoFrame: outBuffer)
            } else {
                asyncVideoCompositionRequest.finish(with: NSError(domain: "VideoFilterCompositor", code: 0, userInfo: nil))
            }
        }
    
    }
    

    If you need to have different filters at different times, you can use custom AVVideoCompositionInstructionProtocol which you can get from the AVAsynchronousVideoCompositionRequest

    Next, you need to use this with your AVMutableVideoComposition, so:

    let videoComposition = AVMutableVideoComposition()
    videoComposition.customVideoCompositorClass = VideoFilterCompositor.self
    //Add your animator tool as usual
    let animator = AVVideoCompositionCoreAnimationTool(postProcessingAsVideoLayer: v, in: p)
    videoComposition.animationTool = animator
    //Finish setting up the composition
    

    With this, you should be able to export the video using a regular AVAssetExportSession, setting its videoComposition