I'm using the AVVideoComposition
API to get CIImage
s from a local video, and after scaling down the CIImage
I'm getting nil
when trying to get the CVPixelBuffer
.
Before scaling down the source frame, I'm getting the original frame CVPixelBuffer
.
Is there any reason the buffer is nil
after scaling down?
Sample:
AVVideoComposition(asset: asset) { [weak self] request in
let source = request.sourceImage
let pixelBuffer = source.pixelBuffer // return value
let scaledDown = source.transformed(by: .init(scaleX: 0.5, y: 0.5))
let scaledPixelBuffer // return nil
})
I think the last line in your sample is incomplete. Did you mean let scaledPixelBuffer = scaledDown.pixelBuffer
? If so, then yes, this won't work. The reason is that the pixelBuffer
property is only available if the CIImage
was created directly from a CVPixelBuffer
. From the docs:
If this image was create using the
init(cvPixelBuffer:)
initializer, this property’s value is theCVPixelBuffer
object that provides the image’s underlying image data. […] Otherwise, this property’s value isnil
.
The CIImage
that is passed to the composition block was created from a pixel buffer provided by AVFoundation. But when you apply a filter or transform to it, you need to render the resulting image into a pixel buffer explicitly using a CIContext
, otherwise you won't get a result.
If you want to change the size of the video frames the composition is using, you can use a AVMutableVideoComposition
instead and set its renderSize
to your desired size after it is initialized:
let composition = AVMutableVideoComposition(asset: asset) { … }
composition.renderSize = CGSize(width: 1280, height: 720)