I am trying to use an AVPlayerItem and AVPlayerItemVideoOutput to grab frame images of a video. Most examples I see attach a CADisplayLink and simply pass the time of the display link to the itemTime function that exists on AVPlayerItemVideoOutput or they pass CACurrentMediaTime() instead.
let itemTime = playerItemVideoOutput.itemTime(forHostTime: CACurrentMediaTime())
The problem with this to me is I would like to grab a specific second of the video. In most functions if have seen you could simply build a time and get that frame. For Example:
CMTime(seconds: 10.0, preferredTimescale: 600)//10 seconds
However, this is not the way AVPlayerItemVideoOutput works. It seems to calculate it's own time from some reference point(probably when it is added and available on the player). I need help understanding the time it needs to pass a specific frame to get of the video.
//get item time so that I can grab a specific frame of the video
if playerItemVideoOutput.hasNewPixelBuffer(forItemTime: itemTime),
let pixelBuffer = playerItemVideoOutput.copyPixelBuffer(forItemTime: itemTime,itemTimeForDisplay: nil) ,
case let i = CIImage(cvPixelBuffer: pixelBuffer){
}
Some might suggest to use AVAssetImageGenerator instead and this does work but I have bound the speed and efficient of the AVPlayerItemVideoOutput to be superior to grab the number of frames that I need.
You could try calling player.seek(to:)
and then copyPixelBuffer
with CACurrentMediaTime()
.
But I don't know how successful you're going to be if the player is playing from some point and you try to get pixel buffers from other points in the timeline.
I don't think AVPlayerItemVideoOutput
is meant to be for truly "random access". It probably (reasonably) assumes that time will progress in the same way that it does for its AVPlayerItem
.
Using an AVAssetReader
with an AVAssetReaderTrackOutput
however would let you seek relative to the asset's timeline by setting the reader's timeRange
property.