How can I get the video size of a video from AVPlayer to set the geometry size of my node?
For example, I have an SCNPlane with a width and height
let planeGeo = SCNPlane(width: 5, height: 5)
So now I instantiate my video player
let videoURL = NSURL(string: someURL)
let player = AVPlayer(URL: videoURL!)
and my SKVideoNode
let spriteKitScene = SKScene(size: CGSize(width: 1920, height: 1080))
spriteKitScene.scaleMode = .AspectFit
videoSpriteKitNode = SKVideoNode(AVPlayer: player)
videoSpriteKitNode.anchorPoint = CGPointMake(0,0)
videoSpriteKitNode.size.width = spriteKitScene.size.width
videoSpriteKitNode.size.height = spriteKitScene.size.height
spriteKitScene.addChild(videoSpriteKitNode)
planeGeo!.firstMaterial.diffuse.contents = spriteKitScene
videoSpriteKitNode.play()
So now I want to have the video size to resize my plane to a correct aspect ratio. I already fiddled around with AVLPlayerLayer but this gives me always 0
let avLayer = AVPlayerLayer(player: player)
print(avLayer.videoRect.width) //0
print(avLayer.videoRect.height) //0
Also I tried that here but it doesn't work as well
let avLayer = AVPlayerLayer(player: player)
let layer = avLayer.sublayers![0]
let transformedBounds = CGRectApplyAffineTransform(layer.bounds, CATransform3DGetAffineTransform(layer.sublayerTransform))
print(transformedBounds.width) //0
print(transformedBounds.height) //0
Ok I figured it out, KVO is the way to go. Add in viewDidLoad:
player.currentItem?.addObserver(self, forKeyPath: "presentationSize", options: .New, context: nil)
in deinit:
player.currentItem?.removeObserver(self, forKeyPath: "presentationSize")
and then add:
override func observeValueForKeyPath(keyPath: String?, ofObject object: AnyObject?, change: [String : AnyObject]?, context: UnsafeMutablePointer<Void>) {
if keyPath == "presentationSize" {
if let item = object as? AVPlayerItem {
let size = item.presentationSize
let width = size.width
let height = size.height
//Set size of geometry here
}
}
}
var py: AVPlayer?
private var pyContext = 0
...
guard let url = URL(string: "https:// .. /test.m4v") else { return }
py = AVPlayer(url: url)
someNode.geometry?.firstMaterial?.diffuse.contents = py
py?.currentItem?.addObserver(self,
forKeyPath: "presentationSize",
context: &pyContext)
...
override func observeValue(forKeyPath keyPath: String?,
of object: Any?,
change: [NSKeyValueChangeKey : Any]?,
context: UnsafeMutableRawPointer?) {
if context == &py1Context && keyPath == "presentationSize" {
print("Found it ...")
guard let item = object as AVPlayerItem else { return }
let ps = item.presentationSize
let aspect: Float = Float(ps.width) / Float(ps.height)
someNode.geometry?.firstMaterial?
.diffuse.contentsTransform =
SCNMatrix4MakeScale( .. , .. , 1)
}
}
Unfortunately any time you work with video, if the exact size of the streaming content is not always the same, then shaping the video is a huge pain. Of course you have many considerations like whether letterboxed, etc etc. In some simple cases the calculation looks like this:
// 1. You nave finally received the info on the video from the HLS stream:
let ps = item.presentationSize
let spect: Float = Float(ps.width) / Float(ps.height)
// 2. Over in your 3D code, you need to know the current facts on the mesh:
let planeW = ... width of your mesh
let planeH = ... height of your mesh
let planeAspect = planeW / planeH
It's possible you are using Apple's simple provided flat square mesh, such as
var simplePlane = SCNPlane()
simplePlane.width = 2.783
simplePlane.height = 1.8723
One way or another you need the w/h of your mesh. And then,
// 3. In many (not all) cases, the solution is:
let final = vidAspect / planeAspect
print("we'll try this: \(final)")
yourNode.geometry?
.firstMaterial?.diffuse
.contentsTransform = SCNMatrix4MakeScale(1.0, final, 1.0)