Trying to use CoreMotion to correctly rotate a SceneKit camera. The scene I've built is done rather simple ... all I do is create a bunch of boxes, distributed in an area, and the camera just points down the Z axis.
Unfortunately, the data coming back from device motion doesn't seem to relate to the device's physical position and orientation in any way. It just seems to meander randomly.
As suggested in this SO post, I'm passing the attitude's quaternion directly to the camera node's orientation property.
Am I misunderstanding what data core motion is giving me here? shouldn't the attitude reflect the device's physical orientation? or is it incremental movement and I should be building upon the prior orientation?
This snippet here might help you:
var motionManager = CMMotionManager()
motionManager?.deviceMotionUpdateInterval = 1.0 / 60.0
motionManager?.startDeviceMotionUpdatesToQueue(
NSOperationQueue.mainQueue(),
withHandler: { (motion: CMDeviceMotion!, error: NSError!) -> Void in
let currentAttitude = motion.attitude
var roll = Float(currentAttitude.roll) + (0.5*Float(M_PI))
var yaw = Float(currentAttitude.yaw)
var pitch = Float(currentAttitude.pitch)
self.cameraNode.eulerAngles = SCNVector3(
x: -roll,
y: yaw,
z: -pitch)
})
This setting is for the device in landscape right. You can play around with different orientations by changing the + and -
Import CoreMotion.