iosswiftmatrixscenekitcore-motion

How to cast a CMRotationMatrix from CoreMotion to be used by a SceneKit camera (or any SCNNode)


I'm trying to use the attitude given by CMHeadphoneMotionManager to guide a camera inside a SKSceneView. If I'm not mistaken, they use difference reference systems, and so the direct initialisation of a float4x4 matrix like this one would not work without any permutation or change (axis do not match between CoreMotion and SceneKit).

To add some context, and I might well be wrong because I couldn't find the exact reference system, and had to run some tests to find out, the coordinate system used as reference by the attitude given by your Airpods (or motion-enabled headphones), goes like this, with positive Y pointing forward from your nose, positive Z pointing up against gravity through your head, and positive X pointing right (at a random direction picked when you start capturing motion): enter image description here

However, the reference coordinate system for SceneKit have positive Y pointing upwards, and positive Z pointing backwards (assuming you are the camera, which looks towards negative Z). Axis X seems the be the same:

enter image description here

My linear algebra knowledge at this level is sort of limited, and even though I've been trying for a few days, I don't know how to convert the given rotation matrix from HeadphoneMotion, to be used by the transform of a SceneKit camera. That would be the question. (Ideally, paired with the concepts behind the permutation of columns required, to learn how it's done.)

Also, I would like to avoid using eulerAngles or quaternions at this point.


Solution

  • After digging a bit more into matrices, I came up with a solution. Not sure if it's the right way, and not sure of the reasoning behind it (If someone knows and want to elaborate, please feel free).

    Basically, assuming the schemes I drew of both coordinates systems are correct (specially the CoreMotion one), we can calculate a rotation matrix T that separates both system; this is, one matrix that would turn one system into the other.

    By looking at those schemes, we know the SceneKit coordinate system is 90º apart (on the axis X) from the CoreMotion coordinate system, and the rotation matrix T would be as follows (in swift, column by column):

    let matrixT = simd_float4x4([
              simd_float4( 1.0, 0.0, 0.0, 0.0),
              simd_float4( 0.0, 0.0, 1.0, 0.0),
              simd_float4( 0.0, -1.0, 0.0, 0.0),
              simd_float4( 0.0, 0.0, 0.0, 1.0)
            ])
    

    Knowing this, we can obtain R', which is just the initial rotation matrix R but in the target coordinate system, by doing this (and this is the key):

    R' = T^-1 * R * T

    Which in swift, would be:

    let newRotation = matrixT.inverse * originalRotationMatrix * matrixT;
    

    Just like that, the newRotation can be used as a transform in the SceneKit camera, and it works.