I am using a WorldTrackingProvider in a VisionOS application
let session = ARKitSession()
let worldInfo = WorldTrackingProvider()
...
try? await session.run([worldInfo])
so that I can query the position of the device using queryDeviceAnchor on each frame:
guard let pose = worldInfo.queryDeviceAnchor(atTimestamp: CACurrentMediaTime()) else {
return
}
let toDeviceTransform = pose.originFromAnchorTransform
let devicePosition = toDeviceTransform.translation
This works great, but I realize I don't understand what point on the actual device this represents for a stereo device. The left eye, middle eye, in-between the two eyes? I assume the middle eye and the cameras are offset from this for stereo. Anyone know?
I believe that AnchorEntity(.head)
, as well as DeviceAnchor
, are tethered to the same point - to the middle of an imaginary line between the left and right main cameras. If you attach a 1-cm-sphere
to head's anchor, in visionOS simulator, it will only be visible to you at a distance of 14+ cm from the cameras (don't know what the distance is for a real device).
RealityView { rvc in
let sphere = ModelEntity(mesh: .generateSphere(radius: 0.005))
sphere.position.z = -0.15
let anchor = AnchorEntity(.head)
anchor.addChild(sphere)
rvc.add(anchor)
}
A similar behavior pattern is observed when working with ARFaceAnchor – face tracking starts working only if the distance from the selfie camera to the user’s face is at least 10+ cm. This occurs due to the focusing of IR-points of the TrueDepth sensor. In the case of the visionOS simulator, the object is artificially blocked, possibly using OcclusionMaterial or the opacity
property.
By the way, when working with stereoscopic imagery or with spatial imagery (a.k.a. Multiview HEVC), it seems to me that the location of the anchor should be exactly where it is - in the middle of an aforementioned imaginary line. This becomes logical when you start to control disparity
.