I have created an augmented reality app with Xcode 15 and Reality Composer that allows a person to use their iPhone to virtually try on glasses before purchasing them. The key to the project is using Reality Composer to define a face anchor and position 3D objects through a .usdz file, then the iPhone’s face tracking does the rest. However, in Xcode 16, the project no longer compiles, and Reality Composer Pro doesn’t even allow setting a target device other than Vision Pro.
Obviously, Reality Composer Pro was made to boost sales of Vision Pro, but it's quite possible that in the near future it'll have all the functionality that was in deprecated Reality Composer app. Since the latest version of RCP still has just user's head tracking (a.k.a. AnchorEntity(.head) anchor) and has no face tracking (a.k.a. AnchorEntity(.face) anchor), you can implement face tracking functionality programmatically.
Here's the code that allows you try on virtual glasses on a tracked face in iOS 18 SwiftUI app:
import SwiftUI
import RealityKit
import ARKit
struct ContentView : View {
var body: some View {
ARFaceView()
}
}
struct ARFaceView : UIViewRepresentable {
let anchor = AnchorEntity(.face) // Facial Anchor
let glasses = try! Entity.loadModel(named: "glasses")
let arView = ARView(frame: .zero)
init() {
// Activating the selfie camera
arView.automaticallyConfigureSession = false
let config = ARFaceTrackingConfiguration()
arView.session.run(config)
}
func makeUIView(context: Context) -> ARView {
glasses.scale /= 1.1
glasses.position.y = -0.05
anchor.addChild(glasses)
arView.scene.anchors.append(anchor)
return arView
}
func updateUIView(_ view: ARView, context: Context) { }
}