swiftuiaugmented-realityvirtual-realityrealitykitrealityview

How to make AR working in Stereo RealityView iOS 18 using SwiftUI


I have an issue using RealityView to show two screens of AR, while I did succeed to make it as a non AR but now my code not working.

Also it is working using Storyboard and Swift with SceneKit, so why it is not working in RealityView?

The goal is to see a cube in both sides in Augmented Reality on the iPhone to make an AR/VR view or XR view to use it with VR Box, which is very successful using SceneKit with Swift but now it show only the left side in AR and the right side not moving and a bit small.

enter image description here

Here is my code:

import SwiftUI
import RealityKit


struct ContentView : View {
    var body: some View {
        HStack (spacing: 0){
            MainView()
            MainView()
        }
        .background(.black)
    }
}

struct MainView : View {
    @State var anchor = AnchorEntity()

    var body: some View {
        RealityView { content in
            let item = ModelEntity(mesh: .generateBox(size:     0.2), materials: [SimpleMaterial()])

            content.camera = .spatialTracking
            anchor.addChild(item)
            anchor.position = [0.0, 0.0, -1.0]
            anchor.orientation = .init(angle: .pi/4,    axis:[0,1,1])

            // Add the horizontal plane anchor to the scene
            content.add(anchor)
        }
    }
}

Solution

  • AR Stereo view in RealityKit for iOS 18

    Since SwiftUI doesn't allow you to create a working clone of RealityView in AR mode, and SwiftUI doesn't allow you to create two separate RealityViews with simultaneous (in-sync) world or planar tracking (due to the fact that you cannot run two AR sessions at once), all you can do is attach your primitives to the AR camera's position (since the camera coordinates are known in each frame without tracking). All you need to do is to provide each AnchorEntity with a .camera case. However, one small trouble awaits you here: in the right view, the cube primitive will be closer to the camera than in the left view. Don't know if this is a bug or not...

    import SwiftUI
    import RealityKit
    
    struct Controller : View {
        @State var position: UnitPoint = .zero
        @Binding var pose: SIMD3<Float>
        let anchor: AnchorEntity
        
        var dragGesture: some Gesture {
            DragGesture(minimumDistance: 15, coordinateSpace: .global)
                .onChanged {
                    anchor.position.x = Float($0.translation.width +
                                                 position.x) * 0.005
                    anchor.position.y = Float($0.translation.height +
                                                 position.y) * -0.005
                    pose = anchor.position
                }
                .onEnded {
                    position.x += $0.translation.width
                    position.y += $0.translation.height
                }
        }
        var body: some View {
            RealityView { rvc in
                rvc.camera = .spatialTracking
                let box = ModelEntity(mesh: .generateBox(size: 0.25))
                anchor.addChild(box)
                rvc.add(anchor)
                
                anchor.position.z = -1.0
                anchor.orientation = .init(angle: .pi/4, axis:[0,1,1])
            }
            .gesture(dragGesture)
        }
    }
    

    struct ContentView : View {
        @State var pose: SIMD3<Float> = .zero
        let anchor1 = AnchorEntity(.camera)
        let anchor2 = AnchorEntity(.camera)
    
        var body: some View {
            HStack(spacing: 0) {
                VStack {
                    Text(anchor1.position.description)
                        .font(.subheadline)
                        .padding(20)
                    Controller(pose: $pose, anchor: anchor1)
                        .onChange(of: pose) { (_, new) in
                            anchor2.position = SIMD3<Float>(new)
                        }
                }
                VStack {
                    Text(anchor2.position.description)
                        .font(.subheadline)
                        .padding(20)
                    Controller(pose: $pose, anchor: anchor2)
                        .onChange(of: pose) { (_, new) in
                            anchor1.position = SIMD3<Float>(new)
                        }
                }
            }
        }
    }
    

    P.S.

    The problem is that the tracking configuration in the brand-new ARKit's API (for working with RealityViews) is launched globally, and is in no way connected to the view itself (unlike in ARView).

    let session = SpatialTrackingSession()
    
    let config = SpatialTrackingSession.Configuration(
        tracking: [.world],
        sceneUnderstanding: [.occlusion],
        camera: .back
    )
    Task {
        await session.run(config)
    }
    

    Solution – ARView in SwiftUI

    You can materialize your idea using two ARViews, because as I said earlier, you can run world tracking configuration only if there's just one running session for two views. And in this case, you'll have to implement UIGestureRecognizer inside ARContainer structure to implement gestures.

    import SwiftUI
    import RealityKit
    import ARKit
    
    struct ARContainer : UIViewRepresentable {
        @State var pose: SIMD3<Float> = [0,-0.5,-1.5]
        let arView = ARView(frame: .zero)
        let worldAnchor: AnchorEntity
        let session: ARSession
    
        func makeUIView(context: Context) -> ARView {
            arView.cameraMode = .ar
            let config = ARWorldTrackingConfiguration()
            arView.session = session
            session.run(config)
            
            let box = ModelEntity(mesh: .generateBox(size: 0.25))
            worldAnchor.addChild(box)
            arView.scene.anchors.append(worldAnchor)
            return arView
        }
        func updateUIView(_ arView: ARView, context: Context) {
            DispatchQueue.main.asyncAfter(deadline: .now() + 3.0) {
                worldAnchor.position = pose
            }
        }
    }
    

    struct ContentView : View {
        let anchor1 = AnchorEntity(world: [0, 0,-1])
        let anchor2 = AnchorEntity(world: [0, 0,-1])
        let session = ARSession()
    
        var body: some View {
            HStack(spacing: 0) {
                VStack {
                    Text(anchor1.position.description)
                        .font(.subheadline)
                        .padding(20)
                    ARContainer(worldAnchor: anchor1, session: session)
                }
                VStack {
                    Text(anchor2.position.description)
                        .font(.subheadline)
                        .padding(20)
                    ARContainer(worldAnchor: anchor2, session: session)
                }
            }
        }
    }