xcodesprite-kitoverlayrealitykithud

How to manage interaction in 2D and 3D within a RealityKit game


My game is utilising RealityKit to render a lot of 3D content. In the foreground I want to place a SpriteKit HUD that will contain buttons, and at times, complete overlays of the 3D content.

Right now, I have something similar to this, having started with the Xcode generated Game template code for ContentView:

var body: some View {
    ZStack {
        RealityView { content in
            // If iOS device that is not the simulator,
            // use the spatial tracking camera.
            #if os(iOS) && !targetEnvironment(simulator)
            content.camera = .spatialTracking
            #endif
            createGameScene(content)
        }.gesture(tapEntityGesture)
        // When this app runs on macOS or iOS simulator,
        // add camera controls that orbit the origin.
        #if os(macOS) || (os(iOS) && targetEnvironment(simulator))
        // noting that this is only here for debugging purposes.
        .realityViewCameraControls(.orbit)
        #endif

        let hudScene = HUDScene(size: mainWindowSize)
        
        SpriteView(scene: hudScene, options: [.allowsTransparency])
        
        // this following line either allows the HUD to receive events (true), or
        // the RealityView to receive Gestures.  How can we enable both at the same
        // time so that SpriteKit SKNodes within the HUD node tree can receive and
        // respond to touches as well as letting RealityKit handle gestures when
        // the HUD ignores the interaction?
        //
            .allowsHitTesting(true)
    }
}

What I'm finding is that it seems to be impossible to have a SpriteView overlaying the RealityView and supporting user interaction in both.

It's either SpriteView gets everything, or RealityView gets everything.

Is there some way to support a HUD this way?

In the previous version of this app I was using SceneKit and was able to simply let the HUD manage everything and talk to the SceneKit SCNView to check for interaction in the 3D space. With RealityKit, I'm not sure how I can do that.

I do not want to use SwiftUI at all for the HUD (or any 2D interaction) as it's just a bad pattern (slow, unresponsive, laggy, and just not very versatile or mature). To be honest if I could get Xcode to generate a non SwiftUI container for the main body of the RealityKit app I'd do away with SwiftUI entirely.


Solution

  • After spending ages trying to get this working where I set .allowsHitTesting(true) and tried to let the SpriteView children manage all interaction and feed it back to the RealityView when needed, I decided it just wasn't possible. RealityKit doesn't really want to play nicely with anything else.

    So what I did was create a simple ApplicationModel:

    public class ApplicationModel : ObservableObject {
        
        @Published var hudInControl : Bool
        
        init() {
            self.hudInControl = false
        }
        
        static let shared : ApplicationModel = ApplicationModel()
        
    }
    

    and then in the ContentView do this:

    struct ContentView: View {
        
        @Environment(\.mainWindowSize) var mainWindowSize
    
        @StateObject var appModel : ApplicationModel = .shared
    
        var body: some View {
            ZStack {
                RealityView { content in
                    // If iOS device that is not the simulator,
                    // use the spatial tracking camera.
                    #if os(iOS) && !targetEnvironment(simulator)
                    content.camera = .spatialTracking
                    #endif
                    createGameScene(content)
                }.gesture(tapEntityGesture)
                // When this app runs on macOS or iOS simulator,
                // add camera controls that orbit the origin.
                #if os(macOS) || (os(iOS) && targetEnvironment(simulator))
                .realityViewCameraControls(.orbit)
                #endif
    
                let hudScene = HUDScene(size: mainWindowSize)
                
                SpriteView(scene: hudScene, options: [.allowsTransparency])
                
                // this following line either allows the HUD to receive events (true), or
                // the RealityView to receive Gestures.  How can we enable both at the same
                // time so that SpriteKit SKNodes within the HUD node tree can receive and
                // respond to touches as well as letting RealityKit handle gestures when
                // the HUD ignores the interaction?
                //
                    .allowsHitTesting(appModel.hudInControl)
            }
        }
    }
    

    this then gives the app some control over whether RealityKit, or SpriteKit get the user interaction events. When the app starts, interaction is through the RealityKit environment by default.

    When the user then triggers something that gives control to the 2D environment, appModel.hudInControl is set to true and it just works.

    For those situations where I have a HUD based button that I want sensitive to taps when the HUD is not in control, I, in the tapEntityGesture handler, offer the tap to the HUD first, and if the HUD does not consume it, I then use it as needed within the RealityView.