swiftuiscenekitarkitrealitykit

How to get points on a the surface of a 3D-Object RealityKit and connect those with a line


I'm working currently on a project where you can load your usdz file into a Realitiykit. I'm relatively new to this API. What I want,is to have a preview of a usdz object and being able to selct some surface points of this object and to connect those via a line. In order to achive that i have some problems:

  1. Firstly, how do I load an object in a plane white view (Not added into the real world camera AR)
  2. Secondly, how am I able to select some points on the 3D object surface via gesture recogintion.
  3. How do I connect those with a line.

This is my current code:


struct ContentView: View {
    var body: some View {
        ARViewContainer()
            .edgesIgnoringSafeArea(.all)
    }
}

struct ARViewContainer: UIViewRepresentable {
    func makeUIView(context: Context) -> ARView {
        let arView = ARView(frame: .zero)
        context.coordinator.arView = arView
        
        // Load the USDZ model
        let modelEntity = try! ModelEntity.loadModel(named: "yourModel.usdz")
        
        // Create an anchor to hold the model
        let anchorEntity = AnchorEntity(world: [0, 0, 0])
        anchorEntity.addChild(modelEntity)
        
        // Add the anchor to the ARView
        arView.scene.addAnchor(anchorEntity)
        
        // Configure the background color to white (simulates a plane white view)
        arView.environment.background = .color(.white)
        
        // Setup gesture recognizer
        context.coordinator.setupGestureRecognition()
        
        return arView
    }
    
    func updateUIView(_ uiView: ARView, context: Context) {}
    
    func makeCoordinator() -> Coordinator {
        return Coordinator()
    }
}

class Coordinator: NSObject {
    var arView: ARView?
    var selectedPoints: [SIMD3<Float>] = []
    
    override init() {
        super.init()
    }
    
    func setupGestureRecognition() {
        guard let arView = arView else { return }
        let tapGesture = UITapGestureRecognizer(target: self, action: #selector(handleTap(_:)))
        arView.addGestureRecognizer(tapGesture)
    }
    
    @objc private func handleTap(_ gesture: UITapGestureRecognizer) {
        guard let arView = arView else { return }
        let location = gesture.location(in: arView)
        
        // Perform a hit test to find where the user tapped on the model
        let results = arView.hitTest(location)
        
        if let firstResult = results.first {
            let position = firstResult.worldTransform.translation
            selectedPoints.append(position)
            addSphere(at: position)
            
            if selectedPoints.count > 1 {
                drawLine(from: selectedPoints[selectedPoints.count - 2], to: position)
            }
        }
    }
    
    private func addSphere(at position: SIMD3<Float>) {
        guard let arView = arView else { return }
        
        let sphere = MeshResource.generateSphere(radius: 0.01)
        let material = SimpleMaterial(color: .red, isMetallic: false)
        let sphereEntity = ModelEntity(mesh: sphere, materials: [material])
        sphereEntity.position = position
        
        let anchorEntity = AnchorEntity(world: position)
        anchorEntity.addChild(sphereEntity)
        arView.scene.addAnchor(anchorEntity)
    }
    
    private func drawLine(from start: SIMD3<Float>, to end: SIMD3<Float>) {
        guard let arView = arView else { return }
        
        let vertices: [SIMD3<Float>] = [start, end]
        let indices: [UInt32] = [0, 1]
        let lineMesh = MeshResource.generate(from: vertices, indices: indices)
        
        let material = SimpleMaterial(color: .blue, isMetallic: false)
        let lineEntity = ModelEntity(mesh: lineMesh, materials: [material])
        
        let anchorEntity = AnchorEntity(world: [0, 0, 0])
        anchorEntity.addChild(lineEntity)
        arView.scene.addAnchor(anchorEntity)
    }
}

extension matrix_float4x4 {
    /// Extracts the translation vector (position) from the 4x4 matrix
    var translation: SIMD3<Float> {
        return SIMD3<Float>(columns.3.x, columns.3.y, columns.3.z)
    }
}

I appriciate your time!


Solution

  • Your current code and logic looks good — with only a few changes you should be able to achieve your goal. Let’s break it down:

    1. NonAR mode with white background

    To run your RealityKit scene without AR functionality, you can initialize ARView with the .nonAR camera mode. This allows you to use RealityKit as a virtual scene:

    let arView = ARView(
        frame: .zero, 
        cameraMode: .nonAR,
        automaticallyConfigureSession: false)
    

    With arView in nonAR mode, you should be able to set the background color using your current approach:

    arView.environment.background = .color(.white)
    

    2. Select points on model surface with tap gesture

    To be able to select points on the surface of the mesh you need to add collision shapes to the imported model.

    The quality of this feature depends on the complexity of the mesh geometry and the ability to represent that geometry using collision shapes. Adding accurate collision shapes is straightforward for primitive shapes but can be challenging for complex meshes.

    You can generate primitive or convex shapes using ShapeResource. From iOS 18, you can also attempt to generate a per-face static collision shape using generateStaticMesh(from:).

    For example, you can generate a convex shape from the imported mesh in makeUIView and add it to modelEntity with a new collision component:

    if let modelComponent = modelEntity.model {
        let convex = ShapeResource.generateConvex(from: modelComponent.mesh)
        modelEntity.collision = .init(shapes: [convex])
    }
    
    // Visualize the collision shapes: 
    arView.debugOptions.insert(.showPhysics) 
    

    To use the position of the hit test in your handleTap function, you can return results as [CollisionCastHit] and extract the position property of the first element:

    let results: [CollisionCastHit] = arView.hitTest(location)
    
    if let firstResult = results.first {
        let position = firstResult.position
        addSphere(at: position)
        // {...}
    }
    

    Different collision shapes and red spheres added at successful hit positions:

    RealityKit scene with different meshes and collision shapes, showing the ability to select positions by performing raycast hit test against the collision shapes

    Edit: Sphere position

    In your addSphere method, the position is currently being applied to both the sphere entity and the anchor entity, which results in an extra offset when the sphere is added as a child to the anchor.

    To ensure the spheres are added accurately, you should set the position for only one of these entities. For instance, you can apply position to the sphere and set the anchor’s position to .zero:

    sphereEntity.position = position
            
    let anchorEntity = AnchorEntity(world: .zero)
    

    3. Drawing lines between points

    There are various approaches to drawing lines in a 3D scene. Below is one approach that involves constructing a rectangular mesh, which scales according to the distance between the points and aligns with the direction of the line.

    By adding a cornerRadius that is half the line width, you effectively create a line with circular profile (and rounded end-points).

    private func drawLine(from start: SIMD3<Float>, to end: SIMD3<Float>) {
        guard let arView = arView else { return }
     
        let midpoint = (start + end) / 2
        let direction = normalize(end - start)
        let distance = length(end - start)
        let lineWidth: Float = 0.012
    
        let lineMesh = MeshResource.generateBox(
            width: lineWidth, 
            height: lineWidth, 
            depth: distance, 
            cornerRadius: lineWidth / 2)
    
        let material = SimpleMaterial(color: .systemBlue, isMetallic: false)
    
        let lineEntity = ModelEntity(mesh: lineMesh, materials: [material])
     
        lineEntity.orientation = simd_quatf(
            from: SIMD3<Float>(0, 0, 1), 
            to: direction)
        lineEntity.position = midpoint
     
        let anchorEntity = AnchorEntity(world: [0, 0, 0])
        anchorEntity.addChild(lineEntity)
        arView.scene.addAnchor(anchorEntity)
     }
    

    With these changes to your code, you should be able to run a virtual scene in RealityKit, tap to perform raycast hit tests against collision shapes in the scene and draw lines between the tapped positions:

    Animated image demonstrating the ability to draw line meshes between successful raycast hit positions in the scene