swiftuirealitykitvisionosreality-composer-pro

Custom Component in Reality Composer Pro


I'm working on a visionOS app an I need to create custom component for Reality Composer Pro. I followed Apple's examples and managed to do some stuff. However I would like to go one step further and be able to edit in real time the preview of the entity containing my custom component. For example in my custom component I have a Opacity field. If I change to 0.5 I would like to see the entity opacity to 0.5 in Reality Composer Pro. Here is my actual code :

    public struct BinOpenerComponent: Component, Codable {
    var dragAxis: DragAxis
    public var canDrag: Bool = true
    public var opacity: Float = 1.0
    public var objectReferenceName: String? = "Enter name"
    
    // Define min and max translation values per axis
    public var minTranslation: Float = 0
    public var maxTranslation: Float = 0

    public init(dragAxis: DragAxis = .x,
                minTranslation: Float = 0.0,
                maxTranslation: Float = 0.4,
                opacity: Float = 1.0,
                objectReferenceName: String = "Enter name") {
        self.dragAxis = dragAxis
        self.minTranslation = minTranslation
        self.maxTranslation = maxTranslation
        self.opacity = opacity
        self.objectReferenceName = objectReferenceName
    }
    
    // MARK: - Drag Logic
    
    /// Handle `.onChanged` actions for drag gestures.
    mutating func onChanged(value: EntityTargetValue<DragGesture.Value>) {
        guard canDrag else { return }
        
        let state = EntityGestureState.shared
        
        // Only allow a single Entity to be targeted at any given time.
        if state.targetedEntity == nil {
            state.targetedEntity = value.entity
            state.initialOrientation = value.entity.orientation(relativeTo: nil)
        }

        handleFixedDrag(value: value) { currentTranslation in
            DispatchQueue.main.async {
                state.currentTranslation = currentTranslation
            }
        }
    }
    
    mutating private func handleFixedDrag(value: EntityTargetValue<DragGesture.Value>, onChange: ((SIMD3<Float>) -> Void)? = nil) {
        let state = EntityGestureState.shared
        guard let entity = state.targetedEntity else { fatalError("Gesture contained no entity") }
        
        if !state.isDragging {
            state.isDragging = true
            state.dragStartPosition = entity.position(relativeTo: nil) // Use world position as reference
        }
        
        // Convert the gesture translation to the scene's coordinate space
        let translation3D = value.convert(value.gestureValue.translation3D, from: .local, to: .scene)
        
        // Determine which axis to apply the translation
        var desiredTranslation = state.dragStartPosition
        switch dragAxis {
        case .x:
            desiredTranslation.x += translation3D.x
            desiredTranslation.x = clamp(desiredTranslation.x, min: minTranslation, max: maxTranslation)
        case .y:
            desiredTranslation.y += translation3D.y
            desiredTranslation.y = clamp(desiredTranslation.y, min: minTranslation, max: maxTranslation)
        case .z:
            desiredTranslation.z += translation3D.z
            desiredTranslation.z = clamp(desiredTranslation.z, min: minTranslation, max: maxTranslation)
        }
        
        // Create the new position with clamped translation
        let offset = desiredTranslation
        
        // Apply the clamped position
        entity.setPosition(offset, relativeTo: nil)
        
        // Optionally, notify about the change
        onChange?(offset)
        
        // Restore the initial orientation if needed
        if let initialOrientation = state.initialOrientation {
            entity.setOrientation(initialOrientation, relativeTo: nil)
        }
        
        // Apply opacity whenever dragging occurs
        //applyOpacity(to: entity)
    }
    
    /// Handle `.onEnded` actions for drag gestures.
    mutating func onEnded(value: EntityTargetValue<DragGesture.Value>) {
        let state = EntityGestureState.shared
        state.isDragging = false
        
        state.pivotEntity = nil
        state.targetedEntity = nil
    }
    
    /// Helper function to clamp a value within a range
    private func clamp(_ value: Float, min: Float, max: Float) -> Float {
        return Swift.max(min, Swift.min(value, max))
    }
    
    // MARK: - Opacity Logic

    /// Apply the current opacity to the entity's materials.
    func applyOpacity(to entity: Entity) {
        guard let modelEntity = entity as? ModelEntity else { return }
        
        let opacityComponent = OpacityComponent(opacity: opacity)
        modelEntity.components.set(opacityComponent)
    }
}

extension Entity {
    var binOpenerComponent: BinOpenerComponent? {
        get { components[BinOpenerComponent.self] }
        set { components[BinOpenerComponent.self] = newValue }
    }
}

public class OpacitySystem: System {
    
    // Specify that this system processes entities with BinOpenerComponent
    static let query = EntityQuery(where: .has(RealityKitContent.BinOpenerComponent.self))

    required public init(scene: RealityKit.Scene) {}

    /// Called every frame to update entities.
    public func update(context: SceneUpdateContext) {
        let entities = context.scene.performQuery(Self.query).map({ $0 })
        
        for entity in entities {
            guard let binOpener = entity.binOpenerComponent else { continue }
            binOpener.applyOpacity(to: entity)
        }
    }
}

I thought that using class OpacitySystem: System it will override in realtime the preview in RCP but no.. However it works great on the AVP, the entered value (opacity) is applied to the entity.

Anyone has an idea ? Thanks in advance for your support and help.


Solution

  • RCP 2.0 parameters cannot be updated from Swift code. It's a one way ticket (RCP to RealityKit).