swiftuirealitykitvisionosreality-composer-pro

Play Audio when I use SpatialTapGesture or Direct Gesture


I am learning how to develop for the vision pro, and I am trying to make a simple bubble that makes a pop noise when it's interacted with.

I have the model create, look and all, but I cannot get the audio to play when I tap or touch the bubble.

struct ImmersiveView: View {
    
    @State var predicate = QueryPredicate<Entity>.has(ModelComponent.self)
    @State private var timer: Timer?
    @State private var audioController: AudioPlaybackController?

    var body: some View {
        RealityView { content in
            // Add the initial RealityKit content
            if let immersiveContentEntity = try? await Entity(named: "Bubble", in: realityKitContentBundle) {
                content.add(immersiveContentEntity)

                // Put skybox here.  See example in World project available at
                // https://developer.apple.com/
            }
        }.gesture(SpatialTapGesture().targetedToEntity(where: predicate).onEnded({
            value in
            let entity = value.entity
            
            let spatialAudio = entity.findEntity(named: "SpatialAudio")
            
            guard let resource = try? AudioFileResource.load(named: "/Bubble/bubble_sound_43207.mp3", from: "Bubble.usda", in: realityKitContentBundle) else {
                fatalError("Unable to find bubble audio file.")
            }
            
            audioController = (spatialAudio?.prepareAudio(resource))!
            audioController?.play()
            
            var material = entity.components[ModelComponent.self]?.materials.first as! ShaderGraphMaterial
            let frameRate: TimeInterval = 1.0/60.0
            let duration: TimeInterval = 0.25
            let targetValue: Float = 1
            let totalFrames = Int(duration / frameRate)
            
            var currentFrame = 0
            var popValue: Float = 0
            
            timer?.invalidate()
            timer = Timer.scheduledTimer(withTimeInterval: frameRate, repeats: true, block: { timer in
                currentFrame += 1
                let progress: Float = Float(currentFrame) / Float(totalFrames)
                
                popValue = progress * targetValue
                
                do {
                    try material.setParameter(name: "Pop", value: .float(popValue))
                    entity.components[ModelComponent.self]?.materials = [material]
                }
                catch {
                    print(error.localizedDescription)
                }
                
                if currentFrame >= totalFrames {
                    timer.invalidate()
                    entity.removeFromParent()
                    
                }
            
            })
        }))
    }
}

My big guess is that it cannot find the audio file, but I have tried every possible path that I can think of.

I can't think of how to share the file structure, but I can share that if it's needed.

Please let me know what I messed up. Lol.


Solution

  • First of all, you must remember that a tap gesture will not work without the Collision and InputTarget components of the model. In addition to this, to play spatial audio you'll need a SpatialAudio component with an assigned audio file. My version of the scene in Reality Composer Pro may significantly differ from yours, but you'll definitely get a general idea of ​​how it works.

    enter image description here


    enter image description here


    import SwiftUI
    import RealityKit
    import RealityKitContent
    
    struct ImmersiveView: View {
        @State var predicate = QueryPredicate<Entity>.has(ModelComponent.self)
        @State var audioController: AudioPlaybackController?
    
        var body: some View {
            RealityView { content in
                if let immersiveContentEntity = try? await Entity(
                    named: "Immersive", 
                       in: realityKitContentBundle
                ) {
                    content.add(immersiveContentEntity)
                    
                    let modelEntity = immersiveContentEntity.findEntity(named: "Bubble") as! ModelEntity
                    modelEntity.generateCollisionShapes(recursive: false)
                    modelEntity.components.set(InputTargetComponent())
    
                    print(immersiveContentEntity)
                }
            }
            .gesture(
                SpatialTapGesture()
                    .targetedToEntity(where: predicate)
                    .onEnded { value in
                        let entity = value.entity
                        
                        guard let resource = try? AudioFileResource.load(
                            named: "/Root/bubble_sound_43207", 
                             from: "Immersive.usda", 
                               in: realityKitContentBundle) else {
                            print("Unable to find bubble audio file.")
                            return
                        }
                        audioController = entity.prepareAudio(resource)
                        audioController?.play()
                    }
            )
        }
    }
    

    enter image description here


    Sometimes it's much easier to load a separate audio file into a scene.