I am developing an immersive visionOS app based on RealityKit and SwiftUI.
This app has ModelEntities
that have a PerspectiveCamera
entity as child.
I want to display the camera view in a 2D window in visionOS.
I am creating the camera, and add it to the entity with
let cameraEntity = PerspectiveCamera()
cameraEntity.camera.far = 10000
cameraEntity.camera.fieldOfViewInDegrees = 60
cameraEntity.camera.near = 0.01
entity.addChild(cameraEntity)
There are some post on SO, like this one, that apparently show such camera views as part of an arView
. However my app is not AR. The immersive view is programmatically generated.
My question is:
How can I show the camera view in a SwiftUI 2D window?
I contacted Apple, and they confirmed that this is a bug in visionOS 1.0 … 2.0 beta: One can currently define a PerspectiveCamera
, but cannot use its output.
They suggested to use a RealityRenderer
instead. Here is the suggested code that I did not check until now.
@Observable
@MainActor
final class OffscreenRenderModel {
private let renderer: RealityRenderer
private let colorTexture: MTLTexture
init(scene: Entity) throws {
renderer = try RealityRenderer()
renderer.entities.append(scene)
let camera = PerspectiveCamera()
renderer.activeCamera = camera
renderer.entities.append(camera)
let textureDesc = MTLTextureDescriptor()
textureDesc.pixelFormat = .rgba8Unorm
textureDesc.width = 512
textureDesc.height = 512
textureDesc.usage = [.renderTarget, .shaderRead]
let device = MTLCreateSystemDefaultDevice()!
colorTexture = device.makeTexture(descriptor: textureDesc)!
}
func render() throws {
let cameraOutputDesc = RealityRenderer.CameraOutput.Descriptor.singleProjection(colorTexture: colorTexture)
let cameraOutput = try RealityRenderer.CameraOutput(cameraOutputDesc)
try renderer.updateAndRender(deltaTime: 0.1, cameraOutput: cameraOutput, onComplete: { renderer in
guard let colorTexture = cameraOutput.colorTextures.first else { fatalError() }
// The colorTexture holds the rendered scene.
})
}
}