swiftswiftuicaemitterlayercaemittercellnsviewrepresentable

How to use CAEmitterLayer on macOS in a SwiftUI app using NSViewRepresentable


I want to use a CAEmitterLayer within a macOS app that is based on SwiftUI.

Problem: The layer itself is perfectly visible, but it doesn’t emit any particles.

→ Here’s my demo project on GitHub

enter image description here

I basically built a NSViewRepresentable for a custom class EmitterNSView: NSView that handles the emitter layer itself.

final class EmitterNSView: NSView {

    private let emitterLayer: CAEmitterLayer = {
        let layer = CAEmitterLayer()
        layer.backgroundColor = NSColor.green.withAlphaComponent(0.33).cgColor
        return layer
    }()

    private let emitterCells: [CAEmitterCell] = {
        // https://developer.apple.com/documentation/quartzcore/caemitterlayer
        let cell = CAEmitterCell()
        cell.name = "someParticle"
        cell.birthRate = 10
        cell.lifetime = 5.0
        cell.velocity = 100
        cell.velocityRange = 50
        cell.emissionLongitude = 0.0
        cell.emissionRange = CGFloat.pi * 2.0
        cell.spinRange = 5
        cell.scale = 1.0
        cell.scaleRange = 0.25
        cell.alphaSpeed = 0.25
        cell.contents = NSImage(named: "whiteParticle.png")!.cgImage
        cell.color = NSColor.systemPink.cgColor
        cell.xAcceleration = 4
        cell.yAcceleration = 3
        cell.zAcceleration = 2
        return [cell]
    }()

    override init(frame frameRect: NSRect) {
        super.init(frame: frameRect)
        self.wantsLayer = true
        self.layer = CALayer()
        self.layer?.autoresizingMask = [.layerWidthSizable, .layerHeightSizable]
        self.configureEmitterLayer()
        self.layer?.addSublayer(self.emitterLayer)
    }

    required init?(coder: NSCoder) {
        fatalError("init(coder:) has not been implemented")
    }

    override func layout() {
        super.layout()
        self.configureEmitterLayer()
    }

    override func updateLayer() {
        super.updateLayer()
        self.configureEmitterLayer()
    }

    func configureEmitterLayer() {
        self.emitterLayer.frame = self.frame
        self.emitterLayer.autoresizingMask = [.layerHeightSizable, .layerWidthSizable]
        self.emitterLayer.masksToBounds = false
        self.emitterLayer.drawsAsynchronously = true
        self.emitterLayer.emitterMode = .points
        self.emitterLayer.birthRate = 2
        self.emitterLayer.emitterShape = CAEmitterLayerEmitterShape.line
        self.emitterLayer.emitterSize = CGSize(width: frame.width * 0.5, height: frame.height * 0.5)
        self.emitterLayer.emitterPosition = CGPoint.zero
        self.emitterLayer.renderMode = CAEmitterLayerRenderMode.additive
        self.emitterLayer.emitterCells = self.emitterCells
        self.emitterLayer.zPosition = 10
        self.emitterLayer.beginTime = CACurrentMediaTime()
        self.emitterLayer.speed = 1.5
        self.emitterLayer.emitterCells = self.emitterCells
    
    }

}

The fact that I can clearly see the green background of the layer within the app indicates that something is wrong with the cells? Feeling lost at this point.

Very similar implementations in UIKit work just fine.

How can I use CAEmitterLayer on macOS within a SwiftUI based app?


Solution

  • Use cgImage(forProposedRect:context:hints:) while casting NSImage to a CGImage.

    cell.contents = NSImage(named:"whiteParticle.png").flatMap {
        return $0.cgImage(forProposedRect: nil, context: nil, hints: nil)
    }