According to this answer by ldoogy, setting the drawsAsynchronously
property of CALayer
to true enables a Metal based renderer vastly improving performance.
This webpage affirms Idoogy's claim.
However, I do not see any performance difference when drawsAsynchronously
is set to true or false.
let layer = CALayer()
let drawsAsynchronously = true //Makes no difference set to true or false
shapeLayer.drawsAsynchronously = drawsAsynchronously
let f = CGRect(x: 0.0, y: 0.0, width: 1024.0, height: 1024.0)
let cgColor = UIColor.orange.cgColor
var lines: [CGPath] // populated with several hundred paths, some with hundreds of points
let start = CFAbsoluteTimeGetCurrent()
for path in lines {
let pathLayer = CAShapeLayer()
pathLayer.path = path
pathLayer.strokeColor = cgColor
pathLayer.fillColor = nil
pathLayer.lineWidth = 1.0
pathLayer.drawsAsynchronously = drawsAsynchronously
layer.addSublayer(pathLayer)
}
UIGraphicsBeginImageContext(f.size)
let ctx = UIGraphicsGetCurrentContext()
layer.render(in: ctx!)
let newImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext()
//time: 0.05170309543609619
print("time: \(CFAbsoluteTimeGetCurrent() - start)")
Idoogy specifies renderInContext
which is what I am using above. render(in:)
has replaced renderInContext
with modern Swift.
Surprisingly, UIGraphicsImageRenderer
is much slower (nearly 300%), but also makes no difference if drawsAsynchronously
is set to true or false:
let renderer = UIGraphicsImageRenderer(size: f.size)
let capturedImage = renderer.image { (ctx) in
return layer.render(in: ctx.cgContext)
}
// time: 0.13654804229736328
print("time: \(CFAbsoluteTimeGetCurrent() - start)")
Is there something I've missed to enable hardware accelerated rendering with drawsAsynchronously
enabled?
EDIT:
I tried using drawRect method too, since Idoogy mentions ContextStrokePath
, but it was the slowest yet, and made no difference if drawsAsynchronously
is enabled or not.
class LineView: UIView {
var lines: [CGPath] //populated with several hundred paths, some with hundreds of points
override func draw(_ rect: CGRect) {
let color = UIColor.orange.cgColor
if let context = UIGraphicsGetCurrentContext() {
for path in lines {
context.saveGState()
context.addPath(path)
context.setStrokeColor(color)
context.setLineWidth(1.0)
context.strokePath()
context.restoreGState()
}
}
}
}
class ViewController: UIViewController {
override func viewDidLoad() {
super.viewDidLoad()
let drawsAsynchronously = true //Makes no difference set to true or false
view.layer.drawsAsynchronously = drawsAsynchronously
let f = CGRect(x: 0.0, y: 0.0, width: 1024.0, height: 1024.0)
let lineView = LineView(frame: f)
lineView.layer.drawsAsynchronously = drawsAsynchronously
view.addSubview(lineView)
let start = CFAbsoluteTimeGetCurrent()
UIGraphicsBeginImageContext(f.size)
let ctx = UIGraphicsGetCurrentContext()
lineView.layer.render(in: ctx!)
let newImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext()
//time: 0.14092397689819336
print("time: \(CFAbsoluteTimeGetCurrent() - start)")
}
Since it is so much slower than using CAShapeLayer
(3x slower), I suspect that maybe using CAShapeLayer
is using the GPU for rendering to image. I'd still like to get the method described by Idoogy working as none of the 3 methods I've tried show any difference using it.
Several things to talk about...
First, the speed difference you observed between UIGraphicsGetImageFromCurrentImageContext
and UIGraphicsImageRenderer
is due to the fact that you are rendering different size images.
Assuming you are on a @3x
device, UIGraphicsImageRenderer
is rendering a 1024 x 1024 UIImage
, but its .scale
is 3, so it's actually a 3072 x 3072 pixel image.
To get equivalent images, change that code block to this:
let fmt = UIGraphicsImageRendererFormat()
fmt.scale = 1
let renderer = UIGraphicsImageRenderer(size: f.size, format: fmt)
let capturedImage = renderer.image { (ctx) in
return layer.render(in: ctx.cgContext)
}
now UIGraphicsImageRenderer
will produce the same 1024 x 1024 pixel image as UIGraphicsGetImageFromCurrentImageContext
.
Next, you're timing blocks of code which are not directly related to layer.drawsAsynchronously
-- creating and adding sublayers, generating objects, etc.
Apple's docs on this are not what I would call "in-depth" -- but Improving Animation Performance we find:
Use Asynchronous Layer Rendering As Needed
Any drawing that you do in your delegate’s
drawLayer:inContext:
method or your view’sdrawRect:
method normally occurs synchronously on your app’s main thread. In some situations, though, drawing your content synchronously might not offer the best performance. If you notice that your animations are not performing well, you might try enabling thedrawsAsynchronously
property on your layer to move those operations to a background thread. If you do so, make sure your drawing code is thread safe. And as always, you should always measure the performance of drawing asynchronously before putting it into your production code.
Important to note -- the docs are talking (mainly) about animation performance... not "single-rendering" tasks.
From some testing...
.drawsAsynchronously = true
.drawsAsynchronously = false
So, let's look at some actual example code that will demonstrate the difference. Too much to try and detail here, but the in-line comments should make it clear what's going on:
custom CALayer
subclass:
class MyCustomDrawLayer: CALayer {
// a property so the caller can read the last draw() duration
var lastRenderDuration: Double = -1
var pths: [CGPath] = []
var cgStrokeColors: [CGColor] = []
var cgFillColors: [CGColor] = []
override init() {
super.init()
commonInit()
}
required init?(coder: NSCoder) {
super.init(coder: coder)
commonInit()
}
func commonInit() {
let lineColors: [UIColor] = [
.red, .systemGreen, .blue
]
let fillColors: [UIColor] = [
.yellow, .green, .cyan
]
cgStrokeColors = lineColors.map({$0.cgColor})
cgFillColors = fillColors.map({$0.cgColor})
}
override func draw(in ctx: CGContext) {
super.draw(in: ctx)
guard pths.count > 0 else { return }
let drawStart = CFAbsoluteTimeGetCurrent()
// cycle through 3 stroke/fill color sets as we draw the paths
for (i, pth) in pths.enumerated() {
ctx.setStrokeColor(cgStrokeColors[i % cgStrokeColors.count])
ctx.setFillColor(cgFillColors[i % cgFillColors.count])
ctx.addPath(pth)
ctx.drawPath(using: .fillStroke)
}
let drawEnd = CFAbsoluteTimeGetCurrent()
lastRenderDuration = drawEnd - drawStart
}
}
"Swift Bird" path:
class SwiftyBird: NSObject {
func path(inRect: CGRect) -> UIBezierPath {
let thisShape = UIBezierPath()
thisShape.move(to: CGPoint(x: 0.31, y: 0.94))
thisShape.addCurve(to: CGPoint(x: 0.00, y: 0.64), controlPoint1: CGPoint(x: 0.18, y: 0.87), controlPoint2: CGPoint(x: 0.07, y: 0.76))
thisShape.addCurve(to: CGPoint(x: 0.12, y: 0.72), controlPoint1: CGPoint(x: 0.03, y: 0.67), controlPoint2: CGPoint(x: 0.07, y: 0.70))
thisShape.addCurve(to: CGPoint(x: 0.57, y: 0.72), controlPoint1: CGPoint(x: 0.28, y: 0.81), controlPoint2: CGPoint(x: 0.45, y: 0.80))
thisShape.addCurve(to: CGPoint(x: 0.57, y: 0.72), controlPoint1: CGPoint(x: 0.57, y: 0.72), controlPoint2: CGPoint(x: 0.57, y: 0.72))
thisShape.addCurve(to: CGPoint(x: 0.15, y: 0.23), controlPoint1: CGPoint(x: 0.40, y: 0.57), controlPoint2: CGPoint(x: 0.26, y: 0.39))
thisShape.addCurve(to: CGPoint(x: 0.10, y: 0.15), controlPoint1: CGPoint(x: 0.13, y: 0.21), controlPoint2: CGPoint(x: 0.11, y: 0.18))
thisShape.addCurve(to: CGPoint(x: 0.50, y: 0.49), controlPoint1: CGPoint(x: 0.22, y: 0.28), controlPoint2: CGPoint(x: 0.43, y: 0.44))
thisShape.addCurve(to: CGPoint(x: 0.22, y: 0.09), controlPoint1: CGPoint(x: 0.35, y: 0.31), controlPoint2: CGPoint(x: 0.21, y: 0.08))
thisShape.addCurve(to: CGPoint(x: 0.69, y: 0.52), controlPoint1: CGPoint(x: 0.46, y: 0.37), controlPoint2: CGPoint(x: 0.69, y: 0.52))
thisShape.addCurve(to: CGPoint(x: 0.71, y: 0.54), controlPoint1: CGPoint(x: 0.70, y: 0.53), controlPoint2: CGPoint(x: 0.70, y: 0.53))
thisShape.addCurve(to: CGPoint(x: 0.61, y: 0.00), controlPoint1: CGPoint(x: 0.77, y: 0.35), controlPoint2: CGPoint(x: 0.71, y: 0.15))
thisShape.addCurve(to: CGPoint(x: 0.92, y: 0.68), controlPoint1: CGPoint(x: 0.84, y: 0.15), controlPoint2: CGPoint(x: 0.98, y: 0.44))
thisShape.addCurve(to: CGPoint(x: 0.92, y: 0.70), controlPoint1: CGPoint(x: 0.92, y: 0.69), controlPoint2: CGPoint(x: 0.92, y: 0.70))
thisShape.addCurve(to: CGPoint(x: 0.92, y: 0.70), controlPoint1: CGPoint(x: 0.92, y: 0.70), controlPoint2: CGPoint(x: 0.92, y: 0.70))
thisShape.addCurve(to: CGPoint(x: 0.99, y: 1.00), controlPoint1: CGPoint(x: 1.00, y: 0.86), controlPoint2: CGPoint(x: 1.00, y: 1.00))
thisShape.addCurve(to: CGPoint(x: 0.75, y: 0.93), controlPoint1: CGPoint(x: 0.92, y: 0.86), controlPoint2: CGPoint(x: 0.81, y: 0.90))
thisShape.addCurve(to: CGPoint(x: 0.31, y: 0.94), controlPoint1: CGPoint(x: 0.64, y: 1.01), controlPoint2: CGPoint(x: 0.47, y: 1.00))
thisShape.close()
let tr = CGAffineTransform(translationX: inRect.minX, y: inRect.minY)
.scaledBy(x: inRect.width, y: inRect.height)
thisShape.apply(tr)
return thisShape
}
}
Looks like this if inRect
is (roughly) 200x200:
Test View Controller class:
class MyDrawAsyncTestVC: UIViewController {
let customLayer = MyCustomDrawLayer()
var manySimplePaths: [CGPath] = []
var fewComplexPaths: [CGMutablePath] = []
var bUseManyPaths: Bool = true
override func viewDidLoad() {
super.viewDidLoad()
view.backgroundColor = .systemYellow
// generate a 64x64 "grid" of 16x16 paths
// (fills the 1024x1024 size)
let v: CGFloat = 16.0
var r: CGRect = .init(x: 0.0, y: 0.0, width: v, height: v)
for col in 0..<64 {
for row in 0..<64 {
r.origin = .init(x: CGFloat(col) * v, y: CGFloat(row) * v)
manySimplePaths.append(SwiftyBird().path(inRect: r).cgPath)
}
}
// manySimplePaths has 4096 paths
fewComplexPaths = [
CGMutablePath(),
CGMutablePath(),
CGMutablePath(),
]
for (j, pth) in manySimplePaths.enumerated() {
fewComplexPaths[j % fewComplexPaths.count].addPath(pth)
}
// fewComplexPaths produces the same output,
// but uses only 3 paths:
// [0] has 1366 subpaths
// [1] has 1365 subpaths
// [2] has 1365 subpaths
customLayer.pths = manySimplePaths
// the custom layer MUST be in the view hierarchy,
// but it doesn't have to be visible
// so we'll add it as a sublayer but position it "out-of-frame"
let sz: CGSize = .init(width: 1024.0, height: 1024.0)
customLayer.frame = .init(x: -(sz.width + 10.0), y: -(sz.height + 10.0), width: sz.width, height: sz.height)
view.layer.addSublayer(customLayer)
}
override func viewDidAppear(_ animated: Bool) {
super.viewDidAppear(animated)
// let's call the test/render func every second
Timer.scheduledTimer(withTimeInterval: 1.0, repeats: true, block: { _ in
self.testMe()
})
}
// tap anywhere to toggle between manySimplePaths and fewComplexPaths
override func touchesBegan(_ touches: Set<UITouch>, with event: UIEvent?) {
bUseManyPaths.toggle()
customLayer.pths = bUseManyPaths ? manySimplePaths : fewComplexPaths
print("\nSwitched to:", bUseManyPaths ? "manySimplePaths" : "fewComplexPaths", "\n")
}
var iCount: Int = 0
func testMe() {
let f = customLayer.frame
// toggle .drawsAsynchronously each time through
customLayer.drawsAsynchronously.toggle()
let genImageStart = CFAbsoluteTimeGetCurrent()
UIGraphicsBeginImageContext(f.size)
guard let ctx = UIGraphicsGetCurrentContext() else { fatalError("Could not get Context!!!") }
let renderStart = CFAbsoluteTimeGetCurrent()
// we want .render(in:) to trigger a call to draw(in:) in custom layer
customLayer.setNeedsDisplay()
// render the layer
customLayer.render(in: ctx)
let renderEnd = CFAbsoluteTimeGetCurrent()
let newImage = UIGraphicsGetImageFromCurrentImageContext()
UIGraphicsEndImageContext()
let genImageEnd = CFAbsoluteTimeGetCurrent()
if iCount == 0 {
print("\nWe're ignoring the first few timing values, so we're not measuring overhead...")
}
iCount += 1
if iCount < 4 {
print(iCount)
return
}
var s: String = "async: \(customLayer.drawsAsynchronously)"
s += customLayer.drawsAsynchronously ? "\t\t" : "\t"
s += "Draw Time: "
s += String(format: "%0.10f", customLayer.lastRenderDuration)
s += "\t\t"
s += "Render Time: "
s += String(format: "%0.10f", renderEnd - renderStart)
s += "\t\t"
s += "Gen Image Time: "
s += String(format: "%0.10f", genImageEnd - genImageStart)
print(s)
}
}
When running, we don't see anything on the screen (just yellow background so we know the app is "live").
It starts a timer, rendering a 1024x1024 image every second, alternating between .drawsAsynchronously
true/false, and prints timing stats to the debug console.
Tapping anywhere toggles between rendering manySimplePaths
or fewComplexPaths
-- both produce the exact same output image.
You should see something similar to this in the debug console:
2024-01-17 13:00:38.706243-0500 MyProj[66254:6949370] Metal GPU Frame Capture Enabled
2024-01-17 13:00:38.708287-0500 MyProj[66254:6949370] Metal API Validation Enabled
We're ignoring the first few timing values, so we're not measuring overhead...
1
2
3
async: false Draw Time: 0.0868519545 Render Time: 0.0882049799 Gen Image Time: 0.0901809931
async: true Draw Time: 0.0249859095 Render Time: 0.1147090197 Gen Image Time: 0.1166020632
async: false Draw Time: 0.0890671015 Render Time: 0.0899358988 Gen Image Time: 0.0919650793
async: true Draw Time: 0.0232139826 Render Time: 0.1093589067 Gen Image Time: 0.1112560034
Switched to: fewComplexPaths
async: false Draw Time: 0.1343829632 Render Time: 0.1352089643 Gen Image Time: 0.1371099949
async: true Draw Time: 0.0092250109 Render Time: 0.0681159496 Gen Image Time: 0.0701240301
async: false Draw Time: 0.1334309578 Render Time: 0.1342890263 Gen Image Time: 0.1361669302
async: true Draw Time: 0.0110900402 Render Time: 0.0679899454 Gen Image Time: 0.0699119568
The rendered 1024x1024 image should look like this: