I have a bunch of CIFilter
s that finally scale & crop the large image (from iPhone camera) to an 1080x1920 CIImage
.
I then want to save the image as a JPG:
var outputFilter: CIFilter?
...
if let ciImage = outputFilter?.outputImage {
let outputImage = UIImage(ciImage: ciImage)
let data = outputImage?.jpegData(compressionQuality: 0.8)
...
}
The ciImage.extent
is 1080x1920, outputImage.size
is also 1080x1920, outputImage.scale
is 1.0.
The image saved to disk however is 3x as large: 3240x5760.
What am I missing?
This will return an image based on your screen scale. If you check your screen scale it will result in 3X. What you need is to initialize your uiimage with the screen scale:
let outputImage = UIImage(ciImage: ciImage, scale: UIScreen.main.scale, orientation: .up)
To render the image you can use UIGraphicsImageRenderer:
extension CIImage {
var rendered: UIImage {
let cgImage = CIContext(options: nil).createCGImage(self, from: extent)!
let size = extent.size
let format = UIGraphicsImageRendererFormat.default()
format.opaque = false
return UIGraphicsImageRenderer(size: size, format: format).image { ctx in
var transform = CGAffineTransform(scaleX: 1, y: -1)
transform = transform.translatedBy(x: 0, y: -size.height)
ctx.cgContext.concatenate(transform)
ctx.cgContext.draw(cgImage, in: CGRect(origin: .zero, size: size))
}
}
}