I have a custom UI component that produces an image of a round ball with some labels superimposed over top of it.
I am taking a snapshot of the component using the following UIView
extension I found here on StackOverflow.
I am taking the resulting UIImage
and using it in a CAEmitterCell
.
My problem is that the snapshot image is square - my round ball on a white background. I would like the background to be clear when it is emitted but I can't seem to find a way to do it.
Is there any way I can modify the UIImage
to make its corners transparent?
Thanks.
extension UIView {
/// Create snapshot
///
/// - parameter rect: The `CGRect` of the portion of the view to return. If `nil` (or omitted),
/// return snapshot of the whole view.
///
/// - returns: Returns `UIImage` of the specified portion of the view.
func snapshot(of rect: CGRect? = nil) -> UIImage? {
// snapshot entire view
UIGraphicsBeginImageContextWithOptions(bounds.size, isOpaque, 0)
drawHierarchy(in: bounds, afterScreenUpdates: true)
let wholeImage = UIGraphicsGetImageFromCurrentImageContext()
UIGraphicsEndImageContext()
// if no `rect` provided, return image of whole view
guard let image = wholeImage, let rect = rect else { return wholeImage }
// otherwise, grab specified `rect` of image
let scale = image.scale
let scaledRect = CGRect(x: rect.origin.x * scale, y: rect.origin.y * scale, width: rect.size.width * scale, height: rect.size.height * scale)
guard let cgImage = image.cgImage?.cropping(to: scaledRect) else { return nil }
return UIImage(cgImage: cgImage, scale: scale, orientation: .up)
}
}
After figuring out how to word my question here, I thought of another way to search for my answer and found a solution.
Someone posted an extension to turn white backgrounds transparent as an answer to a previous question. White didn't work for me, but a simple edit and a name change made the extension work for black backgrounds instead of white.
extension UIImage {
func imageByMakingBlackBackgroundTransparent() -> UIImage? {
let image = UIImage(data: UIImageJPEGRepresentation(self, 1.0)!)!
let rawImageRef: CGImage = image.cgImage!
let colorMasking: [CGFloat] = [0, 0, 0, 0, 0, 0]
UIGraphicsBeginImageContext(image.size);
let maskedImageRef = rawImageRef.copy(maskingColorComponents: colorMasking)
UIGraphicsGetCurrentContext()?.translateBy(x: 0.0,y: image.size.height)
UIGraphicsGetCurrentContext()?.scaleBy(x: 1.0, y: -1.0)
UIGraphicsGetCurrentContext()?.draw(maskedImageRef!, in: CGRect.init(x: 0, y: 0, width: image.size.width, height: image.size.height))
let result = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return result
}
}
My image was originally on a white background, but there was also white in the important part of the image. I temporarily changed my background to black, took a snapshot, then converted black to transparent, then changed my background back to white.
The end result is that my problem is fixed. Thanks to anyone who took any time to read or think about this problem.