I'm trying to do an image cropping feature with perspective correction. I've had success with using CIFilter CIPerspectiveCorrection.
My question is: how do I animate from original image to my cropped image (with the correction)?
layer.filters
is not supported for iOS as per documentation.
I'm thinking maybe CATransform3D
is the way to go, but I'm not familiar with transforms and matrices.
I'm looking for the same animation like cropping in CamScanner app.
Found a way, but kinda brute way to go at it. What I did was to create multiple images interpolated from initial image and final image, then added a CAKeyFrameAnimation
.
let points: [CGFloat] = [0.05, 0.1, 0.15, 0.2, 0.25, 0.3, 0.35, 0.4, 0.45, 0.5, 0.55, 0.6, 0.65, 0.7, 0.75, 0.8, 0.85, 0.9, 0.95, 1.0]
for i in points {
let TLx = originalQuad.topLeft.x + i * (cartesianScaledQuad.topLeft.x - originalQuad.topLeft.x)
let TLy = originalQuad.topLeft.y + i * (cartesianScaledQuad.topLeft.y - originalQuad.topLeft.y)
let TL = CGPoint(x: TLx, y: TLy)
let TRx = originalQuad.topRight.x + i * (cartesianScaledQuad.topRight.x - originalQuad.topRight.x)
let TRy = originalQuad.topRight.y + i * (cartesianScaledQuad.topRight.y - originalQuad.topRight.y)
let TR = CGPoint(x: TRx, y: TRy)
let BLx = originalQuad.bottomLeft.x + i * (cartesianScaledQuad.bottomLeft.x - originalQuad.bottomLeft.x)
let BLy = originalQuad.bottomLeft.y + i * (cartesianScaledQuad.bottomLeft.y - originalQuad.bottomLeft.y)
let BL = CGPoint(x: BLx, y: BLy)
let BRx = originalQuad.bottomRight.x + i * (cartesianScaledQuad.bottomRight.x - originalQuad.bottomRight.x)
let BRy = originalQuad.bottomRight.y + i * (cartesianScaledQuad.bottomRight.y - originalQuad.bottomRight.y)
let BR = CGPoint(x: BRx, y: BRy)
let filteredImage = ciImage.applyingFilter("CIPerspectiveCorrection", parameters: [
"inputTopLeft": CIVector(cgPoint: BL),
"inputTopRight": CIVector(cgPoint: BR),
"inputBottomLeft": CIVector(cgPoint: TL),
"inputBottomRight": CIVector(cgPoint: TR)
])
if let image = CIContext(options: nil).createCGImage(filteredImage, from: filteredImage.extent) {
images.append(image)
}
}
let anim:CAKeyframeAnimation = CAKeyframeAnimation(keyPath: "contents")
anim.duration = 0.5
anim.values = images
anim.delegate = self
imageView.layer.add(anim, forKey: "contents");
imageView.layer.contents = images.last!
Having issue with incrementing values so I had to hardcode every point. Works fine, and looked good. Although the overhead from creating CGImage
s is noticeable, especially if more frames are generated.
Hopefully, Apple will support layer.filters
on iOS in the future.